Lattice Gauge Theories Within and Beyond the Standard Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelzer, Zechariah John
The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involvingmore » $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($$B \\to \\pi \\ell \
Probing Supersymmetry with Neutral Current Scattering Experiments
NASA Astrophysics Data System (ADS)
Kurylov, A.; Ramsey-Musolf, M. J.; Su, S.
2004-02-01
We compute the supersymmetric contributions to the weak charges of the electron (QWe) and proton (QWp) in the framework of Minimal Supersymmetric Standard Model. We also consider the ratio of neutral current to charged current cross sections, R v and Rv¯ at v (v¯)-nucleus deep inelastic scattering, and compare the supersymmetric corrections with the deviations of these quantities from the Standard Model predictions implied by the recent NuTeV measurement.
Cp Asymmetries in B0DECAYS Beyond the Standard Model
NASA Astrophysics Data System (ADS)
Dib, Claudio O.; London, David; Nir, Yosef
Of the many ingredients of the Standard Model that are relevant to the analysis of CP asymmetries in B0 decays, some are likely to hold even beyond the Standard Model while others are sensitive to new physics. Consequently, certain predictions are maintained while others may show dramatic deviations from the Standard Model. Many classes of models may show clear signatures when the asymmetries are measured: four quark generations, Z-mediated flavor-changing neutral currents, supersymmetry and “real superweak” models. On the other hand, models of left-right symmetry and multi-Higgs sectors with natural flavor conservation are unlikely to modify the Standard Model predictions.
Right-handed charged currents in the era of the Large Hadron Collider
Alioli, Simone; Cirigliano, Vincenzo; Dekens, Wouter Gerard; ...
2017-05-16
We discuss the phenomenology of right-handed charged currents in the frame-work of the Standard Model Effective Field Theory, in which they arise due to a single gauge-invariant dimension-six operator. We study the manifestations of the nine complex couplings of the W to right-handed quarks in collider physics, flavor physics, and low-energy precision measurements. We first obtain constraints on the couplings under the assumption that the right-handed operator is the dominant correction to the Standard Model at observable energies. Here, we subsequently study the impact of degeneracies with other Beyond-the-Standard-Model effective interactions and identify observables, both at colliders and low-energy experiments,more » that would uniquely point to right-handed charged currents.« less
Flavour-changing neutral currents making and breaking the standard model.
Archilli, F; Bettler, M-O; Owen, P; Petridis, K A
2017-06-07
The standard model of particle physics is our best description yet of fundamental particles and their interactions, but it is known to be incomplete. As yet undiscovered particles and interactions might exist. One of the most powerful ways to search for new particles is by studying processes known as flavour-changing neutral current decays, whereby a quark changes its flavour without altering its electric charge. One example of such a transition is the decay of a beauty quark into a strange quark. Here we review some intriguing anomalies in these decays, which have revealed potential cracks in the standard model-hinting at the existence of new phenomena.
Standardized Tests and Froebel's Original Kindergarten Model
ERIC Educational Resources Information Center
Jeynes, William H.
2006-01-01
The author argues that American educators rely on standardized tests at too early an age when administered in kindergarten, particularly given the original intent of kindergarten as envisioned by its founder, Friedrich Froebel. The author examines the current use of standardized tests in kindergarten and the Froebel model, including his emphasis…
Positive Matrix Factorization Model for environmental data analyses
Positive Matrix Factorization is a receptor model developed by EPA to provide scientific support for current ambient air quality standards and implement those standards by identifying and quantifying the relative contributions of air pollution sources.
ERIC Educational Resources Information Center
Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary
2012-01-01
The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…
Some Issues in Item Response Theory: Dimensionality Assessment and Models for Guessing
ERIC Educational Resources Information Center
Smith, Jessalyn
2009-01-01
Currently, standardized tests are widely used as a method to measure how well schools and students meet academic standards. As a result, measurement issues have become an increasingly popular topic of study. Unidimensional item response models are used to model latent abilities and specific item characteristics. This class of models makes…
Status of the AIAA Modeling and Simulation Format Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2008-01-01
The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.
Electronic Model of a Ferroelectric Field Effect Transistor
NASA Technical Reports Server (NTRS)
MacLeod, Todd C.; Ho, Fat Duen; Russell, Larry (Technical Monitor)
2001-01-01
A pair of electronic models has been developed of a Ferroelectric Field Effect transistor. These models can be used in standard electrical circuit simulation programs to simulate the main characteristics of the FFET. The models use the Schmitt trigger circuit as a basis for their design. One model uses bipolar junction transistors and one uses MOSFET's. Each model has the main characteristics of the FFET, which are the current hysterisis with different gate voltages and decay of the drain current when the gate voltage is off. The drain current from each model has similar values to an actual FFET that was measured experimentally. T'he input and o Output resistance in the models are also similar to that of the FFET. The models are valid for all frequencies below RF levels. No attempt was made to model the high frequency characteristics of the FFET. Each model can be used to design circuits using FFET's with standard electrical simulation packages. These circuits can be used in designing non-volatile memory circuits and logic circuits and is compatible with all SPICE based circuit analysis programs. The models consist of only standard electrical components, such as BJT's, MOSFET's, diodes, resistors, and capacitors. Each model is compared to the experimental data measured from an actual FFET.
Modeling Bloch oscillations in ultra-small Josephson junctions
NASA Astrophysics Data System (ADS)
Vora, Heli; Kautz, Richard; Nam, Sae Woo; Aumentado, Jose
In a seminal paper, Likharev et al. developed a theory for ultra-small Josephson junctions with Josephson coupling energy (Ej) less than the charging energy (Ec) and showed that such junctions demonstrate Bloch oscillations which could be used to make a fundamental current standard that is a dual of the Josephson volt standard. Here, based on the model of Geigenmüller and Schön, we numerically calculate the current-voltage relationship of such an ultra-small junction which includes various error processes present in a nanoscale Josephson junction such as random quasiparticle tunneling events and Zener tunneling between bands. This model allows us to explore the parameter space to see the effect of each process on the width and height of the Bloch step and serves as a guide to determine whether it is possible to build a quantum current standard of a metrological precision using Bloch oscillations.
Endoscope field of view measurement.
Wang, Quanzeng; Khanicheh, Azadeh; Leiner, Dennis; Shafer, David; Zobel, Jurgen
2017-03-01
The current International Organization for Standardization (ISO) standard (ISO 8600-3: 1997 including Amendment 1: 2003) for determining endoscope field of view (FOV) does not accurately characterize some novel endoscopic technologies such as endoscopes with a close focus distance and capsule endoscopes. We evaluated the endoscope FOV measurement method (the FOV WS method) in the current ISO 8600-3 standard and proposed a new method (the FOV EP method). We compared the two methods by measuring the FOV of 18 models of endoscopes (one device for each model) from seven key international manufacturers. We also estimated the device to device variation of two models of colonoscopes by measuring several hundreds of devices. Our results showed that the FOV EP method was more accurate than the FOV WS method, and could be used for all endoscopes. We also found that the labelled FOV values of many commercial endoscopes are significantly overstated. Our study can help endoscope users understand endoscope FOV and identify a proper method for FOV measurement. This paper can be used as a reference to revise the current endoscope FOV measurement standard.
Endoscope field of view measurement
Wang, Quanzeng; Khanicheh, Azadeh; Leiner, Dennis; Shafer, David; Zobel, Jurgen
2017-01-01
The current International Organization for Standardization (ISO) standard (ISO 8600-3: 1997 including Amendment 1: 2003) for determining endoscope field of view (FOV) does not accurately characterize some novel endoscopic technologies such as endoscopes with a close focus distance and capsule endoscopes. We evaluated the endoscope FOV measurement method (the FOVWS method) in the current ISO 8600-3 standard and proposed a new method (the FOVEP method). We compared the two methods by measuring the FOV of 18 models of endoscopes (one device for each model) from seven key international manufacturers. We also estimated the device to device variation of two models of colonoscopes by measuring several hundreds of devices. Our results showed that the FOVEP method was more accurate than the FOVWS method, and could be used for all endoscopes. We also found that the labelled FOV values of many commercial endoscopes are significantly overstated. Our study can help endoscope users understand endoscope FOV and identify a proper method for FOV measurement. This paper can be used as a reference to revise the current endoscope FOV measurement standard. PMID:28663840
Advocating for Educational Standards in Counselor Licensure Laws
ERIC Educational Resources Information Center
Lawson, Gerard; Trepal, Heather C.; Lee, Robin W.; Kress, Victoria
2017-01-01
As the counseling profession evolves, educational standards for counselor licensure must be standardized from state to state. In this article, the authors discuss historical and current influences and present an advocacy model that has been used to standardize educational requirements in state counselor licensure laws.
CellML metadata standards, associated tools and repositories
Beard, Daniel A.; Britten, Randall; Cooling, Mike T.; Garny, Alan; Halstead, Matt D.B.; Hunter, Peter J.; Lawson, James; Lloyd, Catherine M.; Marsh, Justin; Miller, Andrew; Nickerson, David P.; Nielsen, Poul M.F.; Nomura, Taishin; Subramanium, Shankar; Wimalaratne, Sarala M.; Yu, Tommy
2009-01-01
The development of standards for encoding mathematical models is an important component of model building and model sharing among scientists interested in understanding multi-scale physiological processes. CellML provides such a standard, particularly for models based on biophysical mechanisms, and a substantial number of models are now available in the CellML Model Repository. However, there is an urgent need to extend the current CellML metadata standard to provide biological and biophysical annotation of the models in order to facilitate model sharing, automated model reduction and connection to biological databases. This paper gives a broad overview of a number of new developments on CellML metadata and provides links to further methodological details available from the CellML website. PMID:19380315
Neutrino oscillations and Non-Standard Interactions
NASA Astrophysics Data System (ADS)
Farzan, Yasaman; Tórtola, Mariam
2018-02-01
Current neutrino experiments are measuring the neutrino mixing parameters with an unprecedented accuracy. The upcoming generation of neutrino experiments will be sensitive to subdominant oscillation effects that can give information on the yet-unknown neutrino parameters: the Dirac CP-violating phase, the mass ordering and the octant of θ_{23}. Determining the exact values of neutrino mass and mixing parameters is crucial to test neutrino models and flavor symmetries designed to predict these neutrino parameters. In the first part of this review, we summarize the current status of the neutrino oscillation parameter determination. We consider the most recent data from all solar experiments and the atmospheric data from Super-Kamiokande, IceCube and ANTARES. We also implement the data from the reactor neutrino experiments KamLAND, Daya Bay, RENO and Double Chooz as well as the long baseline neutrino data from MINOS, T2K and NOvA. If in addition to the standard interactions, neutrinos have subdominant yet-unknown Non-Standard Interactions (NSI) with matter fields, extracting the values of these parameters will suffer from new degeneracies and ambiguities. We review such effects and formulate the conditions on the NSI parameters under which the precision measurement of neutrino oscillation parameters can be distorted. Like standard weak interactions, the non-standard interaction can be categorized into two groups: Charged Current (CC) NSI and Neutral Current (NC) NSI. Our focus will be mainly on neutral current NSI because it is possible to build a class of models that give rise to sizeable NC NSI with discernible effects on neutrino oscillation. These models are based on new U(1) gauge symmetry with a gauge boson of mass ≲ 10 MeV. The UV complete model should be of course electroweak invariant which in general implies that along with neutrinos, charged fermions also acquire new interactions on which there are strong bounds. We enumerate the bounds that already exist on the electroweak symmetric models and demonstrate that it is possible to build viable models avoiding all these bounds. In the end, we review methods to test these models and suggest approaches to break the degeneracies in deriving neutrino mass parameters caused by NSI.
Assessment of a novel biomechanical fracture model for distal radius fractures
2012-01-01
Background Distal radius fractures (DRF) are one of the most common fractures and often need surgical treatment, which has been validated through biomechanical tests. Currently a number of different fracture models are used, none of which resemble the in vivo fracture location. The aim of the study was to develop a new standardized fracture model for DRF (AO-23.A3) and compare its biomechanical behavior to the current gold standard. Methods Variable angle locking volar plates (ADAPTIVE, Medartis) were mounted on 10 pairs of fresh-frozen radii. The osteotomy location was alternated within each pair (New: 10 mm wedge 8 mm / 12 mm proximal to the dorsal / volar apex of the articular surface; Gold standard: 10 mm wedge 20 mm proximal to the articular surface). Each specimen was tested in cyclic axial compression (increasing load by 100 N per cycle) until failure or −3 mm displacement. Parameters assessed were stiffness, displacement and dissipated work calculated for each cycle and ultimate load. Significance was tested using a linear mixed model and Wald test as well as t-tests. Results 7 female and 3 male pairs of radii aged 74 ± 9 years were tested. In most cases (7/10), the two groups showed similar mechanical behavior at low loads with increasing differences at increasing loads. Overall the novel fracture model showed a significant different biomechanical behavior than the gold standard model (p < 0,001). The average final loads resisted were significantly lower in the novel model (860 N ± 232 N vs. 1250 N ± 341 N; p = 0.001). Conclusion The novel biomechanical fracture model for DRF more closely mimics the in vivo fracture site and shows a significantly different biomechanical behavior with increasing loads when compared to the current gold standard. PMID:23244634
Payment models to support population health management.
Huerta, Timothy R; Hefner, Jennifer L; McAlearney, Ann Scheck
2014-01-01
To survey the policy-driven financial controls currently being used to drive physician change in the care of populations. This paper offers a review of current health care payment models and discusses the impact of each on the potential success of PHM initiatives. We present the benefits of a multi-part model, combining visit-based fee-for-service reimbursement with a monthly "care coordination payment" and a performance-based payment system. A multi-part model removes volume-based incentives and promotes efficiency. However, it is predicated on a pay-for-performance framework that requires standardized measurement. Application of this model is limited due to the current lack of standardized measurement of quality goals that are linked to payment incentives. Financial models dictated by health system payers are inextricably linked to the organization and management of health care. There is a need for better measurements and realistic targets as part of a comprehensive system of measurement assessment that focuses on practice redesign, with the goal of standardizing measurement of the structure and process of redesign. Payment reform is a necessary component of an accurate measure of the associations between practice transformation and outcomes important to both patients and society.
Higgs-precision constraints on colored naturalness
Essig, Rouven; Meade, Patrick; Ramani, Harikrishnan; ...
2017-09-19
The presence of weak-scale colored top partners is among the simplest solutions to the Higgs hierarchy problem and allows for a natural electroweak scale. We examine the constraints on generic colored top partners coming solely from their effect on the production and decay rates of the observed Higgs with a mass of 125 GeV. We use the latest Higgs precision data from the Tevatron and the LHC as of EPS 2017 to derive the current limits on spin-0, spin-1/2, and spin-1 colored top partners. We also investigate the expected sensitivity from the Run 3 and Run 4 of the LHC,more » as well from possible future electron-positron and proton-proton colliders, including the ILC, CEPC, FCC-ee, and FCC-hh. We discuss constraints on top partners in the Minimal Supersymmetric Standard Model and Little Higgs theories. We also consider various model-building aspects — multiple top partners, modified couplings between the Higgs and Standard-Model particles, and non-Standard-Model Higgs sectors — and evaluate how these weaken the current limits and expected sensitivities. By modifying other Standard-Model Higgs couplings, we find that the best way to hide low-mass top partners from current data is through modifications of the top-Yukawa coupling, although future measurements of top-quark-pair production in association with a Higgs will extensively probe this possibility. We also demonstrate that models with multiple top partners can generically avoid current and future Higgs precision measurements. Nevertheless, some of the model parameter space can be probed with precision measurements at future electron-positron colliders of, for example, the e + e - → Zhcrosssection.« less
Higgs-precision constraints on colored naturalness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Essig, Rouven; Meade, Patrick; Ramani, Harikrishnan
The presence of weak-scale colored top partners is among the simplest solutions to the Higgs hierarchy problem and allows for a natural electroweak scale. We examine the constraints on generic colored top partners coming solely from their effect on the production and decay rates of the observed Higgs with a mass of 125 GeV. We use the latest Higgs precision data from the Tevatron and the LHC as of EPS 2017 to derive the current limits on spin-0, spin-1/2, and spin-1 colored top partners. We also investigate the expected sensitivity from the Run 3 and Run 4 of the LHC,more » as well from possible future electron-positron and proton-proton colliders, including the ILC, CEPC, FCC-ee, and FCC-hh. We discuss constraints on top partners in the Minimal Supersymmetric Standard Model and Little Higgs theories. We also consider various model-building aspects — multiple top partners, modified couplings between the Higgs and Standard-Model particles, and non-Standard-Model Higgs sectors — and evaluate how these weaken the current limits and expected sensitivities. By modifying other Standard-Model Higgs couplings, we find that the best way to hide low-mass top partners from current data is through modifications of the top-Yukawa coupling, although future measurements of top-quark-pair production in association with a Higgs will extensively probe this possibility. We also demonstrate that models with multiple top partners can generically avoid current and future Higgs precision measurements. Nevertheless, some of the model parameter space can be probed with precision measurements at future electron-positron colliders of, for example, the e + e - → Zhcrosssection.« less
The Use of Regulatory Air Quality Models to Develop Successful Ozone Attainment Strategies
NASA Astrophysics Data System (ADS)
Canty, T. P.; Salawitch, R. J.; Dickerson, R. R.; Ring, A.; Goldberg, D. L.; He, H.; Anderson, D. C.; Vinciguerra, T.
2015-12-01
The Environmental Protection Agency (EPA) recently proposed lowering the 8-hr ozone standard to between 65-70 ppb. Not all regions of the U.S. are in attainment of the current 75 ppb standard and it is expected that many regions currently in attainment will not meet the future, lower surface ozone standard. Ozone production is a nonlinear function of emissions, biological processes, and weather. Federal and state agencies rely on regulatory air quality models such as the Community Multi-Scale Air Quality (CMAQ) model and Comprehensive Air Quality Model with Extensions (CAMx) to test ozone precursor emission reduction strategies that will bring states into compliance with the National Ambient Air Quality Standards (NAAQS). We will describe various model scenarios that simulate how future limits on emission of ozone precursors (i.e. NOx and VOCs) from sources such as power plants and vehicles will affect air quality. These scenarios are currently being developed by states required to submit a State Implementation Plan to the EPA. Projections from these future case scenarios suggest that strategies intended to control local ozone may also bring upwind states into attainment of the new NAAQS. Ground based, aircraft, and satellite observations are used to ensure that air quality models accurately represent photochemical processes within the troposphere. We will highlight some of the improvements made to the CMAQ and CAMx model framework based on our analysis of NASA observations obtained by the OMI instrument on the Aura satellite and by the DISCOVER-AQ field campaign.
NASA Astrophysics Data System (ADS)
Espinosa, J. R.; Racco, D.; Riotto, A.
2018-03-01
For the current central values of the Higgs boson and top quark masses, the standard model Higgs potential develops an instability at a scale of the order of 1 011 GeV . We show that a cosmological signature of such instability could be dark matter in the form of primordial black holes seeded by Higgs fluctuations during inflation. The existence of dark matter might not require physics beyond the standard model.
De Leeuw, R A; Westerman, Michiel; Nelson, E; Ket, J C F; Scheele, F
2016-07-08
E-learning is driving major shifts in medical education. Prioritizing learning theories and quality models improves the success of e-learning programs. Although many e-learning quality standards are available, few are focused on postgraduate medical education. We conducted an integrative review of the current postgraduate medical e-learning literature to identify quality specifications. The literature was thematically organized into a working model. Unique quality specifications (n = 72) were consolidated and re-organized into a six-domain model that we called the Postgraduate Medical E-learning Model (Postgraduate ME Model). This model was partially based on the ISO-19796 standard, and drew on cognitive load multimedia principles. The domains of the model are preparation, software design and system specifications, communication, content, assessment, and maintenance. This review clarified the current state of postgraduate medical e-learning standards and specifications. It also synthesized these specifications into a single working model. To validate our findings, the next-steps include testing the Postgraduate ME Model in controlled e-learning settings.
3D digital headform models of Australian cyclists.
Ellena, Thierry; Skals, Sebastian; Subic, Aleksandar; Mustafa, Helmy; Pang, Toh Yen
2017-03-01
Traditional 1D anthropometric data have been the primary source of information used by ergonomists for the dimensioning of head and facial gear. Although these data are simple to use and understand, they only provide univariate measures of key dimensions. 3D anthropometric data, however, describe the complete shape characteristics of the head surface, but are complicated to interpret due to the abundance of information they contain. Consequently, current headform standards based on 1D measurements may not adequately represent the actual head shape variations of the intended user groups. The purpose of this study was to introduce a set of new digital headform models representative of the adult cyclists' community in Australia. Four models were generated based on an Australian 3D anthropometric database of head shapes and a modified hierarchical clustering algorithm. Considerable shape differences were identified between our models and the current headforms from the Australian standard. We conclude that the design of head and facial gear based on current standards might not be favorable for optimal fitting results. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hadker, Nandini; Garg, Suchita; Costanzo, Cory; van der Helm, Wim; Creeden, James
2013-05-01
To quantify the financial impact of adding a novel serum test to the current diagnostic toolkit for preeclampsia (PE) detection in Germany. A decision-analytic model was created to quantify the economic impact of adding a recently developed novel diagnostic test for PE (Roche Diagnostics, Rotkreuz, Switzerland) to current diagnostic practice in Germany. The model simulated a cohort of 1000 pregnant patients receiving obstetric care and quantified the budget impact of adding the novel test to current German PE detection and management practices. The model estimates that the costs associated with managing a typical pregnancy in Germany are €941 when the novel test is used versus €1579 with standard practice. This represents savings of €637 per pregnant woman, even when the test is used as a supplementary diagnostic tool. The savings are attributed to the novel test's ability to better classify patients relative to current practice, specifically, its ability to reduce false negatives by 67% and false positives by 71%. The novel PE test has the potential to provide substantial cost savings to German healthcare payers, even when used as an addition to standard practice. Better classification of patients at risk for developing PE and declassification of those that are not compared to current practice leads to economic savings for the healthcare system. Furthermore, by reducing the rates of false-positive and false-negative classification relative to current standard of care, the test helps better target healthcare spending and lowers overall costs associated with PE care.
Espinosa, J R; Racco, D; Riotto, A
2018-03-23
For the current central values of the Higgs boson and top quark masses, the standard model Higgs potential develops an instability at a scale of the order of 10^{11} GeV. We show that a cosmological signature of such instability could be dark matter in the form of primordial black holes seeded by Higgs fluctuations during inflation. The existence of dark matter might not require physics beyond the standard model.
International Planetary Data Alliance (IPDA) Information Model
NASA Technical Reports Server (NTRS)
Hughes, John Steven; Beebe, R.; Guinness, E.; Heather, D.; Huang, M.; Kasaba, Y.; Osuna, P.; Rye, E.; Savorskiy, V.
2007-01-01
This document is the third deliverable of the International Planetary Data Alliance (IPDA) Archive Data Standards Requirements Identification project. The goal of the project is to identify a subset of the standards currently in use by NASAs Planetary Data System (PDS) that are appropriate for internationalization. As shown in the highlighted sections of Figure 1, the focus of this project is the Information Model component of the Data Architecture Standards, namely the object models, a data dictionary, and a set of data formats.
Ion transport and loss in the earth's quiet ring current. I - Data and standard model
NASA Technical Reports Server (NTRS)
Sheldon, R. B.; Hamilton, D. C.
1993-01-01
A study of the transport and loss of ions in the earth's quiet time ring current, in which the standard radial diffusion model developed for the high-energy radiation belt particles is compared with the measurements of the lower-energy ring current ions, is presented. The data set provides ionic composition information in an energy range that includes the bulk of the ring current energy density, 1-300 keV/e. Protons are found to dominate the quiet time energy density at all altitudes, peaking near L of about 4 at 60 keV/cu cm, with much smaller contributions from O(+) (1-10 percent), He(+) (1-5 percent), and He(2+) (less than 1 percent). A minimization procedure is used to fit the amplitudes of the standard electric radial diffusion coefficient, yielding 5.8 x 10 exp -11 R(E-squared)/s. Fluctuation ionospheric electric fields are suggested as the source of the additional diffusion detected.
Exciting (the) Vacuum: Possible Manifestations of the Higgs particle at the LHC
David Kaplan
2017-12-09
The Higgs boson is the particle most anticipated at the LHC. However, there is currently no leading theory of electroweak symmetry breaking (and the 'Higgs mechanism'). The many possibilities suggest many ways the Higgs could appear in the detectors, some of which require non-standard search methods. I will review the current state of beyond the standard model physics and the implication for Higgs physics. I then discuss some non-standard Higgs decays and suggest (perhaps naive) new experimental strategies for detecting the Higgs in such cases. In some models, while part of the new physics at the weak scale would be visible, the Higgs would be nearly impossible to detect.
Lee, Joohee; Kim, Jinseok; Lim, Hyunsung
2010-07-01
The purpose of the current study was to examine factors that influence rape myths among Korean college students. This study was particularly interested in the ways in which attitudes toward women and sexual double standard affect the relationship between gender and rape myths. Although the incidence of rape is a common concern in many current societies, within each society, the specific components of rape myths reflect the cultural values and norms of that particular society. A sample of 327 college students in South Korea completed the Korean Rape Myth Acceptance Scale-Revised, the Attitudes Toward Women Scale, and the Sexual Double Standard Scale. Structural equation modeling (SEM) was used to test hypothesized models. Results revealed that in three of the four models, rape survivor myths, rape perpetrator myths, and myths about the impact of rape, attitudes toward women were a more important predictor of rape myths than gender or sexual double standard. In the rape spontaneity myths model, on the other hand, sexual double standard was a more important predictor than gender or attitudes toward women. This study provides valuable information that can be useful in developing culturally specific rape prevention and victim intervention programs.
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han
2014-01-01
Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.
The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding genericmore » IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.« less
Glines, Wayne M; Markham, Anna
2018-05-01
Seventy-five years after the Hanford Site was initially created as the primary plutonium production site for atomic weapons development under the Manhattan Project, the American Nuclear Society and the Health Physics Society are sponsoring a conference from 30 September through 3 October 2018, in Pasco, Washington, titled "Applicability of Radiation Response Models to Low Dose Protection Standards." The goal of this conference is to use current scientific data to update the approach to regulating low-level radiation doses; i.e., to answer a quintessential question of radiation protection-how to best develop radiation protection standards that protect human populations against detrimental effects while allowing the beneficial uses of radiation and radioactive materials. Previous conferences (e.g., "Wingspread Conference," "Arlie Conference") have attempted to address this question; but now, almost 20 y later, the key issues, goals, conclusions, and recommendations of those two conferences remain and are as relevant as they were then. Despite the best efforts of the conference participants and increased knowledge and understanding of the science underlying radiation effects in human populations, the bases of current radiation protection standards have evolved little. This 2018 conference seeks to provide a basis and path forward for evolving radiation protection standards to be more reflective of current knowledge and understanding of low dose response models.
Search for Muonic Dark Forces at BABAR
NASA Astrophysics Data System (ADS)
Godang, Romulus
2017-04-01
Many models of physics beyond Standard Model predict the existence of light Higgs states, dark photons, and new gauge bosons mediating interactions between dark sectors and the Standard Model. Using a full data sample collected with the BABAR detector at the PEP-II e+e- collider, we report searches for a light non-Standard Model Higgs boson, dark photon, and a new muonic dark force mediated by a gauge boson (Z') coupling only to the second and third lepton families. Our results significantly improve upon the current bounds and further constrain the remaining region of the allowed parameter space.
Bernard R. Parresol; Joe H. Scott; Anne Andreu; Susan Prichard; Laurie Kurth
2012-01-01
Currently geospatial fire behavior analyses are performed with an array of fire behavior modeling systems such as FARSITE, FlamMap, and the Large Fire Simulation System. These systems currently require standard or customized surface fire behavior fuel models as inputs that are often assigned through remote sensing information. The ability to handle hundreds or...
Informatics in radiology: an information model of the DICOM standard.
Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L
2011-01-01
The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010
Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S
2017-11-01
Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Simulation Model of A Ferroelectric Field Effect Transistor
NASA Technical Reports Server (NTRS)
MacLeod, Todd C.; Ho, Fat Duen; Russell, Larry W. (Technical Monitor)
2002-01-01
An electronic simulation model has been developed of a ferroelectric field effect transistor (FFET). This model can be used in standard electrical circuit simulation programs to simulate the main characteristics of the FFET. The model uses a previously developed algorithm that incorporates partial polarization as a basis for the design. The model has the main characteristics of the FFET, which are the current hysterisis with different gate voltages and decay of the drain current when the gate voltage is off. The drain current has values matching actual FFET's, which were measured experimentally. The input and output resistance in the model is similar to that of the FFET. The model is valid for all frequencies below RF levels. A variety of different ferroelectric material characteristics can be modeled. The model can be used to design circuits using FFET'S with standard electrical simulation packages. The circuit can be used in designing non-volatile memory circuits and logic circuits and is compatible with all SPICE based circuit analysis programs. The model is a drop in library that integrates seamlessly into a SPICE simulation. A comparison is made between the model and experimental data measured from an actual FFET.
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong
2014-01-01
Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817
ERIC Educational Resources Information Center
Kaliski, Pamela K.; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna L.; Plake, Barbara S.; Reshetar, Rosemary A.
2013-01-01
The many-faceted Rasch (MFR) model has been used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR model for examining the quality of ratings obtained from a standard…
Modeling of Photoionized Plasmas
NASA Technical Reports Server (NTRS)
Kallman, Timothy R.
2010-01-01
In this paper I review the motivation and current status of modeling of plasmas exposed to strong radiation fields, as it applies to the study of cosmic X-ray sources. This includes some of the astrophysical issues which can be addressed, the ingredients for the models, the current computational tools, the limitations imposed by currently available atomic data, and the validity of some of the standard assumptions. I will also discuss ideas for the future: challenges associated with future missions, opportunities presented by improved computers, and goals for atomic data collection.
Hidden sector dark matter and the Galactic Center gamma-ray excess: a closer look
Escudero, Miguel; Witte, Samuel J.; Hooper, Dan
2017-11-24
Stringent constraints from direct detection experiments and the Large Hadron Collider motivate us to consider models in which the dark matter does not directly couple to the Standard Model, but that instead annihilates into hidden sector particles which ultimately decay through small couplings to the Standard Model. We calculate the gamma-ray emission generated within the context of several such hidden sector models, including those in which the hidden sector couples to the Standard Model through the vector portal (kinetic mixing with Standard Model hypercharge), through the Higgs portal (mixing with the Standard Model Higgs boson), or both. In each case,more » we identify broad regions of parameter space in which the observed spectrum and intensity of the Galactic Center gamma-ray excess can easily be accommodated, while providing an acceptable thermal relic abundance and remaining consistent with all current constraints. Here, we also point out that cosmic-ray antiproton measurements could potentially discriminate some hidden sector models from more conventional dark matter scenarios.« less
Hidden Sector Dark Matter and the Galactic Center Gamma-Ray Excess: A Closer Look
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escudero, Miguel; Witte, Samuel J.; Hooper, Dan
2017-09-20
Stringent constraints from direct detection experiments and the Large Hadron Collider motivate us to consider models in which the dark matter does not directly couple to the Standard Model, but that instead annihilates into hidden sector particles which ultimately decay through small couplings to the Standard Model. We calculate the gamma-ray emission generated within the context of several such hidden sector models, including those in which the hidden sector couples to the Standard Model through the vector portal (kinetic mixing with Standard Model hypercharge), through the Higgs portal (mixing with the Standard Model Higgs boson), or both. In each case,more » we identify broad regions of parameter space in which the observed spectrum and intensity of the Galactic Center gamma-ray excess can easily be accommodated, while providing an acceptable thermal relic abundance and remaining consistent with all current constraints. We also point out that cosmic-ray antiproton measurements could potentially discriminate some hidden sector models from more conventional dark matter scenarios.« less
Hidden sector dark matter and the Galactic Center gamma-ray excess: a closer look
NASA Astrophysics Data System (ADS)
Escudero, Miguel; Witte, Samuel J.; Hooper, Dan
2017-11-01
Stringent constraints from direct detection experiments and the Large Hadron Collider motivate us to consider models in which the dark matter does not directly couple to the Standard Model, but that instead annihilates into hidden sector particles which ultimately decay through small couplings to the Standard Model. We calculate the gamma-ray emission generated within the context of several such hidden sector models, including those in which the hidden sector couples to the Standard Model through the vector portal (kinetic mixing with Standard Model hypercharge), through the Higgs portal (mixing with the Standard Model Higgs boson), or both. In each case, we identify broad regions of parameter space in which the observed spectrum and intensity of the Galactic Center gamma-ray excess can easily be accommodated, while providing an acceptable thermal relic abundance and remaining consistent with all current constraints. We also point out that cosmic-ray antiproton measurements could potentially discriminate some hidden sector models from more conventional dark matter scenarios.
Hidden sector dark matter and the Galactic Center gamma-ray excess: a closer look
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escudero, Miguel; Witte, Samuel J.; Hooper, Dan
Stringent constraints from direct detection experiments and the Large Hadron Collider motivate us to consider models in which the dark matter does not directly couple to the Standard Model, but that instead annihilates into hidden sector particles which ultimately decay through small couplings to the Standard Model. We calculate the gamma-ray emission generated within the context of several such hidden sector models, including those in which the hidden sector couples to the Standard Model through the vector portal (kinetic mixing with Standard Model hypercharge), through the Higgs portal (mixing with the Standard Model Higgs boson), or both. In each case,more » we identify broad regions of parameter space in which the observed spectrum and intensity of the Galactic Center gamma-ray excess can easily be accommodated, while providing an acceptable thermal relic abundance and remaining consistent with all current constraints. Here, we also point out that cosmic-ray antiproton measurements could potentially discriminate some hidden sector models from more conventional dark matter scenarios.« less
Joint CPT and N resonance in compact atomic time standards
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Hohensee, Michael; Xiao, Yanhong; Phillips, David; Walsworth, Ron
2008-05-01
Currently development efforts towards small, low power atomic time standards use current-modulated VCSELs to generate phase-coherent optical sidebands that interrogate the hyperfine structure of alkali atoms such as rubidium. We describe and use a modified four-level quantum optics model to study the optimal operating regime of the joint CPT- and N-resonance clock. Resonant and non-resonant light shifts as well as modulation comb detuning effects play a key role in determining the optimal operating point of such clocks. We further show that our model is in good agreement with experimental tests performed using Rb-87 vapor cells.
Electric fence standards comport with human data and AC limits.
Kroll, Mark W; Perkins, Peter E; Panescu, Dorin
2015-08-01
The ubiquitous electric fence is essential to modern agriculture and has saved lives by reducing the number of livestock automobile collisions. Modern safety standards such as IEC 60335-2-76 and UL 69 have played a role in this positive result. However, these standards are essentially based on energy and power (RMS current), which have limited direct relationship to cardiac effects. We compared these standards to bioelectrically more relevant units of charge and average current in view of recent work on VF (ventricular fibrillation) induction and to existing IEC AC current limits. There are 3 limits for normal (low) pulsing rate: IEC energy limit, IEC current limit, and UL current limit. We then calculated the delivered charge allowed for each pulse duration for these limits and then compared them to a charge-based safety model derived from published human ventricular-fibrillation induction data. Both the IEC and UL also allow for rapid pulsing for up to 3 minutes. We calculated maximum outputs for various pulse durations assuming pulsing at 10, 20, and 30 pulses per second. These were then compared to standard utility power safety (AC) limits via the conversion factor of 7.4 to convert average current to RMS current for VF risk. The outputs of TASER electrical weapons (typically < 100 μC and ~100 μs duration) were also compared. The IEC and UL electric fence energizer normal rate standards are conservative in comparison with actual human laboratory experiments. The IEC and UL electric fence energizer rapid-pulsing standards are consistent with accepted IEC AC current limits for commercially used pulse durations.
Searching for Physics Beyond the Standard Model and Beyond
NASA Astrophysics Data System (ADS)
Abdullah, Mohammad
The hierarchy problem, convolved with the various known puzzles in particle physics, grants us a great outlook of new physics soon to be discovered. We present multiple approaches to searching for physics beyond the standard model. First, two models with a minimal amount of theoretical guidance are analyzed using existing or simulated LHC data. Then, an extension of the Minimal Supersymmetric Standard Model (MSSM) is studied with an emphasis on the cosmological implications as well as the current and future sensitivity of colliders, direct detection and indirect detection experiments. Finally, a more complete model of the MSSM is presented through which we attempt to resolve tension with observations within the context of gauge mediated supersymmetry breaking.
ERIC Educational Resources Information Center
Li, Deping; Oranje, Andreas
2007-01-01
Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…
NASA Technical Reports Server (NTRS)
Estefan, J. A.; Sovers, O. J.
1994-01-01
The standard tropospheric calibration model implemented in the operational Orbit Determination Program is the seasonal model developed by C. C. Chao in the early 1970's. The seasonal model has seen only slight modification since its release, particularly in the format and content of the zenith delay calibrations. Chao's most recent standard mapping tables, which are used to project the zenith delay calibrations along the station-to-spacecraft line of sight, have not been modified since they were first published in late 1972. This report focuses principally on proposed upgrades to the zenith delay mapping process, although modeling improvements to the zenith delay calibration process are also discussed. A number of candidate approximation models for the tropospheric mapping are evaluated, including the semi-analytic mapping function of Lanyi, and the semi-empirical mapping functions of Davis, et. al.('CfA-2.2'), of Ifadis (global solution model), of Herring ('MTT'), and of Niell ('NMF'). All of the candidate mapping functions are superior to the Chao standard mapping tables and approximation formulas when evaluated against the current Deep Space Network Mark 3 intercontinental very long baselines interferometry database.
McNamara, Robert L; Wang, Yongfei; Partovian, Chohreh; Montague, Julia; Mody, Purav; Eddy, Elizabeth; Krumholz, Harlan M; Bernheim, Susannah M
2015-09-01
Electronic health records (EHRs) offer the opportunity to transform quality improvement by using clinical data for comparing hospital performance without the burden of chart abstraction. However, current performance measures using EHRs are lacking. With support from the Centers for Medicare & Medicaid Services (CMS), we developed an outcome measure of hospital risk-standardized 30-day mortality rates for patients with acute myocardial infarction for use with EHR data. As no appropriate source of EHR data are currently available, we merged clinical registry data from the Action Registry-Get With The Guidelines with claims data from CMS to develop the risk model (2009 data for development, 2010 data for validation). We selected candidate variables that could be feasibly extracted from current EHRs and do not require changes to standard clinical practice or data collection. We used logistic regression with stepwise selection and bootstrapping simulation for model development. The final risk model included 5 variables available on presentation: age, heart rate, systolic blood pressure, troponin ratio, and creatinine level. The area under the receiver operating characteristic curve was 0.78. Hospital risk-standardized mortality rates ranged from 9.6% to 13.1%, with a median of 10.7%. The odds of mortality for a high-mortality hospital (+1 SD) were 1.37 times those for a low-mortality hospital (-1 SD). This measure represents the first outcome measure endorsed by the National Quality Forum for public reporting of hospital quality based on clinical data in the EHR. By being compatible with current clinical practice and existing EHR systems, this measure is a model for future quality improvement measures.
Primordial alchemy: from the Big Bang to the present universe
NASA Astrophysics Data System (ADS)
Steigman, Gary
Of the light nuclides observed in the universe today, D, 3He, 4He, and 7Li are relics from its early evolution. The primordial abundances of these relics, produced via Big Bang Nucleosynthesis (BBN) during the first half hour of the evolution of the universe provide a unique window on Physics and Cosmology at redshifts ~1010. Comparing the BBN-predicted abundances with those inferred from observational data tests the consistency of the standard cosmological model over ten orders of magnitude in redshift, constrains the baryon and other particle content of the universe, and probes both Physics and Cosmology beyond the current standard models. These lectures are intended to introduce students, both of theory and observation, to those aspects of the evolution of the universe relevant to the production and evolution of the light nuclides from the Big Bang to the present. The current observational data is reviewed and compared with the BBN predictions and the implications for cosmology (e.g., universal baryon density) and particle physics (e.g., relativistic energy density) are discussed. While this comparison reveals the stunning success of the standard model(s), there are currently some challenge which leave open the door for more theoretical and observational work with potential implications for astronomy, cosmology, and particle physics.
Information model for digital exchange of soil-related data - potential modifications on ISO 28258
NASA Astrophysics Data System (ADS)
Schulz, Sina; Eberhardt, Einar; Reznik, Tomas
2017-04-01
ABSTRACT The International Standard ISO 28258 "Digital exchange of soil-related data" provides an information model that describes the organization of soil data to facilitate data transfer between data producers, holders and users. The data model contains a fixed set of "core" soil feature types, data types and properties, whereas its customization is on the data provider level, e.g. by adding user-specific properties. Rules for encoding these information are given by a customized XML-based format (called "SoilML"). Some technical shortcomings are currently under consideration in the ISO working group. Directly after publication of ISO 28258 in 2013, also several conceptual and implementation issues concerning the information model had been identified, such as renaming of feature types, modification of data types, and enhancement of definitions or addition of super-classes are part of the current revision process. Conceptual changes for the current ISO data model that are compatible with the Australian/New Zealand soil data model ANZSoilML and the EU INSPIRE Data Specifications Soil are also discussed. The concept of a model with a limited set of properties that can be extended by the data provider should remain unaffected. This presentation aims to introduce and comment on the current ISO soil information model and the proposed modifications. Moreover, we want to discuss these adjustments with respect to enhanced applicability of this International Standard.
The standard model on non-commutative space-time
NASA Astrophysics Data System (ADS)
Calmet, X.; Jurčo, B.; Schupp, P.; Wess, J.; Wohlgenannt, M.
2002-03-01
We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter θ^{μ ν}. No new particles are introduced; the structure group is SU(3)× SU(2)× U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in θ^{μν} we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered.
Regression analysis of current-status data: an application to breast-feeding.
Grummer-strawn, L M
1993-09-01
"Although techniques for calculating mean survival time from current-status data are well known, their use in multiple regression models is somewhat troublesome. Using data on current breast-feeding behavior, this article considers a number of techniques that have been suggested in the literature, including parametric, nonparametric, and semiparametric models as well as the application of standard schedules. Models are tested in both proportional-odds and proportional-hazards frameworks....I fit [the] models to current status data on breast-feeding from the Demographic and Health Survey (DHS) in six countries: two African (Mali and Ondo State, Nigeria), two Asian (Indonesia and Sri Lanka), and two Latin American (Colombia and Peru)." excerpt
NASA Astrophysics Data System (ADS)
Gries, C.; Winslow, L.; Shin, P.; Hanson, P. C.; Barseghian, D.
2010-12-01
At the North Temperate Lakes Long Term Ecological Research (NTL LTER) site six buoys and one met station are maintained, each equipped with up to 20 sensors producing up to 45 separate data streams at a 1 or 10 minute frequency. Traditionally, this data volume has been managed in many matrix type tables, each described in the Ecological Metadata Language (EML) and accessed online by a query system based on the provided metadata. To develop a more flexible information system, several technologies are currently being experimented with. We will review, compare and evaluate these technologies and discuss constraints and advantages of network memberships and implementation of standards. A Data Turbine server is employed to stream data from data logger files into a database with the Real-time Data Viewer being used for monitoring sensor health. The Kepler work flow processor is being explored to introduce quality control routines into this data stream taking advantage of the Data Turbine actor. Kepler could replace traditional database triggers while adding visualization and advanced data access functionality for downstream modeling or other analytical applications. The data are currently streamed into the traditional matrix type tables and into an Observation Data Model (ODM) following the CUAHSI ODM 1.1 specifications. In parallel these sensor data are managed within the Global Lake Ecological Observatory Network (GLEON) where the software package Ziggy streams the data into a database of the VEGA data model. Contributing data to a network implies compliance with established standards for data delivery and data documentation. ODM or VEGA type data models are not easily described in EML, the metadata exchange standard for LTER sites, but are providing many advantages from an archival standpoint. Both GLEON and CUAHSI have developed advanced data access capabilities based on their respective data models and data exchange standards while LTER is currently in a phase of intense technology developments which will eventually provide standardized data access that includes ecological data set types currently not covered by either ODM or VEGA.
Studies of non-standard effects in atmospheric neutrino oscillations of Super-Kamiokande
NASA Astrophysics Data System (ADS)
Wang, Wei
Neutrino oscillation due to mass eigenstate mixing has become the standard theory accounting for both solar and atmospheric neutrino data. This explanation indicates that neutrinos have small but non-vanishing masses, which is a sign of new physics beyond the Standard Model. In this dissertation, we will compare the standard explanation with three types of alternative theories using Super-Kamiokande (SK) atmospheric neutrino data. The first type of non-standard theory involves sterile neutrinos. By using the neutral current enhanced data samples of SK and by considering matter effect, we conclude it is unlikely that sterile neutrinos are responsible for SK atmospheric neutrino zenith angle distributions. Furthermore, we study the allowance of sterile neutrino admixture in atmospheric neutrino mixing and find an admixture of 23% sterile neutrino is allowed at 90% confidence level based on a 2+2 mass hierarchy model. The second type of non-standard theory involves neutrino oscillation induced by violations of Lorentz invariance (LIV) and CPT symmetry (CPTV). The neutrino oscillations induced by the temporal components of the LIV and CPTV terms in the minimal Standard Model Extension (SME) have different energy and pathlength dependences compared to the standard oscillation. Our analysis indicates that it is unlikely to explain SK atmospheric neutrino data with the oscillation effects induced by the temporal components of the minimal SME separately. By treating LIV- and CPTV-induced oscillations as sub-dominant effects, limits on symmetry-breaking parameters are established. The third category of non-standard theory involves vanishing neutrinos caused by neutrino decoherence and neutrino decay. Our study shows that it is unlikely to explain SK atmospheric neutrino zenith angle distributions using these two non-oscillatory models. By treating them as sub-dominant effects, limits on these two types of new physics are set based on several specific models. Our study shows that the oscillation between muon neutrinos and tau neutrinos is the best model explaining SK atmospheric neutrino data among the models we test. In most cases, limits on new physics established in this study using SK atmospheric neutrino data are the best currently available.
Patient-derived Xenograft (PDX) Models In Basic and Translational Breast Cancer Research
Dobrolecki, Lacey E.; Airhart, Susie D.; Alferez, Denis G.; Aparicio, Samuel; Behbod, Fariba; Bentires-Alj, Mohamed; Brisken, Cathrin; Bult, Carol J.; Cai, Shirong; Clarke, Robert B.; Dowst, Heidi; Ellis, Matthew J.; Gonzalez-Suarez, Eva; Iggo, Richard D.; Kabos, Peter; Li, Shunqiang; Lindeman, Geoffrey J.; Marangoni, Elisabetta; McCoy, Aaron; Meric-Bernstam, Funda; Piwnica-Worms, Helen; Poupon, Marie-France; Reis-Filho, Jorge; Sartorius, Carol A.; Scabia, Valentina; Sflomos, George; Tu, Yizheng; Vaillant, François; Visvader, Jane E.; Welm, Alana; Wicha, Max S.
2017-01-01
Patient-derived xenograft (PDX) models of a growing spectrum of cancers are rapidly supplanting long-established traditional cell lines as preferred models for conducting basic and translational pre-clinical research. In breast cancer, to complement the now curated collection of approximately 45 long-established human breast cancer cell lines, a newly formed consortium of academic laboratories, currently from Europe, Australia, and North America, herein summarizes data on over 500 stably transplantable PDX models representing all three clinical subtypes of breast cancer (ER+, HER2+, and “Triple-negative” (TNBC)). Many of these models are well-characterized with respect to genomic, transcriptomic, and proteomic features, metastatic behavior, and treatment response to a variety of standard-of-care and experimental therapeutics. These stably transplantable PDX lines are generally available for dissemination to laboratories conducting translational research, and contact information for each collection is provided. This review summarizes current experiences related to PDX generation across participating groups, efforts to develop data standards for annotation and dissemination of patient clinical information that does not compromise patient privacy, efforts to develop complementary data standards for annotation of PDX characteristics and biology, and progress toward “credentialing” of PDX models as surrogates to represent individual patients for use in pre-clinical and co-clinical translational research. In addition, this review highlights important unresolved questions, as well as current limitations, that have hampered more efficient generation of PDX lines and more rapid adoption of PDX use in translational breast cancer research. PMID:28025748
Trap-assisted tunneling in InGaN/GaN single-quantum-well light-emitting diodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auf der Maur, M., E-mail: auf.der.maur@ing.uniroma2.it; Di Carlo, A.; Galler, B.
Based on numerical simulation and comparison with measured current characteristics, we show that the current in InGaN/GaN single-quantum-well light-emitting diodes at low forward bias can be accurately described by a standard trap-assisted tunneling model. The qualitative and quantitative differences in the current characteristics of devices with different emission wavelengths are demonstrated to be correlated in a physically consistent way with the tunneling model parameters.
A process-based standard for the Solar Energetic Particle Event Environment
NASA Astrophysics Data System (ADS)
Gabriel, Stephen
For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.
[Comparison of Flu Outbreak Reporting Standards Based on Transmission Dynamics Model].
Yang, Guo-jing; Yi, Qing-jie; Li, Qin; Zeng, Qing
2016-05-01
To compare the current two flu outbreak reporting standards for the purpose of better prevention and control of flu outbreaks. A susceptible-exposed-infectious/asymptomatic-removed (SEIAR) model without interventions was set up first, followed by a model with interventions based on real situation. Simulated interventions were developed based on the two reporting standards, and evaluated by estimated duration of outbreaks, cumulative new cases, cumulative morbidity rates, decline in percentage of morbidity rates, and cumulative secondary cases. The basic reproductive number of the outbreak was estimated as 8. 2. The simulation produced similar results as the real situation. The effect of interventions based on reporting standard one (10 accumulated new cases in a week) was better than that of interventions based on reporting standard two (30 accumulated new cases in a week). The reporting standard one (10 accumulated new cases in a week) is more effective for prevention and control of flu outbreaks.
Radiation Environment Modeling for Spacecraft Design: New Model Developments
NASA Technical Reports Server (NTRS)
Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray
2006-01-01
A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.
A Comparison of Career-Related Assessment Tools/Models. Final [Report].
ERIC Educational Resources Information Center
WestEd, San Francisco, CA.
This document contains charts that evaluate career related assessment items. Chart categories include: Purpose/Current Uses/Format; Intended Population; Oregon Career Related Learning Standards Addressed; Relationship to the Standards; Relationship to Endorsement Area Frameworks; Evidence of Validity; Evidence of Reliability; Evidence of Fairness…
Study on Standard Fatigue Vehicle Load Model
NASA Astrophysics Data System (ADS)
Huang, H. Y.; Zhang, J. P.; Li, Y. H.
2018-02-01
Based on the measured data of truck from three artery expressways in Guangdong Province, the statistical analysis of truck weight was conducted according to axle number. The standard fatigue vehicle model applied to industrial areas in the middle and late was obtained, which adopted equivalence damage principle, Miner linear accumulation law, water discharge method and damage ratio theory. Compared with the fatigue vehicle model Specified by the current bridge design code, the proposed model has better applicability. It is of certain reference value for the fatigue design of bridge in China.
Testing the Standard Model by precision measurement of the weak charges of quarks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross Young; Roger Carlini; Anthony Thomas
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low-energy. The precision of this new result, combined with earlier atomic parity-violation measurements, limits the magnitude of possible contributions from physics beyond the Standard Model - setting a model-independent, lower-bound on the scale of new physics at ~1 TeV.
The top quark (20 years after the discovery)
Boos, Eduard; Brandt, Oleg; Denisov, Dmitri; ...
2015-09-10
On the twentieth anniversary of the observation of the top quark, we trace our understanding of this heaviest of all known particles from the prediction of its existence, through the searches and discovery, to the current knowledge of its production mechanisms and properties. We also discuss the central role of the top quark in the Standard Model and the windows that it opens for seeking new physics beyond the Standard Model.
Informatics in clinical research in oncology: current state, challenges, and a future perspective.
Chahal, Amar P S
2011-01-01
The informatics landscape of clinical trials in oncology has changed significantly in the last 10 years. The current state of the infrastructure for clinical trial management, execution, and data management is reviewed. The systems, their functionality, the users, and the standards available to researchers are discussed from the perspective of the oncologist-researcher. Challenges in complexity and in the processing of information are outlined. These challenges include the lack of communication and information-interchange between systems, the lack of simplified standards, and the lack of implementation and adherence to the standards that are available. The clinical toxicology criteria from the National Cancer Institute (CTCAE) are cited as a successful standard in oncology, and HTTP on the Internet is referenced for its simplicity. Differences in the management of information standards between industries are discussed. Possible future advances in oncology clinical research informatics are addressed. These advances include strategic policy review of standards and the implementation of actions to make standards free, ubiquitous, simple, and easily interpretable; the need to change from a local data-capture- or transaction-driven model to a large-scale data-interpretation model that provides higher value to the oncologist and the patient; and the need for information technology investment in a readily available digital educational model for clinical research in oncology that is customizable for individual studies. These new approaches, with changes in information delivery to mobile platforms, will set the stage for the next decade in clinical research informatics.
Wilber, William G.; Crawford, Charles G.; Peters, James G.
1979-01-01
The Indiana State Board of Health is developing a State water-quality management plan that includes establishing limits for wastewater effluents discharged into Indiana streams. A digital model calibrated to conditions in Silver Creek was used to develop alternatives for future waste loadings that would be compatible with Indiana stream water-quality standards defined for two critical hydrologic conditions, summer and winter low flows. Effluents from the Sellersburg and Clarksville-North wastewater-treatment facilities are the only point-source waste loads that significantly affect the water quality in the modeled segment of Silver Creek. Model simulations indicate that nitrification is the most significant factor affecting the dissolved-oxygen concentration in Silver Creek during summer and winter low flows. Natural streamflow in Silver Creek during the summer and annual 7-day, 10-year low flow is zero, so no benefit from dilution is provided. Present ammonia-nitrogen and dissolved-oxygen concentrations of effluent from the Sellersburg and Clarksville-North wastewater-treatment facilities will violate current Indiana water-quality standards for ammonia toxicity and dissolved oxygen during summer and winter low flows. The current biochemical-oxygen demand limits for the Sellersburg and Clarksville-North wastewater-treatment facilities are not sufficient to maintain an average dissolved-oxygen concentration of at least 5 milligrams per liter, the State 's water-quality standard for streams. Calculations of the stream 's assimilative capacity indicate that Silver Creek cannot assimilate additional waste loadings and meet current Indiana water-quality standards. (Kosco-USGS)
What is missing between model and Aura MLS observations in mesospheric OH?
NASA Astrophysics Data System (ADS)
Wang, S.; Li, K. F.; Zeng, Z.; Sander, S. P.; Shia, R. L.; Yung, Y. L.
2017-12-01
Recent Aura Microwave Limb Souder observations show higher mesospheric OH levels than earlier versions and previous satellite observations. The current photochemical model with standard chemistry is not able to accurately simulate MLS OH in the mesosphere. In particular, the model significantly underestimates OH over the altitude range of 60-80km. In the standard middle atmospheric chemistry, HOx over this altitude range is controled mainly through the reactions of H2O + hv (< 205 nm) → H + OH; H + O2 + M → HO2 + M; and OH + HO2 → H2O + O2. In an attempt to resolve the model-observation discrepancy, we adjust the rate coefficients of these reactions within recommended uncertainty ranges using an objective Bayesian approach. However, reasonable perturbations to these reactions are not capable of resolving the mesospheric discrepancy without introducing disagreements in other regions of the atmosphere. We explore possible new reactions in the Earth's atmosphere that are not included in current standard models. Some candidate reactions and their potential impacts on mesospheric HOx chemistry will be discussed. Our results urge new laboratory studies of these candidate reactions, whose rate coefficients have never been measured for the atmospheric conditions.
Predicting Student Performance in a Collaborative Learning Environment
ERIC Educational Resources Information Center
Olsen, Jennifer K.; Aleven, Vincent; Rummel, Nikol
2015-01-01
Student models for adaptive systems may not model collaborative learning optimally. Past research has either focused on modeling individual learning or for collaboration, has focused on group dynamics or group processes without predicting learning. In the current paper, we adjust the Additive Factors Model (AFM), a standard logistic regression…
New vector-like fermions and flavor physics
Ishiwata, Koji; Ligeti, Zoltan; Wise, Mark B.
2015-10-06
We study renormalizable extensions of the standard model that contain vector-like fermions in a (single) complex representation of the standard model gauge group. There are 11 models where the vector-like fermions Yukawa couple to the standard model fermions via the Higgs field. These models do not introduce additional fine-tunings. They can lead to, and are constrained by, a number of different flavor-changing processes involving leptons and quarks, as well as direct searches. An interesting feature of the models with strongly interacting vector-like fermions is that constraints from neutral meson mixings (apart from CP violation inmore » $$ {K}^0-{\\overline{K}}^0 $$ mixing) are not sensitive to higher scales than other flavor-changing neutral-current processes. We identify order 1/(4πM) 2 (where M is the vector-like fermion mass) one-loop contributions to the coefficients of the four-quark operators for meson mixing, that are not suppressed by standard model quark masses and/or mixing angles.« less
New ANSI standard for thyroid phantom
Mallett, Michael W.; Bolch, Wesley E.; Fulmer, Philip C.; ...
2015-08-01
Here, a new ANSI standard titled “Thyroid Phantom Used in Occupational Monitoring” (Health Physics Society 2014) has been published. The standard establishes the criteria for acceptable design, fabrication, or modeling of a phantom suitable for calibrating in vivo monitoring systems to measure photon-emitting radionuclides deposited in the thyroid. The current thyroid phantom standard was drafted in 1973 (ANSI N44.3-1973), last reviewed in 1984, and a revision of the standard to cover a more modern approach was deemed warranted.
78 FR 45104 - Model Manufactured Home Installation Standards: Ground Anchor Installations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-26
... test methods for establishing working load design values of ground anchor assemblies used for new... anchor installations and establish standardized test methods to determine ground anchor performance and... currently no national test method for rating and certifying ground anchor assemblies in different soil...
Multiple imputation to account for measurement error in marginal structural models
Edwards, Jessie K.; Cole, Stephen R.; Westreich, Daniel; Crane, Heidi; Eron, Joseph J.; Mathews, W. Christopher; Moore, Richard; Boswell, Stephen L.; Lesko, Catherine R.; Mugavero, Michael J.
2015-01-01
Background Marginal structural models are an important tool for observational studies. These models typically assume that variables are measured without error. We describe a method to account for differential and non-differential measurement error in a marginal structural model. Methods We illustrate the method estimating the joint effects of antiretroviral therapy initiation and current smoking on all-cause mortality in a United States cohort of 12,290 patients with HIV followed for up to 5 years between 1998 and 2011. Smoking status was likely measured with error, but a subset of 3686 patients who reported smoking status on separate questionnaires composed an internal validation subgroup. We compared a standard joint marginal structural model fit using inverse probability weights to a model that also accounted for misclassification of smoking status using multiple imputation. Results In the standard analysis, current smoking was not associated with increased risk of mortality. After accounting for misclassification, current smoking without therapy was associated with increased mortality [hazard ratio (HR): 1.2 (95% CI: 0.6, 2.3)]. The HR for current smoking and therapy (0.4 (95% CI: 0.2, 0.7)) was similar to the HR for no smoking and therapy (0.4; 95% CI: 0.2, 0.6). Conclusions Multiple imputation can be used to account for measurement error in concert with methods for causal inference to strengthen results from observational studies. PMID:26214338
Weak annihilation and new physics in charmless [Formula: see text] decays.
Bobeth, Christoph; Gorbahn, Martin; Vickers, Stefan
We use currently available data of nonleptonic charmless 2-body [Formula: see text] decays ([Formula: see text]) that are mediated by [Formula: see text] QCD- and QED-penguin operators to study weak annihilation and new-physics effects in the framework of QCD factorization. In particular we introduce one weak-annihilation parameter for decays related by [Formula: see text] quark interchange and test this universality assumption. Within the standard model, the data supports this assumption with the only exceptions in the [Formula: see text] system, which exhibits the well-known "[Formula: see text] puzzle", and some tensions in [Formula: see text]. Beyond the standard model, we simultaneously determine weak-annihilation and new-physics parameters from data, employing model-independent scenarios that address the "[Formula: see text] puzzle", such as QED-penguins and [Formula: see text] current-current operators. We discuss also possibilities that allow further tests of our assumption once improved measurements from LHCb and Belle II become available.
AIR QUALITY SIMULATION MODEL PERFORMANCE FOR ONE-HOUR AVERAGES
If a one-hour standard for sulfur dioxide were promulgated, air quality dispersion modeling in the vicinity of major point sources would be an important air quality management tool. Would currently available dispersion models be suitable for use in demonstrating attainment of suc...
NASA Technical Reports Server (NTRS)
Wilson, C.; Dye, R.; Reed, L.
1982-01-01
The errors associated with planimetric mapping of the United States using satellite remote sensing techniques are analyzed. Assumptions concerning the state of the art achievable for satellite mapping systems and platforms in the 1995 time frame are made. An analysis of these performance parameters is made using an interactive cartographic satellite computer model, after first validating the model using LANDSAT 1 through 3 performance parameters. An investigation of current large scale (1:24,000) US National mapping techniques is made. Using the results of this investigation, and current national mapping accuracy standards, the 1995 satellite mapping system is evaluated for its ability to meet US mapping standards for planimetric and topographic mapping at scales of 1:24,000 and smaller.
Harmonization of standards for parabolic trough collector testing in solar thermal power plants
NASA Astrophysics Data System (ADS)
Sallaberry, Fabienne; Valenzuela, Loreto; Palacin, Luis G.; Leon, Javier; Fischer, Stephan; Bohren, Andreas
2017-06-01
The technology of parabolic trough collectors (PTC) is used widely in concentrating Solar Power (CSP) plants worldwide. However this type of large-size collectors cannot be officially tested by an accredited laboratory and certified by an accredited certification body so far, as there is no standard adapted to its particularity, and the current published standard for solar thermal collectors are not completely applicable to them. Recently some standardization committees have been working on this technology. This paper aims to give a summary of the standardized testing methodology of large-size PTC for CSP plants, giving the physical model chosen for modeling the thermal performance of the collector in the new revision of standard ISO 9806 and the points still to be improved in the standard draft IEC 62862-3-2. In this paper, a summary of the testing validation performed on one parabolic trough collector installed in one of the test facilities at the Plataforma Solar de Almería (PSA) with this new model is also presented.
ERIC Educational Resources Information Center
Wambu, Grace W.; Fisher, Teresa A.
2015-01-01
Despite the government's emphasis on guidance and counseling program implementation in Kenyan schools and a rapid increase in the number of trained school counselors, lack of standardized training curriculums, ethical standards, counseling models, and role ambiguity persist. This article reviews the historical development of guidance and…
Osteotomy models - the current status on pain scoring and management in small rodents.
Lang, Annemarie; Schulz, Anja; Ellinghaus, Agnes; Schmidt-Bleek, Katharina
2016-12-01
Fracture healing is a complex regeneration process which produces new bone tissue without scar formation. However, fracture healing disorders occur in approximately 10% of human patients and cause severe pain and reduced quality of life. Recently, the development of more standardized, sophisticated and commercially available osteosynthesis techniques reflecting clinical approaches has increased the use of small rodents such as rats and mice in bone healing research dramatically. Nevertheless, there is no standard for pain assessment, especially in these species, and consequently limited information regarding the welfare aspects of osteotomy models. Moreover, the selection of analgesics is restricted for osteotomy models since non-steroidal anti-inflammatory drugs (NSAIDs) are known to affect the initial, inflammatory phase of bone healing. Therefore, opioids such as buprenorphine and tramadol are often used. However, dosage data in the literature are varied. Within this review, we clarify the background of osteotomy models, explain the current status and challenges of animal welfare assessment, and provide an example score sheet including model specific parameters. Furthermore, we summarize current refinement options and present a brief outlook on further 3R research. © The Author(s) 2016.
Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin
2016-12-05
Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.
Update to the USDA-ARS fixed-wing spray nozzle models
USDA-ARS?s Scientific Manuscript database
The current USDA ARS Aerial Spray Nozzle Models were updated to reflect both new standardized measurement methods and systems, as well as, to increase operational spray pressure, aircraft airspeed and nozzle orientation angle limits. The new models were developed using both Central Composite Design...
Designing an evaluation framework for WFME basic standards for medical education.
Tackett, Sean; Grant, Janet; Mmari, Kristin
2016-01-01
To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.
The rare decay K+ → π +ν overlineν as a standard model "standard"
NASA Astrophysics Data System (ADS)
Bigi, I. I.; Gabbiani, F.
1991-12-01
A priori one would expect that extensions of the Standard Model can significantly enhance (or suppress) BR( K+ → π +ν overlineν. We have analyzed many different classes of such extensions: models with a non-minimal Higgs sector or with right-handed currents; fourth family extensions; SUSY in both the minimal and non-minimal version, and even with broken R-parity.We find that - apart from a few somewhat exotic exceptions - these extensions have little direct impact on K+ → π +ν overlineν due to constraints that are inferred from Bd- overlineBd and K0- overlineK0 mixing and upper bounds on B → K∗γ . Accordingly K+ → ν +ν overlineν probes very cleanly the top mass and the KM parameter | V(td) V(ts)|, two fundamental parameters in the Standard Model.
Solar Luminosity on the Main Sequence, Standard Model and Variations
NASA Astrophysics Data System (ADS)
Ayukov, S. V.; Baturin, V. A.; Gorshkov, A. B.; Oreshina, A. V.
2017-05-01
Our Sun became Main Sequence star 4.6 Gyr ago according Standard Solar Model. At that time solar luminosity was 30% lower than current value. This conclusion is based on assumption that Sun is fueled by thermonuclear reactions. If Earth's albedo and emissivity in infrared are unchanged during Earth history, 2.3 Gyr ago oceans had to be frozen. This contradicts to geological data: there was liquid water 3.6-3.8 Gyr ago on Earth. This problem is known as Faint Young Sun Paradox. We analyze luminosity change in standard solar evolution theory. Increase of mean molecular weight in the central part of the Sun due to conversion of hydrogen to helium leads to gradual increase of luminosity with time on the Main Sequence. We also consider several exotic models: fully mixed Sun; drastic change of pp reaction rate; Sun consisting of hydrogen and helium only. Solar neutrino observations however exclude most non-standard solar models.
NASA Astrophysics Data System (ADS)
Lingel, Karen; Skwarnicki, Tomasz; Smith, James G.
Penguin, or loop, decays of B mesons induce effective flavor-changing neutral currents, which are forbidden at tree level in the standard model. These decays give special insight into the CKM matrix and are sensitive to non-standard-model effects. In this review, we give a historical and theoretical introduction to penguins and a description of the various types of penguin processes: electromagnetic, electroweak, and gluonic. We review the experimental searches for penguin decays, including the measurements of the electromagnetic penguins b -> sgamma and B -> K*gamma and gluonic penguins B -> Kpi, B+ -> omegaK+ and B -> eta'K, and their implications for the standard model and new physics. We conclude by exploring the future prospects for penguin physics.
Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition
2013-06-01
building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:
New approach to flavor symmetry and an extended naturalness principle
NASA Astrophysics Data System (ADS)
Barr, S. M.
2010-09-01
A class of nonsupersymmetric extensions of the standard model is proposed in which there is a multiplicity of light scalar doublets in a multiplet of a nonabelian family group with the standard model Higgs doublet. Anthropic tuning makes the latter light, and consequently the other scalar doublets remain light because of the family symmetry. The family symmetry greatly constrains the pattern of flavor-changing neutral-current interactions (FCNC) and p decay operators coming from scalar-exchange. Such models show that useful constraints on model-building can come from an extended naturalness principle when the electroweak scale is anthropically tuned.
Precision measurement of the weak charge of the proton.
2018-05-01
Large experimental programmes in the fields of nuclear and particle physics search for evidence of physics beyond that explained by current theories. The observation of the Higgs boson completed the set of particles predicted by the standard model, which currently provides the best description of fundamental particles and forces. However, this theory's limitations include a failure to predict fundamental parameters, such as the mass of the Higgs boson, and the inability to account for dark matter and energy, gravity, and the matter-antimatter asymmetry in the Universe, among other phenomena. These limitations have inspired searches for physics beyond the standard model in the post-Higgs era through the direct production of additional particles at high-energy accelerators, which have so far been unsuccessful. Examples include searches for supersymmetric particles, which connect bosons (integer-spin particles) with fermions (half-integer-spin particles), and for leptoquarks, which mix the fundamental quarks with leptons. Alternatively, indirect searches using precise measurements of well predicted standard-model observables allow highly targeted alternative tests for physics beyond the standard model because they can reach mass and energy scales beyond those directly accessible by today's high-energy accelerators. Such an indirect search aims to determine the weak charge of the proton, which defines the strength of the proton's interaction with other particles via the well known neutral electroweak force. Because parity symmetry (invariance under the spatial inversion (x, y, z) → (-x, -y, -z)) is violated only in the weak interaction, it provides a tool with which to isolate the weak interaction and thus to measure the proton's weak charge 1 . Here we report the value 0.0719 ± 0.0045, where the uncertainty is one standard deviation, derived from our measured parity-violating asymmetry in the scattering of polarized electrons on protons, which is -226.5 ± 9.3 parts per billion (the uncertainty is one standard deviation). Our value for the proton's weak charge is in excellent agreement with the standard model 2 and sets multi-teraelectronvolt-scale constraints on any semi-leptonic parity-violating physics not described within the standard model. Our results show that precision parity-violating measurements enable searches for physics beyond the standard model that can compete with direct searches at high-energy accelerators and, together with astronomical observations, can provide fertile approaches to probing higher mass scales.
NASA Astrophysics Data System (ADS)
Marshall, R. H.; Gabrys, R.
2016-12-01
NASA Goddard Space Flight Center has developed a systemic educator professional development model for the integration of NASA climate change resources into the K-12 classroom. The desired outcome of this model is to prepare teachers in STEM disciplines to be globally engaged and knowledgeable of current climate change research and its potential for content relevancy alignment to standard-based curriculum. The application and mapping of the model is based on the state education needs assessment, alignment to the Next Generation Science Standards (NGSS), and implementation framework developed by the consortium of district superintendents and their science supervisors. In this presentation, we will demonstrate best practices for extending the concept of inquiry-based and project-based learning through the integration of current NASA climate change research into curriculum unit lessons. This model includes a significant teacher development component focused on capacity development for teacher instruction and pedagogy aimed at aligning NASA climate change research to related NGSS student performance expectations and subsequent Crosscutting Concepts, Science and Engineering Practices, and Disciplinary Core Ideas, a need that was presented by the district steering committee as critical for ensuring sustainability and high-impact in the classroom. This model offers a collaborative and inclusive learning community that connects classroom teachers to NASA climate change researchers via an ongoing consultant/mentoring approach. As a result of the first year of implementation of this model, Maryland teachers are implementing NGSS unit lessons that guide students in open-ended research based on current NASA climate change research.
ERIC Educational Resources Information Center
Yan, Duanli; Lewis, Charles; Stocking, Martha
It is unrealistic to suppose that standard item response theory (IRT) models will be appropriate for all new and currently considered computer-based tests. In addition to developing new models, researchers will need to give some attention to the possibility of constructing and analyzing new tests without the aid of strong models. Computerized…
Current Status of Multidisciplinary Care in Psoriatic Arthritis in Spain: NEXUS 2.0 Project.
Queiro, Rubén; Coto, Pablo; Joven, Beatriz; Rivera, Raquel; Navío Marco, Teresa; de la Cueva, Pablo; Alvarez Vega, Jose Luis; Narváez Moreno, Basilio; Rodriguez Martínez, Fernando José; Pardo Sánchez, José; Feced Olmos, Carlos; Pujol, Conrad; Rodríguez, Jesús; Notario, Jaume; Pujol Busquets, Manel; García Font, Mercè; Galindez, Eva; Pérez Barrio, Silvia; Urruticoechea-Arana, Ana; Hergueta, Merce; López Montilla, M Dolores; Vélez García-Nieto, Antonio; Maceiras, Francisco; Rodríguez Pazos, Laura; Rubio Romero, Esteban; Rodríguez Fernandez Freire, Lourdes; Luelmo, Jesús; Gratacós, Jordi
2018-02-26
1) To analyze the implementation of multidisciplinary care models in psoriatic arthritis (PsA) patients, 2) To define minimum and excellent standards of care. A survey was sent to clinicians who already performed multidisciplinary care or were in the process of undertaking it, asking: 1) Type of multidisciplinary care model implemented; 2) Degree, priority and feasibility of the implementation of quality standards in the structure, process and result for care. In 6 regional meetings the results of the survey were presented and discussed, and the ultimate priority of quality standards for care was defined. At a nominal meeting group, 11 experts (rheumatologists and dermatologists) analyzed the results of the survey and the regional meetings. With this information, they defined which standards of care are currently considered as minimum and which are excellent. The simultaneous and parallel models of multidisciplinary care are those most widely implemented, but the implementation of quality standards is highly variable. In terms of structure it ranges from 22% to 74%, in those related to process from 17% to 54% and in the results from 2% to 28%. Of the 25 original quality standards for care, 9 were considered only minimum, 4 were excellent and 12 defined criteria for minimum level and others for excellence. The definition of minimum and excellent quality standards for care will help achieve the goal of multidisciplinary care for patients with PAs, which is the best healthcare possible. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Defining the Core Archive Data Standards of the International Planetary Data Alliance (IPDA)
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Dan; Beebe, Reta; Guinness, Ed; Heather, David; Zender, Joe
2007-01-01
A goal of the International Planetary Data Alliance (lPDA) is to develop a set of archive data standards that enable the sharing of scientific data across international agencies and missions. To help achieve this goal, the IPDA steering committee initiated a six month proj ect to write requirements for and draft an information model based on the Planetary Data System (PDS) archive data standards. The project had a special emphasis on data formats. A set of use case scenarios were first developed from which a set of requirements were derived for the IPDA archive data standards. The special emphasis on data formats was addressed by identifying data formats that have been used by PDS nodes and other agencies in the creation of successful data sets for the Planetary Data System (PDS). The dependency of the IPDA information model on the PDS archive standards required the compilation of a formal specification of the archive standards currently in use by the PDS. An ontology modelling tool was chosen to capture the information model from various sources including the Planetary Science Data Dictionary [I] and the PDS Standards Reference [2]. Exports of the modelling information from the tool database were used to produce the information model document using an object-oriented notation for presenting the model. The tool exports can also be used for software development and are directly accessible by semantic web applications.
Searching for new physics at the frontiers with lattice quantum chromodynamics.
Van de Water, Ruth S
2012-07-01
Numerical lattice-quantum chromodynamics (QCD) simulations, when combined with experimental measurements, allow the determination of fundamental parameters of the particle-physics Standard Model and enable searches for physics beyond-the-Standard Model. We present the current status of lattice-QCD weak matrix element calculations needed to obtain the elements and phase of the Cabibbo-Kobayashi-Maskawa (CKM) matrix and to test the Standard Model in the quark-flavor sector. We then discuss evidence that may hint at the presence of new physics beyond the Standard Model CKM framework. Finally, we discuss two opportunities where we expect lattice QCD to play a pivotal role in searching for, and possibly discovery of, new physics at upcoming high-intensity experiments: rare decays and the muon anomalous magnetic moment. The next several years may witness the discovery of new elementary particles at the Large Hadron Collider (LHC). The interplay between lattice QCD, high-energy experiments at the LHC, and high-intensity experiments will be needed to determine the underlying structure of whatever physics beyond-the-Standard Model is realized in nature. © 2012 New York Academy of Sciences.
Standard electrode potential, Tafel equation, and the solvation thermodynamics.
Matyushov, Dmitry V
2009-06-21
Equilibrium in the electronic subsystem across the solution-metal interface is considered to connect the standard electrode potential to the statistics of localized electronic states in solution. We argue that a correct derivation of the Nernst equation for the electrode potential requires a careful separation of the relevant time scales. An equation for the standard metal potential is derived linking it to the thermodynamics of solvation. The Anderson-Newns model for electronic delocalization between the solution and the electrode is combined with a bilinear model of solute-solvent coupling introducing nonlinear solvation into the theory of heterogeneous electron transfer. We therefore are capable of addressing the question of how nonlinear solvation affects electrochemical observables. The transfer coefficient of electrode kinetics is shown to be equal to the derivative of the free energy, or generalized force, required to shift the unoccupied electronic level in the bulk. The transfer coefficient thus directly quantifies the extent of nonlinear solvation of the redox couple. The current model allows the transfer coefficient to deviate from the value of 0.5 of the linear solvation models at zero electrode overpotential. The electrode current curves become asymmetric in respect to the change in the sign of the electrode overpotential.
Current National Approach to Healthcare ICT Standardization: Focus on Progress in New Zealand.
Park, Young-Taek; Atalag, Koray
2015-07-01
Many countries try to efficiently deliver high quality healthcare services at lower and manageable costs where healthcare information and communication technologies (ICT) standardisation may play an important role. New Zealand provides a good model of healthcare ICT standardisation. The purpose of this study was to review the current healthcare ICT standardisation and progress in New Zealand. This study reviewed the reports regarding the healthcare ICT standardisation in New Zealand. We also investigated relevant websites related with the healthcare ICT standards, most of which were run by the government. Then, we summarised the governance structure, standardisation processes, and their output regarding the current healthcare ICT standards status of New Zealand. New Zealand government bodies have established a set of healthcare ICT standards and clear guidelines and procedures for healthcare ICT standardisation. Government has actively participated in various enactments of healthcare ICT standards from the inception of ideas to their eventual retirement. Great achievements in eHealth have already been realized, and various standards are currently utilised at all levels of healthcare regionally and nationally. Standard clinical terminologies, such as International Classification of Diseases (ICD) and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED-CT) have been adopted and Health Level Seven (HL7) standards are actively used in health information exchanges. The government to New Zealand has well organised ICT institutions, guidelines, and regulations, as well as various programs, such as e-Medications and integrated care services. Local district health boards directly running hospitals have effectively adopted various new ICT standards. They might already be benefiting from improved efficiency resulting from healthcare ICT standardisation.
Comparison of Aerodynamic Resistance Parameterizations and Implications for Dry Deposition Modeling
Nitrogen deposition data used to support the secondary National Ambient Air Quality Standards and critical loads research derives from both measurements and modeling. Data sets with spatial coverage sufficient for regional scale deposition assessments are currently generated fro...
MODEL HARMONIZATION POTENTIAL AND BENEFITS
The IPCS Harmonization Project, which is currently ongoing under the auspices of the WHO, in the context of chemical risk assessment or exposure modeling, does not imply global standardization. Instead, harmonization is thought of as an effort to strive for consistency among appr...
Data needs for X-ray astronomy satellites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, T.
I review the current status of atomic data for X-ray astronomy satellites. This includes some of the astrophysical issues which can be addressed, current modeling and analysis techniques, computational tools, the limitations imposed by currently available atomic data, and the validity of standard assumptions. I also discuss the future: challenges associated with future missions and goals for atomic data collection.
Top quark rare decays via loop-induced FCNC interactions in extended mirror fermion model
NASA Astrophysics Data System (ADS)
Hung, P. Q.; Lin, Yu-Xiang; Nugroho, Chrisna Setyo; Yuan, Tzu-Chiang
2018-02-01
Flavor changing neutral current (FCNC) interactions for a top quark t decays into Xq with X represents a neutral gauge or Higgs boson, and q a up- or charm-quark are highly suppressed in the Standard Model (SM) due to the Glashow-Iliopoulos-Miami mechanism. Whilst current limits on the branching ratios of these processes have been established at the order of 10-4 from the Large Hadron Collider experiments, SM predictions are at least nine orders of magnitude below. In this work, we study some of these FCNC processes in the context of an extended mirror fermion model, originally proposed to implement the electroweak scale seesaw mechanism for non-sterile right-handed neutrinos. We show that one can probe the process t → Zc for a wide range of parameter space with branching ratios varying from 10-6 to 10-8, comparable with various new physics models including the general two Higgs doublet model with or without flavor violations at tree level, minimal supersymmetric standard model with or without R-parity, and extra dimension model.
Search for the flavor-changing neutral-current decay t-->Zq in pp collisions at sqrt[s] = 1.96 TeV.
Aaltonen, T; Adelman, J; Akimoto, T; Albrow, M G; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Bednar, P; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Copic, K; Cordelli, M; Cortiana, G; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'Orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Handler, R; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hauser, J; Hays, C; Heck, M; Heijboer, A; Heinemann, B; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Koay, S A; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kusakabe, Y; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; LeCompte, T; Lee, E; Lee, S W; Leone, S; Lewis, J D; Lin, C S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lu, R-S; Lucchesi, D; Lueck, J; Luci, C; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Lytken, E; Mack, P; MacQueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlok, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Neu, C; Neubauer, M S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Osterberg, K; Griso, S Pagan; Pagliarone, C; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Reisert, B; Rekovic, V; Renton, P; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Saarikko, H; Safonov, A; Sakumoto, W K; Saltó, O; Saltzberg, D; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Scheidle, T; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scott, A L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Sherman, D; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Stuart, D; Suh, J S; Sukhanov, A; Sutherland, M; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Tiwari, V; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Veszpremi, V; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Würthwein, F; Wagner, P; Wagner, R G; Wagner, R L; Wagner-Kuhr, J; Wagner, W; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Wynne, S M; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zaw, I; Zhang, X; Zheng, Y; Zucchelli, S
2008-11-07
We report a search for the flavor-changing neutral-current decay of the top quark t-->Zq (q=u, c) in pp collisions at sqrt[s]=1.96 TeV using a data sample corresponding to an integrated luminosity of 1.9 fb(-1) collected by the CDF II detector. This decay is strongly suppressed in the standard model and an observation of a signal at the Fermilab Tevatron would be an indication of physics beyond the standard model. Using Z+ > or = 4 jet final state candidate events, with and without an identified bottom quark jet, we obtain an upper limit of B(t-->Zq) < 3.7% at 95% C.L.
Ecological models supporting environmental decision making: a strategy for the future
Schmolke, Amelie; Thorbek, Pernille; DeAngelis, Donald L.; Grimm, Volker
2010-01-01
Ecological models are important for environmental decision support because they allow the consequences of alternative policies and management scenarios to be explored. However, current modeling practice is unsatisfactory. A literature review shows that the elements of good modeling practice have long been identified but are widely ignored. The reasons for this might include lack of involvement of decision makers, lack of incentives for modelers to follow good practice, and the use of inconsistent terminologies. As a strategy for the future, we propose a standard format for documenting models and their analyses: transparent and comprehensive ecological modeling (TRACE) documentation. This standard format will disclose all parts of the modeling process to scrutiny and make modeling itself more efficient and coherent.
The Lom Approach--a Call for Concern?
ERIC Educational Resources Information Center
Armitage, Nicholas; Bowerman, Chris
2005-01-01
The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…
The LOM Approach -- A CALL for Concern?
ERIC Educational Resources Information Center
Armitage, Nicholas; Bowerman, Chris
2005-01-01
The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…
A Review of Roles and Responsibilities: Restructuring for Excellence in the School System.
ERIC Educational Resources Information Center
Society for the Advancement of Excellence in Education, Kelowna (British Columbia).
Recommendations are presented for a new form of school governance in British Columbia that takes into account current research on effective schools. In the model described, the provincial government provides the funding, sets the core curriculum, standards, and outcomes, ensures standardized measurement and reporting, and supports field research.…
A quasi-current representation for information needs inspired by Two-State Vector Formalism
NASA Astrophysics Data System (ADS)
Wang, Panpan; Hou, Yuexian; Li, Jingfei; Zhang, Yazhou; Song, Dawei; Li, Wenjie
2017-09-01
Recently, a number of quantum theory (QT)-based information retrieval (IR) models have been proposed for modeling session search task that users issue queries continuously in order to describe their evolving information needs (IN). However, the standard formalism of QT cannot provide a complete description for users' current IN in a sense that it does not take the 'future' information into consideration. Therefore, to seek a more proper and complete representation for users' IN, we construct a representation of quasi-current IN inspired by an emerging Two-State Vector Formalism (TSVF). With the enlightenment of the completeness of TSVF, a "two-state vector" derived from the 'future' (the current query) and the 'history' (the previous query) is employed to describe users' quasi-current IN in a more complete way. Extensive experiments are conducted on the session tracks of TREC 2013 & 2014, and show that our model outperforms a series of compared IR models.
Hertäg, Loreen; Hass, Joachim; Golovko, Tatiana; Durstewitz, Daniel
2012-01-01
For large-scale network simulations, it is often desirable to have computationally tractable, yet in a defined sense still physiologically valid neuron models. In particular, these models should be able to reproduce physiological measurements, ideally in a predictive sense, and under different input regimes in which neurons may operate in vivo. Here we present an approach to parameter estimation for a simple spiking neuron model mainly based on standard f-I curves obtained from in vitro recordings. Such recordings are routinely obtained in standard protocols and assess a neuron's response under a wide range of mean-input currents. Our fitting procedure makes use of closed-form expressions for the firing rate derived from an approximation to the adaptive exponential integrate-and-fire (AdEx) model. The resulting fitting process is simple and about two orders of magnitude faster compared to methods based on numerical integration of the differential equations. We probe this method on different cell types recorded from rodent prefrontal cortex. After fitting to the f-I current-clamp data, the model cells are tested on completely different sets of recordings obtained by fluctuating ("in vivo-like") input currents. For a wide range of different input regimes, cell types, and cortical layers, the model could predict spike times on these test traces quite accurately within the bounds of physiological reliability, although no information from these distinct test sets was used for model fitting. Further analyses delineated some of the empirical factors constraining model fitting and the model's generalization performance. An even simpler adaptive LIF neuron was also examined in this context. Hence, we have developed a "high-throughput" model fitting procedure which is simple and fast, with good prediction performance, and which relies only on firing rate information and standard physiological data widely and easily available.
La Barbera, Luigi; Galbusera, Fabio; Wilke, Hans-Joachim; Villa, Tomaso
2016-09-01
To discuss whether the available standard methods for preclinical evaluation of posterior spine stabilization devices can represent basic everyday life activities and how to compare the results obtained with different procedures. A comparative finite element study compared ASTM F1717 and ISO 12189 standards to validated instrumented L2-L4 segments undergoing standing, upper body flexion and extension. The internal loads on the spinal rod and the maximum stress on the implant are analysed. ISO recommended anterior support stiffness and force allow for reproducing bending moments measured in vivo on an instrumented physiological segment during upper body flexion. Despite the significance of ASTM model from an engineering point of view, the overly conservative vertebrectomy model represents an unrealistic worst case scenario. A method is proposed to determine the load to apply on assemblies with different anterior support stiffnesses to guarantee a comparable bending moment and reproduce specific everyday life activities. The study increases our awareness on the use of the current standards to achieve meaningful results easy to compare and interpret.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
Improving automation standards via semantic modelling: Application to ISA88.
Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès
2017-03-01
Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
A physiome standards-based model publication paradigm.
Nickerson, David P; Buist, Martin L
2009-05-28
In this era of widespread broadband Internet penetration and powerful Web browsers on most desktops, a shift in the publication paradigm for physiome-style models is envisaged. No longer will model authors simply submit an essentially textural description of the development and behaviour of their model. Rather, they will submit a complete working implementation of the model encoded and annotated according to the various standards adopted by the physiome project, accompanied by a traditional human-readable summary of the key scientific goals and outcomes of the work. While the final published, peer-reviewed article will look little different to the reader, in this new paradigm, both reviewers and readers will be able to interact with, use and extend the models in ways that are not currently possible. Here, we review recent developments that are laying the foundations for this new model publication paradigm. Initial developments have focused on the publication of mathematical models of cellular electrophysiology, using technology based on a CellML- or Systems Biology Markup Language (SBML)-encoded implementation of the mathematical models. Here, we review the current state of the art and what needs to be done before such a model publication becomes commonplace.
Bio-Optical Measurement and Modeling of the California Current and Southern Oceans
NASA Technical Reports Server (NTRS)
Mitchell, B. Gregg; Mitchell, B. Greg
2003-01-01
The SIMBIOS project's principal goals are to validate standard or experimental ocean color products through detailed bio-optical and biogeochemical measurements, and to combine Ocean optical observations with modeling to contribute to satellite vicarious radiometric calibration and algorithm development.
Neutrino in standard model and beyond
NASA Astrophysics Data System (ADS)
Bilenky, S. M.
2015-07-01
After discovery of the Higgs boson at CERN the Standard Model acquired a status of the theory of the elementary particles in the electroweak range (up to about 300 GeV). What general conclusions can be inferred from the Standard Model? It looks that the Standard Model teaches us that in the framework of such general principles as local gauge symmetry, unification of weak and electromagnetic interactions and Brout-Englert-Higgs spontaneous breaking of the electroweak symmetry nature chooses the simplest possibilities. Two-component left-handed massless neutrino fields play crucial role in the determination of the charged current structure of the Standard Model. The absence of the right-handed neutrino fields in the Standard Model is the simplest, most economical possibility. In such a scenario Majorana mass term is the only possibility for neutrinos to be massive and mixed. Such mass term is generated by the lepton-number violating Weinberg effective Lagrangian. In this approach three Majorana neutrino masses are suppressed with respect to the masses of other fundamental fermions by the ratio of the electroweak scale and a scale of a lepton-number violating physics. The discovery of the neutrinoless double β-decay and absence of transitions of flavor neutrinos into sterile states would be evidence in favor of the minimal scenario we advocate here.
Geo3DML: A standard-based exchange format for 3D geological models
NASA Astrophysics Data System (ADS)
Wang, Zhangang; Qu, Honggang; Wu, Zixing; Wang, Xianghong
2018-01-01
A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS).
NASA Astrophysics Data System (ADS)
Riddick, Andrew; Hughes, Andrew; Harpham, Quillon; Royse, Katherine; Singh, Anubha
2014-05-01
There has been an increasing interest both from academic and commercial organisations over recent years in developing hydrologic and other environmental models in response to some of the major challenges facing the environment, for example environmental change and its effects and ensuring water resource security. This has resulted in a significant investment in modelling by many organisations both in terms of financial resources and intellectual capital. To capitalise on the effort on producing models, then it is necessary for the models to be both discoverable and appropriately described. If this is not undertaken then the effort in producing the models will be wasted. However, whilst there are some recognised metadata standards relating to datasets these may not completely address the needs of modellers regarding input data for example. Also there appears to be a lack of metadata schemes configured to encourage the discovery and re-use of the models themselves. The lack of an established standard for model metadata is considered to be a factor inhibiting the more widespread use of environmental models particularly the use of linked model compositions which fuse together hydrologic models with models from other environmental disciplines. This poster presents the results of a Natural Environment Research Council (NERC) funded scoping study to understand the requirements of modellers and other end users for metadata about data and models. A user consultation exercise using an on-line questionnaire has been undertaken to capture the views of a wide spectrum of stakeholders on how they are currently managing metadata for modelling. This has provided a strong confirmation of our original supposition that there is a lack of systems and facilities to capture metadata about models. A number of specific gaps in current provision for data and model metadata were also identified, including a need for a standard means to record detailed information about the modelling environment and the model code used, to assist the selection of models for linked compositions. Existing best practice, including the use of current metadata standards (e.g. ISO 19110, ISO 19115 and ISO 19119) and the metadata components of WaterML were also evaluated. In addition to commonly used metadata attributes (e.g. spatial reference information) there was significant interest in recording a variety of additional metadata attributes. These included more detailed information about temporal data, and also providing estimates of data accuracy and uncertainty within metadata. This poster describes the key results of this study, including a number of gaps in the provision of metadata for modelling, and outlines how these might be addressed. Overall the scoping study has highlighted significant interest in addressing this issue within the environmental modelling community. There is therefore an impetus for on-going research, and we are seeking to take this forward through collaboration with other interested organisations. Progress towards an internationally recognised model metadata standard is suggested.
Outputs as Educator Effectiveness in the United States: Shifting towards Political Accountability
ERIC Educational Resources Information Center
Piro, Jody S.; Mullen, Laurie
2013-01-01
The definition of educator effectiveness is being redefined by econometric modeling to evidence student achievement on standardized tests. While the reasons that econometric frameworks are in vogue are many, it is clear that the strength of such models lie in the quantifiable evidence of student learning. Current accountability models frame…
Regional-scale air quality models are being used to demonstrate attainment of the ozone air quality standard. In current regulatory applications, a regional-scale air quality model is applied for a base year and a future year with reduced emissions using the same meteorological ...
A cautionary note concerning the use of stabilized weights in marginal structural models.
Talbot, Denis; Atherton, Juli; Rossi, Amanda M; Bacon, Simon L; Lefebvre, Geneviève
2015-02-28
Marginal structural models are commonly used to estimate the causal effect of a time-varying treatment in presence of time-dependent confounding. When fitting an MSM to data, the analyst must specify both the structural model for the outcome and the treatment models for the inverse-probability-of-treatment weights. The use of stabilized weights is recommended because they are generally less variable than the standard weights. In this paper, we are concerned with the use of the common stabilized weights when the structural model is specified to only consider partial treatment history, such as the current or most recent treatments. We present various examples of settings where these stabilized weights yield biased inferences while the standard weights do not. These issues are first investigated on the basis of simulated data and subsequently exemplified using data from the Honolulu Heart Program. Unlike common stabilized weights, we find that basic stabilized weights offer some protection against bias in structural models designed to estimate current or most recent treatment effects. Copyright © 2014 John Wiley & Sons, Ltd.
Design principles for shift current photovoltaics
Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; ...
2017-01-25
While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less
Design principles for shift current photovoltaics
Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; Coh, Sinisa; Moore, Joel E.
2017-01-01
While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. By analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. Our method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenides such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W−1. Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells. PMID:28120823
Design principles for shift current photovoltaics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando
While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less
Current progress in patient-specific modeling
2010-01-01
We present a survey of recent advancements in the emerging field of patient-specific modeling (PSM). Researchers in this field are currently simulating a wide variety of tissue and organ dynamics to address challenges in various clinical domains. The majority of this research employs three-dimensional, image-based modeling techniques. Recent PSM publications mostly represent feasibility or preliminary validation studies on modeling technologies, and these systems will require further clinical validation and usability testing before they can become a standard of care. We anticipate that with further testing and research, PSM-derived technologies will eventually become valuable, versatile clinical tools. PMID:19955236
NASA Astrophysics Data System (ADS)
Curciarello, Francesca
2016-04-01
e+e- collider experiments at the intensity frontier are naturally suited to probe the existence of a force beyond the Standard Model between WIMPs, the most viable dark matter candidates. The mediator of this new force, known as dark photon, should be a new vector gauge boson very weakly coupled to the Standard Model photon. No significant signal has been observed so far. I will report on current limits set on the coupling factor ɛ2 between the photon and the dark photon by e+e- collider experiments.
Testing the standard model by precision measurement of the weak charges of quarks.
Young, R D; Carlini, R D; Thomas, A W; Roche, J
2007-09-21
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low energy. The precision of this new result, combined with earlier atomic parity-violation measurements, places tight constraints on the size of possible contributions from physics beyond the standard model. Consequently, this result improves the lower-bound on the scale of relevant new physics to approximately 1 TeV.
Modeling of Schottky barrier diode characteristics on heteroepitaxial β-gallium oxide thin films
NASA Astrophysics Data System (ADS)
Splith, Daniel; Müller, Stefan; von Wenckstern, Holger; Grundmann, Marius
2018-02-01
When investigating Schottky contacts on heteroepitaxial β-Ga2O3 thin films, several non-idealities are observed in the current voltage characteristics, which cannot be accounted for with the standard diode current models. In this article, we therefore employed a model for the rigorous calculation of the diode currents in order to understand the origin of this non-idealities. Using the model and a few parameters determined from the measurements, we were able to simulate the characteristics with good agreement to the measured data for temperatures between 30 °C and 150 °C. Fitting of the simulated curves to the measured curves allows a deeper insight into the microscopic origins of said non-idealities.
DEVELOPMENT OF A MODEL FOR REAL TIME CO CONCENTRATIONS NEAR ROADWAYS
Although emission standards for mobile sources continue to be tightened, tailpipe emissions in urban areas continue to be a major source of human exposure to air toxics. Current human exposure models using simplified assumptions based on fixed air monitoring stations and region...
Improvements to Wire Bundle Thermal Modeling for Ampacity Determination
NASA Technical Reports Server (NTRS)
Rickman, Steve L.; Iannello, Christopher J.; Shariff, Khadijah
2017-01-01
Determining current carrying capacity (ampacity) of wire bundles in aerospace vehicles is critical not only to safety but also to efficient design. Published standards provide guidance on determining wire bundle ampacity but offer little flexibility for configurations where wire bundles of mixed gauges and currents are employed with varying external insulation jacket surface properties. Thermal modeling has been employed in an attempt to develop techniques to assist in ampacity determination for these complex configurations. Previous developments allowed analysis of wire bundle configurations but was constrained to configurations comprised of less than 50 elements. Additionally, for vacuum analyses, configurations with very low emittance external jackets suffered from numerical instability in the solution. A new thermal modeler is presented allowing for larger configurations and is not constrained for low bundle infrared emissivity calculations. Formulation of key internal radiation and interface conductance parameters is discussed including the effects of temperature and air pressure on wire to wire thermal conductance. Test cases comparing model-predicted ampacity and that calculated from standards documents are presented.
Patrick Reilly, J
2014-10-01
Differences between IEEE C95 Standards (C95.6-2002 and C95.1-2005) in the low-frequency (1 Hz-100 kHz) and the ICNIRP-2010 guidelines appear across the frequency spectrum. Factors accounting for lack of convergence include: differences between the IEEE standards and the ICNIRP guidelines with respect to biological induction models, stated objectives, data trail from experimentally derived thresholds through physical and biological principles, selection and justification of safety/reduction factors, use of probability models, compliance standards for the limbs as distinct from the whole body, defined population categories, strategies for central nervous system protection below 20 Hz, and correspondence of environmental electric field limits with contact currents. This paper discusses these factors and makes the case for adoption of the limits in the IEEE standards.
A modeling analysis of alternative primary and secondary US ozone standards in urban and rural areas
NASA Astrophysics Data System (ADS)
Nopmongcol, Uarporn; Emery, Chris; Sakulyanontvittaya, Tanarit; Jung, Jaegun; Knipping, Eladio; Yarwood, Greg
2014-12-01
This study employed the High-Order Decoupled Direct Method (HDDM) of sensitivity analysis in a photochemical grid model to determine US anthropogenic emissions reductions required from 2006 levels to meet alternative US primary (health-based) and secondary (welfare-based) ozone (O3) standards. Applying the modeling techniques developed by Yarwood et al. (2013), we specifically evaluated sector-wide emission reductions needed to meet primary standards in the range of 60-75 ppb, and secondary standards in the range of 7-15 ppm-h, in 22 cities and at 20 rural sites across the US for NOx-only, combined NOx and VOC, and VOC-only scenarios. Site-specific model biases were taken into account by applying adjustment factors separately for the primary and secondary standard metrics, analogous to the US Environmental Protection Agency's (EPA) relative response factor technique. Both bias-adjusted and unadjusted results are presented and analyzed. We found that the secondary metric does not necessarily respond to emission reductions the same way the primary metric does, indicating sensitivity to their different forms. Combined NOx and VOC reductions are most effective for cities, whereas NOx-only reductions are sufficient at rural sites. Most cities we examined require more than 50% US anthropogenic emission reductions from 2006 levels to meet the current primary 75 ppb US standard and secondary 15 ppm-h target. Most rural sites require less than 20% reductions to meet the primary 75 ppb standard and less than 40% reductions to meet the secondary 15 ppm-h target. Whether the primary standard is protective of the secondary standard depends on the combination of alternative standard levels. Our modeling suggests that the current 75 ppb standard achieves a 15 ppm-h secondary target in most (17 of 22) cities, but only half of the rural sites; the inability for several western cities and rural areas to achieve the seasonally-summed secondary 15 ppm-h target while meeting the 75 ppb primary target is likely driven by higher background O3 that is commonly reported in the western US. However, a 70 ppb primary standard is protective of a 15 ppm-h secondary standard in all cities and 18 of 20 rural sites we examined, and a 60 ppb primary standard is protective of a 7 ppm-h secondary standard in all cities and 19 of 20 rural sites. If EPA promulgates separate primary and secondary standards, exceedance areas will need to develop and demonstrate control strategies to achieve both. This HDDM analysis provides an illustrative screening assessment by which to estimate emissions reductions necessary to satisfy both standards.
The spread of European models of engineering education: the challenges faced in emerging countries
NASA Astrophysics Data System (ADS)
Gardelle, Linda; Cardona Gil, Emmanuel; Benguerna, Mohamed; Bolat, Altangul; Naran, Boldmaa
2017-03-01
The major European models of engineering training (the German, the British and the French model) spread throughout the world during the twentieth century. Historical heritage, cultural proximity and languages explain the open expression of faithfulness to one system in some countries. In these countries, the national standards inherited are now completed by international standards or are in direct competition with new influences. This article will attempt, through the existing literature, interviews and on-site investigations, to analyse current engineering training in some emerging countries and its relations with European models, the objective being to analyse the evolution of local systems and so the challenges and issues raised by the dissemination of European models.
Singlino resonant dark matter and 125 GeV Higgs boson in high-scale supersymmetry.
Ishikawa, Kazuya; Kitahara, Teppei; Takimoto, Masahiro
2014-09-26
We consider a singlino dark matter (DM) scenario in a singlet extension model of the minimal supersymmetric standard model, which is the so-called the nearly minimal supersymmetric standard model. We find that with high-scale supersymmetry breaking the singlino can obtain a sizable radiative correction to the mass, which opens a window for the DM scenario with resonant annihilation via the exchange of the Higgs boson. We show that the current DM relic abundance and the Higgs boson mass can be explained simultaneously. This scenario can be fully probed by XENON1T.
Chen, Kai; Zhou, Lian; Chen, Xiaodong; Bi, Jun; Kinney, Patrick L
2017-05-01
Few multicity studies have addressed the health effects of ozone in China due to the scarcity of ozone monitoring data. A critical scientific and policy-relevant question is whether a threshold exists in the ozone-mortality relationship. Using a generalized additive model and a univariate random-effects meta-analysis, this research evaluated the relationship between short-term ozone exposure and daily total mortality in seven cities of Jiangsu Province, China during 2013-2014. Spline, subset, and threshold models were applied to further evaluate whether a safe threshold level exists. This study found strong evidence that short-term ozone exposure is significantly associated with premature total mortality. A 10μg/m 3 increase in the average of the current and previous days' maximum 8-h average ozone concentration was associated with a 0.55% (95% posterior interval: 0.34%, 0.76%) increase of total mortality. This finding is robust when considering the confounding effect of PM 2.5 , PM 10 , NO 2 , and SO 2 . No consistent evidence was found for a threshold in the ozone-mortality concentration-response relationship down to concentrations well below the current Chinese Ambient Air Quality Standard (CAAQS) level 2 standard (160μg/m 3 ). Our findings suggest that ozone concentrations below the current CAAQS level 2 standard could still induce increased mortality risks in Jiangsu Province, China. Continuous air pollution control measures could yield important health benefits in Jiangsu Province, China, even in cities that meet the current CAAQS level 2 standard. Copyright © 2017 Elsevier Inc. All rights reserved.
Chen, Kai; Zhou, Lian; Chen, Xiaodong; Bi, Jun; Kinney, Patrick L.
2017-01-01
Background Few multicity studies have addressed the health effects of ozone in China due to the scarcity of ozone monitoring data. A critical scientific and policy-relevant question is whether a threshold exists in the ozone-mortality relationship. Methods Using a generalized additive model and a univariate random-effects meta-analysis, this research evaluated the relationship between short-term ozone exposure and daily total mortality in seven cities of Jiangsu Province, China during 2013–2014. Spline, subset, and threshold models were applied to further evaluate whether a safe threshold level exists. Results This study found strong evidence that short-term ozone exposure is significantly associated with premature total mortality. A 10 μg/m3 increase in the average of the current and previous days’ maximum 8-h average ozone concentration was associated with a 0.55% (95% posterior interval: 0.34%, 0.76%) increase of total mortality. This finding is robust when considering the confounding effect of PM2.5, PM10, NO2, and SO2. No consistent evidence was found for a threshold in the ozone-mortality concentration-response relationship down to concentrations well below the current Chinese Ambient Air Quality Standard (CAAQS) level 2 standard (160 μg/m3). Conclusions Our findings suggest that ozone concentrations below the current CAAQS level 2 standard could still induce increased mortality risks in Jiangsu Province, China. Continuous air pollution control measures could yield important health benefits in Jiangsu Province, China, even in cities that meet the current CAAQS level 2 standard. PMID:28231551
Interoperability of Neuroscience Modeling Software
Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik
2009-01-01
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374
Comparison of air-kerma strength determinations for HDR (192)Ir sources.
Rasmussen, Brian E; Davis, Stephen D; Schmidt, Cal R; Micka, John A; Dewerd, Larry A
2011-12-01
To perform a comparison of the interim air-kerma strength standard for high dose rate (HDR) (192)Ir brachytherapy sources maintained by the University of Wisconsin Accredited Dosimetry Calibration Laboratory (UWADCL) with measurements of the various source models using modified techniques from the literature. The current interim standard was established by Goetsch et al. in 1991 and has remained unchanged to date. The improved, laser-aligned seven-distance apparatus of the University of Wisconsin Medical Radiation Research Center (UWMRRC) was used to perform air-kerma strength measurements of five different HDR (192)Ir source models. The results of these measurements were compared with those from well chambers traceable to the original standard. Alternative methodologies for interpolating the (192)Ir air-kerma calibration coefficient from the NIST air-kerma standards at (137)Cs and 250 kVp x rays (M250) were investigated and intercompared. As part of the interpolation method comparison, the Monte Carlo code EGSnrc was used to calculate updated values of A(wall) for the Exradin A3 chamber used for air-kerma strength measurements. The effects of air attenuation and scatter, room scatter, as well as the solution method were investigated in detail. The average measurements when using the inverse N(K) interpolation method for the Classic Nucletron, Nucletron microSelectron, VariSource VS2000, GammaMed Plus, and Flexisource were found to be 0.47%, -0.10%, -1.13%, -0.20%, and 0.89% different than the existing standard, respectively. A further investigation of the differences observed between the sources was performed using MCNP5 Monte Carlo simulations of each source model inside a full model of an HDR 1000 Plus well chamber. Although the differences between the source models were found to be statistically significant, the equally weighted average difference between the seven-distance measurements and the well chambers was 0.01%, confirming that it is not necessary to update the current standard maintained at the UWADCL.
Connecting dark matter annihilation to the vertex functions of Standard Model fermions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Jason; Light, Christopher, E-mail: jkumar@hawaii.edu, E-mail: lightc@hawaii.edu
We consider scenarios in which dark matter is a Majorana fermion which couples to Standard Model fermions through the exchange of charged mediating particles. The matrix elements for various dark matter annihilation processes are then related to one-loop corrections to the fermion-photon vertex, where dark matter and the charged mediators run in the loop. In particular, in the limit where Standard Model fermion helicity mixing is suppressed, the cross section for dark matter annihilation to various final states is related to corrections to the Standard Model fermion charge form factor. These corrections can be extracted in a gauge-invariant manner frommore » collider cross sections. Although current measurements from colliders are not precise enough to provide useful constraints on dark matter annihilation, improved measurements at future experiments, such as the International Linear Collider, could improve these constraints by several orders of magnitude, allowing them to surpass the limits obtainable by direct observation.« less
Signals from flavor changing scalar currents at the future colliders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atwood, D.; Reina, L.; Soni, A.
1996-11-22
We present a general phenomenological analysis of a class of Two Higgs Doublet Models with Flavor Changing Neutral Currents arising at the tree level. The existing constraints mainly affect the couplings of the first two generations of quarks, leaving the possibility for non negligible Flavor Changing couplings of the top quark open. The next generation of lepton and hadron colliders will offer the right environment to study the physics of the top quark and to unravel the presence of new physics beyond the Standard Model. In this context we discuss some interesting signals from Flavor Changing Scalar Neutral Currents.
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
12 CFR 1807.503 - Project completion.
Code of Federal Regulations, 2011 CFR
2011-01-01
... applicable: One of three model codes (Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI)); or the Council of American Building Officials (CABO) one or two family... must meet the current edition of the Model Energy Code published by the Council of American Building...
Beyond standard model searches in the MiniBooNE experiment
Katori, Teppei; Conrad, Janet M.
2014-08-05
Tmore » he MiniBooNE experiment has contributed substantially to beyond standard model searches in the neutrino sector. he experiment was originally designed to test the Δ m 2 ~ 1 eV 2 region of the sterile neutrino hypothesis by observing ν e ( ν - e ) charged current quasielastic signals from a ν μ ( ν - μ ) beam. MiniBooNE observed excesses of ν e and ν - e candidate events in neutrino and antineutrino mode, respectively. o date, these excesses have not been explained within the neutrino standard model ( ν SM); the standard model extended for three massive neutrinos. Confirmation is required by future experiments such as MicroBooNE. MiniBooNE also provided an opportunity for precision studies of Lorentz violation. he results set strict limits for the first time on several parameters of the standard-model extension, the generic formalism for considering Lorentz violation. Most recently, an extension to MiniBooNE running, with a beam tuned in beam-dump mode, is being performed to search for dark sector particles. In addition, this review describes these studies, demonstrating that short baseline neutrino experiments are rich environments in new physics searches.« less
Physiological pharmacokinetic/pharmacodynamic models require Vmax, Km values for the metabolism of OPs by tissue enzymes. Current literature values cannot be easily used in OP PBPK models (i.e., parathion and chlorpyrifos) because standard methodologies were not used in their ...
Physiological pharmacokinetic\\pharmacodynamic models require Vmax, Km values for the metabolism of OPs by tissue enzymes. Current literature values cannot be easily used in OP PBPK models (i.e., parathion and chlorpyrifos) because standard methodologies were not used in their ...
Characterization of YBa2Cu3O7, including critical current density Jc, by trapped magnetic field
NASA Technical Reports Server (NTRS)
Chen, In-Gann; Liu, Jianxiong; Weinstein, Roy; Lau, Kwong
1992-01-01
Spatial distributions of persistent magnetic field trapped by sintered and melt-textured ceramic-type high-temperature superconductor (HTS) samples have been studied. The trapped field can be reproduced by a model of the current consisting of two components: (1) a surface current Js and (2) a uniform volume current Jv. This Js + Jv model gives a satisfactory account of the spatial distribution of the magnetic field trapped by different types of HTS samples. The magnetic moment can be calculated, based on the Js + Jv model, and the result agrees well with that measured by standard vibrating sample magnetometer (VSM). As a consequence, Jc predicted by VSM methods agrees with Jc predicted from the Js + Jv model. The field mapping method described is also useful to reveal the granular structure of large HTS samples and regions of weak links.
Search for light gauge bosons of the dark sector at the Mainz Microtron.
Merkel, H; Achenbach, P; Ayerbe Gayoso, C; Bernauer, J C; Böhm, R; Bosnar, D; Debenjak, L; Denig, A; Distler, M O; Esser, A; Fonvieille, H; Friščić, I; Middleton, D G; Müller, U; Nungesser, L; Pochodzalla, J; Rohrbeck, M; Sánchez Majos, S; Schlimme, B S; Schoth, M; Sirca, S; Weinriefer, M
2011-06-24
A new exclusion limit for the electromagnetic production of a light U(1) gauge boson γ' decaying to e + e- was determined by the A1 Collaboration at the Mainz Microtron. Such light gauge bosons appear in several extensions of the standard model and are also discussed as candidates for the interaction of dark matter with standard model matter. In electron scattering from a heavy nucleus, the existing limits for a narrow state coupling to e + e- were reduced by nearly an order of magnitude in the range of the lepton pair mass of 210 MeV/c2}
Taming Many-Parameter BSM Models with Bayesian Neural Networks
NASA Astrophysics Data System (ADS)
Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.
2017-09-01
The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.
Simple standard model extension by heavy charged scalar
NASA Astrophysics Data System (ADS)
Boos, E.; Volobuev, I.
2018-05-01
We consider a Standard Model (SM) extension by a heavy charged scalar gauged only under the UY(1 ) weak hypercharge gauge group. Such an extension, being gauge invariant with respect to the SM gauge group, is a simple special case of the well-known Zee model. Since the interactions of the charged scalar with the Standard Model fermions turn out to be significantly suppressed compared to the Standard Model interactions, the charged scalar provides an example of a long-lived charged particle being interesting to search for at the LHC. We present the pair and single production cross sections of the charged scalar at different colliders and the possible decay widths for various boson masses. It is shown that the current ATLAS and CMS searches at 8 and 13 TeV collision energy lead to the bounds on the scalar boson mass of about 300-320 GeV. The limits are expected to be much larger for higher collision energies and, assuming 15 a b-1 integrated luminosity, reach about 2.7 TeV at future 27 TeV LHC thus covering the most interesting mass region.
Negeri, Zelalem F; Shaikh, Mateen; Beyene, Joseph
2018-05-11
Diagnostic or screening tests are widely used in medical fields to classify patients according to their disease status. Several statistical models for meta-analysis of diagnostic test accuracy studies have been developed to synthesize test sensitivity and specificity of a diagnostic test of interest. Because of the correlation between test sensitivity and specificity, modeling the two measures using a bivariate model is recommended. In this paper, we extend the current standard bivariate linear mixed model (LMM) by proposing two variance-stabilizing transformations: the arcsine square root and the Freeman-Tukey double arcsine transformation. We compared the performance of the proposed methods with the standard method through simulations using several performance measures. The simulation results showed that our proposed methods performed better than the standard LMM in terms of bias, root mean square error, and coverage probability in most of the scenarios, even when data were generated assuming the standard LMM. We also illustrated the methods using two real data sets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Teacher Professional Develpment That Meets 21st Century Science Education Standards
NASA Astrophysics Data System (ADS)
van der Veen, Wil E.; Roelofsen Moody, T.
2011-01-01
The National Academies are working with several other groups to develop new National Science Education Standards, with the intention that they will be adopted by all states. It is critical that the science education community uses these new standards when planning teacher professional development and understands the potential implementation challenges. As a first step in developing these new standards, the National Research Council (NRC) recently published a draft Framework for Science Education. This framework describes the major scientific ideas and practices that all students should be familiar with by the end of high school. Following recommendations from the NRC Report "Taking Science to School” (NRC, 2007), it emphasizes the importance of integrating science practices with the learning of science content. These same recommendations influenced the recently revised New Jersey Science Education Standards. Thus, the revised New Jersey standards can be valuable as a case study for curriculum developers and professional development providers. While collaborating with the New Jersey Department of Education on the development of these revised science standards, we identified two critical needs for successful implementation. First, we found that many currently used science activities must be adapted to meet the revised standards and that new activities must be developed. Second, teacher professional development is needed to model the integration of science practices with the learning of science content. With support from the National Space Grant Foundation we developed a week-long Astronomy Institute, which was presented in the summers of 2009 and 2010. We will briefly describe our professional development model and how it helped teachers to bridge the gap between the standards and their current classroom practice. We will provide examples of astronomy activities that were either adapted or developed to meet the new standards. Finally, we will briefly discuss the evaluation results.
Executable Architecture of Net Enabled Operations: State Machine of Federated Nodes
2009-11-01
verbal descriptions from operators) of the current Command and Control (C2) practices into model form. In theory these should be Standard Operating...faudra une grande quantité de données pour faire en sorte que le modèle reflète les processus véritables, les auteurs recommandent que la machine à...descriptions from operators) of the current C2 practices into model form. In theory these should be SOPs that execute as a thread from start to finish. The
The National Ambient Air Quality Standards for particulate matter (PM) and the federal regional haze regulations place some emphasis on the assessment of fine particle (PM; 5) concentrations. Current air quality models need to be improved and evaluated against observations to a...
The status of military specifications with regard to atmospheric turbulence
NASA Technical Reports Server (NTRS)
Moorhouse, David J.; Heffley, Robert K.
1987-01-01
The features of atmospheric disturbances that are significant to aircraft flying qualities are discussed. Next follows a survey of proposed models. Lastly, there is a discussion of the content and application of the model contained in the current flying qualities specification and the forthcoming MIL-Standard.
Health risk evaluation needs precise measurement and modeling of human exposures in microenvironments to support review of current air quality standards. The particulate matter emissions from motor vehicles are a major component of human exposures in urban microenvironments. Cu...
Economic analysis of tree improvement: A status report
George F. Dutrow
1974-01-01
Review of current literature establishes that most authors believe that tree improvement expands production, although some point out drawbacks and alternatives. Both softwood and hardwood improvement programs have been analyzed. The authors used various models, economic assumptions, and standards of measurement, but available data were limited. Future models shouId...
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Volume 2: Compendium of Abstracts
2017-06-01
simulation work using a standard running model for legged systems, the Spring Loaded Inverted Pendulum (SLIP) Model. In this model, the dynamics of a single...bar SLIP model is analyzed using a basin of attraction analyses to determine the optimal configuration for running at different velocities and...acquisition, and the automatic target acquisition were then compared to each other. After running trials with the current system, it will be
Latent Class Analysis of Incomplete Data via an Entropy-Based Criterion
Larose, Chantal; Harel, Ofer; Kordas, Katarzyna; Dey, Dipak K.
2016-01-01
Latent class analysis is used to group categorical data into classes via a probability model. Model selection criteria then judge how well the model fits the data. When addressing incomplete data, the current methodology restricts the imputation to a single, pre-specified number of classes. We seek to develop an entropy-based model selection criterion that does not restrict the imputation to one number of clusters. Simulations show the new criterion performing well against the current standards of AIC and BIC, while a family studies application demonstrates how the criterion provides more detailed and useful results than AIC and BIC. PMID:27695391
Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization
NASA Astrophysics Data System (ADS)
Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad
2017-02-01
Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.
Single Top Production at Next-to-Leading Order in the Standard Model Effective Field Theory.
Zhang, Cen
2016-04-22
Single top production processes at hadron colliders provide information on the relation between the top quark and the electroweak sector of the standard model. We compute the next-to-leading order QCD corrections to the three main production channels: t-channel, s-channel, and tW associated production, in the standard model including operators up to dimension six. The calculation can be matched to parton shower programs and can therefore be directly used in experimental analyses. The QCD corrections are found to significantly impact the extraction of the current limits on the operators, because both of an improved accuracy and a better precision of the theoretical predictions. In addition, the distributions of some of the key discriminating observables are modified in a nontrivial way, which could change the interpretation of measurements in terms of UV complete models.
The computation of standard solar models
NASA Technical Reports Server (NTRS)
Ulrich, Roger K.; Cox, Arthur N.
1991-01-01
Procedures for calculating standard solar models with the usual simplifying approximations of spherical symmetry, no mixing except in the surface convection zone, no mass loss or gain during the solar lifetime, and no separation of elements by diffusion are described. The standard network of nuclear reactions among the light elements is discussed including rates, energy production and abundance changes. Several of the equation of state and opacity formulations required for the basic equations of mass, momentum and energy conservation are presented. The usual mixing-length convection theory is used for these results. Numerical procedures for calculating the solar evolution, and current evolution and oscillation frequency results for the present sun by some recent authors are given.
First direct constraints on Fierz interference in free-neutron β decay
NASA Astrophysics Data System (ADS)
Hickerson, K. P.; Sun, X.; Bagdasarova, Y.; Bravo-Berguño, D.; Broussard, L. J.; Brown, M. A.-P.; Carr, R.; Currie, S.; Ding, X.; Filippone, B. W.; García, A.; Geltenbort, P.; Hoagland, J.; Holley, A. T.; Hong, R.; Ito, T. M.; Knecht, A.; Liu, C.-Y.; Liu, J. L.; Makela, M.; Mammei, R. R.; Martin, J. W.; Melconian, D.; Mendenhall, M. P.; Moore, S. D.; Morris, C. L.; Pattie, R. W.; Pérez Galván, A.; Picker, R.; Pitt, M. L.; Plaster, B.; Ramsey, J. C.; Rios, R.; Saunders, A.; Seestrom, S. J.; Sharapov, E. I.; Sondheim, W. E.; Tatar, E.; Vogelaar, R. B.; VornDick, B.; Wrede, C.; Young, A. R.; Zeck, B. A.; UCNA Collaboration
2017-10-01
Precision measurements of free-neutron β decay have been used to precisely constrain our understanding of the weak interaction. However, the neutron Fierz interference term bn, which is particularly sensitive to beyond-standard-model tensor currents at the TeV scale, has thus far eluded measurement. Here we report the first direct constraints on this term, finding bn=0.067 ±0 .005stat-0.061+0.090sys , consistent with the standard model. The uncertainty is dominated by absolute energy reconstruction and the linearity of the β spectrometer energy response.
The early universe history from contraction-deformation of the Standard Model
NASA Astrophysics Data System (ADS)
Gromov, N. A.
2017-03-01
The elementary particles evolution in the early Universe from Plank time up to several milliseconds is presented. The developed theory is based on the high-temperature (high-energy) limit of the Standard Model which is generated by the contractions of its gauge groups. At the infinite temperature all particles lose masses. Only massless neutral -bosons, massless Z-quarks, neutrinos and photons are survived in this limit. The weak interactions become long-range and are mediated by neutral currents, quarks have only one color degree of freedom.
Wang, Haiyin; Jin, Chunlin; Jiang, Qingwu
2017-11-20
Traditional Chinese medicine (TCM) is an important part of China's medical system. Due to the prolonged low price of TCM procedures and the lack of an effective mechanism for dynamic price adjustment, the development of TCM has markedly lagged behind Western medicine. The World Health Organization (WHO) has emphasized the need to enhance the development of alternative and traditional medicine when creating national health care systems. The establishment of scientific and appropriate mechanisms to adjust the price of medical procedures in TCM is crucial to promoting the development of TCM. This study has examined incorporating value indicators and data on basic manpower expended, time spent, technical difficulty, and the degree of risk in the latest standards for the price of medical procedures in China, and this study also offers a price adjustment model with the relative price ratio as a key index. This study examined 144 TCM procedures and found that prices of TCM procedures were mainly based on the value of medical care provided; on average, medical care provided accounted for 89% of the price. Current price levels were generally low and the current price accounted for 56% of the standardized value of a procedure, on average. Current price levels accounted for a markedly lower standardized value of acupuncture, moxibustion, special treatment with TCM, and comprehensive TCM procedures. This study selected a total of 79 procedures and adjusted them by priority. The relationship between the price of TCM procedures and the suggested price was significantly optimized (p < 0.01). This study suggests that adjustment of the price of medical procedures based on a standardized value parity model is a scientific and suitable method of price adjustment that can serve as a reference for other provinces and municipalities in China and other countries and regions that mainly have fee-for-service (FFS) medical care.
The Slow Controls System of the New Muon g-2 Experiment at Fermilab
NASA Astrophysics Data System (ADS)
Eads, Michael; New Muon g-2 Collaboration
2015-04-01
The goal of the new muon g-2 experiment (E-989), currently under construction at Fermi National Accelerator Laboratory, is to measure the anomalous gyromagnetic ratio of the muon with unprecedented precision. The uncertainty goal of the experiment, 0.14ppm, represents a four-fold improvement over the current best measurement of this value and has the potential to increase the current three standard deviation disagreement with the predicted standard model value to five standard deviations. Measuring the operating conditions of the experiment will be essential to achieving these uncertainty goals. This talk will describe the design and the current status of E-989's slow controls system. This system, based on the MIDAS Slow Control Bus, will be used to measure and record currents, voltages, temperatures, humidities, pressures, flows, and other data which is collected asynchronously with the injection of the muon beam. The system consists of a variety of sensors and front-end electronics which interface to back-end data acquisition, data storage, and data monitoring systems. Parts of the system are all already operational and the full system will be completed before beam commissioning begins in 2017.
NASA Astrophysics Data System (ADS)
Joyce, C. J.; Tobiska, W. K.; Copeland, K.; Smart, D. F.; Shea, M. A.; Nowicki, S.; Atwell, W.; Benton, E. R.; Wilkins, R.; Hands, A.; Gronoff, G.; Meier, M. M.; Schwadron, N.
2017-12-01
Despite its potential for causing a wide range of harmful effects, including health hazards to airline passengers and damage to aircraft and satellite electronics, atmospheric radiation remains a relatively poorly defined risk, lacking sufficient measurements and modelling to fully evaluate the dangers posed. While our reliance on airline travel has increased dramatically over time, there remains an absence of international guidance and standards to protect aircraft passengers from potential health impacts due to radiation exposure. This subject has been gaining traction within the scientific community in recent years, with an expanding number of models with increasing capabilities being made available to evaluate atmospheric radiation hazards. We provide a general description of these modelling efforts, including the physics and methods used by the models, as well as their data inputs and outputs. We also discuss the current capacity for model validation via measurements and discuss the needs for the next generation of models, both in terms of their capabilities and the measurements required to validate them. This review of the status of atmospheric radiation modelling is part of a larger series of studies made as part of the SAFESKY program, with other efforts focusing on the underlying physics and implications, measurements and regulations/standards of atmospheric radiation.
Sensitivity of solar g-modes to varying G cosmologies
NASA Technical Reports Server (NTRS)
Guenther, D. B.; Sills, Ken; Demarque, Pierre; Krauss, Lawrence M.
1995-01-01
The sensitivity of the solar g-mode oscillation spectrum to variability in the universal gravitational constant G is described. Solar models in varying G cosmologies were constructed by evolving a zero-age main-sequence stellar model to the Sun's current age, while allowing the value of G to change according to the power law G(t) proportional to t(exp -beta), where Beta approximately equals delta G/GH and H is the Hubble constant. All solar models were constrained to the observed luminosity and radius at the current age of the Sun by adjusting the helium abundance and the mixing-length parameter of the models in the usual way for standard stellar models. Low-l g-mode oscillation periods were calculated for each of the models and compared to the claimed observation of the solar g-mode oscillation spectrum by Hill & Gu (1990). If one accepts Hill & Gu's claims, then within the uncertainties of the physics of the solar model calculation, our models rule out all but (delta G/GH) less than approximately 0.05. In other words, we conclude that G could not have varied by more than 2% over the past 4.5 Gyr, the lifetime of the present-day Sun. This result lends independent support to the validity of the standard solar model.
An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*
Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.
2014-01-01
Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144
General squark flavour mixing: constraints, phenomenology and benchmarks
De Causmaecker, Karen; Fuks, Benjamin; Herrmann, Bjorn; ...
2015-11-19
Here, we present an extensive study of non-minimal flavour violation in the squark sector in the framework of the Minimal Supersymmetric Standard Model. We investigate the effects of multiple non-vanishing flavour-violating elements in the squark mass matrices by means of a Markov Chain Monte Carlo scanning technique and identify parameter combinations that are favoured by both current data and theoretical constraints. We then detail the resulting distributions of the flavour-conserving and flavour-violating model parameters. Based on this analysis, we propose a set of benchmark scenarios relevant for future studies of non-minimal flavour violation in the Minimal Supersymmetric Standard Model.
Engine Family Groups for Verification of Clean Diesel Technology
These documents show engine family boxes that represent groupings of engine families with similar characterists (i.e., the emissions standards that the engines were built to) for current and past model years.
LHC benchmark scenarios for the real Higgs singlet extension of the standard model
Robens, Tania; Stefaniak, Tim
2016-05-13
Here, we present benchmark scenarios for searches for an additional Higgs state in the real Higgs singlet extension of the Standard Model in Run 2 of the LHC. The scenarios are selected such that they ful ll all relevant current theoretical and experimental constraints, but can potentially be discovered at the current LHC run. We take into account the results presented in earlier work and update the experimental constraints from relevant LHC Higgs searches and signal rate measurements. The benchmark scenarios are given separately for the low mass and high mass region, i.e. the mass range where the additional Higgsmore » state is lighter or heavier than the discovered Higgs state at around 125 GeV. They have also been presented in the framework of the LHC Higgs Cross Section Working Group.« less
Final Report on Jobin Yvon Contained Inductively Coupled Plasma Emission Spectrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennebaker, F.M.
2003-03-17
A new Inductively Coupled Plasma -- Emission Spectrometer (ICP-ES) was recently purchased and installed in Lab B-147/151 at SRTC. The contained JY Model Ultima 170-C ICP-ES has been tested and compared to current ADS ICP-ES instrumentation. The testing has included both performance tests to evaluate instrumental ability, and the measurement of matrix standards commonly analyzed by ICP-ES at Savannah River. In developing operating procedures for this instrument, we have implemented the use of internal standards and off-peak background subtraction. Both of these techniques are recommended by EPA SW-846 ICP-ES methods and are common to current ICP-ES operations. Based on themore » testing and changes, the JY Model Ultima 170-C ICP-ES provides improved performance for elemental analysis of radioactive samples in the Analytical Development Section.« less
Meisamy, Sina; Hines, Catherine D G; Hamilton, Gavin; Sirlin, Claude B; McKenzie, Charles A; Yu, Huanzhou; Brittain, Jean H; Reeder, Scott B
2011-03-01
To prospectively compare an investigational version of a complex-based chemical shift-based fat fraction magnetic resonance (MR) imaging method with MR spectroscopy for the quantification of hepatic steatosis. This study was approved by the institutional review board and was HIPAA compliant. Written informed consent was obtained before all studies. Fifty-five patients (31 women, 24 men; age range, 24-71 years) were prospectively imaged at 1.5 T with quantitative MR imaging and single-voxel MR spectroscopy, each within a single breath hold. The effects of T2 correction, spectral modeling of fat, and magnitude fitting for eddy current correction on fat quantification with MR imaging were investigated by reconstructing fat fraction images from the same source data with different combinations of error correction. Single-voxel T2-corrected MR spectroscopy was used to measure fat fraction and served as the reference standard. All MR spectroscopy data were postprocessed at a separate institution by an MR physicist who was blinded to MR imaging results. Fat fractions measured with MR imaging and MR spectroscopy were compared statistically to determine the correlation (r(2)), and the slope and intercept as measures of agreement between MR imaging and MR spectroscopy fat fraction measurements, to determine whether MR imaging can help quantify fat, and examine the importance of T2 correction, spectral modeling of fat, and eddy current correction. Two-sided t tests (significance level, P = .05) were used to determine whether estimated slopes and intercepts were significantly different from 1.0 and 0.0, respectively. Sensitivity and specificity for the classification of clinically significant steatosis were evaluated. Overall, there was excellent correlation between MR imaging and MR spectroscopy for all reconstruction combinations. However, agreement was only achieved when T2 correction, spectral modeling of fat, and magnitude fitting for eddy current correction were used (r(2) = 0.99; slope ± standard deviation = 1.00 ± 0.01, P = .77; intercept ± standard deviation = 0.2% ± 0.1, P = .19). T1-independent chemical shift-based water-fat separation MR imaging methods can accurately quantify fat over the entire liver, by using MR spectroscopy as the reference standard, when T2 correction, spectral modeling of fat, and eddy current correction methods are used. © RSNA, 2011.
Hines, Catherine D. G.; Hamilton, Gavin; Sirlin, Claude B.; McKenzie, Charles A.; Yu, Huanzhou; Brittain, Jean H.; Reeder, Scott B.
2011-01-01
Purpose: To prospectively compare an investigational version of a complex-based chemical shift–based fat fraction magnetic resonance (MR) imaging method with MR spectroscopy for the quantification of hepatic steatosis. Materials and Methods: This study was approved by the institutional review board and was HIPAA compliant. Written informed consent was obtained before all studies. Fifty-five patients (31 women, 24 men; age range, 24–71 years) were prospectively imaged at 1.5 T with quantitative MR imaging and single-voxel MR spectroscopy, each within a single breath hold. The effects of T2* correction, spectral modeling of fat, and magnitude fitting for eddy current correction on fat quantification with MR imaging were investigated by reconstructing fat fraction images from the same source data with different combinations of error correction. Single-voxel T2-corrected MR spectroscopy was used to measure fat fraction and served as the reference standard. All MR spectroscopy data were postprocessed at a separate institution by an MR physicist who was blinded to MR imaging results. Fat fractions measured with MR imaging and MR spectroscopy were compared statistically to determine the correlation (r2), and the slope and intercept as measures of agreement between MR imaging and MR spectroscopy fat fraction measurements, to determine whether MR imaging can help quantify fat, and examine the importance of T2* correction, spectral modeling of fat, and eddy current correction. Two-sided t tests (significance level, P = .05) were used to determine whether estimated slopes and intercepts were significantly different from 1.0 and 0.0, respectively. Sensitivity and specificity for the classification of clinically significant steatosis were evaluated. Results: Overall, there was excellent correlation between MR imaging and MR spectroscopy for all reconstruction combinations. However, agreement was only achieved when T2* correction, spectral modeling of fat, and magnitude fitting for eddy current correction were used (r2 = 0.99; slope ± standard deviation = 1.00 ± 0.01, P = .77; intercept ± standard deviation = 0.2% ± 0.1, P = .19). Conclusion: T1-independent chemical shift–based water-fat separation MR imaging methods can accurately quantify fat over the entire liver, by using MR spectroscopy as the reference standard, when T2* correction, spectral modeling of fat, and eddy current correction methods are used. © RSNA, 2011 PMID:21248233
ERIC Educational Resources Information Center
Kelava, Augustin; Nagengast, Benjamin
2012-01-01
Structural equation models with interaction and quadratic effects have become a standard tool for testing nonlinear hypotheses in the social sciences. Most of the current approaches assume normally distributed latent predictor variables. In this article, we present a Bayesian model for the estimation of latent nonlinear effects when the latent…
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K.T.
2016-01-01
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. PMID:26928563
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K T
2016-04-15
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?
NASA Technical Reports Server (NTRS)
Lum, Karen; Hihn, Jairus; Menzies, Tim
2006-01-01
While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.
Dark forces coupled to nonconserved currents
NASA Astrophysics Data System (ADS)
Dror, Jeff A.; Lasenby, Robert; Pospelov, Maxim
2017-10-01
New light vectors with dimension-4 couplings to Standard Model states have (energy/vectormass)2-enhanced production rates unless the current they couple to is conserved. These processes allow us to derive new constraints on the couplings of such vectors, that are significantly stronger than the previous literature for a wide variety of models. Examples include vectors with axial couplings to quarks and vectors coupled to currents (such as baryon number) that are only broken by the chiral anomaly. Our new limits arise from a range of processes, including rare Z decays and flavor-changing meson decays, and rule out a number of phenomenologically motivated proposals.
NASA Astrophysics Data System (ADS)
Testi, D.; Schito, E.; Menchetti, E.; Grassi, W.
2014-11-01
Constructions built in Italy before 1945 (about 30% of the total built stock) feature low energy efficiency. Retrofit actions in this field can lead to valuable energetic and economic savings. In this work, we ran a dynamic simulation of a historical building of the University of Pisa during the heating season. We firstly evaluated the energy requirements of the building and the performance of the existing natural gas boiler, validated with past billings of natural gas. We also verified the energetic savings obtainable by the substitution of the boiler with an air-to-water electrically-driven modulating heat pump, simulated through a cycle-based model, evaluating the main economic metrics. The cycle-based model of the heat pump, validated with manufacturers' data available only at specified temperature and load conditions, can provide more accurate results than the simplified models adopted by current technical standards, thus increasing the effectiveness of energy audits.
NASA Astrophysics Data System (ADS)
Nickles, Cassandra; Goodman, Matthew; Saez, Jose; Issakhanian, Emin
2016-11-01
California's current drought has renewed public interest in recycled water from Water Reclamation Plants (WRPs). It is critical that the recycled water meets public health standards. This project consists of simulating the transport of an instantaneous conservative tracer through the WRP chlorine contact tanks. Local recycled water regulations stipulate a minimum 90-minute modal contact time during disinfection at peak dry weather design flow. In-situ testing is extremely difficult given flowrate dependence on real world sewage line supply and recycled water demand. Given as-built drawings and operation parameters, the chlorine contact tanks are modeled to simulate extreme situations, which may not meet regulatory standards. The turbulent flow solutions are used as the basis to model the transport of a turbulently diffusing conservative tracer added instantaneously to the inlet of the reactors. This tracer simulates the transport through advection and dispersion of chlorine in the WRPs. Previous work validated the models against experimental data. The current work shows the predictive value of the simulations.
Scialla, Michele A; Canter, Kimberly S; Chen, Fang Fang; Kolb, E Anders; Sandler, Eric; Wiener, Lori; Kazak, Anne E
2018-03-01
With published evidence-based Standards for Psychosocial Care for Children with Cancer and their Families, it is important to know the current status of their implementation. This paper presents data on delivery of psychosocial care related to the Standards in the United States. Pediatric oncologists, psychosocial leaders, and administrators in pediatric oncology from 144 programs completed an online survey. Participants reported on the extent to which psychosocial care consistent with the Standards was implemented and was comprehensive and state of the art. They also reported on specific practices and services for each Standard and the extent to which psychosocial care was integrated into broader medical care. Participants indicated that psychosocial care consistent with the Standards was usually or always provided at their center for most of the Standards. However, only half of the oncologists (55.6%) and psychosocial leaders (45.6%) agreed or strongly agreed that their psychosocial care was comprehensive and state of the art. Types of psychosocial care provided included evidence-based and less established approaches but were most often provided when problems were identified, rather than proactively. The perception of state of the art care was associated with practices indicative of integrated psychosocial care and the extent to which the Standards are currently implemented. Many oncologists and psychosocial leaders perceive that the delivery of psychosocial care at their center is consistent with the Standards. However, care is quite variable, with evidence for the value of more integrated models of psychosocial services. © 2017 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Charbonneau-Gowdy, Paula
2015-01-01
Current debates on quality standards in education often look to the levels of an increasingly diverse array of literacies as a measure of that standard. At the same time, while mobile technologies are profoundly changing the way we live, communicate and learn in our everyday lives, relatively little seems to be known about their potential to…
Malone, Matthew; Goeres, Darla M; Gosbell, Iain; Vickery, Karen; Jensen, Slade; Stoodley, Paul
2017-02-01
The concept of biofilms in human health and disease is now widely accepted as cause of chronic infection. Typically, biofilms show remarkable tolerance to many forms of treatments and the host immune response. This has led to vast increase in research to identify new (and sometimes old) anti-biofilm strategies that demonstrate effectiveness against these tolerant phenotypes. Areas covered: Unfortunately, a standardized methodological approach of biofilm models has not been adopted leading to a large disparity between testing conditions. This has made it almost impossible to compare data across multiple laboratories, leaving large gaps in the evidence. Furthermore, many biofilm models testing anti-biofilm strategies aimed at the medical arena have not considered the matter of relevance to an intended application. This may explain why some in vitro models based on methodological designs that do not consider relevance to an intended application fail when applied in vivo at the clinical level. Expert commentary: This review will explore the issues that need to be considered in developing performance standards for anti-biofilm therapeutics and provide a rationale for the need to standardize models/methods that are clinically relevant. We also provide some rational as to why no standards currently exist.
The Economic Impact of the President’s 2013 Budget
2012-04-01
and capital . According to the Solow-type model , people base their decisions about working and saving pri- marily on current economic... model developed by Robert Solow. CBO’s life-cycle growth model is an overlapping - generations general -equilibrium model that is based on a standard...services produced in a given period by the labor and capital supplied by the country’s residents , regardless of where the labor
Gordon, Sarah; Daneshian, Mardas; Bouwstra, Joke; Caloni, Francesca; Constant, Samuel; Davies, Donna E; Dandekar, Gudrun; Guzman, Carlos A; Fabian, Eric; Haltner, Eleonore; Hartung, Thomas; Hasiwa, Nina; Hayden, Patrick; Kandarova, Helena; Khare, Sangeeta; Krug, Harald F; Kneuer, Carsten; Leist, Marcel; Lian, Guoping; Marx, Uwe; Metzger, Marco; Ott, Katharina; Prieto, Pilar; Roberts, Michael S; Roggen, Erwin L; Tralau, Tewes; van den Braak, Claudia; Walles, Heike; Lehr, Claus-Michael
2015-01-01
Models of the outer epithelia of the human body - namely the skin, the intestine and the lung - have found valid applications in both research and industrial settings as attractive alternatives to animal testing. A variety of approaches to model these barriers are currently employed in such fields, ranging from the utilization of ex vivo tissue to reconstructed in vitro models, and further to chip-based technologies, synthetic membrane systems and, of increasing current interest, in silico modeling approaches. An international group of experts in the field of epithelial barriers was convened from academia, industry and regulatory bodies to present both the current state of the art of non-animal models of the skin, intestinal and pulmonary barriers in their various fields of application, and to discuss research-based, industry-driven and regulatory-relevant future directions for both the development of new models and the refinement of existing test methods. Issues of model relevance and preference, validation and standardization, acceptance, and the need for simplicity versus complexity were focal themes of the discussions. The outcomes of workshop presentations and discussions, in relation to both current status and future directions in the utilization and development of epithelial barrier models, are presented by the attending experts in the current report.
Lim, Maria A; Louie, Brenton; Ford, Daniel; Heath, Kyle; Cha, Paulyn; Betts-Lacroix, Joe; Lum, Pek Yee; Robertson, Timothy L; Schaevitz, Laura
2017-01-01
Despite a broad spectrum of anti-arthritic drugs currently on the market, there is a constant demand to develop improved therapeutic agents. Efficient compound screening and rapid evaluation of treatment efficacy in animal models of rheumatoid arthritis (RA) can accelerate the development of clinical candidates. Compound screening by evaluation of disease phenotypes in animal models facilitates preclinical research by enhancing understanding of human pathophysiology; however, there is still a continuous need to improve methods for evaluating disease. Current clinical assessment methods are challenged by the subjective nature of scoring-based methods, time-consuming longitudinal experiments, and the requirement for better functional readouts with relevance to human disease. To address these needs, we developed a low-touch, digital platform for phenotyping preclinical rodent models of disease. As a proof-of-concept, we utilized the rat collagen-induced arthritis (CIA) model of RA and developed the Digital Arthritis Index (DAI), an objective and automated behavioral metric that does not require human-animal interaction during the measurement and calculation of disease parameters. The DAI detected the development of arthritis similar to standard in vivo methods, including ankle joint measurements and arthritis scores, as well as demonstrated a positive correlation to ankle joint histopathology. The DAI also determined responses to multiple standard-of-care (SOC) treatments and nine repurposed compounds predicted by the SMarTR TM Engine to have varying degrees of impact on RA. The disease profiles generated by the DAI complemented those generated by standard methods. The DAI is a highly reproducible and automated approach that can be used in-conjunction with standard methods for detecting RA disease progression and conducting phenotypic drug screens.
Rethinking Teacher Evaluation: A Conversation about Statistical Inferences and Value-Added Models
ERIC Educational Resources Information Center
Callister Everson, Kimberlee; Feinauer, Erika; Sudweeks, Richard R.
2013-01-01
In this article, the authors provide a methodological critique of the current standard of value-added modeling forwarded in educational policy contexts as a means of measuring teacher effectiveness. Conventional value-added estimates of teacher quality are attempts to determine to what degree a teacher would theoretically contribute, on average,…
Going with the Flow: Challenging Students to Make Assumptions
ERIC Educational Resources Information Center
Felton, Mathew D.; Anhalt, Cynthia O.; Cortez, Ricardo
2015-01-01
Many current and future teachers have little experience with modeling and how to integrate it into their teaching. However, with the introduction of the Common Core State Standards for Mathematics (CCSSM) and its emphasis on mathematical modeling in all grades (CCSSI 2010), this integration has become paramount. Therefore, middle-grades teachers…
A Comparison of Exposure Control Procedures in CATS Using the GPC Model
ERIC Educational Resources Information Center
Leroux, Audrey J.; Dodd, Barbara G.
2016-01-01
The current study compares the progressive-restricted standard error (PR-SE) exposure control method with the Sympson-Hetter, randomesque, and no exposure control (maximum information) procedures using the generalized partial credit model with fixed- and variable-length CATs and two item pools. The PR-SE method administered the entire item pool…
Large Dataset of Acute Oral Toxicity Data Created for Testing in Silico Models (ASCCT meeting)
Acute toxicity data is a common requirement for substance registration in the US. Currently only data derived from animal tests are accepted by regulatory agencies, and the standard in vivo tests use lethality as the endpoint. Non-animal alternatives such as in silico models are ...
USDA-ARS?s Scientific Manuscript database
Effects of hydraulic redistribution (HR) on hydrological, biogeochemical, and ecological processes have been demonstrated in the field, but the current generation of standard earth system models does not include a representation of HR. Though recent studies have examined the effect of incorporating ...
Staying the Course: A Model Leadership Preparation Program that Goes the Distance
ERIC Educational Resources Information Center
Griffin, Leslie L.; Taylor, Thomas R.; Varner, Lynn W.; White, Carole L.
2012-01-01
The Educational Leadership Master's Cohort Program at Delta State University was redesigned in 2011 to reflect current industry standards and streamline the program. Since its inception in the mid-1990s, the program has provided a nationally acclaimed model for principal preparation. The recent redesign focused on preparing school principals to…
Integrating the Demonstration Orientation and Standards-Based Models of Achievement Goal Theory
ERIC Educational Resources Information Center
Wynne, Heather Marie
2014-01-01
Achievement goal theory and thus, the empirical measures stemming from the research, are currently divided on two conceptual approaches, namely the reason versus aims-based models of achievement goals. The factor structure and predictive utility of goal constructs from the Patterns of Adaptive Learning Strategies (PALS) and the latest two versions…
Robust model comparison disfavors power law cosmology
NASA Astrophysics Data System (ADS)
Shafer, Daniel L.
2015-05-01
Late-time power law expansion has been proposed as an alternative to the standard cosmological model and shown to be consistent with some low-redshift data. We test power law expansion against the standard flat Λ CDM cosmology using goodness-of-fit and model comparison criteria. We consider type Ia supernova (SN Ia) data from two current compilations (JLA and Union2.1) along with a current set of baryon acoustic oscillation (BAO) measurements that includes the high-redshift Lyman-α forest measurements from BOSS quasars. We find that neither power law expansion nor Λ CDM is strongly preferred over the other when the SN Ia and BAO data are analyzed separately but that power law expansion is strongly disfavored by the combination. We treat the Rh=c t cosmology (a constant rate of expansion) separately and find that it is conclusively disfavored by all combinations of data that include SN Ia observations and a poor overall fit when systematic errors in the SN Ia measurements are ignored, despite a recent claim to the contrary. We discuss this claim and some concerns regarding hidden model dependence in the SN Ia data.
Elementary particles in the early Universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gromov, N.A., E-mail: gromov@dm.komisc.ru
The high-temperature limit of the Standard Model generated by the contractions of gauge groups is discussed. Contraction parameters of gauge group SU(2) of the Electroweak Model and gauge group SU(3) of Quantum Chromodynamics are taken identical and tending to zero when the temperature increases. Properties of the elementary particles change drastically at the infinite temperature limit: all particles lose masses, all quarks are monochromatic. Electroweak interactions become long-range and are mediated by neutral currents. Particles of different kind do not interact. It looks like some stratification with only one sort of particles in each stratum. The Standard Model passes inmore » this limit through several stages, which are distinguished by the powers of the contraction parameter. For any stage intermediate models are constructed and the exact expressions for the respective Lagrangians are presented. The developed approach describes the evolution of the Standard Model in the early Universe from the Big Bang up to the end of several nanoseconds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenger, Andreas
2009-01-01
The study of processes involving flavour-changing neutral currents provides a particularly promising probe for New Physics beyond the Standard Model of particle physics. These processes are forbidden at tree level and proceed through loop processes, which are strongly suppressed in the Standard Model. Cross-sections for these processes can be significantly enhanced by contributions from new particles as they are proposed in most extentions of the Standard Model. This thesis presents searches for two flavour-changing neutral current decays, B± ! K±μ+μ- and B0 d ! K¤μ+μ-. The analysis was performed on 4.1 fb-1 of data collected by the DØ detector inmore » Run II of the Fermilab Tevatron. Candidate events for the decay B± ! K±μ+μ- were selected using a multi-variate analysis technique and the number of signal events determined by a fit to the invariant mass spectrum. Normalising to the known branching fraction for B± ! J/ÃK±, a branching fraction of B(B± ! K± μ+μ-) = 6.45 ± 2.24 (stat) ± 1.19 (syst) × 10-7 (1) was measured. The branching fraction for the decay B0 d ! K¤μ+μ- was determined in a similar way. Normalizing to the known branching fraction for B0 d ! J/ÃK¤, a branching fraction of B(B0 d ! K¤ μ+μ-) = 11.15 ± 3.05 (stat) ± 1.94 (syst) × 10-7 (2) was measured. All measurements are in agreement with the Standard Model.« less
NASA Astrophysics Data System (ADS)
Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan
2018-03-01
Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.
Modeling the Ocean Tide for Tidal Power Generation Applications
NASA Astrophysics Data System (ADS)
Kawase, M.; Gedney, M.
2014-12-01
Recent years have seen renewed interest in the ocean tide as a source of energy for electrical power generation. Unlike in the 1960s, when the tidal barrage was the predominant method of power extraction considered and implemented, the current methodology favors operation of a free-stream turbine or an array of them in strong tidal currents. As tidal power generation moves from pilot-scale projects to actual array implementations, numerical modeling of tidal currents is expected to play an increasing role in site selection, resource assessment, array design, and environmental impact assessment. In this presentation, a simple, coupled ocean/estuary model designed for research into fundamental aspects of tidal power generation is described. The model consists of a Pacific Ocean-size rectangular basin and a connected fjord-like embayment with dimensions similar to that of Puget Sound, Washington, one of the potential power generation sites in the United States. The model is forced by an idealized lunar tide-generating potential. The study focuses on the energetics of a tidal system including tidal power extraction at both global and regional scales. The hyperbolic nature of the governing shallow water equations means consequence of tidal power extraction cannot be limited to the local waters, but is global in extent. Modeling power extraction with a regional model with standard boundary conditions introduces uncertainties of 3 ~ 25% in the power extraction estimate depending on the level of extraction. Power extraction in the model has a well-defined maximum (~800 MW in a standard case) that is in agreement with previous theoretical studies. Natural energy dissipation and tidal power extraction strongly interact; for a turbine array of a given capacity, the higher the level of natural dissipation the lower the power the array can extract. Conversely, power extraction leads to a decrease in the level of natural dissipation (Figure) as well as the tidal range and the current speed. In the standard case considered, at the maximum power extraction the tidal range in the estuary is reduced by 37% and the natural dissipation by 78% from the unperturbed state. Thus, environmental consequences of power generation are likely to become the limiting factor on the scale of resource development before the physical maximum is reached.
Paraboloid magnetospheric magnetic field model and the status of the model as an ISO standard
NASA Astrophysics Data System (ADS)
Alexeev, I.
A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions It is a reason why the method of the paraboloid magnetospheric model construction based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters Such approach is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace equation for each of these large-scale current systems in the magnetosphere with a
Letter-case information and the identification of brand names.
Perea, Manuel; Jiménez, María; Talero, Fernanda; López-Cañada, Soraya
2015-02-01
A central tenet of most current models of visual-word recognition is that lexical units are activated on the basis of case-invariant abstract letter representations. Here, we examined this assumption by using a unique type of words: brand names. The rationale of the experiments is that brand names are archetypically printed either in lowercase (e.g., adidas) or uppercase (e.g., IKEA). This allows us to present the brand names in their standard or non-standard case configuration (e.g., adidas, IKEA vs. ADIDAS, ikea, respectively). We conducted two experiments with a brand-decision task ('is it a brand name?'): a single-presentation experiment and a masked priming experiment. Results in the single-presentation experiment revealed faster identification times of brand names in their standard case configuration than in their non-standard case configuration (i.e., adidas faster than ADIDAS; IKEA faster than ikea). In the masked priming experiment, we found faster identification times of brand names when they were preceded by an identity prime that matched its standard case configuration than when it did not (i.e., faster response times to adidas-adidas than to ADIDAS-adidas). Taken together, the present findings strongly suggest that letter-case information forms part of a brand name's graphemic information, thus posing some limits to current models of visual-word recognition. © 2014 The British Psychological Society.
Accounting principles, revenue recognition, and the profitability of pharmacy benefit managers.
McLean, Robert A; Garis, Robert I
2005-03-01
To contrast pharmacy benefit management (PBM) companies' measured profitability by using two accounting standards. The first accounting standard is that which, under Generally Accepted Accounting Principles (GAAP), PBMs are currently allowed to employ. The second accounting standard, seemingly more congruent with the PBM business model, treats the PBM as an agent of the plan sponsor. Financial Accounting Standards Board (FASB) Emerging Issues Task Force Issue 99-19, U.S. Securities and Exchange 10-K filings and financial accounting literature. Under GAAP record keeping, the PBM industry profitability appears modest. Using currently applied GAAP, the PBM treats all payment from the plan sponsor as revenue and all payment to the pharmacy as revenue. However, the PBM functions, in practice, as an entity that passes-through money collected from one party (the sponsor) to other parties (dispensing pharmacies). Therefore, it would seem that the nature of PBM cash flows would be more accurately recorded as a pass-through entity. When the PBM is evaluated using an accounting method that recognizes the pass-through nature of its business, the PBM profit margin increases dramatically. Current GAAP standards make traditional financial statement analysis of PBMs unrevealing, and may hide genuinely outstanding financial performance. Investors, regulators, pharmacies, and the FASB all have an interest in moving to clarify this accounting anomaly.
NASA Astrophysics Data System (ADS)
Dumitrache, P.; Goanţă, A. M.
2017-08-01
The ability of the cabins to insure the operator protection in the case of the shock loading that appears at the roll-over of the machine or when the cab is struck by the falling objects, it’s one of the most important performance criterions that it must comply by the machines and the mobile equipments. The experimental method provides the most accurate information on the behaviour of protective structures, but generates high costs due to experimental installations and structures which may be compromised during the experiments. In these circumstances, numerical simulation of the actual problem (mechanical shock applied to a strength structure) is a perfectly viable alternative, given that the hardware and software current performances provides the necessary support to obtain results with an acceptable level of accuracy. In this context, the paper proposes using FEA platforms for virtual testing of the actual strength structures of the cabins using their finite element models based on 3D models generated in CAD environments. In addition to the economic advantage above mentioned, although the results obtained by simulation using the finite element method are affected by a number of simplifying assumptions, the adequate modelling of the phenomenon can be a successful support in the design process of structures to meet safety performance criteria imposed by current standards. In the first section of the paper is presented the general context of the security performance requirements imposed by current standards on the cabins strength structures. The following section of the paper is dedicated to the peculiarities of finite element modelling in problems that impose simulation of the behaviour of structures subjected to shock loading. The final section of the paper is dedicated to a case study and to the future objectives.
Heat Transfer Analysis in Wire Bundles for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Rickman, S. L.; Iamello, C. J.
2016-01-01
Design of wiring for aerospace vehicles relies on an understanding of "ampacity" which refers to the current carrying capacity of wires, either, individually or in wire bundles. Designers rely on standards to derate allowable current flow to prevent exceedance of wire temperature limits due to resistive heat dissipation within the wires or wire bundles. These standards often add considerable margin and are based on empirical data. Commercial providers are taking an aggressive approach to wire sizing which challenges the conventional wisdom of the established standards. Thermal modelling of wire bundles may offer significant mass reduction in a system if the technique can be generalized to produce reliable temperature predictions for arbitrary bundle configurations. Thermal analysis has been applied to the problem of wire bundles wherein any or all of the wires within the bundle may carry current. Wire bundles present analytical challenges because the heat transfer path from conductors internal to the bundle is tortuous, relying on internal radiation and thermal interface conductance to move the heat from within the bundle to the external jacket where it can be carried away by convective and radiative heat transfer. The problem is further complicated by the dependence of wire electrical resistivity on temperature. Reduced heat transfer out of the bundle leads to higher conductor temperatures and, hence, increased resistive heat dissipation. Development of a generalized wire bundle thermal model is presented and compared with test data. The steady state heat balance for a single wire is derived and extended to the bundle configuration. The generalized model includes the effects of temperature varying resistance, internal radiation and thermal interface conductance, external radiation and temperature varying convective relief from the free surface. The sensitivity of the response to uncertainties in key model parameters is explored using Monte Carlo analysis.
Impedance spectroscopy of tripolar concentric ring electrodes with Ten20 and TD246 pastes.
Nasrollaholhosseini, Seyed Hadi; Herrera, Daniel Salazar; Besio, Walter G
2017-07-01
Electrodes are used to transform ionic currents to electrical currents in biological systems. Modeling the electrode-electrolyte interface could help to optimize the performance of the electrode interface to achieve higher signal to noise ratios. There are previous reports of accurate models for single-element biomedical electrodes. In this paper, we measured the impedance on both tripolar concentric ring electrodes and standard cup electrodes by electrochemical impedance spectroscopy (EIS) using both Ten20 and TD246 electrode paste. Furthermore, we applied the model to prove that the model can predict the performance of the electrode-electrolyte interface for tripolar concentric ring electrodes (TCRE) that are used to record brain signals.
Support of Multidimensional Parallelism in the OpenMP Programming Model
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Jost, Gabriele
2003-01-01
OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.
Congsheng Fu; Guiling Wang; Michael L. Goulden; Russell L. Scott; Kenneth Bible; Zoe G. Cardon
2016-01-01
Effects of hydraulic redistribution (HR) on hydrological, biogeochemical, and ecological processes have been demonstrated in the field, but the current generation of standard earth system models does not include a representation of HR. Though recent studies have examined the effect of incorporating HR into land surface models, few (if any) have done cross-site...
Benchmarking on Tsunami Currents with ComMIT
NASA Astrophysics Data System (ADS)
Sharghi vand, N.; Kanoglu, U.
2015-12-01
There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)
GAMBIT: the global and modular beyond-the-standard-model inference tool
NASA Astrophysics Data System (ADS)
Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-11-01
We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.
The Earth's magnetosphere modeling and ISO standard
NASA Astrophysics Data System (ADS)
Alexeev, I.
The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base Fairfield et al 1994 which contains Earth s magnetospheric magnetic field measurements accumulated during many years The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The last version of the Tsyganenko model has been constructed for a geomagnetic storm time interval This version based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters The same method has been used previously for paraboloid model construction This method is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace
On the Higgs-like boson in the minimal supersymmetric 3-3-1 model
NASA Astrophysics Data System (ADS)
Ferreira, J. G.; Pires, C. A. de S.; da Silva, P. S. Rodrigues; Siqueira, Clarissa
2018-03-01
It is imperative that any proposal of new physics beyond the standard model possesses a Higgs-like boson with 125 GeV of mass and couplings with the standard particles that recover the branching ratios and signal strengths as measured by CMS and ATLAS. We address this issue within the supersymmetric version of the minimal 3-3-1 model. For this we develop the Higgs potential with focus on the lightest Higgs provided by the model. Our proposal is to verify if it recovers the properties of the Standard Model Higgs. With respect to its mass, we calculate it up to one loop level by taking into account all contributions provided by the model. In regard to its couplings, we restrict our investigation to couplings of the Higgs-like boson with the standard particles, only. We then calculate the dominant branching ratios and the respective signal strengths and confront our results with the recent measurements of CMS and ATLAS. As distinctive aspects, we remark that our Higgs-like boson intermediates flavor changing neutral processes and has as signature the decay t → h+c. We calculate its branching ratio and compare it with current bounds. We also show that the Higgs potential of the model is stable for the region of parameter space employed in our calculations.
Minimal Unified Resolution to R_{K^{(*)}} and R(D^{(*)}) Anomalies with Lepton Mixing.
Choudhury, Debajyoti; Kundu, Anirban; Mandal, Rusa; Sinha, Rahul
2017-10-13
It is a challenging task to explain, in terms of a simple and compelling new physics scenario, the intriguing discrepancies between the standard model expectations and the data for the neutral-current observables R_{K} and R_{K^{*}}, as well as the charged-current observables R(D) and R(D^{*}). We show that this can be achieved in an effective theory with only two unknown parameters. In addition, this class of models predicts some interesting signatures in the context of both B decays as well as high-energy collisions.
Sabzghabaei, Foroogh; Salajeghe, Mahla; Soltani Arabshahi, Seyed Kamran
2017-01-01
Background: In this study, ambulatory care training in Firoozgar hospital was evaluated based on Iranian national standards of undergraduate medical education related to ambulatory education using Baldrige Excellence Model. Moreover, some suggestions were offered to promote education quality in the current condition of ambulatory education in Firoozgar hospital and national standards using the gap analysis method. Methods: This descriptive analytic study was a kind of evaluation research performed using the standard check lists published by the office of undergraduate medical education council. Data were collected through surveying documents, interviewing, and observing the processes based on the Baldrige Excellence Model. After confirming the validity and reliability of the check lists, we evaluated the establishment level of the national standards of undergraduate medical education in the clinics of this hospital in the 4 following domains: educational program, evaluation, training and research resources, and faculty members. Data were analyzed according to the national standards of undergraduate medical education related to ambulatory education and the Baldrige table for scoring. Finally, the quality level of the current condition was determined as very appropriate, appropriate, medium, weak, and very weak. Results: In domains of educational program 62%, in evaluation 48%, in training and research resources 46%, in faculty members 68%, and in overall ratio, 56% of the standards were appropriate. Conclusion: The most successful domains were educational program and faculty members, but evaluation and training and research resources domains had a medium performance. Some domains and indicators were determined as weak and their quality needed to be improved, so it is suggested to provide the necessary facilities and improvements by attending to the quality level of the national standards of ambulatory education PMID:29951400
Defending Against Advanced Persistent Threats Using Game-Theory.
Rass, Stefan; König, Sandra; Schauer, Stefan
2017-01-01
Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker's incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system's protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest.
The Graphical Representation of the Digital Astronaut Physiology Backbone
NASA Technical Reports Server (NTRS)
Briers, Demarcus
2010-01-01
This report summarizes my internship project with the NASA Digital Astronaut Project to analyze the Digital Astronaut (DA) physiology backbone model. The Digital Astronaut Project (DAP) applies integrated physiology models to support space biomedical operations, and to assist NASA researchers in closing knowledge gaps related to human physiologic responses to space flight. The DA physiology backbone is a set of integrated physiological equations and functions that model the interacting systems of the human body. The current release of the model is HumMod (Human Model) version 1.5 and was developed over forty years at the University of Mississippi Medical Center (UMMC). The physiology equations and functions are scripted in an XML schema specifically designed for physiology modeling by Dr. Thomas G. Coleman at UMMC. Currently it is difficult to examine the physiology backbone without being knowledgeable of the XML schema. While investigating and documenting the tags and algorithms used in the XML schema, I proposed a standard methodology for a graphical representation. This standard methodology may be used to transcribe graphical representations from the DA physiology backbone. In turn, the graphical representations can allow examination of the physiological functions and equations without the need to be familiar with the computer programming languages or markup languages used by DA modeling software.
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Extending and expanding the life of older current meters
Strahle, W.J.; Martini, Marinna A.
1995-01-01
The EG&G Model 610 VACM and Model 630 VMCM are standards for ocean current measurements. It is simple to add peripheral sensors to the data stream of the VACM by use of add-on CMOS circuitry. The firmware control of the VMCM makes it virtually impossible to add sampling of additional sensors. Most of the electronic components used in the VACM are obsolete or difficult to replace and the VMCM will soon follow suit. As a result, the USGS joined WHOI in the development of a PCMCIA data storage system to replace the cassette recording system in the VACM. Using the same PCMCIA recording package as the controller and recorder for the VMCM, a user-friendly VMCM is being designed. PCMCIA cards are rapidly becoming an industry standard with a wide range of storage capacities. By upgrading the VACM and VMCM to PCMCIA storage systems with a flexible microprocessor, they will continue to be viable instruments.
Physics of Electronic Materials
NASA Astrophysics Data System (ADS)
Rammer, Jørgen
2017-03-01
1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.
Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook
NASA Technical Reports Server (NTRS)
Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.
1986-01-01
The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey T.; Hill, Roger; Walker, Andy
The use of the term 'availability' to describe a photovoltaic (PV) system and power plant has been fraught with confusion for many years. A term that is meant to describe equipment operational status is often omitted, misapplied or inaccurately combined with PV performance metrics due to attempts to measure performance and reliability through the lens of traditional power plant language. This paper discusses three areas where current research in standards, contract language and performance modeling is improving the way availability is used with regards to photovoltaic systems and power plants.
Search for new physics with a monojet and missing transverse energy in pp collisions at √s = 7 TeV.
Chatrchyan, S; Khachatryan, V; Sirunyan, A M; Tumasyan, A; Adam, W; Bergauer, T; Dragicevic, M; Erö, J; Fabjan, C; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kiesenhofer, W; Krammer, M; Liko, D; Mikulec, I; Pernicka, M; Rohringer, H; Schöfbeck, R; Strauss, J; Taurok, A; Teischinger, F; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C-E; Mossolov, V; Shumeiko, N; Gonzalez, J Suarez; Bansal, S; Benucci, L; De Wolf, E A; Janssen, X; Maes, J; Maes, T; Mucibello, L; Ochesanu, S; Roland, B; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Blekman, F; Blyweert, S; D'Hondt, J; Devroede, O; Suarez, R Gonzalez; Kalogeropoulos, A; Maes, M; Van Doninck, W; Van Mulders, P; Van Onsem, G P; Villella, I; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Gay, A P R; Hammad, G H; Hreus, T; Marage, P E; Thomas, L; Velde, C Vander; Vanlaer, P; Adler, V; Cimmino, A; Costantini, S; Grunewald, M; Klein, B; Lellouch, J; Marinov, A; Mccartin, J; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Walsh, S; Zaganidis, N; Basegmez, S; Bruno, G; Caudron, J; Ceard, L; Gil, E Cortina; De Favereau De Jeneret, J; Delaere, C; Favart, D; Giammanco, A; Grégoire, G; Hollar, J; Lemaitre, V; Liao, J; Militaru, O; Nuttens, C; Ovyn, S; Pagano, D; Pin, A; Piotrzkowski, K; Schul, N; Beliy, N; Caebergs, T; Daubie, E; Alves, G A; Brito, L; De Jesus Damiao, D; Pol, M E; Souza, M H G; Aldá Júnior, W L; Carvalho, W; Da Costa, E M; Martins, C De Oliveira; Fonseca De Souza, S; Mundim, L; Nogima, H; Oguri, V; Prado Da Silva, W L; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Bernardes, C A; Dias, F A; Tomei, T R Fernandez Perez; Gregores, E M; Lagana, C; Marinho, F; Mercadante, P G; Novaes, S F; Padula, Sandra S; Darmenov, N; Genchev, V; Iaydjiev, P; Piperov, S; Rodozov, M; Stoykova, S; Sultanov, G; Tcholakov, V; Trayanov, R; Dimitrov, A; Hadjiiska, R; Karadzhinova, A; Kozhuharov, V; Litov, L; Mateev, M; Pavlov, B; Petkov, P; Bian, J G; Chen, G M; Chen, H S; Jiang, C H; Liang, D; Liang, S; Meng, X; Tao, J; Wang, J; Wang, J; Wang, X; Wang, Z; Xiao, H; Xu, M; Zang, J; Zhang, Z; Ban, Y; Guo, S; Guo, Y; Li, W; Mao, Y; Qian, S J; Teng, H; Zhu, B; Zou, W; Cabrera, A; Moreno, B Gomez; Rios, A A Ocampo; Oliveros, A F Osorio; Sanabria, J C; Godinovic, N; Lelas, D; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Attikis, A; Galanti, M; Mousa, J; Nicolaou, C; Ptochos, F; Razis, P A; Finger, M; Finger, M; Awad, A; Khalil, S; Radi, A; Hektor, A; Kadastik, M; Müntel, M; Raidal, M; Rebane, L; Tiko, A; Azzolini, V; Eerola, P; Fedi, G; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Tuominen, E; Tuominiemi, J; Tuovinen, E; Ungaro, D; Wendland, L; Banzuzi, K; Karjalainen, A; Korpela, A; Tuuva, T; Sillou, D; Besancon, M; Choudhury, S; Dejardin, M; Denegri, D; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Shreyber, I; Titov, M; Verrecchia, P; Baffioni, S; Beaudette, F; Benhabib, L; Bianchini, L; Bluj, M; Broutin, C; Busson, P; Charlot, C; Dahms, T; Dobrzynski, L; Elgammal, S; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Mironov, C; Ochando, C; Paganini, P; Sabes, D; Salerno, R; Sirois, Y; Thiebaux, C; Wyslouch, B; Zabi, A; Agram, J-L; Andrea, J; Bloch, D; Bodin, D; Brom, J-M; Cardaci, M; Chabert, E C; Collard, C; Conte, E; Drouhin, F; Ferro, C; Fontaine, J-C; Gelé, D; Goerlach, U; Greder, S; Juillot, P; Karim, M; Le Bihan, A-C; Mikami, Y; Van Hove, P; Fassi, F; Mercier, D; Baty, C; Beauceron, S; Beaupere, N; Bedjidian, M; Bondu, O; Boudoul, G; Boumediene, D; Brun, H; Chasserat, J; Chierici, R; Contardo, D; Depasse, P; El Mamouni, H; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Mirabito, L; Perries, S; Sordini, V; Tosi, S; Tschudi, Y; Verdier, P; Lomidze, D; Anagnostou, G; Beranek, S; Edelhoff, M; Feld, L; Heracleous, N; Hindrichs, O; Jussen, R; Klein, K; Merz, J; Mohr, N; Ostapchuk, A; Perieanu, A; Raupach, F; Sammet, J; Schael, S; Sprenger, D; Weber, H; Weber, M; Wittmer, B; Ata, M; Dietz-Laursonn, E; Erdmann, M; Hebbeker, T; Hinzmann, A; Hoepfner, K; Klimkovich, T; Klingebiel, D; Kreuzer, P; Lanske, D; Lingemann, J; Magass, C; Merschmeyer, M; Meyer, A; Papacz, P; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Steggemann, J; Teyssier, D; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Ahmad, W Haj; Heydhausen, D; Hoehle, F; Kargoll, B; Kress, T; Kuessel, Y; Linn, A; Nowack, A; Perchalla, L; Pooth, O; Rennefeld, J; Sauerland, P; Stahl, A; Thomas, M; Tornier, D; Zoeller, M H; Martin, M Aldaya; Behrenhoff, W; Behrens, U; Bergholz, M; Bethani, A; Borras, K; Cakir, A; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Eckstein, D; Flossdorf, A; Flucke, G; Geiser, A; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Katsas, P; Kleinwort, C; Kluge, H; Knutsson, A; Krämer, M; Krücker, D; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Melzer-Pellmann, I-A; Meyer, A B; Mnich, J; Mussgiller, A; Olzem, J; Petrukhin, A; Pitzl, D; Raspereza, A; Raval, A; Rosin, M; Schmidt, R; Schoerner-Sadenius, T; Sen, N; Spiridonov, A; Stein, M; Tomaszewska, J; Walsh, R; Wissing, C; Autermann, C; Blobel, V; Bobrovskyi, S; Draeger, J; Enderle, H; Gebbert, U; Görner, M; Kaschube, K; Kaussen, G; Kirschenmann, H; Klanner, R; Lange, J; Mura, B; Naumann-Emme, S; Nowak, F; Pietsch, N; Sander, C; Schettler, H; Schleper, P; Schlieckau, E; Schröder, M; Schum, T; Schwandt, J; Stadie, H; Steinbrück, G; Thomsen, J; Barth, C; Bauer, J; Berger, J; Buege, V; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Gruschke, J; Hackstein, C; Hartmann, F; Heinrich, M; Held, H; Hoffmann, K H; Honc, S; Komaragiri, J R; Kuhr, T; Martschei, D; Mueller, S; Müller, Th; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Scheurer, A; Schieferdecker, P; Schilling, F-P; Schott, G; Simonis, H J; Stober, F M; Troendle, D; Wagner-Kuhr, J; Weiler, T; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Kesisoglou, S; Kyriakis, A; Loukas, D; Manolakos, I; Markou, A; Markou, C; Mavrommatis, C; Ntomari, E; Petrakou, E; Gouskos, L; Mertzimekis, T J; Panagiotou, A; Stiliaris, E; Evangelou, I; Foudas, C; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Aranyi, A; Bencze, G; Boldizsar, L; Hajdu, C; Hidas, P; Horvath, D; Kapusi, A; Krajczar, K; Sikler, F; Veres, G I; Vesztergombi, G; Beni, N; Molnar, J; Palinkas, J; Szillasi, Z; Veszpremi, V; Raics, P; Trocsanyi, Z L; Ujvari, B; Beri, S B; Bhatnagar, V; Dhingra, N; Gupta, R; Jindal, M; Kaur, M; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A P; Singh, J; Singh, S P; Ahuja, S; Choudhary, B C; Gupta, P; Jain, S; Jain, S; Kumar, A; Kumar, A; Naimuddin, M; Ranjan, K; Shivpuri, R K; Banerjee, S; Bhattacharya, S; Dutta, S; Gomber, B; Khurana, R; Sarkar, S; Choudhury, R K; Dutta, D; Kailas, S; Kumar, V; Mehta, P; Mohanty, A K; Pant, L M; Shukla, P; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Mohanty, G B; Saha, A; Sudhakar, K; Wickramage, N; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Etesami, S M; Fahim, A; Hashemi, M; Jafari, A; Khakzad, M; Mohammadi, A; Najafabadi, M Mohammadi; Mehdiabadi, S Paktinat; Safarzadeh, B; Zeinali, M; Abbrescia, M; Barbone, L; Calabria, C; Colaleo, A; Creanza, D; De Filippis, N; De Palma, M; Fiore, L; Iaselli, G; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Nuzzo, S; Pacifico, N; Pierro, G A; Pompili, A; Pugliese, G; Romano, F; Roselli, G; Selvaggi, G; Silvestris, L; Trentadue, R; Tupputi, S; Zito, G; Abbiendi, G; Benvenuti, A C; Bonacorsi, D; Braibant-Giacomelli, S; Brigliadori, L; Capiluppi, P; Castro, A; Cavallo, F R; Cuffiani, M; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giunta, M; Grandi, C; Marcellini, S; Masetti, G; Meneghelli, M; Montanari, A; Navarria, F L; Odorici, F; Perrotta, A; Primavera, F; Rossi, A M; Rovelli, T; Siroli, G; Travaglini, R; Albergo, S; Cappello, G; Chiorboli, M; Costa, S; Tricomi, A; Tuve, C; Barbagli, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Gonzi, S; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bianco, S; Colafranceschi, S; Fabbri, F; Piccolo, D; Fabbricatore, P; Musenich, R; Benaglia, A; De Guio, F; Di Matteo, L; Gennai, S; Ghezzi, A; Malvezzi, S; Martelli, A; Massironi, A; Menasce, D; Moroni, L; Paganoni, M; Pedrini, D; Ragazzi, S; Redaelli, N; Sala, S; Tabarelli de Fatis, T; Buontempo, S; Montoya, C A Carrillo; Cavallo, N; De Cosa, A; Fabozzi, F; Iorio, A O M; Lista, L; Merola, M; Paolucci, P; Azzi, P; Bacchetta, N; Bellan, P; Biasotto, M; Bisello, D; Branca, A; Carlin, R; Checchia, P; Dorigo, T; Gasparini, F; Gozzelino, A; Gulmini, M; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Meneguzzo, A T; Nespolo, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Ratti, S P; Riccardi, C; Torre, P; Vitulo, P; Viviani, C; Biasini, M; Bilei, G M; Caponeri, B; Fanò, L; Lariccia, P; Lucaroni, A; Mantovani, G; Menichelli, M; Nappi, A; Romeo, F; Santocchia, A; Taroni, S; Valdata, M; Azzurri, P; Bagliesi, G; Bernardini, J; Boccali, T; Broccolo, G; Castaldi, R; D'Agnolo, R T; Dell'Oso, R; Fiori, F; Foà, L; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Martini, L; Messineo, A; Palla, F; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tonelli, G; Venturi, A; Verdini, P G; Barone, L; Cavallari, F; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Grassi, M; Longo, E; Meridiani, P; Nourbakhsh, S; Organtini, G; Pandolfi, F; Paramatti, R; Rahatlou, S; Rovelli, C; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Botta, C; Cartiglia, N; Castello, R; Costa, M; Demaria, N; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Obertino, M M; Pastrone, N; Pelliccioni, M; Potenza, A; Romero, A; Ruspa, M; Sacchi, R; Sola, V; Solano, A; Staiano, A; Pereira, A Vilela; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Montanino, D; Penzo, A; Heo, S G; Nam, S K; Chang, S; Chung, J; Kim, D H; Kim, G N; Kim, J E; Kong, D J; Park, H; Ro, S R; Son, D; Son, D C; Son, T; Kim, Zero; Kim, J Y; Song, S; Choi, S; Hong, B; Jo, M; Kim, H; Kim, J H; Kim, T J; Lee, K S; Moon, D H; Park, S K; Sim, K S; Choi, M; Kang, S; Kim, H; Park, C; Park, I C; Park, S; Ryu, G; Choi, Y; Choi, Y K; Goh, J; Kim, M S; Lee, J; Lee, S; Seo, H; Yu, I; Bilinskas, M J; Grigelionis, I; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla-Valdez, H; De La Cruz-Burelo, E; Heredia-de La Cruz, I; Lopez-Fernandez, R; Villalba, R Magaña; Sánchez-Hernández, A; Villasenor-Cendejas, L M; Moreno, S Carrillo; Valencia, F Vazquez; Ibarguen, H A Salazar; Linares, E Casimiro; Pineda, A Morelos; Reyes-Santos, M A; Krofcheck, D; Tam, J; Butler, P H; Doesburg, R; Silverwood, H; Ahmad, M; Ahmed, I; Asghar, M I; Hoorani, H R; Khan, W A; Khurshid, T; Qazi, S; Brona, G; Cwiok, M; Dominik, W; Doroba, K; Kalinowski, A; Konecki, M; Krolikowski, J; Frueboes, T; Gokieli, R; Górski, M; Kazana, M; Nawrocki, K; Romanowska-Rybinska, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Bargassa, P; David, A; Faccioli, P; Parracho, P G Ferreira; Gallinaro, M; Musella, P; Nayak, A; Pela, J; Ribeiro, P Q; Seixas, J; Varela, J; Afanasiev, S; Belotelov, I; Bunin, P; Golutvin, I; Karjavin, V; Kozlov, G; Lanev, A; Moisenz, P; Palichik, V; Perelygin, V; Savina, M; Shmatov, S; Smirnov, V; Volodko, A; Zarubin, A; Golovtsov, V; Ivanov, Y; Kim, V; Levchenko, P; Murzin, V; Oreshkin, V; Smirnov, I; Sulimov, V; Uvarov, L; Vavilov, S; Vorobyev, A; Vorobyev, An; Andreev, Yu; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Toropin, A; Troitsky, S; Epshteyn, V; Gavrilov, V; Kaftanov, V; Kossov, M; Krokhotin, A; Lychkovskaya, N; Popov, V; Safronov, G; Semenov, S; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Kodolova, O; Lokhtin, I; Markina, A; Obraztsov, S; Perfilov, M; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Andreev, V; Azarkin, M; Dremin, I; Kirakosyan, M; Leonidov, A; Rusakov, S V; Vinogradov, A; Azhgirey, I; Bayshev, I; Bitioukov, S; Grishin, V; Kachanov, V; Konstantinov, D; Korablev, A; Krychkine, V; Petrov, V; Ryutin, R; Sobol, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Krpic, D; Milosevic, J; Aguilar-Benitez, M; Maestre, J Alcaraz; Arce, P; Battilana, C; Calvo, E; Cepeda, M; Cerrada, M; Llatas, M Chamizo; Colino, N; De La Cruz, B; Peris, A Delgado; Pardos, C Diez; Vázquez, D Domínguez; Bedoya, C Fernandez; Ramos, J P Fernández; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Lopez, O Gonzalez; Lopez, S Goy; Hernandez, J M; Josa, M I; Merino, G; Pelayo, J Puerta; Redondo, I; Romero, L; Santaolalla, J; Soares, M S; Willmott, C; Albajar, C; Codispoti, G; de Trocóniz, J F; Cuevas, J; Menendez, J Fernandez; Folgueras, S; Caballero, I Gonzalez; Iglesias, L Lloret; Garcia, J M Vizan; Cifuentes, J A Brochero; Cabrillo, I J; Calderon, A; Chuang, S H; Campderros, J Duarte; Felcini, M; Fernandez, M; Gomez, G; Sanchez, J Gonzalez; Jorda, C; Pardo, P Lobelle; Virto, A Lopez; Marco, J; Marco, R; Rivero, C Martinez; Matorras, F; Sanchez, F J Munoz; Gomez, J Piedra; Rodrigo, T; Rodríguez-Marrero, A Y; Ruiz-Jimeno, A; Scodellaro, L; Sanudo, M Sobron; Vila, I; Cortabitarte, R Vilar; Abbaneo, D; Auffray, E; Auzinger, G; Baillon, P; Ball, A H; Barney, D; Bell, A J; Benedetti, D; Bernet, C; Bialas, W; Bloch, P; Bocci, A; Bolognesi, S; Bona, M; Breuker, H; Bunkowski, K; Camporesi, T; Cerminara, G; Christiansen, T; Perez, J A Coarasa; Curé, B; D'Enterria, D; De Roeck, A; Di Guida, S; Dupont-Sagorin, N; Elliott-Peisert, A; Frisch, B; Funk, W; Gaddi, A; Georgiou, G; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Glege, F; Garrido, R Gomez-Reino; Gouzevitch, M; Govoni, P; Gowdy, S; Guiducci, L; Hansen, M; Hartl, C; Harvey, J; Hegeman, J; Hegner, B; Hoffmann, H F; Honma, A; Innocente, V; Janot, P; Kaadze, K; Karavakis, E; Lecoq, P; Lourenço, C; Mäki, T; Malberti, M; Malgeri, L; Mannelli, M; Masetti, L; Maurisset, A; Meijers, F; Mersi, S; Meschi, E; Moser, R; Mozer, M U; Mulders, M; Nesvold, E; Nguyen, M; Orimoto, T; Orsini, L; Perez, E; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Piparo, D; Polese, G; Racz, A; Antunes, J Rodrigues; Rolandi, G; Rommerskirchen, T; Rovere, M; Sakulin, H; Schäfer, C; Schwick, C; Segoni, I; Sharma, A; Siegrist, P; Simon, M; Sphicas, P; Spiropulu, M; Stoye, M; Tropea, P; Tsirou, A; Vichoudis, P; Voutilainen, M; Zeuner, W D; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Bäni, L; Bortignon, P; Caminada, L; Chanon, N; Chen, Z; Cittolin, S; Dissertori, G; Dittmar, M; Eugster, J; Freudenreich, K; Grab, C; Hintz, W; Lecomte, P; Lustermann, W; Marchica, C; Ruiz del Arbol, P Martinez; Milenovic, P; Moortgat, F; Nägeli, C; Nef, P; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Rossini, M; Sala, L; Sanchez, A K; Sawley, M-C; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Urscheler, C; Wallny, R; Weber, M; Wehrli, L; Weng, J; Aguilo, E; Amsler, C; Chiochia, V; De Visscher, S; Favaro, C; Rikova, M Ivova; Mejias, B Millan; Otiougova, P; Regenfus, C; Robmann, P; Schmidt, A; Snoek, H; Chang, Y H; Chen, K H; Kuo, C M; Li, S W; Lin, W; Liu, Z K; Lu, Y J; Mekterovic, D; Volpe, R; Wu, J H; Yu, S S; Bartalini, P; Chang, P; Chang, Y H; Chang, Y W; Chao, Y; Chen, K F; Hou, W-S; Hsiung, Y; Kao, K Y; Lei, Y J; Lu, R-S; Shiu, J G; Tzeng, Y M; Wang, M; Adiguzel, A; Bakirci, M N; Cerci, S; Dozen, C; Dumanoglu, I; Eskut, E; Girgis, S; Gokbulut, G; Hos, I; Kangal, E E; Topaksu, A Kayis; Onengut, G; Ozdemir, K; Ozturk, S; Polatoz, A; Sogut, K; Cerci, D Sunar; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilin, B; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Ocalan, K; Ozpineci, A; Serin, M; Sever, R; Surat, U E; Yildirim, E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Isildak, B; Kaya, M; Kaya, O; Ozbek, M; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Bostock, F; Brooke, J J; Cheng, T L; Clement, E; Cussans, D; Frazier, R; Goldstein, J; Grimes, M; Hansen, M; Hartley, D; Heath, G P; Heath, H F; Kreczko, L; Metson, S; Newbold, D M; Nirunpong, K; Poll, A; Senkin, S; Smith, V J; Ward, S; Basso, L; Bell, K W; Belyaev, A; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Harder, K; Harper, S; Jackson, J; Kennedy, B W; Olaiya, E; Petyt, D; Radburn-Smith, B C; Shepherd-Themistocleous, C H; Tomalin, I R; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Cutajar, M; Davies, G; Della Negra, M; Ferguson, W; Fulcher, J; Futyan, D; Gilbert, A; Bryer, A Guneratne; Hall, G; Hatherell, Z; Hays, J; Iles, G; Jarvis, M; Karapostoli, G; Lyons, L; MacEvoy, B C; Magnan, A-M; Marrouche, J; Mathias, B; Nandi, R; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rogerson, S; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sparrow, A; Tapper, A; Tourneur, S; Acosta, M Vazquez; Virdee, T; Wakefield, S; Wardle, N; Wardrope, D; Whyntie, T; Barrett, M; Chadwick, M; Cole, J E; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Martin, W; Reid, I D; Teodorescu, L; Hatakeyama, K; Liu, H; Henderson, C; Bose, T; Jarrin, E Carrera; Fantasia, C; Heister, A; St John, J; Lawson, P; Lazic, D; Rohlf, J; Sperka, D; Sulak, L; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Ferapontov, A; Heintz, U; Jabeen, S; Kukartsev, G; Landsberg, G; Luk, M; Narain, M; Nguyen, D; Segala, M; Sinthuprasith, T; Speer, T; Tsang, K V; Breedon, R; Breto, G; Calderon De La Barca Sanchez, M; Chauhan, S; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Salur, S; Schwarz, T; Searle, M; Smith, J; Squires, M; Tripathi, M; Sierra, R Vasquez; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Deisher, A; Duris, J; Erhan, S; Farrell, C; Hauser, J; Ignatenko, M; Jarvis, C; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Babb, J; Chandra, A; Clare, R; Ellison, J; Gary, J W; Giordano, F; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Long, O R; Luthra, A; Nguyen, H; Shen, B C; Stringer, R; Sturdy, J; Sumowidagdo, S; Wilken, R; Wimpenny, S; Andrews, W; Branson, J G; Cerati, G B; Evans, D; Golf, F; Holzner, A; Kelley, R; Lebourgeois, M; Letts, J; Mangano, B; Padhi, S; Palmer, C; Petrucciani, G; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Sudano, E; Tadel, M; Tu, Y; Vartak, A; Wasserbaech, S; Würthwein, F; Yagil, A; Yoo, J; Barge, D; Bellan, R; Campagnari, C; D'Alfonso, M; Danielson, T; Flowers, K; Geffert, P; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lowette, S; Mccoll, N; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Apresyan, A; Bornheim, A; Bunn, J; Chen, Y; Gataullin, M; Ma, Y; Mott, A; Newman, H B; Rogan, C; Shin, K; Timciuc, V; Traczyk, P; Veverka, J; Wilkinson, R; Yang, Y; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Iiyama, Y; Jang, D W; Jun, S Y; Liu, Y F; Paulini, M; Russ, J; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Edelmaier, C J; Ford, W T; Gaz, A; Heyburn, B; Lopez, E Luiggi; Nauenberg, U; Smith, J G; Stenson, K; Ulmer, K A; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Cassel, D; Chatterjee, A; Das, S; Eggert, N; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kaufman, G Nicolas; Patterson, J R; Puigh, D; Ryd, A; Salvati, E; Shi, X; Sun, W; Teo, W D; Thom, J; Thompson, J; Vaughan, J; Weng, Y; Winstrom, L; Wittich, P; Biselli, A; Cirino, G; Winn, D; Abdullin, S; Albrow, M; Anderson, J; Apollinari, G; Atac, M; Bakken, J A; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Bloch, I; Borcherding, F; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Cihangir, S; Cooper, W; Eartly, D P; Elvira, V D; Esen, S; Fisk, I; Freeman, J; Gao, Y; Gottschalk, E; Green, D; Gunthoti, K; Gutsche, O; Hanlon, J; Harris, R M; Hirschauer, J; Hooberman, B; Jensen, H; Johnson, M; Joshi, U; Khatiwada, R; Klima, B; Kousouris, K; Kunori, S; Kwan, S; Leonidopoulos, C; Limon, P; Lincoln, D; Lipton, R; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Mrenna, S; Musienko, Y; Newman-Holmes, C; O'Dell, V; Pordes, R; Prokofyev, O; Saoulidou, N; Sexton-Kennedy, E; Sharma, S; Spalding, W J; Spiegel, L; Tan, P; Taylor, L; Tkaczyk, S; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wu, W; Yang, F; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Bourilkov, D; Chen, M; De Gruttola, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fisher, M; Fu, Y; Furic, I K; Gartner, J; Kim, B; Konigsberg, J; Korytov, A; Kropivnitskaya, A; Kypreos, T; Matchev, K; Mitselmakher, G; Muniz, L; Prescott, C; Remington, R; Schmitt, M; Scurlock, B; Sellers, P; Skhirtladze, N; Snowball, M; Wang, D; Yelton, J; Zakaria, M; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Mesa, D; Rodriguez, J L; Adams, T; Askew, A; Bochenek, J; Chen, J; Diamond, B; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prosper, H; Quertenmont, L; Sekmen, S; Veeraraghavan, V; Baarmand, M M; Dorney, B; Guragain, S; Hohlmann, M; Kalakhety, H; Ralich, R; Vodopiyanov, I; Adams, M R; Anghel, I M; Apanasevich, L; Bai, Y; Bazterra, V E; Betts, R R; Callner, J; Cavanaugh, R; Dragoiu, C; Gauthier, L; Gerber, C E; Hofman, D J; Khalatyan, S; Kunde, G J; Lacroix, F; Malek, M; O'Brien, C; Silkworth, C; Silvestre, C; Smoron, A; Strom, D; Varelas, N; Akgun, U; Albayrak, E A; Bilki, B; Clarida, W; Duru, F; Lae, C K; McCliment, E; Merlo, J-P; Mermerkaya, H; Mestvirishvili, A; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Eskew, C; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Hu, G; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Whitbeck, A; Baringer, P; Bean, A; Benelli, G; Grachov, O; Kenny, R P; Murray, M; Noonan, D; Sanders, S; Wood, J S; Zhukova, V; Barfuss, A F; Bolton, T; Chakaberia, I; Ivanov, A; Khalil, S; Makouski, M; Maravin, Y; Shrestha, S; Svintradze, I; Wan, Z; Gronberg, J; Lange, D; Wright, D; Baden, A; Boutemeur, M; Eno, S C; Ferencek, D; Gomez, J A; Hadley, N J; Kellogg, R G; Kirn, M; Lu, Y; Mignerey, A C; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; Dutta, V; Everaerts, P; Ceballos, G Gomez; Goncharov, M; Hahn, K A; Harris, P; Kim, Y; Klute, M; Lee, Y-J; Li, W; Loizides, C; Luckey, P D; Ma, T; Nahn, S; Paus, C; Ralph, D; Roland, C; Roland, G; Rudolph, M; Stephans, G S F; Stöckli, F; Sumorok, K; Sung, K; Wenger, E A; Wolf, R; Xie, S; Yang, M; Yilmaz, Y; Yoon, A S; Zanetti, M; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dudero, P R; Franzoni, G; Haupt, J; Klapoetke, K; Kubota, Y; Mans, J; Pastika, N; Rekovic, V; Rusack, R; Sasseville, M; Singovsky, A; Tambe, N; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Summers, D; Bloom, K; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Godshalk, A; Iashvili, I; Jain, S; Kharchilava, A; Kumar, A; Shipkowski, S P; Smith, K; Zennamo, J; Alverson, G; Barberis, E; Baumgartel, D; Boeriu, O; Chasco, M; Reucroft, S; Swain, J; Trocino, D; Wood, D; Zhang, J; Anastassov, A; Kubik, A; Odell, N; Ofierzynski, R A; Pollack, B; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Brinkerhoff, A; Hildreth, M; Jessop, C; Karmgard, D J; Kolb, J; Kolberg, T; Lannon, K; Luo, W; Lynch, S; Marinelli, N; Morse, D M; Pearson, T; Ruchti, R; Slaunwhite, J; Valls, N; Wayne, M; Ziegler, J; Bylsma, B; Durkin, L S; Gu, J; Hill, C; Killewald, P; Kotov, K; Ling, T Y; Rodenburg, M; Williams, G; Adam, N; Berry, E; Elmer, P; Gerbaudo, D; Halyo, V; Hebda, P; Hunt, A; Jones, J; Laird, E; Pegna, D Lopes; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Quan, X; Saka, H; Stickland, D; Tully, C; Werner, J S; Zuranski, A; Acosta, J G; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Vargas, J E Ramirez; Zatserklyaniy, A; Alagoz, E; Barnes, V E; Bolla, G; Borrello, L; Bortoletto, D; De Mattia, M; Everett, A; Garfinkel, A F; Gutay, L; Hu, Z; Jones, M; Koybasi, O; Kress, M; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Shipsey, I; Silvers, D; Svyatkovskiy, A; Yoo, H D; Zablocki, J; Zheng, Y; Jindal, P; Parashar, N; Boulahouache, C; Ecklund, K M; Geurts, F J M; Padley, B P; Redjimi, R; Roberts, J; Zabel, J; Betchart, B; Bodek, A; Chung, Y S; Covarelli, R; de Barbaro, P; Demina, R; Eshaq, Y; Flacher, H; Garcia-Bellido, A; Goldenzweig, P; Gotra, Y; Han, J; Harel, A; Miner, D C; Orbaker, D; Petrillo, G; Sakumoto, W; Vishnevskiy, D; Zielinski, M; Bhatti, A; Ciesielski, R; Demortier, L; Goulianos, K; Lungu, G; Malik, S; Mesropian, C; Yan, M; Atramentov, O; Barker, A; Duggan, D; Gershtein, Y; Gray, R; Halkiadakis, E; Hidas, D; Hits, D; Lath, A; Panwalkar, S; Patel, R; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Eusebi, R; Flanagan, W; Gilmore, J; Gurrola, A; Kamon, T; Khotilovich, V; Montalvo, R; Osipenkov, I; Pakhotin, Y; Pivarski, J; Safonov, A; Sengupta, S; Tatarinov, A; Toback, D; Weinberger, M; Akchurin, N; Bardak, C; Damgov, J; Jeong, C; Kovitanggoon, K; Lee, S W; Libeiro, T; Mane, P; Roh, Y; Sill, A; Volobouev, I; Wigmans, R; Yazgan, E; Appelt, E; Brownson, E; Engh, D; Florez, C; Gabella, W; Issah, M; Johns, W; Kurt, P; Maguire, C; Melo, A; Sheldon, P; Snook, B; Tuo, S; Velkovska, J; Arenton, M W; Balazs, M; Boutle, S; Cox, B; Francis, B; Hirosky, R; Ledovskoy, A; Lin, C; Neu, C; Yohay, R; Gollapinni, S; Harr, R; Karchin, P E; Lamichhane, P; Mattson, M; Milstène, C; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Dasu, S; Efron, J; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Herndon, M; Hervé, A; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Mohapatra, A; Palmonari, F; Reeder, D; Ross, I; Savin, A; Smith, W H; Swanson, J; Weinberg, M
2011-11-11
A study of events with missing transverse energy and an energetic jet is performed using pp collision data at a center-of-mass energy of 7 TeV. The data were collected by the CMS detector at the LHC, and correspond to an integrated luminosity of 36 pb(-1). An excess of these events over standard model contributions is a signature of new physics such as large extra dimensions and unparticles. The number of observed events is in good agreement with the prediction of the standard model, and significant extension of the current limits on parameters of new physics benchmark models is achieved.
Evaluation of LOINC for Representing Constitutional Cytogenetic Test Result Reports
Heras, Yan Z.; Mitchell, Joyce A.; Williams, Marc S.; Brothman, Arthur R.; Huff, Stanley M.
2009-01-01
Genetic testing is becoming increasingly important to medical practice. Integrating genetics and genomics data into electronic medical records is crucial in translating genetic discoveries into improved patient care. Information technology, especially Clinical Decision Support Systems, holds great potential to help clinical professionals take full advantage of genomic advances in their daily medical practice. However, issues relating to standard terminology and information models for exchanging genetic testing results remain relatively unexplored. This study evaluates whether the current LOINC standard is adequate to represent constitutional cytogenetic test result reports using sample result reports from ARUP Laboratories. The results demonstrate that current standard terminology is insufficient to support the needs of coding cytogenetic test results. The terminology infrastructure must be developed before clinical information systems will be able to handle the high volumes of genetic data expected in the near future. PMID:20351857
Evaluation of LOINC for representing constitutional cytogenetic test result reports.
Heras, Yan Z; Mitchell, Joyce A; Williams, Marc S; Brothman, Arthur R; Huff, Stanley M
2009-11-14
Genetic testing is becoming increasingly important to medical practice. Integrating genetics and genomics data into electronic medical records is crucial in translating genetic discoveries into improved patient care. Information technology, especially Clinical Decision Support Systems, holds great potential to help clinical professionals take full advantage of genomic advances in their daily medical practice. However, issues relating to standard terminology and information models for exchanging genetic testing results remain relatively unexplored. This study evaluates whether the current LOINC standard is adequate to represent constitutional cytogenetic test result reports using sample result reports from ARUP Laboratories. The results demonstrate that current standard terminology is insufficient to support the needs of coding cytogenetic test results. The terminology infrastructure must be developed before clinical information systems will be able to handle the high volumes of genetic data expected in the near future.
NASA Astrophysics Data System (ADS)
Menzel, R.; Paynter, D.; Jones, A. L.
2017-12-01
Due to their relatively low computational cost, radiative transfer models in global climate models (GCMs) run on traditional CPU architectures generally consist of shortwave and longwave parameterizations over a small number of wavelength bands. With the rise of newer GPU and MIC architectures, however, the performance of high resolution line-by-line radiative transfer models may soon approach those of the physical parameterizations currently employed in GCMs. Here we present an analysis of the current performance of a new line-by-line radiative transfer model currently under development at GFDL. Although originally designed to specifically exploit GPU architectures through the use of CUDA, the radiative transfer model has recently been extended to include OpenMP in an effort to also effectively target MIC architectures such as Intel's Xeon Phi. Using input data provided by the upcoming Radiative Forcing Model Intercomparison Project (RFMIP, as part of CMIP 6), we compare model results and performance data for various model configurations and spectral resolutions run on both GPU and Intel Knights Landing architectures to analogous runs of the standard Oxford Reference Forward Model on traditional CPUs.
49 CFR 526.5 - Earning offsetting monetary credits in future model years.
Code of Federal Regulations, 2011 CFR
2011-10-01
... UNDER THE AUTOMOBILE FUEL EFFICIENCY ACT OF 1980 § 526.5 Earning offsetting monetary credits in future... for the class of automobiles which may fail to comply with a fuel economy standard and for any other classes of automobiles from which credits may be transferred, for the current model year and for each...
ERIC Educational Resources Information Center
Brown, Elizabeth
2009-01-01
The current rate of change suggests scholarly communications issues such as new publication models and technology to connect library and research tools is expected to continue into the foreseeable future. As models evolve, standards develop, and scientists evolve in their communication patterns, librarians will need to embrace transitional…
49 CFR 526.5 - Earning offsetting monetary credits in future model years.
Code of Federal Regulations, 2013 CFR
2013-10-01
... UNDER THE AUTOMOBILE FUEL EFFICIENCY ACT OF 1980 § 526.5 Earning offsetting monetary credits in future... for the class of automobiles which may fail to comply with a fuel economy standard and for any other classes of automobiles from which credits may be transferred, for the current model year and for each...
49 CFR 526.5 - Earning offsetting monetary credits in future model years.
Code of Federal Regulations, 2014 CFR
2014-10-01
... UNDER THE AUTOMOBILE FUEL EFFICIENCY ACT OF 1980 § 526.5 Earning offsetting monetary credits in future... for the class of automobiles which may fail to comply with a fuel economy standard and for any other classes of automobiles from which credits may be transferred, for the current model year and for each...
49 CFR 526.5 - Earning offsetting monetary credits in future model years.
Code of Federal Regulations, 2012 CFR
2012-10-01
... UNDER THE AUTOMOBILE FUEL EFFICIENCY ACT OF 1980 § 526.5 Earning offsetting monetary credits in future... for the class of automobiles which may fail to comply with a fuel economy standard and for any other classes of automobiles from which credits may be transferred, for the current model year and for each...
Can R-parity violation hide vanilla supersymmetry at the LHC?
NASA Astrophysics Data System (ADS)
Asano, Masaki; Rolbiecki, Krzysztof; Sakurai, Kazuki
2013-01-01
Current experimental constraints on a large parameter space in supersymmetric models rely on the large missing energy signature. This is usually provided by the lightest neutralino which stability is ensured by R-parity. However, if R-parity is violated, the lightest neutralino decays into the standard model particles and the missing energy cut is not efficient anymore. In particular, the U DD type R-parity violation induces the neutralino decay to three quarks which potentially leads to the most difficult signal to be searched at hadron colliders. In this paper, we study the constraints on R-parity violating supersymmetric models using a same-sign dilepton and a multijet signatures. We show that the gluino and squarks lighter than TeV are already excluded in the constrained minimal supersymmetric standard model with the R-parity violation if their masses are approximately equal. We also analyze constraints in a simplified model with the R-parity violation. We compare how the R-parity violation changes some of the observables typically used to distinguish a supersymmetric signal from standard model backgrounds.
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
B→πll Form Factors for New Physics Searches from Lattice QCD.
Bailey, Jon A; Bazavov, A; Bernard, C; Bouchard, C M; DeTar, C; Du, Daping; El-Khadra, A X; Freeland, E D; Gámiz, E; Gottlieb, Steven; Heller, U M; Kronfeld, A S; Laiho, J; Levkova, L; Liu, Yuzhi; Lunghi, E; Mackenzie, P B; Meurice, Y; Neil, E; Qiu, Si-Wei; Simone, J N; Sugar, R; Toussaint, D; Van de Water, R S; Zhou, Ran
2015-10-09
The rare decay B→πℓ^{+}ℓ^{-} arises from b→d flavor-changing neutral currents and could be sensitive to physics beyond the standard model. Here, we present the first ab initio QCD calculation of the B→π tensor form factor f_{T}. Together with the vector and scalar form factors f_{+} and f_{0} from our companion work [J. A. Bailey et al., Phys. Rev. D 92, 014024 (2015)], these parametrize the hadronic contribution to B→π semileptonic decays in any extension of the standard model. We obtain the total branching ratio BR(B^{+}→π^{+}μ^{+}μ^{-})=20.4(2.1)×10^{-9} in the standard model, which is the most precise theoretical determination to date, and agrees with the recent measurement from the LHCb experiment [R. Aaij et al., J. High Energy Phys. 12 (2012) 125].
$$B\\to\\pi\\ell\\ell$$ Form Factors for New-Physics Searches from Lattice QCD
Bailey, Jon A.
2015-10-07
The rare decay B→πℓ +ℓ - arises from b→d flavor-changing neutral currents and could be sensitive to physics beyond the standard model. Here, we present the first ab initio QCD calculation of the B→π tensor form factor f T. Together with the vector and scalar form factors f + and f 0 from our companion work [J. A. Bailey et al., Phys. Rev. D 92, 014024 (2015)], these parametrize the hadronic contribution to B→π semileptonic decays in any extension of the standard model. We obtain the total branching ratio BR(B +→π +μ +μ -)=20.4(2.1)×10 -9 in the standard model, whichmore » is the most precise theoretical determination to date, and agrees with the recent measurement from the LHCb experiment [R. Aaij et al., J. High Energy Phys. 12 (2012) 125].« less
NASA Astrophysics Data System (ADS)
Rohmanu, Ajar; Everhard, Yan
2017-04-01
A technological development, especially in the field of electronics is very fast. One of the developments in the electronics hardware device is Flexible Flat Cable (FFC), which serves as a media liaison between the main boards with other hardware parts. The production of Flexible Flat Cable (FFC) will go through the process of testing and measuring of the quality Flexible Flat Cable (FFC). Currently, the testing and measurement is still done manually by observing the Light Emitting Diode (LED) by the operator, so there were many problems. This study will be made of test quality Flexible Flat Cable (FFC) computationally utilize Open Source Embedded System. The method used is the measurement with Short Open Test method using Ohm’s Law approach to 4-wire (Kelvin) and fuzzy logic as a decision maker measurement results based on Open Source Arduino Data Logger. This system uses a sensor current INA219 as a sensor to read the voltage value thus obtained resistance value Flexible Flat Cable (FFC). To get a good system we will do the Black-box testing as well as testing the accuracy and precision with the standard deviation method. In testing the system using three models samples were obtained the test results in the form of standard deviation for the first model of 1.921 second model of 4.567 and 6.300 for the third model. While the value of the Standard Error of Mean (SEM) for the first model of the model 0.304 second at 0.736 and 0.996 of the third model. In testing this system, we will also obtain the average value of the measurement tolerance resistance values for the first model of - 3.50% 4.45% second model and the third model of 5.18% with the standard measurement of prisoners and improve productivity becomes 118.33%. From the results of the testing system is expected to improve the quality and productivity in the process of testing Flexible Flat Cable (FFC).
Zimmerman, Tammy M.
2008-01-01
The Lake Erie beaches in Pennsylvania are a valuable recreational resource for Erie County. Concentrations of Escherichia coli (E. coli) at monitored beaches in Presque Isle State Park in Erie, Pa., occasionally exceed the single-sample bathing-water standard of 235 colonies per 100 milliliters resulting in potentially unsafe swimming conditions and prompting beach managers to post public advisories or to close beaches to recreation. To supplement the current method for assessing recreational water quality (E. coli concentrations from the previous day), a predictive regression model for E. coli concentrations at Presque Isle Beach 2 was developed from data collected during the 2004 and 2005 recreational seasons. Model output included predicted E. coli concentrations and exceedance probabilities--the probability that E. coli concentrations would exceed the standard. For this study, E. coli concentrations and other water-quality and environmental data were collected during the 2006 recreational season at Presque Isle Beach 2. The data from 2006, an independent year, were used to test (validate) the 2004-2005 predictive regression model and compare the model performance to the current method. Using 2006 data, the 2004-2005 model yielded more correct responses and better predicted exceedances of the standard than the use of E. coli concentrations from the previous day. The differences were not pronounced, however, and more data are needed. For example, the model correctly predicted exceedances of the standard 11 percent of the time (1 out of 9 exceedances that occurred in 2006) whereas using the E. coli concentrations from the previous day did not result in any correctly predicted exceedances. After validation, new models were developed by adding the 2006 data to the 2004-2005 dataset and by analyzing the data in 2- and 3-year combinations. Results showed that excluding the 2004 data (using 2005 and 2006 data only) yielded the best model. Explanatory variables in the 2005-2006 model were log10 turbidity, bird count, and wave height. The 2005-2006 model correctly predicted when the standard would not be exceeded (specificity) with a response of 95.2 percent (178 out of 187 nonexceedances) and correctly predicted when the standard would be exceeded (sensitivity) with a response of 64.3 percent (9 out of 14 exceedances). In all cases, the results from predictive modeling produced higher percentages of correct predictions than using E. coli concentrations from the previous day. Additional data collected each year can be used to test and possibly improve the model. The results of this study will aid beach managers in more rapidly determining when waters are not safe for recreational use and, subsequently, when to close a beach or post an advisory.
Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate
NASA Astrophysics Data System (ADS)
Samaras, C.; Cook, L.
2015-12-01
Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.
Reiner, Bruce I
2017-10-01
Conventional peer review practice is compromised by a number of well-documented biases, which in turn limit standard of care analysis, which is fundamental to determination of medical malpractice. In addition to these intrinsic biases, other existing deficiencies exist in current peer review including the lack of standardization, objectivity, retrospective practice, and automation. An alternative model to address these deficiencies would be one which is completely blinded to the peer reviewer, requires independent reporting from both parties, utilizes automated data mining techniques for neutral and objective report analysis, and provides data reconciliation for resolution of finding-specific report differences. If properly implemented, this peer review model could result in creation of a standardized referenceable peer review database which could further assist in customizable education, technology refinement, and implementation of real-time context and user-specific decision support.
Analytical evaluation of current starch methods used in the international sugar industry: Part I.
Cole, Marsha; Eggleston, Gillian; Triplett, Alexa
2017-08-01
Several analytical starch methods exist in the international sugar industry to mitigate starch-related processing challenges and assess the quality of traded end-products. These methods use iodometric chemistry, mostly potato starch standards, and utilize similar solubilization strategies, but had not been comprehensively compared. In this study, industrial starch methods were compared to the USDA Starch Research method using simulated raw sugars. Type of starch standard, solubilization approach, iodometric reagents, and wavelength detection affected total starch determination in simulated raw sugars. Simulated sugars containing potato starch were more accurately detected by the industrial methods, whereas those containing corn starch, a better model for sugarcane starch, were only accurately measured by the USDA Starch Research method. Use of a potato starch standard curve over-estimated starch concentrations. Among the variables studied, starch standard, solubilization approach, and wavelength detection affected the sensitivity, accuracy/precision, and limited the detection/quantification of the current industry starch methods the most. Published by Elsevier Ltd.
Thermal neutral format based on the step technology
NASA Technical Reports Server (NTRS)
Almazan, P. Planas; Legal, J. L.
1995-01-01
The exchange of models is one of the most serious problems currently encountered in the practice of spacecraft thermal analysis. Essentially, the problem originates in the diversity of computing environments that are used across different sites, and the consequent proliferation of native tool formats. Furthermore, increasing pressure to reduce the development's life cycle time has originated a growing interest in the so-called spacecraft concurrent engineering. In this context, the realization of the interdependencies between different disciplines and the proper communication between them become critical issues. The use of a neutral format represents a step forward in addressing these problems. Such a means of communication is adopted by consensus. A neutral format is not directly tied to any specific tool and it is kept under stringent change control. Currently, most of the groups promoting exchange formats are contributing with their experience to STEP, the Standard for Exchange of Product Model Data, which is being developed under the auspices of the International Standards Organization (ISO 10303). This paper presents the different efforts made in Europe to provide the spacecraft thermal analysis community with a Thermal Neutral Format (TNF) based on STEP. Following an introduction with some background information, the paper presents the characteristics of the STEP standard. Later, the first efforts to produce a STEP Spacecraft Thermal Application Protocol are described. Finally, the paper presents the currently harmonized European activities that follow up and extend earlier work on the area.
NASA Astrophysics Data System (ADS)
Chaturvedi, K.; Willenborg, B.; Sindram, M.; Kolbe, T. H.
2017-10-01
Semantic 3D city models play an important role in solving complex real-world problems and are being adopted by many cities around the world. A wide range of application and simulation scenarios directly benefit from the adoption of international standards such as CityGML. However, most of the simulations involve properties, whose values vary with respect to time, and the current generation semantic 3D city models do not support time-dependent properties explicitly. In this paper, the details of solar potential simulations are provided operating on the CityGML standard, assessing and estimating solar energy production for the roofs and facades of the 3D building objects in different ways. Furthermore, the paper demonstrates how the time-dependent simulation results are better-represented inline within 3D city models utilizing the so-called Dynamizer concept. This concept not only allows representing the simulation results in standardized ways, but also delivers a method to enhance static city models by such dynamic property values making the city models truly dynamic. The dynamizer concept has been implemented as an Application Domain Extension of the CityGML standard within the OGC Future City Pilot Phase 1. The results are given in this paper.
A Criterion to Control Nonlinear Error in the Mixed-Mode Bending Test
NASA Technical Reports Server (NTRS)
Reeder, James R.
2002-01-01
The mixed-mode bending test ha: been widely used to measure delamination toughness and was recently standardized by ASTM as Standard Test Method D6671-01. This simple test is a combination of the standard Mode I (opening) test and a Mode II (sliding) test. This test uses a unidirectional composite test specimen with an artificial delamination subjected to bending loads to characterize when a delamination will extend. When the displacements become large, the linear theory used to analyze the results of the test yields errors in the calcu1ated toughness values. The current standard places no limit on the specimen loading and therefore test data can be created using the standard that are significantly in error. A method of limiting the error that can be incurred in the calculated toughness values is needed. In this paper, nonlinear models of the MMB test are refined. One of the nonlinear models is then used to develop a simple criterion for prescribing conditions where thc nonlinear error will remain below 5%.
Information risk and security modeling
NASA Astrophysics Data System (ADS)
Zivic, Predrag
2005-03-01
This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.
Toward (finally!) ruling out Z and Higgs mediated dark matter models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escudero, Miguel; Berlin, Asher; Hooper, Dan
2016-12-01
In recent years, direct detection, indirect detection, and collider experiments have placed increasingly stringent constraints on particle dark matter, exploring much of the parameter space associated with the WIMP paradigm. In this paper, we focus on the subset of WIMP models in which the dark matter annihilates in the early universe through couplings to either the Standard Model Z or the Standard Model Higgs boson. Considering fermionic, scalar, and vector dark matter candidates within a model-independent context, we find that the overwhelming majority of these dark matter candidates are already ruled out by existing experiments. In the case of Zmore » mediated dark matter, the only scenarios that are not currently excluded are those in which the dark matter is a fermion with an axial coupling and with a mass either within a few GeV of the Z resonance ( m {sub DM} ≅ m {sub Z} /2) or greater than 200 GeV, or with a vector coupling and with m {sub DM} > 6 TeV . Several Higgs mediated scenarios are currently viable if the mass of the dark matter is near the Higgs pole ( m {sub DM} ≅ m {sub H} /2). Otherwise, the only scenarios that are not excluded are those in which the dark matter is a scalar (vector) heavier than 400 GeV (1160 GeV) with a Higgs portal coupling, or a fermion with a pseudoscalar (CP violating) coupling to the Standard Model Higgs boson. With the exception of dark matter with a purely pseudoscalar coupling to the Higgs, it is anticipated that planned direct detection experiments will probe nearly the entire range of models considered in this study.« less
Toward (finally!) ruling out Z and Higgs mediated dark matter models
Escudero, Miguel; Fermi National Accelerator Lab.; Berlin, Asher; ...
2016-12-15
In recent years, direct detection, indirect detection, and collider experiments have placed increasingly stringent constraints on particle dark matter, exploring much of the parameter space associated with the WIMP paradigm. In this paper, we focus on the subset of WIMP models in which the dark matter annihilates in the early universe through couplings to either the Standard Model Z or the Standard Model Higgs boson. Considering fermionic, scalar, and vector dark matter candidates within a model-independent context, we find that the overwhelming majority of these dark matter candidates are already ruled out by existing experiments. In the case of Zmore » mediated dark matter, the only scenarios that are not currently excluded are those in which the dark matter is a fermion with an axial coupling and with a mass either within a few GeV of the Z resonance (m DM ≃ m Z/2) or greater than 200 GeV, or with a vector coupling and with m DM > 6 TeV . Several Higgs mediated scenarios are currently viable if the mass of the dark matter is near the Higgs pole (m DM ≃ m H/2). Otherwise, the only scenarios that are not excluded are those in which the dark matter is a scalar (vector) heavier than 400 GeV (1160 GeV) with a Higgs portal coupling, or a fermion with a pseudoscalar (CP violating) coupling to the Standard Model Higgs boson. Furthermore, with the exception of dark matter with a purely pseudoscalar coupling to the Higgs, it is anticipated that planned direct detection experiments will probe nearly the entire range of models considered in this study.« less
NASA Astrophysics Data System (ADS)
Gladkov, Svyatoslav; Kochmann, Julian; Reese, Stefanie; Hütter, Markus; Svendsen, Bob
2016-04-01
The purpose of the current work is the comparison of thermodynamic model formulations for chemically and structurally inhomogeneous solids at finite deformation based on "standard" non-equilibrium thermodynamics [SNET: e. g. S. de Groot and P. Mazur, Non-equilibrium Thermodynamics, North Holland, 1962] and the general equation for non-equilibrium reversible-irreversible coupling (GENERIC) [H. C. Öttinger, Beyond Equilibrium Thermodynamics, Wiley Interscience, 2005]. In the process, non-isothermal generalizations of standard isothermal conservative [e. g. J. W. Cahn and J. E. Hilliard, Free energy of a non-uniform system. I. Interfacial energy. J. Chem. Phys. 28 (1958), 258-267] and non-conservative [e. g. S. M. Allen and J. W. Cahn, A macroscopic theory for antiphase boundary motion and its application to antiphase domain coarsening. Acta Metall. 27 (1979), 1085-1095; A. G. Khachaturyan, Theory of Structural Transformations in Solids, Wiley, New York, 1983] diffuse interface or "phase-field" models [e. g. P. C. Hohenberg and B. I. Halperin, Theory of dynamic critical phenomena, Rev. Modern Phys. 49 (1977), 435-479; N. Provatas and K. Elder, Phase Field Methods in Material Science and Engineering, Wiley-VCH, 2010.] for solids are obtained. The current treatment is consistent with, and includes, previous works [e. g. O. Penrose and P. C. Fife, Thermodynamically consistent models of phase-field type for the kinetics of phase transitions, Phys. D 43 (1990), 44-62; O. Penrose and P. C. Fife, On the relation between the standard phase-field model and a "thermodynamically consistent" phase-field model. Phys. D 69 (1993), 107-113] on non-isothermal systems as a special case. In the context of no-flux boundary conditions, the SNET- and GENERIC-based approaches are shown to be completely consistent with each other and result in equivalent temperature evolution relations.
Toward (finally!) ruling out Z and Higgs mediated dark matter models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escudero, Miguel; Fermi National Accelerator Lab.; Berlin, Asher
In recent years, direct detection, indirect detection, and collider experiments have placed increasingly stringent constraints on particle dark matter, exploring much of the parameter space associated with the WIMP paradigm. In this paper, we focus on the subset of WIMP models in which the dark matter annihilates in the early universe through couplings to either the Standard Model Z or the Standard Model Higgs boson. Considering fermionic, scalar, and vector dark matter candidates within a model-independent context, we find that the overwhelming majority of these dark matter candidates are already ruled out by existing experiments. In the case of Zmore » mediated dark matter, the only scenarios that are not currently excluded are those in which the dark matter is a fermion with an axial coupling and with a mass either within a few GeV of the Z resonance (m DM ≃ m Z/2) or greater than 200 GeV, or with a vector coupling and with m DM > 6 TeV . Several Higgs mediated scenarios are currently viable if the mass of the dark matter is near the Higgs pole (m DM ≃ m H/2). Otherwise, the only scenarios that are not excluded are those in which the dark matter is a scalar (vector) heavier than 400 GeV (1160 GeV) with a Higgs portal coupling, or a fermion with a pseudoscalar (CP violating) coupling to the Standard Model Higgs boson. Furthermore, with the exception of dark matter with a purely pseudoscalar coupling to the Higgs, it is anticipated that planned direct detection experiments will probe nearly the entire range of models considered in this study.« less
Sneutrino dark matter in gauged inverse seesaw models for neutrinos.
An, Haipeng; Dev, P S Bhupal; Cai, Yi; Mohapatra, R N
2012-02-24
Extending the minimal supersymmetric standard model to explain small neutrino masses via the inverse seesaw mechanism can lead to a new light supersymmetric scalar partner which can play the role of inelastic dark matter (IDM). It is a linear combination of the superpartners of the neutral fermions in the theory (the light left-handed neutrino and two heavy standard model singlet neutrinos) which can be very light with mass in ~5-20 GeV range, as suggested by some current direct detection experiments. The IDM in this class of models has keV-scale mass splitting, which is intimately connected to the small Majorana masses of neutrinos. We predict the differential scattering rate and annual modulation of the IDM signal which can be testable at future germanium- and xenon-based detectors.
NASA Technical Reports Server (NTRS)
Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol
2003-01-01
The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.
NASA Astrophysics Data System (ADS)
Stier, P.; Schutgens, N. A. J.; Bian, H.; Boucher, O.; Chin, M.; Ghan, S.; Huneeus, N.; Kinne, S.; Lin, G.; Myhre, G.; Penner, J. E.; Randles, C.; Samset, B.; Schulz, M.; Yu, H.; Zhou, C.
2012-09-01
Simulated multi-model "diversity" in aerosol direct radiative forcing estimates is often perceived as measure of aerosol uncertainty. However, current models used for aerosol radiative forcing calculations vary considerably in model components relevant for forcing calculations and the associated "host-model uncertainties" are generally convoluted with the actual aerosol uncertainty. In this AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in nine participating models. Even with prescribed aerosol radiative properties, simulated clear-sky and all-sky aerosol radiative forcings show significant diversity. For a purely scattering case with globally constant optical depth of 0.2, the global-mean all-sky top-of-atmosphere radiative forcing is -4.51 W m-2 and the inter-model standard deviation is 0.70 W m-2, corresponding to a relative standard deviation of 15%. For a case with partially absorbing aerosol with an aerosol optical depth of 0.2 and single scattering albedo of 0.8, the forcing changes to 1.26 W m-2, and the standard deviation increases to 1.21 W m-2, corresponding to a significant relative standard deviation of 96%. However, the top-of-atmosphere forcing variability owing to absorption is low, with relative standard deviations of 9% clear-sky and 12% all-sky. Scaling the forcing standard deviation for a purely scattering case to match the sulfate radiative forcing in the AeroCom Direct Effect experiment, demonstrates that host model uncertainties could explain about half of the overall sulfate forcing diversity of 0.13 W m-2 in the AeroCom Direct Radiative Effect experiment. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harding, Samuel F.; Romero-Gomez, Pedro D. J.; Richmond, Marshall C.
Standards provide recommendations for the best practices in the installation of current meters for measuring fluid flow in closed conduits. These include PTC-18 and IEC-41 . Both of these standards refer to the requirements of the ISO Standard 3354 for cases where the velocity distribution is assumed to be regular and the flow steady. Due to the nature of the short converging intakes of Kaplan hydroturbines, these assumptions may be invalid if current meters are intended to be used to characterize turbine flows. In this study, we examine a combination of measurement guidelines from both ISO standards by means ofmore » virtual current meters (VCM) set up over a simulated hydroturbine flow field. To this purpose, a computational fluid dynamics (CFD) model was developed to model the velocity field of a short converging intake of the Ice Harbor Dam on the Snake River, in the State of Washington. The detailed geometry and resulting wake of the submersible traveling screen (STS) at the first gate slot was of particular interest in the development of the CFD model using a detached eddy simulation (DES) turbulence solution. An array of virtual point velocity measurements were extracted from the resulting velocity field to simulate VCM at two virtual measurement (VM) locations at different distances downstream of the STS. The discharge through each bay was calculated from the VM using the graphical integration solution to the velocity-area method. This method of representing practical velocimetry techniques in a numerical flow field has been successfully used in a range of marine and conventional hydropower applications. A sensitivity analysis was performed to observe the effect of the VCM array resolution on the discharge error. The downstream VM section required 11–33% less VCM in the array than the upstream VM location to achieve a given discharge error. In general, more instruments were required to quantify the discharge at high levels of accuracy when the STS was introduced because of the increased spatial variability of the flow velocity.« less
Ferrets as Models for Influenza Virus Transmission Studies and Pandemic Risk Assessments
Barclay, Wendy; Barr, Ian; Fouchier, Ron A.M.; Matsuyama, Ryota; Nishiura, Hiroshi; Peiris, Malik; Russell, Charles J.; Subbarao, Kanta; Zhu, Huachen
2018-01-01
The ferret transmission model is extensively used to assess the pandemic potential of emerging influenza viruses, yet experimental conditions and reported results vary among laboratories. Such variation can be a critical consideration when contextualizing results from independent risk-assessment studies of novel and emerging influenza viruses. To streamline interpretation of data generated in different laboratories, we provide a consensus on experimental parameters that define risk-assessment experiments of influenza virus transmissibility, including disclosure of variables known or suspected to contribute to experimental variability in this model, and advocate adoption of more standardized practices. We also discuss current limitations of the ferret transmission model and highlight continued refinements and advances to this model ongoing in laboratories. Understanding, disclosing, and standardizing the critical parameters of ferret transmission studies will improve the comparability and reproducibility of pandemic influenza risk assessment and increase the statistical power and, perhaps, accuracy of this model. PMID:29774862
New Models and Methods for the Electroweak Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, Linda
2017-09-26
This is the Final Technical Report to the US Department of Energy for grant DE-SC0013529, New Models and Methods for the Electroweak Scale, covering the time period April 1, 2015 to March 31, 2017. The goal of this project was to maximize the understanding of fundamental weak scale physics in light of current experiments, mainly the ongoing run of the Large Hadron Collider and the space based satellite experiements searching for signals Dark Matter annihilation or decay. This research program focused on the phenomenology of supersymmetry, Higgs physics, and Dark Matter. The properties of the Higgs boson are currently beingmore » measured by the Large Hadron collider, and could be a sensitive window into new physics at the weak scale. Supersymmetry is the leading theoretical candidate to explain the natural nessof the electroweak theory, however new model space must be explored as the Large Hadron collider has disfavored much minimal model parameter space. In addition the nature of Dark Matter, the mysterious particle that makes up 25% of the mass of the universe is still unknown. This project sought to address measurements of the Higgs boson couplings to the Standard Model particles, new LHC discovery scenarios for supersymmetric particles, and new measurements of Dark Matter interactions with the Standard Model both in collider production and annihilation in space. Accomplishments include new creating tools for analyses of Dark Matter models in Dark Matter which annihilates into multiple Standard Model particles, including new visualizations of bounds for models with various Dark Matter branching ratios; benchmark studies for new discovery scenarios of Dark Matter at the Large Hardon Collider for Higgs-Dark Matter and gauge boson-Dark Matter interactions; New target analyses to detect direct decays of the Higgs boson into challenging final states like pairs of light jets, and new phenomenological analysis of non-minimal supersymmetric models, namely the set of Dirac Gaugino Models.« less
Redesigning the Preparation of All Teachers within the Framework of an Integrated Program Model
ERIC Educational Resources Information Center
Hardman, Michael L.
2009-01-01
It is incumbent on universities to reflect current research on effective teacher preparation and respond to the changing needs of the 21st century. These needs include the knowledge and skills to instruct diverse students; an increasing emphasis on standards and an integrated curriculum model; and the call for all educators to work together to…
Model-Based, Noninvasive Monitoring of Intracranial Pressure
2013-07-01
patients. A physiologically based model relates ICP to simultaneously measured waveforms of arterial blood pressure ( ABP ), obtained via radial... ABP and CBFV are currently measured as the clinical standard of care. The project’s major accomplishments include: assembling a suitable system for...synchronized arterial blood pressure ( ABP ) and cerebral blood flow velocity (CBFV) waveform measurements that can be obtained quite routinely. Our processing
NASA Technical Reports Server (NTRS)
Chuss, David
2010-01-01
The Cosmic Microwave Background (CMB) has provided a wealth of information about the history and physics of the early Universe. Much progress has been made on uncovering the emerging Standard Model of Cosmology by such experiments as COBE and WMAP, and ESA's Planck Surveyor will likely increase our knowledge even more. Despite the success of this model, mysteries remain. Currently understood physics does not offer a compelling explanation for the homogeneity, flatness, and the origin of structure in the Universe. Cosmic Inflation, a brief epoch of exponential expansion, has been posted to explain these observations. If inflation is a reality, it is expected to produce a background spectrum of gravitational waves that will leave a small polarized imprint on the CMB. Discovery of this signal would give the first direct evidence for inflation and provide a window into physics at scales beyond those accessible to terrestrial particle accelerators. I will briefly review aspects of the Standard Model of Cosmology and discuss our current efforts to design and deploy experiments to measure the polarization of the CMB with the precision required to test inflation.
Novel Soft-Pion Theorem for Long-Range Nuclear Parity Violation.
Feng, Xu; Guo, Feng-Kun; Seng, Chien-Yeah
2018-05-04
The parity-odd effect in the standard model weak neutral current reveals itself in the long-range parity-violating nuclear potential generated by the pion exchanges in the ΔI=1 channel with the parity-odd pion-nucleon coupling constant h_{π}^{1}. Despite decades of experimental and theoretical efforts, the size of this coupling constant is still not well understood. In this Letter, we derive a soft-pion theorem relating h_{π}^{1} and the neutron-proton mass splitting induced by an artificial parity-even counterpart of the ΔI=1 weak Lagrangian and demonstrate that the theorem still holds exact at the next-to-leading order in the chiral perturbation theory. A considerable amount of simplification is expected in the study of h_{π}^{1} by using either lattice or other QCD models following its reduction from a parity-odd proton-neutron-pion matrix element to a simpler spectroscopic quantity. The theorem paves the way to much more precise calculations of h_{π}^{1}, and thus a quantitative test of the strangeness-conserving neutral current interaction of the standard model is foreseen.
Search for sterile neutrino oscillations in muon neutrino disappearance at MINOS/MINOS+
NASA Astrophysics Data System (ADS)
Todd, Jacob; Minos+ Collaboration
2017-01-01
A wide variety of neutrino oscillation phenomena are well-described by the standard three-flavour neutrino model, but some anomalies exist. The LSND and MiniBooNE experiments have measured electron antineutrino appearance in excess of standard oscillation predictions, which points to the possibility of a sterile neutrino with higher mass than the presently known states. MINOS, a two-detector, long-baseline neutrino oscillation experiment, was optimized for the measurement of muon neutrino disappearance in the NuMI neutrino beam. A sterile neutrino responsible for the LSND and MiniBooNE excesses would cause distortions in the charged current and neutral current MINOS spectra, which permits the search for sterile neutrinos at MINOS. In close collaboration with the Daya Bay reactor neutrino experiment, MINOS has placed strong constraints on the sterile neutrino parameter space for a model with one additional sterile neutrino. Further, the extension of data collection with MINOS+, which samples the NuMI beam in a medium energy configuration, markedly increases the sensitivity of the combined MINOS and MINOS+ sample to a 3+1-flavour sterile neutrino model.
Novel Soft-Pion Theorem for Long-Range Nuclear Parity Violation
NASA Astrophysics Data System (ADS)
Feng, Xu; Guo, Feng-Kun; Seng, Chien-Yeah
2018-05-01
The parity-odd effect in the standard model weak neutral current reveals itself in the long-range parity-violating nuclear potential generated by the pion exchanges in the Δ I =1 channel with the parity-odd pion-nucleon coupling constant hπ1 . Despite decades of experimental and theoretical efforts, the size of this coupling constant is still not well understood. In this Letter, we derive a soft-pion theorem relating hπ1 and the neutron-proton mass splitting induced by an artificial parity-even counterpart of the Δ I =1 weak Lagrangian and demonstrate that the theorem still holds exact at the next-to-leading order in the chiral perturbation theory. A considerable amount of simplification is expected in the study of hπ1 by using either lattice or other QCD models following its reduction from a parity-odd proton-neutron-pion matrix element to a simpler spectroscopic quantity. The theorem paves the way to much more precise calculations of hπ1, and thus a quantitative test of the strangeness-conserving neutral current interaction of the standard model is foreseen.
Yu, Jeong Il; Park, Won; Choi, Doo Ho; Huh, Seung Jae; Nam, Seok Jin; Kim, Seok Won; Lee, Jeong Eon; Kil, Won Ho; Im, Young-Hyuck; Ahn, Jin Seok; Park, Yeon Hee; Cho, Eun Yoon
2015-08-01
This study was conducted to establish a prognostic model in patients with pathologic N1 (pN1) breast cancer who have not undergone elective nodal irradiation (ENI) under the current standard management and to suggest possible indications for ENI. We performed a retrospective study with patients with pN1 breast cancer who received the standard local and preferred adjuvant chemotherapy treatment without neoadjuvant chemotherapy and ENI from January 2005 to June 2011. Most of the indicated patients received endocrine and trastuzumab therapy. In 735 enrolled patients, the median follow-up period was 58.4 months (range, 7.2-111.3 months). Overall, 55 recurrences (7.4%) developed, and locoregional recurrence was present in 27 patients (3.8%). Recurrence-free survival was significantly related to lymphovascular invasion (P = .04, hazard ratio [HR], 1.83; 95% confidence interval [CI], 1.03-2.88), histologic grade (P = .03, HR, 2.57; 95% CI, 1.05-6.26), and nonluminal A subtype (P = .02, HR, 3.04; 95% CI, 1.23-7.49) in multivariate analysis. The prognostic model was established by these 3 prognostic factors. Recurrence-free survival was less than 90% at 5 years in cases with 2 or 3 factors. The prognostic model has stratified risk groups in pN1 breast cancer without ENI. Patients with 2 or more factors should be considered for ENI. Copyright © 2015 Elsevier Inc. All rights reserved.
Peering beyond the horizon with standard sirens and redshift drift
NASA Astrophysics Data System (ADS)
Jimenez, Raul; Raccanelli, Alvise; Verde, Licia; Matarrese, Sabino
2018-04-01
An interesting test on the nature of the Universe is to measure the global spatial curvature of the metric in a model independent way, at a level of |Ωk|<10‑4, or, if possible, at the cosmic variance level of the amplitude of the CMB fluctuations |Ωk|≈10‑5. A limit of |Ωk|<10‑4 would yield stringent tests on several models of inflation. Further, improving the constraint by an order of magnitude would help in reducing "model confusion" in standard parameter estimation. Moreover, if the curvature is measured to be at the value of the amplitude of the CMB fluctuations, it would offer a powerful test on the inflationary paradigm and would indicate that our Universe must be significantly larger than the current horizon. On the contrary, in the context of standard inflation, measuring a value above CMB fluctuations will lead us to conclude that the Universe is not much larger than the current observed horizon; this can also be interpreted as the presence of large fluctuations outside the horizon. However, it has proven difficult, so far, to find observables that can achieve such level of accuracy, and, most of all, be model-independent. Here we propose a method that can in principle achieve that; this is done by making minimal assumptions and using distance probes that are cosmology-independent: gravitational waves, redshift drift and cosmic chronometers. We discuss what kind of observations are needed in principle to achieve the desired accuracy.
Pérez, Teresa; Makrestsov, Nikita; Garatt, John; Torlakovic, Emina; Gilks, C Blake; Mallett, Susan
The Canadian Immunohistochemistry Quality Control program monitors clinical laboratory performance for estrogen receptor and progesterone receptor tests used in breast cancer treatment management in Canada. Current methods assess sensitivity and specificity at each time point, compared with a reference standard. We investigate alternative performance analysis methods to enhance the quality assessment. We used 3 methods of analysis: meta-analysis of sensitivity and specificity of each laboratory across all time points; sensitivity and specificity at each time point for each laboratory; and fitting models for repeated measurements to examine differences between laboratories adjusted by test and time point. Results show 88 laboratories participated in quality control at up to 13 time points using typically 37 to 54 histology samples. In meta-analysis across all time points no laboratories have sensitivity or specificity below 80%. Current methods, presenting sensitivity and specificity separately for each run, result in wide 95% confidence intervals, typically spanning 15% to 30%. Models of a single diagnostic outcome demonstrated that 82% to 100% of laboratories had no difference to reference standard for estrogen receptor and 75% to 100% for progesterone receptor, with the exception of 1 progesterone receptor run. Laboratories with significant differences to reference standard identified with Generalized Estimating Equation modeling also have reduced performance by meta-analysis across all time points. The Canadian Immunohistochemistry Quality Control program has a good design, and with this modeling approach has sufficient precision to measure performance at each time point and allow laboratories with a significantly lower performance to be targeted for advice.
Ehrlich, Matthias; Schüffny, René
2013-01-01
One of the major outcomes of neuroscientific research are models of Neural Network Structures (NNSs). Descriptions of these models usually consist of a non-standardized mixture of text, figures, and other means of visual information communication in print media. However, as neuroscience is an interdisciplinary domain by nature, a standardized way of consistently representing models of NNSs is required. While generic descriptions of such models in textual form have recently been developed, a formalized way of schematically expressing them does not exist to date. Hence, in this paper we present Neural Schematics as a concept inspired by similar approaches from other disciplines for a generic two dimensional representation of said structures. After introducing NNSs in general, a set of current visualizations of models of NNSs is reviewed and analyzed for what information they convey and how their elements are rendered. This analysis then allows for the definition of general items and symbols to consistently represent these models as Neural Schematics on a two dimensional plane. We will illustrate the possibilities an agreed upon standard can yield on sampled diagrams transformed into Neural Schematics and an example application for the design and modeling of large-scale NNSs.
Integrated Mecical Model (IMM) 4.0 Verification and Validation (VV) Testing (HRP IWS 2016)
NASA Technical Reports Server (NTRS)
Walton, M; Kerstman, E.; Arellano, J.; Boley, L.; Reyes, D.; Young, M.; Garcia, Y.; Saile, L.; Myers, J.
2016-01-01
Timeline, partial treatment, and alternate medications were added to the IMM to improve the fidelity of this model to enhance decision support capabilities. Using standard design reference missions, IMM VV testing compared outputs from the current operational IMM (v3) with those from the model with added functionalities (v4). These new capabilities were examined in a comparative, stepwise approach as follows: a) comparison of the current operational IMM v3 with the enhanced functionality of timeline alone (IMM 4.T), b) comparison of IMM 4.T with the timeline and partial treatment (IMM 4.TPT), and c) comparison of IMM 4.TPT with timeline, partial treatment and alternative medication (IMM 4.0).
Current state of the mass storage system reference model
NASA Technical Reports Server (NTRS)
Coyne, Robert
1993-01-01
IEEE SSSWG was chartered in May 1990 to abstract the hardware and software components of existing and emerging storage systems and to define the software interfaces between these components. The immediate goal is the decomposition of a storage system into interoperable functional modules which vendors can offer as separate commercial products. The ultimate goal is to develop interoperable standards which define the software interfaces, and in the distributed case, the associated protocols to each of the architectural modules in the model. The topics are presented in viewgraph form and include the following: IEEE SSSWG organization; IEEE SSSWG subcommittees & chairs; IEEE standards activity board; layered view of the reference model; layered access to storage services; IEEE SSSWG emphasis; and features for MSSRM version 5.
Probing new physics with atmospheric neutrinos at KM3NeT-ORCA
NASA Astrophysics Data System (ADS)
Coelho, João A. B.;
2017-09-01
We present the prospects of ORCA searches for new physics phenomena using atmospheric neutrinos. Focus is given to exploiting the impact of strong matter effects on the oscillation of atmospheric neutrinos in light of expanded models, such as sterile neutrinos and non-standard interactions. In the presence of light sterile neutrinos that mix with active neutrinos, additional resonances and suppressions may occur at different energies. One may also use neutrino oscillations to probe the properties of the coherent forward scattering which may be altered by new interactions beyond the Standard Model. Preliminary studies show that ORCA would be able to probe some parameters of these models with sensitivity up to one order of magnitude better than current constraints.
Six-quark decays of the Higgs boson in supersymmetry with R-parity violation.
Carpenter, Linda M; Kaplan, David E; Rhee, Eun-Jung
2007-11-23
Both electroweak precision measurements and simple supersymmetric extensions of the standard model prefer a mass of the Higgs boson less than the experimental lower limit (on a standard-model-like Higgs boson) of 114 GeV. We show that supersymmetric models with R parity violation and baryon-number violation have a significant range of parameter space in which the Higgs boson dominantly decays to six jets. These decays are much more weakly constrained by current CERN LEP analyses and would allow for a Higgs boson mass near that of the Z. In general, lighter scalar quark and other superpartner masses are allowed. The Higgs boson would potentially be discovered at hadron colliders via the appearance of new displaced vertices.
Absolute Spectrophotometric Calibration to 1% from the FUV through the near-IR
NASA Astrophysics Data System (ADS)
Finley, David
2005-07-01
We propose a significant improvement to the existing HST calibration. The current calibration is based on three primary DA white dwarf standards, GD 71, GD 153, and G 191-B2B. The standard fluxes are calculated using NLTE models, with effective temperatures and gravities that were derived from Balmer line fits using LTE models. We propose to improve the accuracy and internal consistency of the calibration by deriving corrected effective temperatures and gravities based on fitting the observed line profiles with updated NLTE models, and including the fit results from multiple STIS spectra, rather than the {usually} 1 or 2 ground-based spectra used previously. We will also determine the fluxes for 5 new, fainter primary or secondary standards, extending the standard V magnitude lower limit from 13.4 to 16.5, and extending the wavelength coverage from 0.1 to 2.5 micron. The goal is to achieve an overall flux accuracy of 1%, which will be needed, for example, for the upcoming supernova survey missions to measure the equation of state of the dark energy that is accelerating the expansion of the universe.
Truong, Dennis Q.; Magerowski, Greta; Blackburn, George L.; Bikson, Marom; Alonso-Alonso, Miguel
2013-01-01
Recent studies show that acute neuromodulation of the prefrontal cortex with transcranial direct current stimulation (tDCS) can decrease food craving, attentional bias to food, and actual food intake. These data suggest potential clinical applications for tDCS in the field of obesity. However, optimal stimulation parameters in obese individuals are uncertain. One fundamental concern is whether a thick, low-conductivity layer of subcutaneous fat around the head can affect current density distribution and require dose adjustments during tDCS administration. The aim of this study was to investigate the role of head fat on the distribution of current during tDCS and evaluate whether dosing standards for tDCS developed for adult individuals in general are adequate for the obese population. We used MRI-derived high-resolution computational models that delineated fat layers in five human heads from subjects with body mass index (BMI) ranging from “normal-lean” to “super-obese” (20.9 to 53.5 kg/m2). Data derived from these simulations suggest that head fat influences tDCS current density across the brain, but its relative contribution is small when other components of head anatomy are added. Current density variability between subjects does not appear to have a direct and/or simple link to BMI. These results indicate that guidelines for the use of tDCS can be extrapolated to obese subjects without sacrificing efficacy and/or treatment safety; the recommended standard parameters can lead to the delivery of adequate current flow to induce neuromodulation of brain activity in the obese population. PMID:24159560
Current challenges in fundamental physics
NASA Astrophysics Data System (ADS)
Egana Ugrinovic, Daniel
The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.
Spacetime Curvature and Higgs Stability after Inflation.
Herranen, M; Markkanen, T; Nurmi, S; Rajantie, A
2015-12-11
We investigate the dynamics of the Higgs field at the end of inflation in the minimal scenario consisting of an inflaton field coupled to the standard model only through the nonminimal gravitational coupling ξ of the Higgs field. Such a coupling is required by renormalization of the standard model in curved space, and in the current scenario also by vacuum stability during high-scale inflation. We find that for ξ≳1, rapidly changing spacetime curvature at the end of inflation leads to significant production of Higgs particles, potentially triggering a transition to a negative-energy Planck scale vacuum state and causing an immediate collapse of the Universe.
2016-09-01
A government-commissioned review of data security across health and care has led to the proposal of new standards for security and options for a consent/opt-out model. Standards include that all staff complete appropriate annual data security training and pass a mandatory test provided through the revised Information Governance Toolkit, that personal confidential data is only accessible to staff who need it for their current role, and that access is removed as soon as it is no longer required. The consent/opt-out model is outlined under 8 statements, and includes certain circumstances where it will not apply, for example, where there is an overriding public interest, or mandatory legal requirement.
Strategic Deployment of Clinical Models.
Goossen, William
2016-01-01
The selection, implementation, and certification of electronic health records (EHR) could benefit from the required use of one of the established clinical model approaches. For the lifelong record of data about individuals, issues arise about the permanence and preservation of data during or even beyond a lifetime. Current EHR do not fully adhere to pertinent standards for clinical data, where it is known for some 20 plus years that standardization of health data is a cornerstone for patient safety, interoperability, data retrieval for various purposes and the lifelong preservation of such data. This paper briefly introduces the issues and gives a brief recommendation for future work in this area.
Swat, M J; Moodie, S; Wimalaratne, S M; Kristensen, N R; Lavielle, M; Mari, A; Magni, P; Smith, M K; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, A C; Kaye, R; Keizer, R; Kloft, C; Kok, J N; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, H B; Parra-Guillen, Z P; Plan, E; Ribba, B; Smith, G; Trocóniz, I F; Yvon, F; Milligan, P A; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N
2015-06-01
The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps.
Swat, MJ; Moodie, S; Wimalaratne, SM; Kristensen, NR; Lavielle, M; Mari, A; Magni, P; Smith, MK; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, AC; Kaye, R; Keizer, R; Kloft, C; Kok, JN; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, HB; Parra-Guillen, ZP; Plan, E; Ribba, B; Smith, G; Trocóniz, IF; Yvon, F; Milligan, PA; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N
2015-01-01
The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps. PMID:26225259
Hybrid simulations of magnetic reconnection with kinetic ions and fluid electron pressure anisotropy
Le, A.; Daughton, W.; Karimabadi, H.; ...
2016-03-16
We present the first hybrid simulations with kinetic ions and recently developed equations of state for the electron fluid appropriate for reconnection with a guide field. The equations of state account for the main anisotropy of the electron pressure tensor.Magnetic reconnection is studied in two systems, an initially force-free current sheet and a Harris sheet. The hybrid model with the equations of state is compared to two other models, hybrid simulations with isothermal electrons and fully kinetic simulations. Including the anisotropicequations of state in the hybrid model provides a better match to the fully kinetic model. In agreement with fullymore » kinetic results, the main feature captured is the formation of an electron current sheet that extends several ion inertial lengths. This electron current sheet modifies the Hall magnetic field structure near the X-line, and it is not observed in the standard hybrid model with isotropic electrons. The saturated reconnection rate in this regime nevertheless remains similar in all three models. Here, implications for global modeling are discussed.« less
NASA Technical Reports Server (NTRS)
Bilitza, D.; Reinisch, B.; Gallagher, D.; Huang, X.; Truhlik, V.; Nsumei, P.
2007-01-01
The goal of this LWS tools effort is the development of a new data-based F-region TOpside and PLAsmasphere (TOPLA) model for the electron density (Ne) and temperature (Te) for inclusion in the International Reference Ionosphere (IRI) model using newly available satellite data and models for these regions. The IRI model is the de facto international standard for specification of ionospheric parameters and is currently being considered as an ISO Technical Specification for the ionosphere. Our effort is directed towards improving the topside part of the model and extending it into the plasmasphere. Specifically we are planning to overcome the following shortcomings of the current IRI topside model: (I) overestimation of densities above 700 km by a factor of 2 and more, (3) unrealistically steep density profiles at high latitudes during very high solar activities, (4) no solar cycle variations and no semi-annual variations for the electron temperature, (5) discontinuities or unphysical gradients when merging with plasmaspheric models. We will report on first accomplishments and on the current status of the project.
A cloud-based information repository for bridge monitoring applications
NASA Astrophysics Data System (ADS)
Jeong, Seongwoon; Zhang, Yilan; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.
2016-04-01
This paper describes an information repository to support bridge monitoring applications on a cloud computing platform. Bridge monitoring, with instrumentation of sensors in particular, collects significant amount of data. In addition to sensor data, a wide variety of information such as bridge geometry, analysis model and sensor description need to be stored. Data management plays an important role to facilitate data utilization and data sharing. While bridge information modeling (BrIM) technologies and standards have been proposed and they provide a means to enable integration and facilitate interoperability, current BrIM standards support mostly the information about bridge geometry. In this study, we extend the BrIM schema to include analysis models and sensor information. Specifically, using the OpenBrIM standards as the base, we draw on CSI Bridge, a commercial software widely used for bridge analysis and design, and SensorML, a standard schema for sensor definition, to define the data entities necessary for bridge monitoring applications. NoSQL database systems are employed for data repository. Cloud service infrastructure is deployed to enhance scalability, flexibility and accessibility of the data management system. The data model and systems are tested using the bridge model and the sensor data collected at the Telegraph Road Bridge, Monroe, Michigan.
Kranz, J; Sommer, K-J; Steffens, J
2014-05-01
Patient safety and risk/complication management rank among the current megatrends in modern medicine, which has undoubtedly become more complex. In time-critical, error-prone and difficult situations, which often occur repeatedly in everyday clinical practice, guidelines are inappropriate for acting rapidly and intelligently. With the establishment and consistent use of standard operating procedures like in commercial aviation, a possible strategic approach is available. These medical aids to decision-making - quick reference cards - are short, optimized instructions that enable a standardized procedure in case of medical claims.
Inclusion of policies on ethical standards in animal experiments in biomedical science journals.
Rands, Sean A
2011-11-01
Most published biomedical research involving animal models is evaluated carefully to ensure that appropriate ethical standards are met. In the current study, 500 journals randomly selected from MedLine were assessed for whether they presented animal research. Of the 138 journals that did, the instructions to authors of 85 (61.6%) included a requirement for author assurance of adherence to ethical standards during experiments involving animals. In comparison to a wider range of biologic journals, biomedical science journals were more likely to have some sort of ethical policy concerning the reporting and presentation of animal experiments.
Usability and accessibility in consumer health informatics current trends and future challenges.
Goldberg, Larry; Lide, Bettijoyce; Lowry, Svetlana; Massett, Holly A; O'Connell, Trisha; Preece, Jennifer; Quesenbery, Whitney; Shneiderman, Ben
2011-05-01
It is a truism that, for innovative eHealth systems to have true value and impact, they must first and foremost be usable and accessible by clinicians, consumers, and other stakeholders. In this paper, current trends and future challenges in the usability and accessibility of consumer health informatics will be described. Consumer expectations of their healthcare providers and healthcare records in this new era of consumer-directed care will be explored, and innovative visualizations, assistive technologies, and other ways that healthcare information is currently being provided and/or shared will be described. Challenges for ensuring the usability of current and future systems will also be discussed. An innovative model for conducting systematic, timely, user-centered research on consumer-facing websites at the National Cancer Institute (NCI) and the ongoing efforts at the National Institute of Standards and Technology (NIST) to promote health information technology (HIT) usability standards and evaluation criteria will also be presented. Copyright © 2011 American Journal of Preventive Medicine. All rights reserved.
Run Environment and Data Management for Earth System Models
NASA Astrophysics Data System (ADS)
Widmann, H.; Lautenschlager, M.; Fast, I.; Legutke, S.
2009-04-01
The Integrating Model and Data Infrastructure (IMDI) developed and maintained by the Model and Data Group (M&D) comprises the Standard Compile Environment (SCE) and the Standard Run Environment (SRE). The IMDI software has a modular design, which allows to combine and couple a suite of model components and as well to execute the tasks independently and on various platforms. Furthermore the modular structure enables the extension to new model combinations and new platforms. The SRE presented here enables the configuration and performance of earth system model experiments from model integration up to storage and visualization of data. We focus on recently implemented tasks such as synchronous data base filling, graphical monitoring and automatic generation of meta data in XML forms during run time. As well we address the capability to run experiments in heterogeneous IT environments with different computing systems for model integration, data processing and storage. These features are demonstrated for model configurations and on platforms used in current or upcoming projects, e.g. MILLENNIUM or IPCC AR5.
Defending Against Advanced Persistent Threats Using Game-Theory
König, Sandra; Schauer, Stefan
2017-01-01
Advanced persistent threats (APT) combine a variety of different attack forms ranging from social engineering to technical exploits. The diversity and usual stealthiness of APT turns them into a central problem of contemporary practical system security, since information on attacks, the current system status or the attacker’s incentives is often vague, uncertain and in many cases even unavailable. Game theory is a natural approach to model the conflict between the attacker and the defender, and this work investigates a generalized class of matrix games as a risk mitigation tool for an advanced persistent threat (APT) defense. Unlike standard game and decision theory, our model is tailored to capture and handle the full uncertainty that is immanent to APTs, such as disagreement among qualitative expert risk assessments, unknown adversarial incentives and uncertainty about the current system state (in terms of how deeply the attacker may have penetrated into the system’s protective shells already). Practically, game-theoretic APT models can be derived straightforwardly from topological vulnerability analysis, together with risk assessments as they are done in common risk management standards like the ISO 31000 family. Theoretically, these models come with different properties than classical game theoretic models, whose technical solution presented in this work may be of independent interest. PMID:28045922
ERIC Educational Resources Information Center
Isakson, Carol
2005-01-01
In this article, the author presents several Web sites supporting electronic presentation skills. The sites featured here will help fine-tune one's skills in modeling effective presentations and provide suggestions for managing student presentations meeting National Educational Technology Standards (NETS). Most use PowerPoint, the current industry…
Student Centered Curriculum: Elementary School
ERIC Educational Resources Information Center
Rondone, Atria
2014-01-01
Student-centered learning has an important place in education because it fosters student engagement and allows the traditional micromanaging teacher to transform into a guide. The current education model emphasizes teacher control and curriculum based on standardized testing, which stunts students' natural learning processes. This study…
Frequency domain model for analysis of paralleled, series-output-connected Mapham inverters
NASA Technical Reports Server (NTRS)
Brush, Andrew S.; Sundberg, Richard C.; Button, Robert M.
1989-01-01
The Mapham resonant inverter is characterized as a two-port network driven by a selected periodic voltage. The two-port model is then used to model a pair of Mapham inverters connected in series and employing phasor voltage regulation. It is shown that the model is useful for predicting power output in paralleled inverter units, and for predicting harmonic current output of inverter pairs, using standard power flow techniques. Some sample results are compared to data obtained from testing hardware inverters.
Frequency domain model for analysis of paralleled, series-output-connected Mapham inverters
NASA Technical Reports Server (NTRS)
Brush, Andrew S.; Sundberg, Richard C.; Button, Robert M.
1989-01-01
The Mapham resonant inverter is characterized as a two-port network driven by a selected periodic voltage. The two-port model is then used to model a pair of Mapham inverters connected in series and employing phasor voltage regulation. It is shown that the model is useful for predicting power output in paralleled inverter units, and for predicting harmonic current output of inverter pairs, using standard power flow techniques. Some examples are compared to data obtained from testing hardware inverters.
A per-cent-level determination of the nucleon axial coupling from quantum chromodynamics.
Chang, C C; Nicholson, A N; Rinaldi, E; Berkowitz, E; Garron, N; Brantley, D A; Monge-Camacho, H; Monahan, C J; Bouchard, C; Clark, M A; Joó, B; Kurth, T; Orginos, K; Vranas, P; Walker-Loud, A
2018-06-01
The axial coupling of the nucleon, g A , is the strength of its coupling to the weak axial current of the standard model of particle physics, in much the same way as the electric charge is the strength of the coupling to the electromagnetic current. This axial coupling dictates the rate at which neutrons decay to protons, the strength of the attractive long-range force between nucleons and other features of nuclear physics. Precision tests of the standard model in nuclear environments require a quantitative understanding of nuclear physics that is rooted in quantum chromodynamics, a pillar of the standard model. The importance of g A makes it a benchmark quantity to determine theoretically-a difficult task because quantum chromodynamics is non-perturbative, precluding known analytical methods. Lattice quantum chromodynamics provides a rigorous, non-perturbative definition of quantum chromodynamics that can be implemented numerically. It has been estimated that a precision of two per cent would be possible by 2020 if two challenges are overcome 1,2 : contamination of g A from excited states must be controlled in the calculations and statistical precision must be improved markedly 2-10 . Here we use an unconventional method 11 inspired by the Feynman-Hellmann theorem that overcomes these challenges. We calculate a g A value of 1.271 ± 0.013, which has a precision of about one per cent.
Integrity modelling of tropospheric delay models
NASA Astrophysics Data System (ADS)
Rózsa, Szabolcs; Bastiaan Ober, Pieter; Mile, Máté; Ambrus, Bence; Juni, Ildikó
2017-04-01
The effect of the neutral atmosphere on signal propagation is routinely estimated by various tropospheric delay models in satellite navigation. Although numerous studies can be found in the literature investigating the accuracy of these models, for safety-of-life applications it is crucial to study and model the worst case performance of these models using very low recurrence frequencies. The main objective of the INTegrity of TROpospheric models (INTRO) project funded by the ESA PECS programme is to establish a model (or models) of the residual error of existing tropospheric delay models for safety-of-life applications. Such models are required to overbound rare tropospheric delays and should thus include the tails of the error distributions. Their use should lead to safe error bounds on the user position and should allow computation of protection levels for the horizontal and vertical position errors. The current tropospheric model from the RTCA SBAS Minimal Operational Standards has an associated residual error that equals 0.12 meters in the vertical direction. This value is derived by simply extrapolating the observed distribution of the residuals into the tail (where no data is present) and then taking the point where the cumulative distribution has an exceedance level would be 10-7.While the resulting standard deviation is much higher than the estimated standard variance that best fits the data (0.05 meters), it surely is conservative for most applications. In the context of the INTRO project some widely used and newly developed tropospheric delay models (e.g. RTCA MOPS, ESA GALTROPO and GPT2W) were tested using 16 years of daily ERA-INTERIM Reanalysis numerical weather model data and the raytracing technique. The results showed that the performance of some of the widely applied models have a clear seasonal dependency and it is also affected by a geographical position. In order to provide a more realistic, but still conservative estimation of the residual error of tropospheric delays, the mathematical formulation of the overbounding models are currently under development. This study introduces the main findings of the residual error analysis of the studied tropospheric delay models, and discusses the preliminary analysis of the integrity model development for safety-of-life applications.
On the recovery of electric currents in the liquid core of the Earth
NASA Astrophysics Data System (ADS)
Kuslits, Lukács; Prácser, Ernő; Lemperger, István
2017-04-01
Inverse geodynamo modelling has become a standard method to get a more accurate image of the processes within the outer core. In this poster excerpts from the preliminary results of an other approach are presented. This comes around the possibility of recovering the currents within the liquid core directly, using Main Magnetic Field data. The approximation of different systems of the flow of charge is possible with various geometries. Based on previous geodynamo simulations, current coils can furnish a good initial geometry for such an estimation. The presentation introduces our preliminary test results and the study of reliability of the applied inversion algorithm for different numbers of coils, distributed in a grid simbolysing the domain between the inner-core and core-mantle boundaries. We shall also present inverted current structures using Main Field model data.
CMB constraints on β-exponential inflationary models
NASA Astrophysics Data System (ADS)
Santos, M. A.; Benetti, M.; Alcaniz, J. S.; Brito, F. A.; Silva, R.
2018-03-01
We analyze a class of generalized inflationary models proposed in ref. [1], known as β-exponential inflation. We show that this kind of potential can arise in the context of brane cosmology, where the field describing the size of the extra-dimension is interpreted as the inflaton. We discuss the observational viability of this class of model in light of the latest Cosmic Microwave Background (CMB) data from the Planck Collaboration through a Bayesian analysis, and impose tight constraints on the model parameters. We find that the CMB data alone prefer weakly the minimal standard model (ΛCDM) over the β-exponential inflation. However, when current local measurements of the Hubble parameter, H0, are considered, the β-inflation model is moderately preferred over the ΛCDM cosmology, making the study of this class of inflationary models interesting in the context of the current H0 tension.
NASA Astrophysics Data System (ADS)
Soti, G.; Wauters, F.; Breitenfeldt, M.; Finlay, P.; Herzog, P.; Knecht, A.; Köster, U.; Kraev, I. S.; Porobic, T.; Prashanth, P. N.; Towner, I. S.; Tramm, C.; Zákoucký, D.; Severijns, N.
2014-09-01
Background: Precision measurements at low energy search for physics beyond the standard model in a way complementary to searches for new particles at colliders. In the weak sector the most general β-decay Hamiltonian contains, besides vector and axial-vector terms, also scalar, tensor, and pseudoscalar terms. Current limits on the scalar and tensor coupling constants from neutron and nuclear β decay are on the level of several percent. Purpose: Extracting new information on tensor coupling constants by measuring the β-asymmetry parameter in the pure Gamow-Teller decay of Cu67, thereby testing the V-A structure of the weak interaction. Method: An iron sample foil into which the radioactive nuclei were implanted was cooled down to mK temperatures in a 3He-4He dilution refrigerator. An external magnetic field of 0.1 T, in combination with the internal hyperfine magnetic field, oriented the nuclei. The anisotropic β radiation was observed with planar high-purity germanium detectors operating at a temperature of about 10 K. An on-line measurement of the β asymmetry of Cu68 was performed as well for normalization purposes. Systematic effects were investigated using geant4 simulations. Results: The experimental value, Ã=0.587(14), is in agreement with the standard model value of 0.5991(2) and is interpreted in terms of physics beyond the standard model. The limits obtained on possible tensor-type charged currents in the weak interaction Hamiltonian are -0.045<(CT+CT')/CA<0.159 (90% C.L.). Conclusions: The obtained limits are comparable to limits from other correlation measurements in nuclear β decay and contribute to further constraining tensor coupling constants.
Establishing a Network of faint DA white dwarfs as Spectrophotometric Standards
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Narayan, Gautham; Holberg, Jay; Matheson, Thomas; Olszewski, Edward; Stubbs, Christopher; Bohlin, Ralph; Sabbi, Elena; Deustua, Susana; Rest, Armin; Axelrod, Tim; MacKenty, John W.; Camarota, Larry; Gilliland, Ron
2015-08-01
Systematic uncertainties in photometric calibration are the dominant source of error in current type Ia supernova dark energy studies, as well as other forefront cosmology efforts, e.g. photo-redshift determinations for weak lensing mass tomography. Current and next-generation ground-based all-sky surveys require a network of calibration stars with 1) known SEDs (to properly and unambiguously take into account filter differences), and 2) that are on a common photometric zeropoint scale across the sky to sub-percent accuracy. We are using a combination of HST panchromatic photometry and ground based spectroscopy to establish such an essential network of faint primary photometric standards, exploiting the well-understood spectral energy distributions of DA white dwarf stars that are free from the complications of observing through the Earth's time-variable atmosphere. The Balmer features in the spectra are used to deduce the two parameters (temperature and log(g)) from which we model the spectral energy distribution (SED) from these stars which have pure hydrogen atmospheres. By comparing against panchromatic broadband HST photometry, and allowing for an achromatic zero-point adjustment and mild scaling of the interstellar reddening, we find that model prediction and observation agree to a few milli-mag. By combining the zero-point and reddening adjustments with the modeled SED, for each star we obtain the incident SED above the terrestrial atmosphere, thus establishing these objects as spectrophotometric standards. We are pursuing 23 objects between 16 and 19 mag spread over the sky uniformly around the equator and northern mid-latitudes, with plans to extend this to southern latitudes. This precision photometric heritage from HST will benefit essentially all existing and upcoming survey projects, and in prticular, directly addresses one of the current barriers to understanding the nature of dark energy.
Gusts and shear within hurricane eyewalls can exceed offshore wind turbine design standards
NASA Astrophysics Data System (ADS)
Worsnop, Rochelle P.; Lundquist, Julie K.; Bryan, George H.; Damiani, Rick; Musial, Walt
2017-06-01
Offshore wind energy development is underway in the U.S., with proposed sites located in hurricane-prone regions. Turbine design criteria outlined by the International Electrotechnical Commission do not encompass the extreme wind speeds and directional shifts of hurricanes stronger than category 2. We examine a hurricane's turbulent eyewall using large-eddy simulations with Cloud Model 1. Gusts and mean wind speeds near the eyewall of a category 5 hurricane exceed the current Class I turbine design threshold of 50 m s-1 mean wind and 70 m s-1 gusts. Largest gust factors occur at the eye-eyewall interface. Further, shifts in wind direction suggest that turbines must rotate or yaw faster than current practice. Although current design standards omit mention of wind direction change across the rotor layer, large values (15-50°) suggest that veer should be considered.
Gusts and shear within hurricane eyewalls can exceed offshore wind turbine design standards
Worsnop, Rochelle P.; Lundquist, Julie K.; Bryan, George H.; ...
2017-05-30
Here, offshore wind energy development is underway in the U.S., with proposed sites located in hurricane-prone regions. Turbine design criteria outlined by the International Electrotechnical Commission do not encompass the extreme wind speeds and directional shifts of hurricanes stronger than category 2. We examine a hurricane's turbulent eyewall using large-eddy simulations with Cloud Model 1. Gusts and mean wind speeds near the eyewall of a category 5 hurricane exceed the current Class I turbine design threshold of 50 m s –1 mean wind and 70 m s –1 gusts. Largest gust factors occur at the eye-eyewall interface. Further, shifts inmore » wind direction suggest that turbines must rotate or yaw faster than current practice. Although current design standards omit mention of wind direction change across the rotor layer, large values (15–50°) suggest that veer should be considered.« less
Constraints and spandrels of interareal connectomes
Rubinov, Mikail
2016-01-01
Interareal connectomes are whole-brain wiring diagrams of white-matter pathways. Recent studies have identified modules, hubs, module hierarchies and rich clubs as structural hallmarks of these wiring diagrams. An influential current theory postulates that connectome modules are adequately explained by evolutionary pressures for wiring economy, but that the other hallmarks are not explained by such pressures and are therefore less trivial. Here, we use constraint network models to test these postulates in current gold-standard vertebrate and invertebrate interareal-connectome reconstructions. We show that empirical wiring-cost constraints inadequately explain connectome module organization, and that simultaneous module and hub constraints induce the structural byproducts of hierarchies and rich clubs. These byproducts, known as spandrels in evolutionary biology, include the structural substrate of the default-mode network. Our results imply that currently standard connectome characterizations are based on circular analyses or double dipping, and we emphasize an integrative approach to future connectome analyses for avoiding such pitfalls. PMID:27924867
Constraints and spandrels of interareal connectomes.
Rubinov, Mikail
2016-12-07
Interareal connectomes are whole-brain wiring diagrams of white-matter pathways. Recent studies have identified modules, hubs, module hierarchies and rich clubs as structural hallmarks of these wiring diagrams. An influential current theory postulates that connectome modules are adequately explained by evolutionary pressures for wiring economy, but that the other hallmarks are not explained by such pressures and are therefore less trivial. Here, we use constraint network models to test these postulates in current gold-standard vertebrate and invertebrate interareal-connectome reconstructions. We show that empirical wiring-cost constraints inadequately explain connectome module organization, and that simultaneous module and hub constraints induce the structural byproducts of hierarchies and rich clubs. These byproducts, known as spandrels in evolutionary biology, include the structural substrate of the default-mode network. Our results imply that currently standard connectome characterizations are based on circular analyses or double dipping, and we emphasize an integrative approach to future connectome analyses for avoiding such pitfalls.
Gusts and shear within hurricane eyewalls can exceed offshore wind turbine design standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worsnop, Rochelle P.; Lundquist, Julie K.; Bryan, George H.
Here, offshore wind energy development is underway in the U.S., with proposed sites located in hurricane-prone regions. Turbine design criteria outlined by the International Electrotechnical Commission do not encompass the extreme wind speeds and directional shifts of hurricanes stronger than category 2. We examine a hurricane's turbulent eyewall using large-eddy simulations with Cloud Model 1. Gusts and mean wind speeds near the eyewall of a category 5 hurricane exceed the current Class I turbine design threshold of 50 m s –1 mean wind and 70 m s –1 gusts. Largest gust factors occur at the eye-eyewall interface. Further, shifts inmore » wind direction suggest that turbines must rotate or yaw faster than current practice. Although current design standards omit mention of wind direction change across the rotor layer, large values (15–50°) suggest that veer should be considered.« less
Recent evaluations of crack-opening-area in circumferentially cracked pipes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, S.; Brust, F.; Ghadiali, N.
1997-04-01
Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less
NASA Astrophysics Data System (ADS)
Takano, Yukinori; Hirata, Akimasa; Fujiwara, Osamu
Human exposed to electric and/or magnetic fields at low frequencies may cause direct effect such as nerve stimulation and excitation. Therefore, basic restriction is regulated in terms of induced current density in the ICNIRP guidelines and in-situ electric field in the IEEE standard. External electric or magnetic field which does not produce induced quantities exceeding the basic restriction is used as a reference level. The relationship between the basic restriction and reference level for low-frequency electric and magnetic fields has been investigated using European anatomic models, while limited for Japanese model, especially for electric field exposures. In addition, that relationship has not well been discussed. In the present study, we calculated the induced quantities in anatomic Japanese male and female models exposed to electric and magnetic fields at reference level. A quasi static finite-difference time-domain (FDTD) method was applied to analyze this problem. As a result, spatially averaged induced current density was found to be more sensitive to averaging algorithms than that of in-situ electric field. For electric and magnetic field exposure at the ICNIRP reference level, the maximum values of the induced current density for different averaging algorithm were smaller than the basic restriction for most cases. For exposures at the reference level in the IEEE standard, the maximum electric fields in the brain were larger than the basic restriction in the brain while smaller for the spinal cord and heart.
Method of Calculating the Correction Factors for Cable Dimensioning in Smart Grids
NASA Astrophysics Data System (ADS)
Simutkin, M.; Tuzikova, V.; Tlusty, J.; Tulsky, V.; Muller, Z.
2017-04-01
One of the main causes of overloading electrical equipment by currents of higher harmonics is the great increasing of a number of non-linear electricity power consumers. Non-sinusoidal voltages and currents affect the operation of electrical equipment, reducing its lifetime, increases the voltage and power losses in the network, reducing its capacity. There are standards that respects emissions amount of higher harmonics current that cannot provide interference limit for a safe level in power grid. The article presents a method for determining a correction factor to the long-term allowable current of the cable, which allows for this influence. Using mathematical models in the software Elcut, it was described thermal processes in the cable in case the flow of non-sinusoidal current. Developed in the article theoretical principles, methods, mathematical models allow us to calculate the correction factor to account for the effect of higher harmonics in the current spectrum for network equipment in any type of non-linear load.
A composite model for the 750 GeV diphoton excess
Harigaya, Keisuke; Nomura, Yasunori
2016-03-14
We study a simple model in which the recently reported 750 GeV diphoton excess arises from a composite pseudo Nambu-Goldstone boson — hidden pion — produced by gluon fusion and decaying into two photons. The model only introduces an extra hidden gauge group at the TeV scale with a vectorlike quark in the bifundamental representation of the hidden and standard model gauge groups. We calculate the masses of all the hidden pions and analyze their experimental signatures and constraints. We find that two colored hidden pions must be near the current experimental limits, and hence are probed in the nearmore » future. We study physics of would-be stable particles — the composite states that do not decay purely by the hidden and standard model gauge dynamics — in detail, including constraints from cosmology. We discuss possible theoretical structures above the TeV scale, e.g. conformal dynamics and supersymmetry, and their phenomenological implications. We also discuss an extension of the minimal model in which there is an extra hidden quark that is singlet under the standard model and has a mass smaller than the hidden dynamical scale. This provides two standard model singlet hidden pions that can both be viewed as diphoton/diboson resonances produced by gluon fusion. We discuss several scenarios in which these (and other) resonances can be used to explain various excesses seen in the LHC data.« less
ERIC Educational Resources Information Center
Porter, Priscilla
Students focus on people who make a difference. The unit features men and women whose achievements have had a direct or indirect influence in the students' lives. These individuals include heroes from long ago and the recent past along with people who are currently active in the local community. The unit is crafted around biographies. It aims to…
Combustible Cartridge Cases: Current Status and Future Prospects
1992-08-01
and presents new models of combustible cartridge case burning . Following a summary of experimental data, it concludes with proposals for further...UNCLASSIFIED UNCLASSIFIED UNCLASSIFIED SAR NSN 7540-01-280-5500 Standard Form 298 (Rev 2-89) Precribed by ANSI Std Z39-19 IWFENTIONALLY LEFT BLANK...7 3.3 In-Depth Burning Model ........................................ 7 3.3.1 M odel A ................................................ 9 3.3.2
NASA Astrophysics Data System (ADS)
Tokuhama-Espinosa, Tracey Noel
Concepts from neuroeducation, commonly referred in the popular press as "brain-based learning," have been applied indiscreetly and inconsistently to classroom teaching practices for many years. While standards exist in neurology, psychology and pedagogy, there are no agreed upon standards in their intersection, neuroeducation, and a formal bridge linking the fields is missing. This study used grounded theory development to determine the parameters of the emerging neuroeducational field based on a meta-analysis of the literature over the past 30 years, which included over 2,200 documents. This research results in a new model for neuroeducation. The design of the new model was followed by a Delphi survey of 20 international experts from six different countries that further refined the model contents over several months of reflection. Finally, the revised model was compared to existing information sources, including popular press, peer review journals, academic publications, teacher training textbooks and the Internet, to determine to what extent standards in neuroeducation are met in the current literature. This study determined that standards in the emerging field, now labeled Mind, Brain, and Education: The Science of Teaching and Learning after the Delphi rounds, are the union of standards in the parent fields of neuroscience, psychology, and education. Additionally, the Delphi expert panel agreed upon the goals of the new discipline, its history, the thought leaders, and a model for judging quality information. The study culminated in a new model of the academic discipline of Mind, Brain, and Education science, which explains the tenets, principles and instructional guidelines supported by the meta-analysis of the literature and the Delphi response.
Robust constraint on cosmic textures from the cosmic microwave background.
Feeney, Stephen M; Johnson, Matthew C; Mortlock, Daniel J; Peiris, Hiranya V
2012-06-15
Fluctuations in the cosmic microwave background (CMB) contain information which has been pivotal in establishing the current cosmological model. These data can also be used to test well-motivated additions to this model, such as cosmic textures. Textures are a type of topological defect that can be produced during a cosmological phase transition in the early Universe, and which leave characteristic hot and cold spots in the CMB. We apply bayesian methods to carry out a rigorous test of the texture hypothesis, using full-sky data from the Wilkinson Microwave Anisotropy Probe. We conclude that current data do not warrant augmenting the standard cosmological model with textures. We rule out at 95% confidence models that predict more than 6 detectable cosmic textures on the full sky.
Electronic field emission models beyond the Fowler-Nordheim one
NASA Astrophysics Data System (ADS)
Lepetit, Bruno
2017-12-01
We propose several quantum mechanical models to describe electronic field emission from first principles. These models allow us to correlate quantitatively the electronic emission current with the electrode surface details at the atomic scale. They all rely on electronic potential energy surfaces obtained from three dimensional density functional theory calculations. They differ by the various quantum mechanical methods (exact or perturbative, time dependent or time independent), which are used to describe tunneling through the electronic potential energy barrier. Comparison of these models between them and with the standard Fowler-Nordheim one in the context of one dimensional tunneling allows us to assess the impact on the accuracy of the computed current of the approximations made in each model. Among these methods, the time dependent perturbative one provides a well-balanced trade-off between accuracy and computational cost.
NASA Astrophysics Data System (ADS)
Northup, E. A.; Kusterer, J.; Quam, B.; Chen, G.; Early, A. B.; Beach, A. L., III
2015-12-01
The current ICARTT file format standards were developed for the purpose of fulfilling the data management needs for the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) campaign in 2004. The goal of the ICARTT file format was to establish a common and simple to use data file format to promote data exchange and collaboration among science teams with similar science objectives. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Despite its level of acceptance, there are a number of issues with the current ICARTT format, especially concerning the machine readability. To enhance usability, the ICARTT Refresh Earth Science Data Systems Working Group (ESDSWG) was established to enable a platform for atmospheric science data producers, users (e.g. modelers) and data managers to collaborate on developing criteria for this file format. Ultimately, this is a cross agency effort to improve and aggregate the metadata records being produced. After conducting a survey to identify deficiencies in the current format, we determined which are considered most important to the various communities. Numerous recommendations were made to improve upon the file format while maintaining backward compatibility. The recommendations made to date and their advantages and limitations will be discussed.
Reinventing School Libraries: Alternatives, Models and Options for the Future.
ERIC Educational Resources Information Center
Haycock, Ken
1998-01-01
In collaboration with teachers, teacher librarians have a positive impact on student achievement in content areas, information literacy, and reading motivation and ability. Current professional issues of teacher librarians include role clarification, appropriate preservice education, standards for student information literacy, articulation of…
Improvements to the YbF electron electric dipole moment experiment
NASA Astrophysics Data System (ADS)
Sauer, B. E.; Rabey, I. M.; Devlin, J. A.; Tarbutt, M. R.; Ho, C. J.; Hinds, E. A.
2017-04-01
The standard model of particle physics predicts that the permanent electric dipole moment (EDM) of the electron is very nearly zero. Many extensions to the standard model predict an electron EDM just below current experimental limits. We are currently working to improve the sensitivity of the Imperial College YbF experiment. We have implemented combined laser-radiofrequency pumping techniques which both increase the number of molecules which participate in the EDM experiment and also increase the probability of detection. Combined, these techniques give nearly two orders of magnitude increase in the experimental sensitivity. At this enhanced sensitivity magnetic effects which were negligible become important. We have developed a new way to construct the electrodes for electric field plates which minimizes the effect of magnetic Johnson noise. The new YbF experiment is expected to comparable in sensitivity to the most sensitive measurements of the electron EDM to date. We will also discuss laser cooling techniques which promise an even larger increase in sensitivity.
Implications of Higgs’ universality for physics beyond the Standard Model
NASA Astrophysics Data System (ADS)
Goldman, T.; Stephenson, G. J.
2017-06-01
We emulate Cabibbo by assuming a kind of universality for fermion mass terms in the Standard Model. We show that this is consistent with all current data and with the concept that deviations from what we term Higgs’ universality are due to corrections from currently unknown physics of nonetheless conventional form. The application to quarks is straightforward, while the application to leptons makes use of the recognition that Dark Matter can provide the “sterile” neutrinos needed for the seesaw mechanism. Requiring agreement with neutrino oscillation results leads to the prediction that the mass eigenstates of the sterile neutrinos are separated by quadratically larger ratios than for the charged fermions. Using consistency with the global fit to LSND-like, short-baseline oscillations to determine the scale of the lowest mass sterile neutrino strongly suggests that the recently observed astrophysical 3.55 keV γ-ray line is also consistent with the mass expected for the second most massive sterile neutrino in our analysis.
Probing SUSY effects in K S 0 → μ + μ -
NASA Astrophysics Data System (ADS)
Chobanova, Veronika; D'Ambrosio, Giancarlo; Kitahara, Teppei; Martínez, Miriam Lucio; Santos, Diego Martínez; Fernández, Isabel Suárez; Yamamoto, Kei
2018-05-01
We explore supersymmetric contributions to the decay K S 0 → μ + μ -, in light of current experimental data. The Standard Model (SM) predicts B({K}_S^0\\to {μ}+{μ}-)≈ 5× {10}^{-12} . We find that contributions arising from flavour violating Higgs penguins can enhance the branching fraction up to ≈ 35 × 10-12 within different scenarios of the Minimal Supersymmetric Standard Model (MSSM), as well as suppress it down to ≈ 0 .78 × 10-12. Regions with fine-tuned parameters can bring the branching fraction up to the current experimental upper bound, 8 × 10-10. The mass degeneracy of the heavy Higgs bosons in MSSM induces correlations between B({K}_S^0\\to {μ}+{μ}-)and B({K}_L^0\\to {μ}+{μ}-) . Predictions for the CP asymmetry in K 0 → μ + μ - decays in the context of MSSM are also given, and can be up to eight times bigger than in the SM.
NASA Occupant Protection Standards Development
NASA Technical Reports Server (NTRS)
Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles
2012-01-01
Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement
Building-up a database of spectro-photometric standards from the UV to the NIR
NASA Astrophysics Data System (ADS)
Vernet, J.; Kerber, F.; Mainieri, V.; Rauch, T.; Saitta, F.; D'Odorico, S.; Bohlin, R.; Ivanov, V.; Lidman, C.; Mason, E.; Smette, A.; Walsh, J.; Fosbury, R.; Goldoni, P.; Groot, P.; Hammer, F.; Kaper, L.; Horrobin, M.; Kjaergaard-Rasmussen, P.; Royer, F.
2010-11-01
We present results of a project aimed at establishing a set of 12 spectro-photometric standards over a wide wavelength range from 320 to 2500 nm. Currently no such set of standard stars covering the near-IR is available. Our strategy is to extend the useful range of existing well-established optical flux standards (Oke 1990, Hamuy et al. 1992, 1994) into the near-IR by means of integral field spectroscopy with SINFONI at the VLT combined with state-of-the-art white dwarf stellar atmospheric models (TMAP, Holberg et al. 2008). As a solid reference, we use two primary HST standard white dwarfs GD71 and GD153 and one HST secondary standard BD+17 4708. The data were collected through an ESO “Observatory Programme” over ~40 nights between February 2007 and September 2008.
Realistic model for a fifth force explaining anomaly in Be8* →8Bee+e- decay
NASA Astrophysics Data System (ADS)
Gu, Pei-Hong; He, Xiao-Gang
2017-06-01
We propose a theoretical model to explain a 6.8 σ anomaly recently reported in the opening angle and invariant mass distributions of e+e- pairs produced in excited Be8* nuclear transition to its ground state 8B e. The anomaly is explained by a fifth force mediated by a 17 MeV X boson through the decay Be8* →8Be X followed by X →e+e-. The X boson comes from extension of the standard model with two additional U(1) gauge symmetries producing a protophobic pure vector current interaction with quarks. The model also contains axial-vector current interaction. Although the existent axial-vector current interactions are strongly constrained by the measurement of parity violation in e-quark scattering, their contributions cancel out in the iso-scalar interaction for Be8* →8Be X. It is remarkable that the model parameters need to explain the anomaly survive all known low energy experimental constraints. The model may also alleviate the long-standing (g - 2)μ anomaly problem and can be probed by the LHCb experiment.
NASA Astrophysics Data System (ADS)
Biswas, Jhumoor; John, Kuruvilla; Farooqui, Zuber
The recent Intergovernmental Panel on Climate Change report predicts significant temperature increases over the century which constitutes the pulse of climate variability in a region. A modeling study was performed to identify the potential impact of temperature perturbations on tropospheric ozone concentrations in South Texas. A future case modeling scenario which incorporates appropriate emission reduction strategies without accounting for climatic inconsistencies was used in this study. The photochemical modeling was undertaken for a high ozone episode of 13-20 September 1999, and a future modeling scenario was projected for ozone episode days in 2007 utilizing the meteorological conditions prevalent in the base year. The temperatures were increased uniformly throughout the simulation domain and through the vertical layers by 2°C, 3°C, 4°C, 5°C, and 6°C, respectively in the future year modeling case. These temperature perturbations represented the outcome of extreme climate change within the study region. Significantly large changes in peak ozone concentrations were predicted by the photochemical model. For the 6°C temperature perturbation, the greatest amplification in the maximum 8-h ozone concentrations within urban areas of the modeling domain was approximately 12 ppb. In addition, transboundary flux from major industrialized urban areas played a major role in supplementing the high ozone concentrations during the perturbed temperature scenarios. The Unites States Environmental Protection Agency (USEPA) is currently proposing stricter 8-h ozone standards. The effect of temperature perturbations on ozone exceedances based on current and potential stringent future National Ambient Air Quality Standards (NAAQS) was also studied. Temperatures had an appreciable spatial impact on the 8-h ozone exceedances with a considerable increase in spatial area exceeding the NAAQS for the 8-h ozone levels within the study region for each successive augmentation in temperature. The number of exceedances of the 8-h ozone standard increased significantly with each degree rise of temperature with the problem becoming even more acute in light of stricter future proposed standards of ozone.
Spatial data standards meet meteorological data - pushing the boundaries
NASA Astrophysics Data System (ADS)
Wagemann, Julia; Siemen, Stephan; Lamy-Thepaut, Sylvie
2017-04-01
The data archive of the European Centre for Medium-Range Weather Forecasts (ECMWF) holds around 120 PB of data and is world's largest archive of meteorological data. This information is of great value for many Earth Science disciplines, but the complexity of the data (up to five dimensions and different time axis domains) and its native data format GRIB, while being an efficient archive format, limits the overall data uptake especially from users outside the MetOcean domain. ECMWF's MARS WebAPI is a very efficient and flexible system for expert users to access and retrieve meteorological data, though challenging for users outside the MetOcean domain. With the help of web-based standards for data access and processing, ECMWF wants to make more than 1 PB of meteorological and climate data easier accessible to users across different Earth Science disciplines. As climate data provider for the H2020 project EarthServer-2, ECMWF explores the feasibility to give on-demand access to it's MARS archive via the OGC standard interface Web Coverage Service (WCS). Despite the potential a WCS for climate and meteorological data offers, the standards-based modelling of meteorological and climate data entails many challenges and reveals the boundaries of the current Web Coverage Service 2.0 standard. Challenges range from valid semantic data models for meteorological data to optimal and efficient data structures for a scalable web service. The presentation reviews the applicability of the current Web Coverage Service 2.0 standard to meteorological and climate data and discusses challenges that are necessary to overcome in order to achieve real interoperability and to ensure the conformant sharing and exchange of meteorological data.
Exotic quarks in Twin Higgs models
Cheng, Hsin -Chia; Jung, Sunghoon; Salvioni, Ennio; ...
2016-03-14
The Twin Higgs model provides a natural theory for the electroweak symmetry breaking without the need of new particles carrying the standard model gauge charges below a few TeV. In the low energy theory, the only probe comes from the mixing of the Higgs fields in the standard model and twin sectors. However, an ultraviolet completion is required below ~ 10 TeV to remove residual logarithmic divergences. In non-supersymmetric completions, new exotic fermions charged under both the standard model and twin gauge symmetries have to be present to accompany the top quark, thus providing a high energy probe of themore » model. Some of them carry standard model color, and may therefore be copiously produced at current or future hadron colliders. Once produced, these exotic quarks can decay into a top together with twin sector particles. If the twin sector particles escape the detection, we have the irreducible stop-like signals. On the other hand, some twin sector particles may decay back into the standard model particles with long lifetimes, giving spectacular displaced vertex signals in combination with the prompt top quarks. This happens in the Fraternal Twin Higgs scenario with typical parameters, and sometimes is even necessary for cosmological reasons. We study the potential displaced vertex signals from the decays of the twin bottomonia, twin glueballs, and twin leptons in the Fraternal Twin Higgs scenario. As a result, depending on the details of the twin sector, the exotic quarks may be probed up to ~ 2.5 TeV at the LHC and beyond 10 TeV at a future 100 TeV collider, providing a strong test of this class of ultraviolet completions.« less
On standardization of basic datasets of electronic medical records in traditional Chinese medicine.
Zhang, Hong; Ni, Wandong; Li, Jing; Jiang, Youlin; Liu, Kunjing; Ma, Zhaohui
2017-12-24
Standardization of electronic medical record, so as to enable resource-sharing and information exchange among medical institutions has become inevitable in view of the ever increasing medical information. The current research is an effort towards the standardization of basic dataset of electronic medical records in traditional Chinese medicine. In this work, an outpatient clinical information model and an inpatient clinical information model are created to adequately depict the diagnosis processes and treatment procedures of traditional Chinese medicine. To be backward compatible with the existing dataset standard created for western medicine, the new standard shall be a superset of the existing standard. Thus, the two models are checked against the existing standard in conjunction with 170,000 medical record cases. If a case cannot be covered by the existing standard due to the particularity of Chinese medicine, then either an existing data element is expanded with some Chinese medicine contents or a new data element is created. Some dataset subsets are also created to group and record Chinese medicine special diagnoses and treatments such as acupuncture. The outcome of this research is a proposal of standardized traditional Chinese medicine medical records datasets. The proposal has been verified successfully in three medical institutions with hundreds of thousands of medical records. A new dataset standard for traditional Chinese medicine is proposed in this paper. The proposed standard, covering traditional Chinese medicine as well as western medicine, is expected to be soon approved by the authority. A widespread adoption of this proposal will enable traditional Chinese medicine hospitals and institutions to easily exchange information and share resources. Copyright © 2017. Published by Elsevier B.V.
Workflow of CAD / CAM Scoliosis Brace Adjustment in Preparation Using 3D Printing.
Weiss, Hans-Rudolf; Tournavitis, Nicos; Nan, Xiaofeng; Borysov, Maksym; Paul, Lothar
2017-01-01
High correction bracing is the most effective conservative treatment for patients with scoliosis during growth. Still today braces for the treatment of scoliosis are made by casting patients while computer aided design (CAD) and computer aided manufacturing (CAM) is available with all possibilities to standardize pattern specific brace treatment and improve wearing comfort. CAD / CAM brace production mainly relies on carving a polyurethane foam model which is the basis for vacuuming a polyethylene (PE) or polypropylene (PP) brace. Purpose of this short communication is to describe the workflow currently used and to outline future requirements with respect to 3D printing technology. Description of the steps of virtual brace adjustment as available today are content of this paper as well as an outline of the great potential there is for the future 3D printing technology. For 3D printing of scoliosis braces it is necessary to establish easy to use software plug-ins in order to allow adding 3D printing technology to the current workflow of virtual CAD / CAM brace adjustment. Textures and structures can be added to the brace models at certain well defined locations offering the potential of more wearing comfort without losing in-brace correction. Advances have to be made in the field of CAD / CAM software tools with respect to design and generation of individually structured brace models based on currently well established and standardized scoliosis brace libraries.
Constraining viscous dark energy models with the latest cosmological data
NASA Astrophysics Data System (ADS)
Wang, Deng; Yan, Yang-Jie; Meng, Xin-He
2017-10-01
Based on the assumption that the dark energy possessing bulk viscosity is homogeneously and isotropically permeated in the universe, we propose three new viscous dark energy (VDE) models to characterize the accelerating universe. By constraining these three models with the latest cosmological observations, we find that they just deviate very slightly from the standard cosmological model and can alleviate effectively the current H_0 tension between the local observation by the Hubble Space Telescope and the global measurement by the Planck Satellite. Interestingly, we conclude that a spatially flat universe in our VDE model with cosmic curvature is still supported by current data, and the scale invariant primordial power spectrum is strongly excluded at least at the 5.5σ confidence level in the three VDE models as the Planck result. We also give the 95% upper limits of the typical bulk viscosity parameter η in the three VDE scenarios.
Neiger, Brad L; Thackeray, Rosemary; Fagen, Michael C
2011-03-01
Priority setting is an important component of systematic planning in health promotion and also factors into the development of a comprehensive evaluation plan. The basic priority rating (BPR) model was introduced more than 50 years ago and includes criteria that should be considered in any priority setting approach (i.e., use of predetermined criteria, standardized comparisons, and a rubric that controls bias). Although the BPR model has provided basic direction in priority setting, it does not represent the broad array of data currently available to decision makers. Elements in the model also give more weight to the impact of communicable diseases compared with chronic diseases. For these reasons, several modifications are recommended to improve the BPR model and to better assist health promotion practitioners in the priority setting process. The authors also suggest a new name, BPR 2.0, to represent this revised model.
Search for nonstandard neutrino interactions with IceCube DeepCore
NASA Astrophysics Data System (ADS)
Aartsen, M. G.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Al Samarai, I.; Altmann, D.; Andeen, K.; Anderson, T.; Ansseau, I.; Anton, G.; Argüelles, C.; Auffenberg, J.; Axani, S.; Bagherpour, H.; Bai, X.; Barron, J. P.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; BenZvi, S.; Berley, D.; Bernardini, E.; Besson, D. Z.; Binder, G.; Bindig, D.; Blaufuss, E.; Blot, S.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Bourbeau, E.; Bourbeau, J.; Bradascio, F.; Braun, J.; Brayeur, L.; Brenzke, M.; Bretz, H.-P.; Bron, S.; Brostean-Kaiser, J.; Burgman, A.; Carver, T.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cross, R.; Day, M.; de André, J. P. A. M.; De Clercq, C.; DeLaunay, J. J.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dujmovic, H.; Dumm, J. P.; Dunkman, M.; Dvorak, E.; Eberhardt, B.; Ehrhardt, T.; Eichmann, B.; Eller, P.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Franckowiak, A.; Friedman, E.; Fuchs, T.; Gaisser, T. K.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Giang, W.; Glauch, T.; Glüsenkamp, T.; Goldschmidt, A.; Gonzalez, J. G.; Grant, D.; Griffith, Z.; Haack, C.; Hallgren, A.; Halzen, F.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Hokanson-Fasig, B.; Hoshina, K.; Huang, F.; Huber, M.; Hultqvist, K.; Hünnefeld, M.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Kalaczynski, P.; Kang, W.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kheirandish, A.; Kim, J.; Kim, M.; Kintscher, T.; Kirby, C.; Kiryluk, J.; Kittler, T.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Köpke, L.; Kopper, C.; Kopper, S.; Koschinsky, J. P.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, M.; Krückl, G.; Kunnen, J.; Kunwar, S.; Kurahashi, N.; Kuwabara, T.; Kyriacou, A.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lauber, F.; Lennarz, D.; Lesiak-Bzdak, M.; Leuermann, M.; Liu, Q. R.; Lu, L.; Lünemann, J.; Luszczak, W.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mancina, S.; Maruyama, R.; Mase, K.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Micallef, J.; Momenté, G.; Montaruli, T.; Moore, R. W.; Moulai, M.; Nahnhauer, R.; Nakarmi, P.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Peiffer, P.; Pepper, J. A.; Pérez de los Heros, C.; Pieloth, D.; Pinat, E.; Plum, M.; Price, P. B.; Przybylski, G. T.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Rea, I. C.; Reimann, R.; Relethford, B.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Rysewyk, D.; Sälzer, T.; Sanchez Herrera, S. E.; Sandrock, A.; Sandroos, J.; Santander, M.; Sarkar, S.; Sarkar, S.; Satalecka, K.; Schlunder, P.; Schmidt, T.; Schneider, A.; Schoenen, S.; Schöneberg, S.; Schumacher, L.; Seckel, D.; Seunarine, S.; Soedingrekso, J.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stachurska, J.; Stamatikos, M.; Stanev, T.; Stasik, A.; Stettner, J.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Strotjohann, N. L.; Stuttard, T.; Sullivan, G. W.; Sutherland, M.; Taboada, I.; Tatar, J.; Tenholt, F.; Ter-Antonyan, S.; Terliuk, A.; Tešić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Tung, C. F.; Turcati, A.; Turley, C. F.; Ty, B.; Unger, E.; Usner, M.; Vandenbroucke, J.; Van Driessche, W.; van Eijndhoven, N.; Vanheule, S.; van Santen, J.; Vehring, M.; Vogel, E.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandler, F. D.; Wandkowsky, N.; Waza, A.; Weaver, C.; Weiss, M. J.; Wendt, C.; Werthebach, J.; Westerhoff, S.; Whelan, B. J.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wolf, M.; Wood, J.; Wood, T. R.; Woolsey, E.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Yuan, T.; Zoll, M.; IceCube Collaboration
2018-04-01
As atmospheric neutrinos propagate through the Earth, vacuumlike oscillations are modified by Standard Model neutral- and charged-current interactions with electrons. Theories beyond the Standard Model introduce heavy, TeV-scale bosons that can produce nonstandard neutrino interactions. These additional interactions may modify the Standard Model matter effect producing a measurable deviation from the prediction for atmospheric neutrino oscillations. The result described in this paper constrains nonstandard interaction parameters, building upon a previous analysis of atmospheric muon-neutrino disappearance with three years of IceCube DeepCore data. The best fit for the muon to tau flavor changing term is ɛμ τ=-0.0005 , with a 90% C.L. allowed range of -0.0067 <ɛμ τ<0.0081 . This result is more restrictive than recent limits from other experiments for ɛμ τ. Furthermore, our result is complementary to a recent constraint on ɛμ τ using another publicly available IceCube high-energy event selection. Together, they constitute the world's best limits on nonstandard interactions in the μ -τ sector.
Spatial features register: toward standardization of spatial features
Cascio, Janette
1994-01-01
As the need to share spatial data increases, more than agreement on a common format is needed to ensure that the data is meaningful to both the importer and the exporter. Effective data transfer also requires common definitions of spatial features. To achieve this, part 2 of the Spatial Data Transfer Standard (SDTS) provides a model for a spatial features data content specification and a glossary of features and attributes that fit this model. The model provides a foundation for standardizing spatial features. The glossary now contains only a limited subset of hydrographic and topographic features. For it to be useful, terms and definitions must be included for other categories, such as base cartographic, bathymetric, cadastral, cultural and demographic, geodetic, geologic, ground transportation, international boundaries, soils, vegetation, water, and wetlands, and the set of hydrographic and topographic features must be expanded. This paper will review the philosophy of the SDTS part 2 and the current plans for creating a national spatial features register as one mechanism for maintaining part 2.
Water impacts of CO2 emission performance standards for fossil fuel-fired power plants.
Talati, Shuchi; Zhai, Haibo; Morgan, M Granger
2014-10-21
We employ an integrated systems modeling tool to assess the water impacts of the new source performance standards recently proposed by the U.S. Environmental Protection Agency for limiting CO2 emissions from coal- and gas-fired power plants. The implementation of amine-based carbon capture and storage (CCS) for 40% CO2 capture to meet the current proposal will increase plant water use by roughly 30% in supercritical pulverized coal-fired power plants. The specific amount of added water use varies with power plant and CCS designs. More stringent emission standards than the current proposal would require CO2 emission reductions for natural gas combined-cycle (NGCC) plants via CCS, which would also increase plant water use. When examined over a range of possible future emission standards from 1100 to 300 lb CO2/MWh gross, new baseload NGCC plants consume roughly 60-70% less water than coal-fired plants. A series of adaptation approaches to secure low-carbon energy production and improve the electric power industry's water management in the face of future policy constraints are discussed both quantitatively and qualitatively.
Hadley, Wendy; Sato, Amy; Kuhl, Elizabeth; Rancourt, Diana; Oster, Danielle; Lloyd-Richardson, Elizabeth
2015-01-01
Objective Adolescent weight control interventions demonstrate variable findings, with inconsistent data regarding the appropriate role for parents. The current study examined the efficacy of a standard adolescent behavioral weight control (BWC) intervention that also targeted parent–adolescent communication and parental modeling of healthy behaviors (Standard Behavioral Treatment + Enhanced Parenting; SBT + EP) compared with a standard BWC intervention (SBT). Methods 49 obese adolescents (M age = 15.10; SD = 1.33; 76% female; 67.3% non-Hispanic White) and a caregiver were randomly assigned to SBT or SBT + EP. Adolescent and caregiver weight and height, parental modeling, and weight-related communication were obtained at baseline and end of the 16-week intervention. Results Significant decreases in adolescent weight and increases in parental self-monitoring were observed across both conditions. Analyses of covariance revealed a trend for greater reduction in weight and negative maternal commentary among SBT condition participants. Conclusions Contrary to hypotheses, targeting parent–adolescent communication and parental modeling did not lead to better outcomes in adolescent weight control. PMID:25294840
Morales, Juan F; Montoto, Sebastian Scioli; Fagiolino, Pietro; Ruiz, Maria E
2017-01-01
The Blood-Brain Barrier (BBB) is a physical and biochemical barrier that restricts the entry of certain drugs to the Central Nervous System (CNS), while allowing the passage of others. The ability to predict the permeability of a given molecule through the BBB is a key aspect in CNS drug discovery and development, since neurotherapeutic agents with molecular targets in the CNS should be able to cross the BBB, whereas peripherally acting agents should not, to minimize the risk of CNS adverse effects. In this review we examine and discuss QSAR approaches and current availability of experimental data for the construction of BBB permeability predictive models, focusing on the modeling of the biorelevant parameter unbound partitioning coefficient (Kp,uu). Emphasis is made on two possible strategies to overcome the current limitations of in silico models: considering the prediction of brain penetration as a multifactorial problem, and increasing experimental datasets through accurate and standardized experimental techniques.
Electro-thermo-optical simulation of vertical-cavity surface-emitting lasers
NASA Astrophysics Data System (ADS)
Smagley, Vladimir Anatolievich
Three-dimensional electro-thermal simulator based on the double-layer approximation for the active region was coupled to optical gain and optical field numerical simulators to provide a self-consistent steady-state solution of VCSEL current-voltage and current-output power characteristics. Methodology of VCSEL modeling had been established and applied to model a standard 850-nm VCSEL based on GaAs-active region and a novel intracavity-contacted 400-nm GaN-based VCSEL. Results of GaAs VCSEL simulation were in a good agreement with experiment. Correlations between current injection and radiative mode profiles have been observed. Physical sub-models of transport, optical gain and cavity optical field were developed. Carrier transport through DBRs was studied. Problem of optical fields in VCSEL cavity was treated numerically by the effective frequency method. All the sub-models were connected through spatially inhomogeneous rate equation system. It was shown that the conventional uncoupled analysis of every separate physical phenomenon would be insufficient to describe VCSEL operation.
Physics-based coastal current tomographic tracking using a Kalman filter.
Wang, Tongchen; Zhang, Ying; Yang, T C; Chen, Huifang; Xu, Wen
2018-05-01
Ocean acoustic tomography can be used based on measurements of two-way travel-time differences between the nodes deployed on the perimeter of the surveying area to invert/map the ocean current inside the area. Data at different times can be related using a Kalman filter, and given an ocean circulation model, one can in principle now cast and even forecast current distribution given an initial distribution and/or the travel-time difference data on the boundary. However, an ocean circulation model requires many inputs (many of them often not available) and is unpractical for estimation of the current field. A simplified form of the discretized Navier-Stokes equation is used to show that the future velocity state is just a weighted spatial average of the current state. These weights could be obtained from an ocean circulation model, but here in a data driven approach, auto-regressive methods are used to obtain the time and space dependent weights from the data. It is shown, based on simulated data, that the current field tracked using a Kalman filter (with an arbitrary initial condition) is more accurate than that estimated by the standard methods where data at different times are treated independently. Real data are also examined.
Toward a consistent modeling framework to assess multi-sectoral climate impacts.
Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin
2018-02-13
Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Building an Evaluation Scale using Item Response Theory.
Lalor, John P; Wu, Hao; Yu, Hong
2016-11-01
Evaluation of NLP methods requires testing against a previously vetted gold-standard test set and reporting standard metrics (accuracy/precision/recall/F1). The current assumption is that all items in a given test set are equal with regards to difficulty and discriminating power. We propose Item Response Theory (IRT) from psychometrics as an alternative means for gold-standard test-set generation and NLP system evaluation. IRT is able to describe characteristics of individual items - their difficulty and discriminating power - and can account for these characteristics in its estimation of human intelligence or ability for an NLP task. In this paper, we demonstrate IRT by generating a gold-standard test set for Recognizing Textual Entailment. By collecting a large number of human responses and fitting our IRT model, we show that our IRT model compares NLP systems with the performance in a human population and is able to provide more insight into system performance than standard evaluation metrics. We show that a high accuracy score does not always imply a high IRT score, which depends on the item characteristics and the response pattern.
Building an Evaluation Scale using Item Response Theory
Lalor, John P.; Wu, Hao; Yu, Hong
2016-01-01
Evaluation of NLP methods requires testing against a previously vetted gold-standard test set and reporting standard metrics (accuracy/precision/recall/F1). The current assumption is that all items in a given test set are equal with regards to difficulty and discriminating power. We propose Item Response Theory (IRT) from psychometrics as an alternative means for gold-standard test-set generation and NLP system evaluation. IRT is able to describe characteristics of individual items - their difficulty and discriminating power - and can account for these characteristics in its estimation of human intelligence or ability for an NLP task. In this paper, we demonstrate IRT by generating a gold-standard test set for Recognizing Textual Entailment. By collecting a large number of human responses and fitting our IRT model, we show that our IRT model compares NLP systems with the performance in a human population and is able to provide more insight into system performance than standard evaluation metrics. We show that a high accuracy score does not always imply a high IRT score, which depends on the item characteristics and the response pattern.1 PMID:28004039
Standardizing an End-to-end Accounting Service
NASA Technical Reports Server (NTRS)
Greenberg, Edward; Kazz, Greg
2006-01-01
Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.
Data dictionary and formatting standard for dissemination of geotechnical data
Benoit, J.; Bobbitt, J.I.; Ponti, D.J.; Shimel, S.A.; ,
2004-01-01
A pilot system for archiving and web dissemination of geotechnical data collected and stored by various agencies is currently under development. Part of the scope of this project, sponsored by the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) and by the Pacific Earthquake Engineering Research Center (PEER) Lifelines Program, is the development of a data dictionary and formatting standard. This paper presents the data model along with the basic structure of the data dictionary tables for this pilot system.
Majumder, Rupamanjari; Jangsangthong, Wanchana; Feola, Iolanda; Ypey, Dirk L.; Pijnappels, Daniël A.; Panfilov, Alexander V.
2016-01-01
Atrial fibrillation (AF) is the most frequent form of arrhythmia occurring in the industrialized world. Because of its complex nature, each identified form of AF requires specialized treatment. Thus, an in-depth understanding of the bases of these arrhythmias is essential for therapeutic development. A variety of experimental studies aimed at understanding the mechanisms of AF are performed using primary cultures of neonatal rat atrial cardiomyocytes (NRAMs). Previously, we have shown that the distinct advantage of NRAM cultures is that they allow standardized, systematic, robust re-entry induction in the presence of a constitutively-active acetylcholine-mediated K+ current (IKACh-c). Experimental studies dedicated to mechanistic explorations of AF, using these cultures, often use computer models for detailed electrophysiological investigations. However, currently, no mathematical model for NRAMs is available. Therefore, in the present study we propose the first model for the action potential (AP) of a NRAM with constitutively-active acetylcholine-mediated K+ current (IKACh-c). The descriptions of the ionic currents were based on patch-clamp data obtained from neonatal rats. Our monolayer model closely mimics the action potential duration (APD) restitution and conduction velocity (CV) restitution curves presented in our previous in vitro studies. In addition, the model reproduces the experimentally observed dynamics of spiral wave rotation, in the absence and in the presence of drug interventions, and in the presence of localized myofibroblast heterogeneities. PMID:27332890
An information theory approach to the density of the earth
NASA Technical Reports Server (NTRS)
Graber, M. A.
1977-01-01
Information theory can develop a technique which takes experimentally determined numbers and produces a uniquely specified best density model satisfying those numbers. A model was generated using five numerical parameters: the mass of the earth, its moment of inertia, three zero-node torsional normal modes (L = 2, 8, 26). In order to determine the stability of the solution, six additional densities were generated, in each of which the period of one of the three normal modes was increased or decreased by one standard deviation. The superposition of the seven models is shown. It indicates that current knowledge of the torsional modes is sufficient to specify the density in the upper mantle but that the lower mantle and core will require smaller standard deviations before they can be accurately specified.
Electron magnetic reconnection without ion coupling in Earth's turbulent magnetosheath
NASA Astrophysics Data System (ADS)
Phan, T. D.; Eastwood, J. P.; Shay, M. A.; Drake, J. F.; Sonnerup, B. U. Ö.; Fujimoto, M.; Cassak, P. A.; Øieroset, M.; Burch, J. L.; Torbert, R. B.; Rager, A. C.; Dorelli, J. C.; Gershman, D. J.; Pollock, C.; Pyakurel, P. S.; Haggerty, C. C.; Khotyaintsev, Y.; Lavraud, B.; Saito, Y.; Oka, M.; Ergun, R. E.; Retino, A.; Le Contel, O.; Argall, M. R.; Giles, B. L.; Moore, T. E.; Wilder, F. D.; Strangeway, R. J.; Russell, C. T.; Lindqvist, P. A.; Magnes, W.
2018-05-01
Magnetic reconnection in current sheets is a magnetic-to-particle energy conversion process that is fundamental to many space and laboratory plasma systems. In the standard model of reconnection, this process occurs in a minuscule electron-scale diffusion region1,2. On larger scales, ions couple to the newly reconnected magnetic-field lines and are ejected away from the diffusion region in the form of bi-directional ion jets at the ion Alfvén speed3-5. Much of the energy conversion occurs in spatially extended ion exhausts downstream of the diffusion region6. In turbulent plasmas, which contain a large number of small-scale current sheets, reconnection has long been suggested to have a major role in the dissipation of turbulent energy at kinetic scales7-11. However, evidence for reconnection plasma jetting in small-scale turbulent plasmas has so far been lacking. Here we report observations made in Earth's turbulent magnetosheath region (downstream of the bow shock) of an electron-scale current sheet in which diverging bi-directional super-ion-Alfvénic electron jets, parallel electric fields and enhanced magnetic-to-particle energy conversion were detected. Contrary to the standard model of reconnection, the thin reconnecting current sheet was not embedded in a wider ion-scale current layer and no ion jets were detected. Observations of this and other similar, but unidirectional, electron jet events without signatures of ion reconnection reveal a form of reconnection that can drive turbulent energy transfer and dissipation in electron-scale current sheets without ion coupling.
Electron magnetic reconnection without ion coupling in Earth's turbulent magnetosheath.
Phan, T D; Eastwood, J P; Shay, M A; Drake, J F; Sonnerup, B U Ö; Fujimoto, M; Cassak, P A; Øieroset, M; Burch, J L; Torbert, R B; Rager, A C; Dorelli, J C; Gershman, D J; Pollock, C; Pyakurel, P S; Haggerty, C C; Khotyaintsev, Y; Lavraud, B; Saito, Y; Oka, M; Ergun, R E; Retino, A; Le Contel, O; Argall, M R; Giles, B L; Moore, T E; Wilder, F D; Strangeway, R J; Russell, C T; Lindqvist, P A; Magnes, W
2018-05-01
Magnetic reconnection in current sheets is a magnetic-to-particle energy conversion process that is fundamental to many space and laboratory plasma systems. In the standard model of reconnection, this process occurs in a minuscule electron-scale diffusion region 1,2 . On larger scales, ions couple to the newly reconnected magnetic-field lines and are ejected away from the diffusion region in the form of bi-directional ion jets at the ion Alfvén speed 3-5 . Much of the energy conversion occurs in spatially extended ion exhausts downstream of the diffusion region 6 . In turbulent plasmas, which contain a large number of small-scale current sheets, reconnection has long been suggested to have a major role in the dissipation of turbulent energy at kinetic scales 7-11 . However, evidence for reconnection plasma jetting in small-scale turbulent plasmas has so far been lacking. Here we report observations made in Earth's turbulent magnetosheath region (downstream of the bow shock) of an electron-scale current sheet in which diverging bi-directional super-ion-Alfvénic electron jets, parallel electric fields and enhanced magnetic-to-particle energy conversion were detected. Contrary to the standard model of reconnection, the thin reconnecting current sheet was not embedded in a wider ion-scale current layer and no ion jets were detected. Observations of this and other similar, but unidirectional, electron jet events without signatures of ion reconnection reveal a form of reconnection that can drive turbulent energy transfer and dissipation in electron-scale current sheets without ion coupling.
Model Training Guide. Firefighter I.
ERIC Educational Resources Information Center
Hagevig, William A.; Gallagher, Leigh S.
This firefighter training guide for a 180-hour course was developed to assist training officers in planning training with emphasis on conformance to recommended National Fire Protection Association (NFPA 1001) standards. The material in the guide is referenced to current editions of the International Fire Service Training Association manuals and…
Amphibian model species Xenopus tropicalis is currently being utilized by EPA in the development of a standardized in vivo reproductive toxicity assay. Perturbations to the hypothalamic-pituitary-gonadal axis from exposure to endocrine disrupting compounds during larval develop...
The Talent Search Model: Past, Present, and Future
ERIC Educational Resources Information Center
Swiatek, Mary Ann
2007-01-01
Typical standardized achievement tests cannot provide accurate information about gifted students' abilities because they are not challenging enough for such students. Talent searches solve this problem through above-level testing--using tests designed for older students to raise the ceiling for younger, gifted students. Currently, talent search…
Global ozone and air quality: a multi-model assessment of risks to human health and crops
NASA Astrophysics Data System (ADS)
Ellingsen, K.; Gauss, M.; van Dingenen, R.; Dentener, F. J.; Emberson, L.; Fiore, A. M.; Schultz, M. G.; Stevenson, D. S.; Ashmore, M. R.; Atherton, C. S.; Bergmann, D. J.; Bey, I.; Butler, T.; Drevet, J.; Eskes, H.; Hauglustaine, D. A.; Isaksen, I. S. A.; Horowitz, L. W.; Krol, M.; Lamarque, J. F.; Lawrence, M. G.; van Noije, T.; Pyle, J.; Rast, S.; Rodriguez, J.; Savage, N.; Strahan, S.; Sudo, K.; Szopa, S.; Wild, O.
2008-02-01
Within ACCENT, a European Network of Excellence, eighteen atmospheric models from the U.S., Europe, and Japan calculated present (2000) and future (2030) concentrations of ozone at the Earth's surface with hourly temporal resolution. Comparison of model results with surface ozone measurements in 14 world regions indicates that levels and seasonality of surface ozone in North America and Europe are characterized well by global models, with annual average biases typically within 5-10 nmol/mol. However, comparison with rather sparse observations over some regions suggest that most models overestimate annual ozone by 15-20 nmol/mol in some locations. Two scenarios from the International Institute for Applied Systems Analysis (IIASA) and one from the Intergovernmental Panel on Climate Change Special Report on Emissions Scenarios (IPCC SRES) have been implemented in the models. This study focuses on changes in near-surface ozone and their effects on human health and vegetation. Different indices and air quality standards are used to characterise air quality. We show that often the calculated changes in the different indices are closely inter-related. Indices using lower thresholds are more consistent between the models, and are recommended for global model analysis. Our analysis indicates that currently about two-thirds of the regions considered do not meet health air quality standards, whereas only 2-4 regions remain below the threshold. Calculated air quality exceedances show moderate deterioration by 2030 if current emissions legislation is followed and slight improvements if current emissions reduction technology is used optimally. For the "business as usual" scenario severe air quality problems are predicted. We show that model simulations of air quality indices are particularly sensitive to how well ozone is represented, and improved accuracy is needed for future projections. Additional measurements are needed to allow a more quantitative assessment of the risks to human health and vegetation from changing levels of surface ozone.
Phenomenological Model of Current Sheet Canting in Pulsed Electromagnetic Accelerators
NASA Technical Reports Server (NTRS)
Markusic, Thomas; Choueiri, E. Y.
2003-01-01
The phenomenon of current sheet canting in pulsed electromagnetic accelerators is the departure of the plasma sheet (that carries the current) from a plane that is perpendicular to the electrodes to one that is skewed, or tipped. Review of pulsed electromagnetic accelerator literature reveals that current sheet canting is a ubiquitous phenomenon - occurring in all of the standard accelerator geometries. Developing an understanding of current sheet canting is important because it can detract from the propellant sweeping capabilities of current sheets and, hence, negatively impact the overall efficiency of pulsed electromagnetic accelerators. In the present study, it is postulated that depletion of plasma near the anode, which results from axial density gradient induced diamagnetic drift, occurs during the early stages of the discharge, creating a density gradient normal to the anode, with a characteristic length on the order of the ion skin depth. Rapid penetration of the magnetic field through this region ensues, due to the Hall effect, leading to a canted current front ahead of the initial current conduction channel. In this model, once the current sheet reaches appreciable speeds, entrainment of stationary propellant replenishes plasma in the anode region, inhibiting further Hall-convective transport of the magnetic field; however, the previously established tilted current sheet remains at a fairly constant canting angle for the remainder of the discharge cycle, exerting a transverse J x B force which drives plasma toward the cathode and accumulates it there. This proposed sequence of events has been incorporated into a phenomenological model. The model predicts that canting can be reduced by using low atomic mass propellants with high propellant loading number density; the model results are shown to give qualitative agreement with experimentally measured canting angle mass dependence trends.
Search for Decays of the Λ$$0\\atop{b}$$ Baryon with the D0 Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camacho, Enrique
2011-11-25
This thesis presents work I performed within the D0 Collaboration to make the measurement of the Branching Ratio of Λmore » $$0\\atop{b}$$ baryon in the channel Λ$$0\\atop{b}$$ → J/ΨΛ 0 . The b-hadron such as the Λ$$0\\atop{b}$$ are currently the subject of much research in both the theorical and experimental particle physics communities. Measurements of the production and decays of b-hadrons can improve the understanding of the electroweak and strong interactions described by the Standard Model of particle physics, as well as proving opportunities to search for physics beyond the Standard Model.« less
Battat, James B R; Chandler, John F; Stubbs, Christopher W
2007-12-14
We present constraints on violations of Lorentz invariance based on archival lunar laser-ranging (LLR) data. LLR measures the Earth-Moon separation by timing the round-trip travel of light between the two bodies and is currently accurate to the equivalent of a few centimeters (parts in 10(11) of the total distance). By analyzing this LLR data under the standard-model extension (SME) framework, we derived six observational constraints on dimensionless SME parameters that describe potential Lorentz violation. We found no evidence for Lorentz violation at the 10(-6) to 10(-11) level in these parameters. This work constitutes the first LLR constraints on SME parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caprini, Chiara; Tamanini, Nicola, E-mail: chiara.caprini@cea.fr, E-mail: nicola.tamanini@cea.fr
We perform a forecast analysis of the capability of the eLISA space-based interferometer to constrain models of early and interacting dark energy using gravitational wave standard sirens. We employ simulated catalogues of standard sirens given by merging massive black hole binaries visible by eLISA, with an electromagnetic counterpart detectable by future telescopes. We consider three-arms mission designs with arm length of 1, 2 and 5 million km, 5 years of mission duration and the best-level low frequency noise as recently tested by the LISA Pathfinder. Standard sirens with eLISA give access to an intermediate range of redshift 1 ∼< zmore » ∼< 8, and can therefore provide competitive constraints on models where the onset of the deviation from ΛCDM (i.e. the epoch when early dark energy starts to be non-negligible, or when the interaction with dark matter begins) occurs relatively late, at z ∼< 6. If instead early or interacting dark energy is relevant already in the pre-recombination era, current cosmological probes (especially the cosmic microwave background) are more efficient than eLISA in constraining these models, except possibly in the interacting dark energy model if the energy exchange is proportional to the energy density of dark energy.« less
Considering Planetary Constraints and Dynamic Screening in Solar Evolution Modeling
NASA Astrophysics Data System (ADS)
Wood, Suzannah R.; Mussack, Katie; Guzik, Joyce A.
2018-01-01
The ‘faint early sun problem’ remains unsolved. This problem consists of the apparent contradiction between the standard solar model prediction of lower luminosity (70% of current luminosity) and the observations of liquid water on early Earth and Mars. The presence of liquid water on early Earth and Mars should not be neglected and should be used as a constraint for solar evolution modeling. In addition, modifications to standard solar models are needed to address the discrepancy with solar structure inferred from helioseismology given the latest solar abundance determinations. Here, we will utilize the three different solar abundances: GN93 (Grevesse & Noels, 1993), AGS05 (Asplund et al., 2005), AGSS09 (Asplund et al., 2009). Here, we propose an early mass loss model with an initial solar mass between 1.07 and 1.15 solar masses and an exponentially decreasing mass-loss rate to meet conditions in the early solar system (Wood et al, submitted). Additionally, we investigate the effects of dynamic screening and the new OPLIB opacities from Los Alamos (Colgan et al., 2016). We show the effects of these modifications to the standard solar evolution models on the interior structure, neutrino fluxes, sound speed, p-mode frequencies, convection zone depth, and envelope helium and element abundance of the model sun at the present day.
ERIC Educational Resources Information Center
Conley-Ware, Lakita D.
2010-01-01
This research addresses a real world cyberspace problem, where currently no cross industry standard methodology exists. The goal is to develop a model for identification and detection of vulnerabilities and threats of cyber-crime or cyber-terrorism where cyber-technology is the vehicle to commit the criminal or terrorist act (CVCT). This goal was…
Proposed reporting model update creates dialogue between FASB and not-for-profits.
Mosrie, Norman C
2016-04-01
Seeing a need to refresh the current guidelines, the Financial Accounting Standards Board (FASB) proposed an update to the financial accounting and reporting model for not-for-profit entities. In a response to solicited feedback, the board is now revisiting its proposed update and has set forth a plan to finalize its new guidelines. The FASB continues to solicit and respond to feedback as the process progresses.
NASA Astrophysics Data System (ADS)
Stier, P.; Schutgens, N. A. J.; Bellouin, N.; Bian, H.; Boucher, O.; Chin, M.; Ghan, S.; Huneeus, N.; Kinne, S.; Lin, G.; Ma, X.; Myhre, G.; Penner, J. E.; Randles, C. A.; Samset, B.; Schulz, M.; Takemura, T.; Yu, F.; Yu, H.; Zhou, C.
2013-03-01
Simulated multi-model "diversity" in aerosol direct radiative forcing estimates is often perceived as a measure of aerosol uncertainty. However, current models used for aerosol radiative forcing calculations vary considerably in model components relevant for forcing calculations and the associated "host-model uncertainties" are generally convoluted with the actual aerosol uncertainty. In this AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in twelve participating models. Even with prescribed aerosol radiative properties, simulated clear-sky and all-sky aerosol radiative forcings show significant diversity. For a purely scattering case with globally constant optical depth of 0.2, the global-mean all-sky top-of-atmosphere radiative forcing is -4.47 Wm-2 and the inter-model standard deviation is 0.55 Wm-2, corresponding to a relative standard deviation of 12%. For a case with partially absorbing aerosol with an aerosol optical depth of 0.2 and single scattering albedo of 0.8, the forcing changes to 1.04 Wm-2, and the standard deviation increases to 1.01 W-2, corresponding to a significant relative standard deviation of 97%. However, the top-of-atmosphere forcing variability owing to absorption (subtracting the scattering case from the case with scattering and absorption) is low, with absolute (relative) standard deviations of 0.45 Wm-2 (8%) clear-sky and 0.62 Wm-2 (11%) all-sky. Scaling the forcing standard deviation for a purely scattering case to match the sulfate radiative forcing in the AeroCom Direct Effect experiment demonstrates that host model uncertainties could explain about 36% of the overall sulfate forcing diversity of 0.11 Wm-2 in the AeroCom Direct Radiative Effect experiment. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention.
Integrating Dynamic Data and Sensors with Semantic 3D City Models in the Context of Smart Cities
NASA Astrophysics Data System (ADS)
Chaturvedi, K.; Kolbe, T. H.
2016-10-01
Smart cities provide effective integration of human, physical and digital systems operating in the built environment. The advancements in city and landscape models, sensor web technologies, and simulation methods play a significant role in city analyses and improving quality of life of citizens and governance of cities. Semantic 3D city models can provide substantial benefits and can become a central information backbone for smart city infrastructures. However, current generation semantic 3D city models are static in nature and do not support dynamic properties and sensor observations. In this paper, we propose a new concept called Dynamizer allowing to represent highly dynamic data and providing a method for injecting dynamic variations of city object properties into the static representation. The approach also provides direct capability to model complex patterns based on statistics and general rules and also, real-time sensor observations. The concept is implemented as an Application Domain Extension for the CityGML standard. However, it could also be applied to other GML-based application schemas including the European INSPIRE data themes and national standards for topography and cadasters like the British Ordnance Survey Mastermap or the German cadaster standard ALKIS.
Cosmology in Mirror Twin Higgs and neutrino masses
NASA Astrophysics Data System (ADS)
Chacko, Zackaria; Craig, Nathaniel; Fox, Patrick J.; Harnik, Roni
2017-07-01
We explore a simple solution to the cosmological challenges of the original Mirror Twin Higgs (MTH) model that leads to interesting implications for experiment. We consider theories in which both the standard model and mirror neutrinos acquire masses through the familiar seesaw mechanism, but with a low right-handed neutrino mass scale of order a few GeV. In these νMTH models, the right-handed neutrinos leave the thermal bath while still relativistic. As the universe expands, these particles eventually become nonrelativistic, and come to dominate the energy density of the universe before decaying. Decays to standard model states are preferred, with the result that the visible sector is left at a higher temperature than the twin sector. Consequently the contribution of the twin sector to the radiation density in the early universe is suppressed, allowing the current bounds on this scenario to be satisfied. However, the energy density in twin radiation remains large enough to be discovered in future cosmic microwave background experiments. In addition, the twin neutrinos are significantly heavier than their standard model counterparts, resulting in a sizable contribution to the overall mass density in neutrinos that can be detected in upcoming experiments designed to probe the large scale structure of the universe.
Extension of the general thermal field equation for nanosized emitters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyritsakis, A., E-mail: akyritsos1@gmail.com; Xanthakis, J. P.
2016-01-28
During the previous decade, Jensen et al. developed a general analytical model that successfully describes electron emission from metals both in the field and thermionic regimes, as well as in the transition region. In that development, the standard image corrected triangular potential barrier was used. This barrier model is valid only for planar surfaces and therefore cannot be used in general for modern nanometric emitters. In a recent publication, the authors showed that the standard Fowler-Nordheim theory can be generalized for highly curved emitters if a quadratic term is included to the potential model. In this paper, we extend thismore » generalization for high temperatures and include both the thermal and intermediate regimes. This is achieved by applying the general method developed by Jensen to the quadratic barrier model of our previous publication. We obtain results that are in good agreement with fully numerical calculations for radii R > 4 nm, while our calculated current density differs by a factor up to 27 from the one predicted by the Jensen's standard General-Thermal-Field (GTF) equation. Our extended GTF equation has application to modern sharp electron sources, beam simulation models, and vacuum breakdown theory.« less
Bortot, Maria; Laughter, Melissa Ronni; Stein, Madia; Rocker, Adam; Patel, Vikas; Park, Daewon
2018-05-16
A quaternized reverse thermal gel (RTG) aimed at replacing current surgical incision drapes (SIDs) was designed and characterized. The antimicrobial efficacy of the quaternized RTG was analyzed using both in vitro and in vivo models and was compared to standard SIDs. Polymer characterization was completed using both nuclear magnetic resonance ( 1 H NMR) and lower critical solution temperature (LCST) analysis. Biocompatibility was assessed using a standard cell viability assay. The in vitro antimicrobial efficacy of the polymer was analyzed against four common bacteria species using a time-kill test. The in vivo antimicrobial efficacy of the polymer and standard SIDs were compared using a murine model aimed at mimicking surgical conditions. NMR confirmed the polymer structure and presence of quaternized groups and alkyl chains. The polymer displayed a LCST of 34 °C and a rapid rate of gelation, allowing stable gel formation when applied to skin. Once quaternized, the polymer displayed an increase in kill-rate of bacteria compared to unquaternized polymer. In experiments aimed at mimicking surgical conditions, the quaternized polymer showed statistically comparable bacteria-killing capacity to the standard SID and even surpassed the SID for killing capacity at various time points. A novel approach to replacing current SIDs was developed using an antimicrobial polymer system with RTG properties. The RTG properties of this polymer maintain a liquid state at low temperatures and a gel upon heating, allowing this polymer to form a tight coating when applied to skin. Furthermore, this polymer achieved excellent antimicrobial properties in both in vitro and in vivo models. With further optimization, this polymer system has the potential to replace and streamline presurgical patient preparations through its easy application and beneficial antimicrobial properties.
Monotone Approximation for a Nonlinear Size and Class Age Structured Epidemic Model
2006-02-22
information if it does not display a currently valid OMB control number. 1. REPORT DATE 22 FEB 2006 2. REPORT TYPE 3. DATES COVERED 00-00-2006 to 00...follows from standard results, given the fact that they are all linear problems with local boundary conditions for Sinko-Streifer type systems. We...model, J. Franklin Inst., 297 (1974), 325-333. [14] K. E. Howard, A size and maturity structured model of cell dwarfism exhibiting chaotic be- havior
2017-06-01
maintenance times from the fleet are randomly resampled when running the model to enhance model realism. The use of a simulation model to represent the...helicopter regiment. 2. Attack Helicopter UH TIGER The EC665, or Airbus Helicopter TIGER, (Figure 3) is a four- bladed , twin- engine multi-role attack...migrated into the automated management system SAP Standard Product Family (SASPF), and the usage clock starts to run with the amount of the current
NASA Astrophysics Data System (ADS)
Yu, Hesheng; Thé, Jesse
2016-11-01
The prediction of the dispersion of air pollutants in urban areas is of great importance to public health, homeland security, and environmental protection. Computational Fluid Dynamics (CFD) emerges as an effective tool for pollutant dispersion modelling. This paper reports and quantitatively validates the shear stress transport (SST) k-ω turbulence closure model and its transitional variant for pollutant dispersion under complex urban environment for the first time. Sensitivity analysis is performed to establish recommendation for the proper use of turbulence models in urban settings. The current SST k-ω simulation is validated rigorously by extensive experimental data using hit rate for velocity components, and the "factor of two" of observations (FAC2) and fractional bias (FB) for concentration field. The simulation results show that current SST k-ω model can predict flow field nicely with an overall hit rate of 0.870, and concentration dispersion with FAC2 = 0.721 and FB = 0.045. The flow simulation of the current SST k-ω model is slightly inferior to that of a detached eddy simulation (DES), but better than that of standard k-ε model. However, the current study is the best among these three model approaches, when validated against measurements of pollutant dispersion in the atmosphere. This work aims to provide recommendation for proper use of CFD to predict pollutant dispersion in urban environment.
Spatially-Resolved Beam Current and Charge-State Distributions for the NEXT Ion Engine
NASA Technical Reports Server (NTRS)
Pollard, James E.; Diamant, Kevin D.; Crofton, Mark W.; Patterson, Michael J.; Soulas, George C.
2010-01-01
Plume characterization tests with the 36-cm NEXT ion engine are being performed at The Aerospace Corporation using engineering-model and prototype-model thrusters. We have examined the beam current density and xenon charge-state distribution as functions of position on the accel grid. To measure the current density ratio j++/j+, a collimated Eprobe was rotated through the plume with the probe oriented normal to the accel electrode surface at a distance of 82 cm. The beam current density jb versus radial position was measured with a miniature planar probe at 3 cm from the accel. Combining the j++/j+ and jb data yielded the ratio of total Xe+2 current to total Xe+1 current (J++/J+) at forty operating points in the standard throttle table. The production of Xe+2 and Xe+3 was measured as a function of propellant utilization to support performance and lifetime predictions for an extended throttle table. The angular dependence of jb was measured at intermediate and far-field distances to assist with plume modeling and to evaluate the thrust loss due to beam divergence. Thrust correction factors were derived from the total doubles-to-singles current ratio and from the far-field divergence data
Martínez-Flores, Francisco; Sandoval-Zamora, Hugo; Machuca-Rodriguez, Catalina; Barrera-López, Araceli; García-Cavazos, Ricardo; Madinaveitia-Villanueva, Juan Antonio
2016-01-01
Tissue storage is a medical process that is in the regulation and homogenisation phase in the scientific world. The international standards require the need to ensure safety and efficacy of human allografts such as skin and other tissues. The activities of skin and tissues banks currently involve their recovery, processing, storage and distribution, which are positively correlated with technological and scientific advances present in current biomedical sciences. A description is presented of the operational model of Skin and Tissue Bank at INR as successful case for procurement, recovery and preservation of skin and tissues for therapeutic uses, with high safety and biological quality. The essential and standard guidelines are presented as keystones for a tissue recovery program based on scientific evidence, and within an ethical and legal framework, as well as to propose a model for complete overview of the donation of tissues and organ programs in Mexico. Finally, it concludes with essential proposals for improving the efficacy of transplantation of organs and tissue programs. Copyright © 2015 Academia Mexicana de Cirugía A.C. Published by Masson Doyma México S.A. All rights reserved.
Geant4 Simulations for the Radon Electric Dipole Moment Search at TRIUMF
NASA Astrophysics Data System (ADS)
Rand, Evan; Bangay, Jack; Bianco, Laura; Dunlop, Ryan; Finlay, Paul; Garrett, Paul; Leach, Kyle; Phillips, Andrew; Svensson, Carl; Sumithrarachchi, Chandana; Wong, James
2010-11-01
The existence of a permanent electric dipole moment (EDM) requires the violation of time-reversal symmetry (T) or, equivalently, the violation of charge conjugation C and parity P (CP). Although no particle EDM has yet been found, current theories beyond the Standard Model, e.g. multiple-Higgs theories, left-right symmetry, and supersymmetry, predict EDMs within current experimental reach. In fact, present limits on the EDMs of the neutron, electron and ^199Hg atom have significantly reduced the parameter spaces of these models. The measurement of a non-zero EDM would be a direct measurement of the violation of time-reversal symmetry, and would represent a clear signal of new physics beyond the Standard Model. Recent theoretical calculations predict large enhancements in the atomic EDMs for atoms with octupole-deformed nuclei, making odd-A Rn isotopes prime candidates for the EDM search. The Geant4 simulations presented here are essential for the development towards an EDM measurement. They provide an accurate description of γ-ray scattering and backgrounds in the experimental apparatus, and are being used to study the overall sensitivity of the RnEDM experiment at TRIUMF in Vancouver, B.C.
Chou, Wei-Lung; Wang, Chih-Ta; Chang, Wen-Chun; Chang, Shih-Yu
2010-08-15
In this study, metal hydroxides generated during electrocoagulation (EC) were used to remove the chemical oxygen demand (COD) of oxide chemical mechanical polishing (oxide-CMP) wastewater from a semiconductor manufacturing plant by EC. Adsorption studies were conducted in a batch system for various current densities and temperatures. The COD concentration in the oxide-CMP wastewater was effectively removed and decreased by more than 90%, resulting in a final wastewater COD concentration that was below the Taiwan discharge standard (100 mg L(-1)). Since the processed wastewater quality exceeded the direct discharge standard, the effluent could be considered for reuse. The adsorption kinetic studies showed that the EC process was best described using the pseudo-second-order kinetic model at the various current densities and temperatures. The experimental data were also tested against different adsorption isotherm models to describe the EC process. The Freundlich adsorption isotherm model predictions matched satisfactorily with the experimental observations. Thermodynamic parameters, including the Gibbs free energy, enthalpy, and entropy, indicated that the COD adsorption of oxide-CMP wastewater on metal hydroxides was feasible, spontaneous and endothermic in the temperature range of 288-318 K. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Satellite Data for Seagrass Health Modeling and Monitoring
NASA Technical Reports Server (NTRS)
Spiering, Bruce A.; Underwood, Lauren; Ross, Kenton
2011-01-01
Time series derived information for coastal waters will be used to provide input data for the Fong and Harwell model. The current MODIS land mask limits where the model can be applied; this project will: a) Apply MODIS data with resolution higher than the standard products (250-m vs. 1-km). b) Seek to refine the land mask. c) Explore nearby areas to use as proxies for time series directly over the beds. Novel processing approaches will be leveraged from other NASA projects and customized as inputs for seagrass productivity modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Honrubia-Escribano, A.; Jimenez-Buendia, F.; Molina-Garcia, A.
This paper presents the current status of simplified wind turbine models used for power system stability analysis. This work is based on the ongoing work being developed in IEC 61400-27. This international standard, for which a technical committee was convened in October 2009, is focused on defining generic (also known as simplified) simulation models for both wind turbines and wind power plants. The results of the paper provide an improved understanding of the usability of generic models to conduct power system simulations.
Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H
2004-06-01
Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.
Fermionic dark matter and neutrino masses in a B - L model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sánchez-Vega, B. L.; Schmitz, E. R.
2015-09-01
In this work we present a common framework for neutrino masses and dark matter. Specifically, we work with a local B - L extension of the standard model which has three right-handed neutrinos, n(Ri), and some extra scalars, Phi, phi(i), besides the standard model fields. The n(Ri)'s have nonstandard B - L quantum numbers and thus these couple to different scalars. This model has the attractive property that an almost automatic Z(2) symmetry acting only on a fermionic field, n(R3), is present. Taking advantage of this Z(2) symmetry, we study both the neutrino mass generation via a natural seesaw mechanismmore » at low energy and the possibility of n(R3) being a dark matter candidate. For this last purpose, we study its relic abundance and its compatibility with the current direct detection experiments.« less
Canuto, Enrico; Acuña-Bravo, Wilber; Agostani, Marco; Bonadei, Marco
2014-07-01
Solenoid current regulation is well-known and standard in any proportional electro-hydraulic valve. The goal is to provide a wide-band transfer function from the reference to the measured current, thus making the solenoid a fast and ideal force actuator within the limits of the power supplier. The power supplier is usually a Pulse Width Modulation (PWM) amplifier fixing the voltage bound and the Nyquist frequency of the regulator. Typical analog regulators include three main terms: a feedforward channel, a proportional feedback channel and the electromotive force compensation. The latter compensation may be accomplished by integrative feedback. Here the problem is faced through a model-based design (Embedded Model Control), on the basis of a wide-band embedded model of the solenoid which includes the effect of eddy currents. To this end model parameters must be identified. The embedded model includes a stochastic disturbance dynamics capable of estimating and correcting the electromotive contribution together with parametric uncertainty, variability and state dependence. The embedded model which is fed by the measured current and the supplied voltage becomes a state predictor of the controllable and disturbance dynamics. The control law combines reference generator, state feedback and disturbance rejection to dispatch the PWM amplifier with the appropriate duty cycle. Modeling, identification and control design are outlined together with experimental result. Comparison with an existing analog regulator is also provided. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Simulating Turbulent Wind Fields for Offshore Turbines in Hurricane-Prone Regions (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Damiani, R.; Musial, W.
Extreme wind load cases are one of the most important external conditions in the design of offshore wind turbines in hurricane prone regions. Furthermore, in these areas, the increase in load with storm return-period is higher than in extra-tropical regions. However, current standards have limited information on the appropriate models to simulate wind loads from hurricanes. This study investigates turbulent wind models for load analysis of offshore wind turbines subjected to hurricane conditions. Suggested extreme wind models in IEC 61400-3 and API/ABS (a widely-used standard in oil and gas industry) are investigated. The present study further examines the wind turbinemore » response subjected to Hurricane wind loads. Three-dimensional wind simulator, TurbSim, is modified to include the API wind model. Wind fields simulated using IEC and API wind models are used for an offshore wind turbine model established in FAST to calculate turbine loads and response.« less
Direct 3-D morphological measurements of silicone rubber impression using micro-focus X-ray CT.
Kamegawa, Masayuki; Nakamura, Masayuki; Fukui, Yu; Tsutsumi, Sadami; Hojo, Masaki
2010-01-01
Three-dimensional computer models of dental arches play a significant role in prosthetic dentistry. The microfocus X-ray CT scanner has the advantage of capturing precise 3D shapes of deep fossa, and we propose a new method of measuring the three-dimensional morphology of a dental impression directly, which will eliminate the conversion process to dental casts. Measurement precision and accuracy were evaluated using a standard gage comprised of steel balls which simulate the dental arch. Measurement accuracy, standard deviation of distance distribution of superimposed models, was determined as +/-0.050 mm in comparison with a CAD model. Impressions and casts of an actual dental arch were scanned by microfocus X-ray CT and three-dimensional models were compared. The impression model had finer morphology, especially around the cervical margins of teeth. Within the limitations of the current study, direct three-dimensional impression modeling was successfully demonstrated using microfocus X-ray CT.
Cosmological Models and Stability
NASA Astrophysics Data System (ADS)
Andersson, Lars
Principles in the form of heuristic guidelines or generally accepted dogma play an important role in the development of physical theories. In particular, philosophical considerations and principles figure prominently in the work of Albert Einstein. As mentioned in the talk by Jiří Bičák at this conference, Einstein formulated the equivalence principle, an essential step on the road to general relativity, during his time in Prague 1911-1912. In this talk, I would like to discuss some aspects of cosmological models. As cosmology is an area of physics where "principles" such as the "cosmological principle" or the "Copernican principle" play a prominent role in motivating the class of models which form part of the current standard model, I will start by comparing the role of the equivalence principle to that of the principles used in cosmology. I will then briefly describe the standard model of cosmology to give a perspective on some mathematical problems and conjectures on cosmological models, which are discussed in the later part of this paper.
Sirunyan, A. M.; Tumasyan, A.; Adam, W.; ...
2017-07-03
Here, a search for the production of a single top quark in association with a Z boson is presented, both to identify the expected standard model process and to search for flavour-changing neutral current interactions. The data sample corresponds to an integrated luminosity of 19.7 fb –1 recorded by the CMS experiment at the LHC in proton-proton collisions at √s = 8 TeV. Final states with three leptons (electrons or muons) and at least one jet are investigated. An events yield compatible with tZq standard model production is observed, and the corresponding cross section is measured to be σ(pp →more » tZq → ℓνbℓ +ℓ –q) = 10 –7 +8 fb with a significance of 2.4 standard deviations. No presence of flavour-changing neutral current production of tZq is observed. Exclusion limits at 95% confidence level on the branching fractions of a top quark decaying to a Z boson and an up or a charm quark are found to be Β(t → Zu) < 0.022% and Β(t → Zc) < 0.049%.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sirunyan, A. M.; Tumasyan, A.; Adam, W.
Here, a search for the production of a single top quark in association with a Z boson is presented, both to identify the expected standard model process and to search for flavour-changing neutral current interactions. The data sample corresponds to an integrated luminosity of 19.7 fb –1 recorded by the CMS experiment at the LHC in proton-proton collisions at √s = 8 TeV. Final states with three leptons (electrons or muons) and at least one jet are investigated. An events yield compatible with tZq standard model production is observed, and the corresponding cross section is measured to be σ(pp →more » tZq → ℓνbℓ +ℓ –q) = 10 –7 +8 fb with a significance of 2.4 standard deviations. No presence of flavour-changing neutral current production of tZq is observed. Exclusion limits at 95% confidence level on the branching fractions of a top quark decaying to a Z boson and an up or a charm quark are found to be Β(t → Zu) < 0.022% and Β(t → Zc) < 0.049%.« less
Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.
Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark
2016-03-16
The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.
Implementing PAT with Standards
NASA Astrophysics Data System (ADS)
Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.
2016-02-01
Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.
Zhang, Qingyu; Fan, Juwang; Yang, Weidong; Chen, Bixin; Zhang, Lijuan; Liu, Jiaoyu; Wang, Jingling; Zhou, Chunyao; Chen, Xuan
2017-07-01
Vehicle deterioration and technological change influence emission factors (EFs). In this study, the impacts of vehicle deterioration and emission standards on EFs of regulated pollutants (carbon monoxide [CO], hydrocarbon [HC], and nitrogen oxides [NO x ]) for gasoline light-duty trucks (LDTs) were investigated according to the inspection and maintenance (I/M) data using a chassis dynamometer method. Pollutant EFs for LDTs markedly varied with accumulated mileages and emission standards, and the trends of EFs are associated with accumulated mileages. In addition, the study also found that in most cases, the median EFs of CO, HC, and NO x are higher than those of basic EFs in the International Vehicle Emissions (IVE) model; therefore, the present study provides correction factors for the IVE model relative to the corresponding emission standards and mileages. Currently, vehicle emissions are great contributors to air pollution in cities, especially in developing countries. Emission factors play a key role in creating emission inventory and estimating emissions. Deterioration represented by vehicle age and accumulated mileage and changes of emission standards markedly influence emission factors. In addition, the results provide collection factors for implication in the IVE model in the region levels.
Building a Case for Blocks as Kindergarten Mathematics Learning Tools
ERIC Educational Resources Information Center
Kinzer, Cathy; Gerhardt, Kacie; Coca, Nicole
2016-01-01
Kindergarteners need access to blocks as thinking tools to develop, model, test, and articulate their mathematical ideas. In the current educational landscape, resources such as blocks are being pushed to the side and being replaced by procedural worksheets and academic "seat time" in order to address standards. Mathematics research…
A PRELIMINARY EVALUATION OF MODELS-3 CMAQ USING PARTICULATE MATTER DATA FROM THE IMPROVE NETWORK
The Clean Air Act and its Amendments require the United States Environmental Protection Agency (EPA) to establish National Ambient Air Quality Standards for Particulate Matter (PM) and to assess current and future air quality regulations designed to protect human health and wel...
Effective Strategies for Teaching in K-8 Classrooms
ERIC Educational Resources Information Center
Moore, Kenneth D.; Hansen, Jacqueline
2011-01-01
Featuring a wealth of reflection activities and connections to standards, this concise, easy-to-read teaching methods text equips students with the content knowledge and skills they need to become effective K-8 teachers. The book maximizes instructional flexibility, reflects current educational issues, highlights recent research, and models best…
Critical Supports for Secondary Educators in Common Core State Standard Implementation
ERIC Educational Resources Information Center
Ruchti, Wendy P.; Jenkins, Susan J.; Agamba, Joachim
2013-01-01
Teacher professional development (PD) is a complex, ongoing challenge as educational systems attempt to deliver excellent programming in pursuit of increased student achievement (Opfer and Pedder 2011). This article examines Idaho Total Instructional Alignment (TIA), a model for teacher PD that is currently being utilized in secondary schools…
Challenges in the global QCD analysis of parton structure of nucleons
NASA Astrophysics Data System (ADS)
Tung, Wu-Ki
2000-12-01
We briefly summarize the current status of global QCD analysis of the parton structure of the nucleon and then highlight the open questions and challenges which confront this endeavor on which much of the phenomenology of the Standard Model and the search of New Physics depend.
Fractions, Number Lines, Third Graders
ERIC Educational Resources Information Center
Cramer, Kathleen; Ahrendt, Sue; Monson, Debra; Wyberg, Terry; Colum, Karen
2017-01-01
The Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) outlines ambitious goals for fraction learning, starting in third grade, that include the use of the number line model. Understanding and constructing fractions on a number line are particularly complex tasks. The current work of the authors centers on ways to successfully…
Personality Types as an Indicator of Online Student Success and Retention
ERIC Educational Resources Information Center
Meredith, Ben P.
2011-01-01
With online education courses within public institutions realizing lower than average retention and success rates for students, current retention practices and models are falling woefully short of providing workable, viable answers to keeping students and helping them be successful without lowering academic standards. While a relationship between…
Despite tremendous efforts towards regulating and controlling tropospheric ozone (O3) formation, over 70 million people currently live in U.S. counties which exceed the National Ambient Air Quality Standard (NAAQS) set for 03. These high 03 concentrations alone cost the U.S. ap...
The Philosophy and Foundations of Vocational Education.
ERIC Educational Resources Information Center
MSS Information Corp., New York, NY.
The introductory volume in a new series on vocational education, the book surveys recent literature on the philosophy and foundations of this relatively new field. Opening papers deal with the objectives of vocational education departments in high schools, current standards of technological and industrial education, and models for comprehensive…
Neutrinos What are they? Neutrinos are members of the Standard Model, belonging to a class of the mass could be and the mass differences between flavors of neutrinos, although there are many current experiments designed to probe this question. The difficulty lies in the fact that neutrinos are
Spin and precision electroweak physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marciano, W.J.
1994-12-01
A perspective on fundamental parameters and precision tests of the Standard Model is given. Weak neutral current reactions are discussed with emphasis on those processes involving (polarized) electrons. The role of electroweak radiative corrections in determining the top quark mass and probing for {open_quotes}new physics{close_quotes} is described.
An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2010-10-01
The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayan Ghosh, Jeff Hammond
OpenSHMEM is a community effort to unifyt and standardize the SHMEM programming model. MPI (Message Passing Interface) is a well-known community standard for parallel programming using distributed memory. The most recen t release of MPI, version 3.0, was designed in part to support programming models like SHMEM.OSHMPI is an implementation of the OpenSHMEM standard using MPI-3 for the Linux operating system. It is the first implementation of SHMEM over MPI one-sided communication and has the potential to be widely adopted due to the portability and widely availability of Linux and MPI-3. OSHMPI has been tested on a variety of systemsmore » and implementations of MPI-3, includingInfiniBand clusters using MVAPICH2 and SGI shared-memory supercomputers using MPICH. Current support is limited to Linux but may be extended to Apple OSX if there is sufficient interest. The code is opensource via https://github.com/jeffhammond/oshmpi« less
Robust tuning of robot control systems
NASA Technical Reports Server (NTRS)
Minis, I.; Uebel, M.
1992-01-01
The computed torque control problem is examined for a robot arm with flexible, geared, joint drive systems which are typical in many industrial robots. The standard computed torque algorithm is not directly applicable to this class of manipulators because of the dynamics introduced by the joint drive system. The proposed approach to computed torque control combines a computed torque algorithm with torque controller at each joint. Three such control schemes are proposed. The first scheme uses the joint torque control system currently implemented on the robot arm and a novel form of the computed torque algorithm. The other two use the standard computed torque algorithm and a novel model following torque control system based on model following techniques. Standard tasks and performance indices are used to evaluate the performance of the controllers. Both numerical simulations and experiments are used in evaluation. The study shows that all three proposed systems lead to improved tracking performance over a conventional PD controller.
Air Traffic Control Improvement Using Prioritized CSMA
NASA Technical Reports Server (NTRS)
Robinson, Daryl C.
2001-01-01
Version 7 simulations of the industry-standard network simulation software "OPNET" are presented of two applications of the Aeronautical Telecommunications Network (ATN), Controller Pilot Data Link Communications (CPDLC) and Automatic Dependent Surveillance-Broadcast mode (ADS-B), over VHF Data Link mode 2 (VDL-2). Communication is modeled for air traffic between just three cities. All aircraft are assumed to have the same equipage. The simulation involves Air Traffic Control (ATC) ground stations and 105 aircraft taking off, flying realistic free-flight trajectories, and landing in a 24-hr period. All communication is modeled as unreliable. Collision-less, prioritized carrier sense multiple access (CSMA) is successfully tested. The statistics presented include latency, queue length, and packet loss. This research may show that a communications system simpler than the currently accepted standard envisioned may not only suffice, but also surpass performance of the standard at a lower cost of deployment.
Study of C P -violating charge asymmetries of single muons and like-sign dimuons in p p ¯ collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abazov, V. M.; Abbott, B.; Acharya, B. S.
2014-01-01
We measure the inclusive single muon charge asymmetry and the like-sign dimuon charge asymmetry inmore » $$p \\bar{p}$$ collisions using the full data set of 10.4 fb$$^{-1}$$ collected with the D0 detector at the Fermilab Tevatron. The standard model predictions of the charge asymmetries induced by CP violation are small in magnitude compared to the current experimental precision, so non-zero measurements could indicate new sources of CP violation. The measurements differ from the standard model predictions of CP violation in these asymmetries with a significance of 3.6 standard deviations. These results are interpreted in a framework of $B$ meson mixing within the CKM formalism to measure the relative width difference $$\\dgg$$ between the mass eigenstates of the $$\\Bd$$ meson system, and the semileptonic charge asymmetries $$\\asld$$ and $$\\asls$$ of $$\\Bd$$ and $$\\Bs$$ mesons respectively.« less
Thermoelectric converters for alternating current standards
NASA Astrophysics Data System (ADS)
Anatychuk, L. I.; Taschuk, D. D.
2012-06-01
Thermoelectric converters of alternating current remain priority instruments when creating standard equipment. This work presents the results of design and manufacture of alternating current converter for a military standard of alternating current in Ukraine. Results of simulation of temperature distribution in converter elements, ways of optimization to improve the accuracy of alternating current signal reproduction are presented. Results of metrological trials are given. The quality of thermoelectric material specially created for alternating current metrology is verified. The converter was used in alternating current standard for the frequency range from 10 Hz to 30 MHz. The efficiency of using thermoelectric signal converters in measuring instruments is confirmed.
High Energy Colliders and Hidden Sectors
NASA Astrophysics Data System (ADS)
Dror, Asaf Jeff
This thesis explores two dominant frontiers of theoretical physics, high energy colliders and hidden sectors. The Large Hadron Collider (LHC) is just starting to reach its maximum operational capabilities. However, already with the current data, large classes of models are being put under significant pressure. It is crucial to understand whether the (thus far) null results are a consequence of a lack of solution to the hierarchy problem around the weak scale or requires expanding the search strategy employed at the LHC. It is the duty of the current generation of physicists to design new searches to ensure that no stone is left unturned. To this end, we study the sensitivity of the LHC to the couplings in the Standard Model top sector. We find it can significantly improve the measurements on ZtRtR coupling by a novel search strategy, making use of an implied unitarity violation in such models. Analogously, we show that other couplings in the top sector can also be measured with the same technique. Furthermore, we critically analyze a set of anomalies in the LHC data and how they may appear from consistent UV completions. We also propose a technique to measure lifetimes of new colored particles with non-trivial spin. While the high energy frontier will continue to take data, it is likely the only collider of its kind for the next couple decades. On the other hand, low-energy experiments have a promising future with many new proposed experiments to probe the existence of particles well below the weak scale but with small couplings to the Standard Model. In this work we survey the different possibilities, focusingon the constraints as well as possible new hidden sector dynamics. In particular, we show that vector portals which couple to an anomalous current, e.g., baryon number, are significantly constrained from flavor changing meson decays and rare Z decays. Furthermore, we present a new mechanism for dark matter freezeout which depletes the dark sector through an out-of-equilibrium decay into the Standard Model.
NASA Astrophysics Data System (ADS)
Kattge, J.; Knorr, W.; Raddatz, T.; Wirth, C.
2009-04-01
Photosynthetic capacity is one of the most sensitive parameters of terrestrial biosphere models whose representation in global scale simulations has been severely hampered by a lack of systematic analyses using a sufficiently broad database. Due to its coupling to stomatal conductance changes in the parameterisation of photosynthetic capacity may potentially influence transpiration rates and vegetation surface temperature. Here, we provide a constrained parameterisation of photosynthetic capacity for different plant functional types in the context of the photosynthesis model proposed by Farquhar et al. (1980), based on a comprehensive compilation of leaf photosynthesis rates and leaf nitrogen content. Mean values of photosynthetic capacity were implemented into the coupled climate-vegetation model ECHAM5/JSBACH and modelled gross primary production (GPP) is compared to a compilation of independent observations on stand scale. Compared to the current standard parameterisation the root-mean-squared difference between modelled and observed GPP is substantially reduced for almost all PFTs by the new parameterisation of photosynthetic capacity. We find a systematic depression of NUE (photosynthetic capacity divided by leaf nitrogen content) on certain tropical soils that are known to be deficient in phosphorus. Photosynthetic capacity of tropical trees derived by this study is substantially lower than standard estimates currently used in terrestrial biosphere models. This causes a decrease of modelled GPP while it significantly increases modelled tropical vegetation surface temperatures, up to 0.8°C. These results emphasise the importance of a constrained parameterisation of photosynthetic capacity not only for the carbon cycle, but also for the climate system.
Citygml and the Streets of New York - a Proposal for Detailed Street Space Modelling
NASA Astrophysics Data System (ADS)
Beil, C.; Kolbe, T. H.
2017-10-01
Three-dimensional semantic city models are increasingly used for the analysis of large urban areas. Until now the focus has mostly been on buildings. Nonetheless many applications could also benefit from detailed models of public street space for further analysis. However, there are only few guidelines for representing roads within city models. Therefore, related standards dealing with street modelling are examined and discussed. Nearly all street representations are based on linear abstractions. However, there are many use cases that require or would benefit from the detailed geometrical and semantic representation of street space. A variety of potential applications for detailed street space models are presented. Subsequently, based on related standards as well as on user requirements, a concept for a CityGML-compliant representation of street space in multiple levels of detail is developed. In the course of this process, the CityGML Transportation model of the currently valid OGC standard CityGML2.0 is examined to discover possibilities for further developments. Moreover, a number of improvements are presented. Finally, based on open data sources, the proposed concept is implemented within a semantic 3D city model of New York City generating a detailed 3D street space model for the entire city. As a result, 11 thematic classes, such as roadbeds, sidewalks or traffic islands are generated and enriched with a large number of thematic attributes.
NASA Astrophysics Data System (ADS)
Irby, I.; Friedrichs, M. A. M.
2017-12-01
Human impacts on the Chesapeake Bay through increased nutrient run-off as a result of land-use change, urbanization, and industrialization, have resulted in a degradation of water quality over the last half-century. These direct impacts, compounded with human-induced climate changes such as warming, rising sea level, and changes in precipitation, have elevated the conversation surrounding the future of the Bay's water quality. As a result, in 2010, a Total Maximum Daily Load (TMDL) was established for the Chesapeake Bay that limited nutrient and sediment input in an effort to increase dissolved oxygen. This research utilizes a multiple model approach to evaluate confidence in the estuarine water quality modeling portion of the TMDL. One of the models is then used to assess the potential impact climate change may have on the success of currently mandated nutrient reduction levels in 2050. Results demonstrate that although the models examined differ structurally and in biogeochemical complexity, they project a similar attainment of regulatory water quality standards after nutrient reduction, while also establishing that meeting water quality standards is relatively independent of hydrologic conditions. By developing a Confidence Index, this research identifies the locations and causes of greatest uncertainty in modeled projections of water quality. Although there are specific locations and times where the models disagree, this research lends an increased degree of confidence in the appropriateness of the TMDL levels and in the general impact nutrient reductions will have on Chesapeake Bay water quality under current environmental conditions. However, when examining the potential impacts of climate change, this research shows that the combined impacts of increasing temperature, sea level, and river flow negatively affect dissolved oxygen throughout the Chesapeake Bay and impact progress towards meeting the water quality standards associated with the TMDL with increased temperature as the primary culprit. These results, having been continually shared with the regulatory TMDL modelers, will aid in the decision making for the 2017 TMDL Mid-Point Assessment.
2012-01-01
Background The UK general practitioner (GP) appraisal system is deemed to be an inadequate source of performance evidence to inform a future medical revalidation process. A long-running voluntary model of external peer review in the west of Scotland provides feedback by trained peers on the standard of GP colleagues' core appraisal activities and may 'add value' in strengthening the robustness of the current system in support of revalidation. A significant minority of GPs has participated in the peer feedback model, but a clear majority has yet to engage with it. We aimed to explore the views of non-participants to identify barriers to engagement and attitudes to external peer review as a means to inform the current appraisal system. Methods We conducted semi-structured interviews with a sample of west of Scotland GPs who had yet to participate in the peer review model. A thematic analysis of the interview transcriptions was conducted using a constant comparative approach. Results 13 GPs were interviewed of whom nine were males. Four core themes were identified in relation to the perceived and experienced 'value' placed on the topics discussed and their relevance to routine clinical practice and professional appraisal: 1. Value of the appraisal improvement activity. 2. Value of external peer review. 3. Value of the external peer review model and host organisation and 4. Attitudes to external peer review. Conclusions GPs in this study questioned the 'value' of participation in the external peer review model and the national appraisal system over the standard of internal feedback received from immediate work colleagues. There was a limited understanding of the concept, context and purpose of external peer review and some distrust of the host educational provider. Future engagement with the model by these GPs is likely to be influenced by policy to improve the standard of appraisal and contractual related activities, rather than a self-directed recognition of learning needs. PMID:22443714
Curnock, Esther; Bowie, Paul; Pope, Lindsey; McKay, John
2012-03-23
The UK general practitioner (GP) appraisal system is deemed to be an inadequate source of performance evidence to inform a future medical revalidation process. A long-running voluntary model of external peer review in the west of Scotland provides feedback by trained peers on the standard of GP colleagues' core appraisal activities and may 'add value' in strengthening the robustness of the current system in support of revalidation. A significant minority of GPs has participated in the peer feedback model, but a clear majority has yet to engage with it. We aimed to explore the views of non-participants to identify barriers to engagement and attitudes to external peer review as a means to inform the current appraisal system. We conducted semi-structured interviews with a sample of west of Scotland GPs who had yet to participate in the peer review model. A thematic analysis of the interview transcriptions was conducted using a constant comparative approach. 13 GPs were interviewed of whom nine were males. Four core themes were identified in relation to the perceived and experienced 'value' placed on the topics discussed and their relevance to routine clinical practice and professional appraisal: 1. Value of the appraisal improvement activity. 2. Value of external peer review. 3. Value of the external peer review model and host organisation and 4. Attitudes to external peer review. GPs in this study questioned the 'value' of participation in the external peer review model and the national appraisal system over the standard of internal feedback received from immediate work colleagues. There was a limited understanding of the concept, context and purpose of external peer review and some distrust of the host educational provider. Future engagement with the model by these GPs is likely to be influenced by policy to improve the standard of appraisal and contractual related activities, rather than a self-directed recognition of learning needs.
Probing into the effectiveness of self-isolation policies in epidemic control
NASA Astrophysics Data System (ADS)
Crokidakis, Nuno; Duarte Queirós, Sílvio M.
2012-06-01
In this work, we inspect the reliability of controlling and quelling an epidemic disease mimicked by a susceptible-infected-susceptible (SIS) model defined on a complex network by means of current and implementable quarantine and isolation policies. Specifically, we consider that each individual in the network is originally linked to individuals of two types: members of the same household and acquaintances. The topology of this network evolves, taking into account a probability q that aims at representing the quarantine or isolation process in which the connection with acquaintances is severed according to standard policies of control of epidemics. Within current policies of self-isolation and standard infection rates, our results show that the propagation is either only controllable for hypothetical rates of compliance or not controllable at all.
NASA Technical Reports Server (NTRS)
Beach, Aubrey; Northup, Emily; Early, Amanda; Wang, Dali; Kusterer, John; Quam, Brandi; Chen, Gao
2015-01-01
The current data management practices for NASA airborne field projects have successfully served science team data needs over the past 30 years to achieve project science objectives, however, users have discovered a number of issues in terms of data reporting and format. The ICARTT format, a NASA standard since 2010, is currently the most popular among the airborne measurement community. Although easy for humans to use, the format standard is not sufficiently rigorous to be machine-readable. This makes data use and management tedious and resource intensive, and also create problems in Distributed Active Archive Center (DAAC) data ingest procedures and distribution. Further, most DAACs use metadata models that concentrate on satellite data observations, making them less prepared to deal with airborne data.
Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ian M; Danoix, F; Forbes, Richard
2011-01-01
Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less
Progress Toward a Format Standard for Flight Dynamics Models
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2006-01-01
In the beginning, there was FORTRAN, and it was... not so good. But it was universal, and all flight simulator equations of motion were coded with it. Then came ACSL, C, Ada, C++, C#, Java, FORTRAN-90, Matlab/Simulink, and a number of other programming languages. Since the halcyon punch card days of 1968, models of aircraft flight dynamics have proliferated in training devices, desktop engineering and development computers, and control design textbooks. With the rise of industry teaming and increased reliance on simulation for procurement decisions, aircraft and missile simulation models are created, updated, and exchanged with increasing frequency. However, there is no real lingua franca to facilitate the exchange of models from one simulation user to another. The current state-of-the-art is such that several staff-months if not staff-years are required to 'rehost' each release of a flight dynamics model from one simulation environment to another one. If a standard data package or exchange format were to be universally adopted, the cost and time of sharing and updating aerodynamics, control laws, mass and inertia, and other flight dynamic components of the equations of motion of an aircraft or spacecraft simulation could be drastically reduced. A 2002 paper estimated over $ 6 million in savings could be realized for one military aircraft type alone. This paper describes the efforts of the American Institute of Aeronautics and Astronautics (AIAA) to develop a standard flight dynamic model exchange standard based on XML and HDF-5 data formats.
High Energy Phenomenology - Proceedings of the Workshop
NASA Astrophysics Data System (ADS)
Pérez, Miguel A.; Huerta, Rodrigo
1992-06-01
The Table of Contents for the full book PDF is as follows: * Preface * Radiative Corrections in the Electroweak Standard Model * Introduction * The Electroweak Standard Model and its Renormalization * Basic Properties of the Standard Model * Renormalization of the Standard Model * Calculation of Radiative Corrections * One-Loop Integrals * Corrected Matrix Elements and Cross Sections * Photonic Corrections * Physical Applications and Results * Parameter Relations in Higher Orders * Decay Widths * Z Physics * W-Pair Production * Higgs Production in e+e- Annihilation * Conclusion * Appendix: Feynman Rules * References * Hadron Collider Physics * Introduction * e+ e- Annihilation * The Standard Model * The Drell-Yan Process in Hadronic Collisions * The Structure Functions * Hadronic Z Production * Hadronic W Production * The Transverse Mass * Quark Decays of W's * Weak Interactions * Neutrino Scattering * Weak Neutral Currents * The Standard Model * Symmetries and Lagrangians * Spontaneous Symmetry Breaking * The Standard Model Again * Experimental Situation * Appendix * References * Lectures on Heavy Quark Effective Theory * Introduction * Motivation * Physical Intuition * The Heavy Quark Effective Theory * The Effective Lagrangian and its Feynman Rules * What is an Effective Theory? * The Effective Theory Beyond Tree Level * External Currents * Leading-Logs or No Leading-Logs; A digression * Sample Calculations * Symmetries * Flavor-SU(N) * Spin-SU(2) * Spectrum * Strong Transitions * Covariant Representation of States * Meson Decay Constants * Preliminaries * Formal Derivation: Green Functions * Quick and Dirty Derivation: States in the HQET * Vector Meson Decay Constant * Corrections * Form Factors in overline {B} rightarrow Deν and overline {B} rightarrow D ^ast {e}ν * Preliminaries * Form Factors in the HQET * Form Factors in order αs * 1/MQ * The Correcting Lagrangian * The Corrected Currents * Corrections of order mc/mb * Corrections of order overline {Λ} /m_c and overline {Λ} /m_c * Conclusions and More * Inclusive Semileptonic Decay Rates * overline {B} rightarrow Π {e} overline {ν} and overline {B} rightarrow Π {e} overline {ν} * Rare overline {B} decays * e^+ e^- rightarrow {B} overline {B} * λb → λcDs vs λb → λc D*s * Factorization * A Last Word (or Two) * References * An Overview of Nonleptonic Decays of B, D, K Mesons and CP-Noninvariance * Generic Ways to Study Nonleptonic Decays and CP-Noninvariance * The Quark-Diagram Scheme * Invariants of the CKM and the Universal Decay-Amplitude CP-Noninvariance Factor Xcp * Implications of Measuring Partial-Decay-Rate Asymmetries in B± Decays and in Neutral B Decays such as B0, overline {B}^{0} rightarrow K_sJ/{Ψ} * Nonleptonic Decays of D Mesons: From the CKM Non- and Singly-Suppressed Decays to the Predictions of Doubly-Suppressed Decays * Charm Meson D Decays into Vector and Pseudoscalar Bosons, D → VP * Charm Meson Decays into Pseudoscalar-Pseudoscalar Mesons, D → PP * Charm Meson Decays into Vector-Vector Mesons, D → VV * Nonleptonic Decays of B Mesons * The CKM Non-Suppressed Decays * Interesting Features in the Rare B Meson Decays * CP-Noninvariance in K Meson Decays * Implications of Measurement of Re( ɛ'/ɛ) * Other Important Searches for Decay-Amplitude CP Noninvariance in Strange Particles * Some Generic Properties of Decay-Amplitude CP-Noninvariance * References * Top Quark Physics * Introduction * The Top Quark Exists * Upper Limit on Mt * Other Constraints on Mt * Production of Top * Hadron Colliders * SM Top Decays * Detecting SM Tops-Signatures * Model-Independent Lower Limit on Mt * Determining the Charge of a New Heavy Quark * When the Top Quark is Detected * Top Decays - A Window to New Physics? * - Decay to Supersymmetric Partners * - Decay to Charged Higgs Bosons * - Flavor-Changing Neutral Current Decays * - Other possibilities * New Information Once Top is Observed * Studying the Top Decays Couplings * The Top Quark at N LC * Measuring Mt - How Well? * Sharper Predictions for Many Observables * Measuring Vts, Vtd, Vtb and Γ(t → bW) * Top Polarization Predictions - A New Observable * Testing QCD Polarization Predictions * Correlation of Top Spin Direction with Final b, l+ Directions and Top Mass Measurements * Measuring P_{pm} ^ t * General Top Couplings * One Loop Corrections to Top Decay * Decay Helicity Amplitudes * New Sources of CP Violation at the Weak Scale? * The Effect of Top Loops on Higgs Masses * Is t → Wb a Background for Studying TeV WW Interactions? * Predictions for Mt * Final Remarks * References * High Precision Radiative Corrections in the Semileptonic Decays of Hyperons * On the Decay W± → P±γ * The Decay H0 → γγ and Physics Beyond the Standard Model * Neutrino Masses and Double Beta Decay * Neutrino Oscillations in a Medium: Analytic Calculation of Nonadiabatic Transitions * Gauge-Invariant Perturbation Theory Near a Gauge Resonance * Lower Dimensional Divergences in Gauge Theories * Strange Stars: Which is the Ground State of QCD at Finite Baryon Number? * Experimental Signatures of the SU(5)c Color Model * Generalized Supersymmetric Quantum Mechanics * Chern-Simons Theories in 2 + 1 Dimensions * List of participants
Measurement of inclusive radiative B-meson decay B decaying to X(S) meson-gamma
NASA Astrophysics Data System (ADS)
Ozcan, Veysi Erkcan
Radiative decays of the B meson, B→ Xsgamma, proceed via virtual flavor changing neutral current processes that are sensitive to contributions from high mass scales, either within the Standard Model of electroweak interactions or beyond. In the Standard Model, these transitions are sensitive to the weak interactions of the top quark, and relatively robust predictions of the inclusive decay rate exist. Significant deviation from these predictions could be interpreted as indications for processes not included in the minimal Standard Model, like interactions of charged Higgs or SUSY particles. The analysis of the inclusive photon spectrum from B→ Xsgamma decays is rather challenging due to high backgrounds from photons emitted in the decay of mesons in B decays as well as e+e- annihilation to low mass quark and lepton pairs. Based on 88.5 million BB events collected by the BABAR detector, the photon spectrum above 1.9 GeV is presented. By comparison of the first and second moments of the photon spectrum with QCD predictions (calculated in the kinetic scheme), QCD parameters describing the bound state of the b quark in the B meson are extracted: mb=4.45+/-0.16 GeV/c2m2 p=0.65+/-0.29 GeV2 These parameters are useful input to non-perturbative QCD corrections to the semileptonic B decay rate and the determination of the CKM parameter Vub. Based on these parameters and heavy quark expansion, the full branching fraction is obtained as: BRB→X sgEg >1.6GeV=4.050.32 stat+/-0.38syst +/-0.29model x10-4. This result is in good agreement with previous measurements, the statistical and systematic errors are comparable. It is also in good agreement with the theoretical Standard Model predictions, and thus within the present errors there is no indication of any interactions not accounted for in the Standard Model. This finding implies strong constraints on physics beyond the Standard Model.
Development of a model for occipital fixation--validation of an analogue bone material.
Mullett, H; O'Donnell, T; Felle, P; O'Rourke, K; FitzPatrick, D
2002-01-01
Several implant systems may be used to fuse the skull to the upper cervical spine (occipitocervical fusion). Current biomechanical evaluation is restricted by the limitations of human cadaveric specimens. This paper describes the design and validation of a synthetic testing model of the occipital bone. Data from thickness measurement and pull-out strength testing of a series of human cadaveric skulls was used in the design of a high-density rigid polyurethane foam model. The synthetic occipital model demonstrated repeatable and consistent morphological and biomechanical properties. The model provides a standardized environment for evaluation of occipital implants.
Intelligence Reach for Expertise (IREx)
NASA Astrophysics Data System (ADS)
Hadley, Christina; Schoening, James R.; Schreiber, Yonatan
2015-05-01
IREx is a search engine for next-generation analysts to find collaborators. U.S. Army Field Manual 2.0 (Intelligence) calls for collaboration within and outside the area of operations, but finding the best collaborator for a given task can be challenging. IREx will be demonstrated as part of Actionable Intelligence Technology Enabled Capability Demonstration (AI-TECD) at the E15 field exercises at Ft. Dix in July 2015. It includes a Task Model for describing a task and its prerequisite competencies, plus a User Model (i.e., a user profile) for individuals to assert their capabilities and other relevant data. These models use a canonical suite of ontologies as a foundation for these models, which enables robust queries and also keeps the models logically consistent. IREx also supports learning validation, where a learner who has completed a course module can search and find a suitable task to practice and demonstrate that their new knowledge can be used in the real world for its intended purpose. The IREx models are in the initial phase of a process to develop them as an IEEE standard. This initiative is currently an approved IEEE Study Group, after which follows a standards working group, then a balloting group, and if all goes well, an IEEE standard.
A review of the quantum current standard
NASA Astrophysics Data System (ADS)
Kaneko, Nobu-Hisa; Nakamura, Shuji; Okazaki, Yuma
2016-03-01
The electric current, voltage, and resistance standards are the most important standards related to electricity and magnetism. Of these three standards, only the ampere, which is the unit of electric current, is an International System of Units (SI) base unit. However, even with modern technology, relatively large uncertainty exists regarding the generation and measurement of current. As a result of various innovative techniques based on nanotechnology and novel materials, new types of junctions for quantum current generation and single-electron current sources have recently been proposed. These newly developed methods are also being used to investigate the consistency of the three quantum electrical effects, i.e. the Josephson, quantum Hall, and single-electron tunneling effects, which are also known as ‘the quantum metrology triangle’. This article describes recent research and related developments regarding current standards and quantum-metrology-triangle experiments.
Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A
2016-06-01
To evaluate the cost-effectiveness of nutritional support compared with standard care in preventing pressure ulcers (PrUs) in high-risk hospitalized patients. An economic model using data from a systematic literature review. A meta-analysis of randomized controlled trials on the efficacy of nutritional support in reducing the incidence of PrUs was conducted. Modeled cohort of hospitalized patients at high risk of developing PrUs and malnutrition simulated during their hospital stay and up to 1 year. Standard care included PrU prevention strategies, such as redistribution surfaces, repositioning, and skin protection strategies, along with standard hospital diet. In addition to the standard care, the intervention group received nutritional support comprising patient education, nutrition goal setting, and the consumption of high-protein supplements. The analysis was from a healthcare payer perspective. Key outcomes of the model included the average costs and quality-adjusted life years. Model results were tested in univariate sensitivity analyses, and decision uncertainty was characterized using a probabilistic sensitivity analysis. Compared with standard care, nutritional support was cost saving at AU $425 per patient and marginally more effective with an average 0.005 quality-adjusted life years gained. The probability of nutritional support being cost-effective was 87%. Nutritional support to prevent PrUs in high-risk hospitalized patients is cost-effective with substantial cost savings predicted. Hospitals should implement the recommendations from the current PrU practice guidelines and offer nutritional support to high-risk patients.
Alonso, Rodrigo; Fernandez Martinez, Enrique; Gavela, M. B.; ...
2016-12-22
The gauging of the lepton flavour group is considered in the Standard Model context and in its extension with three right-handed neutrinos. The anomaly cancellation conditions lead to a Seesaw mechanism as underlying dynamics for all leptons; in addition, it requires a phenomenologically viable setup which leads to Majorana masses for the neutral sector: the type I Seesaw Lagrangian in the Standard Model case and the inverse Seesaw in the extended model. Within the minimal extension of the scalar sector, the Yukawa couplings are promoted to scalar fields in the bifundamental of the flavour group. The resulting low-energy Yukawa couplingsmore » are proportional to inverse powers of the vacuum expectation values of those scalars; the protection against flavour changing neutral currents differs from that of Minimal Flavour Violation. In every case, the μ - τ flavour sector exhibits rich and promising phenomenological signals.« less
Folded supersymmetry with a twist
Cohen, Timothy; Craig, Nathaniel; Lou, Hou Keong; ...
2016-03-30
Folded supersymmetry (f-SUSY) stabilizes the weak scale against radiative corrections from the top sector via scalar partners whose gauge quantum numbers differ from their Standard Model counterparts. This non-trivial pairing of states can be realized in extra-dimensional theories with appropriate supersymmetry-breaking boundary conditions. We present a class of calculable f-SUSY models that are parametrized by a non-trivial twist in 5D boundary conditions and can accommodate the observed Higgs mass and couplings. Although the distinctive phenomenology associated with the novel folded states should provide strong evidence for this mechanism, the most stringent constraints are currently placed by conventional supersymmetry searches. Asmore » a result, these models remain minimally fine-tuned in light of LHC8 data and provide a range of both standard and exotic signatures accessible at LHC13.« less
Novel dark matter phenomenology at colliders
NASA Astrophysics Data System (ADS)
Wardlow, Kyle Patrick
While a suitable candidate particle for dark matter (DM) has yet to be discovered, it is possible one will be found by experiments currently investigating physics on the weak scale. If discovered on that energy scale, the dark matter will likely be producible in significant quantities at colliders like the LHC, allowing the properties of and underlying physical model characterizing the dark matter to be precisely determined. I assume that the dark matter will be produced as one of the decay products of a new massive resonance related to physics beyond the Standard Model, and using the energy distributions of the associated visible decay products, develop techniques for determining the symmetry protecting these potential dark matter candidates from decaying into lighter Standard Model (SM) particles and to simultaneously measure the masses of both the dark matter candidate and the particle from which it decays.
Chorvat, Terrence; McCabe, Kevin
2004-01-01
Much has been written about how law as an institution has developed to solve many problems that human societies face. Inherent in all of these explanations are models of how humans make decisions. This article discusses what current neuroscience research tells us about the mechanisms of human decision making of particular relevance to law. This research indicates that humans are both more capable of solving many problems than standard economic models predict, but also limited in ways those models ignore. This article discusses how law is both shaped by our cognitive processes and also shapes them. The article considers some of the implications of this research for improving our understanding of how our current legal regimes operate and how the law can be structured to take advantage of our neural mechanisms to improve social welfare. PMID:15590613
Vehicle track segmentation using higher order random fields
Quach, Tu -Thach
2017-01-09
Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less
Vehicle track segmentation using higher order random fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quach, Tu -Thach
Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less
An Engineered Membrane to Measure Electroporation: Effect of Tethers and Bioelectronic Interface
Hoiles, William; Krishnamurthy, Vikram; Cranfield, Charles G.; Cornell, Bruce
2014-01-01
This article reports on the construction and predictive models for a platform comprised of an engineered tethered membrane. The platform provides a controllable and physiologically relevant environment for the study of the electroporation process. The mixed self-assembled membrane is formed via a rapid solvent exchange technique. The membrane is tethered to the gold electrode and includes an ionic reservoir separating the membrane and gold surface. Above the membrane, there is an electrolyte solution, and a gold counterelectrode. A voltage is applied between the gold electrodes and the current measured. The current is dependent on the energy required to form aqueous pores and the conductance of each pore. A two-level predictive model, consisting of a macroscopic and a continuum model, is developed to relate the pore dynamics to the measured current. The macroscopic model consists of an equivalent circuit model of the tethered membrane, and asymptotic approximations to the Smoluchowski-Einstein equation of electroporation that is dependent on the pore conductance and the energy required to form aqueous pores. The continuum model is a generalized Poisson-Nernst-Planck (GPNP) system where an activity coefficient to account for steric effects of ions is added to the standard PNP system. The GPNP is used to evaluate the conductance of aqueous pores, and the electrical energy required to form the pores. As an outcome of the setup of the device and the two-level model, biologically important variables can be estimated from experimental measurements. To validate the accuracy of the two-level model, the predicted current is compared with experimentally measured current for different tethering densities. PMID:25229142
Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael
2017-09-01
The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Highlights from High Energy Neutrino Experiments at CERN
NASA Astrophysics Data System (ADS)
Schlatter, W.-D.
2015-07-01
Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.
CP violation in h → ττ and LFV h → μτ
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayreter, Alper; He, Xiao-Gang; Valencia, German
The CMS Collaboration has reported a possible lepton flavor violating (LFV) signal . Whereas this does not happen in the standard model (SM), we point out that new physics responsible for this type of decay would, in general, also produce charge-parity (CP) violation in . We estimate the size of this effect in a model independent manner and find that a large asymmetry, of order 25%, is allowed by current constraints.
CP violation in h → ττ and LFV h → μτ
Hayreter, Alper; He, Xiao-Gang; Valencia, German
2016-06-30
The CMS Collaboration has reported a possible lepton flavor violating (LFV) signal . Whereas this does not happen in the standard model (SM), we point out that new physics responsible for this type of decay would, in general, also produce charge-parity (CP) violation in . We estimate the size of this effect in a model independent manner and find that a large asymmetry, of order 25%, is allowed by current constraints.
Lu, Biao; Miao, Yong; Vigneron, Pascale; Chagnault, Vincent; Grand, Eric; Wadouachi, Anne; Postel, Denis; Pezron, Isabelle; Egles, Christophe; Vayssade, Muriel
2017-04-01
Sugar-based surfactants present surface-active properties and relatively low cytotoxicity. They are often considered as safe alternatives to currently used surfactants in cosmetic industries. In this study, four sugar-based surfactants, each with an eight carbon alkyl chain bound to a glucose or a maltose headgroup through an amide linkage, were synthesized and compared to two standard surfactants. The cytotoxic and irritant effects of surfactants were evaluated using two biologically relevant models: 3D dermal model (mouse fibroblasts embedded in collagen gel) and reconstituted human epidermis (RHE, multi-layered human keratinocytes). Results show that three synthesized surfactants possess lower cytotoxicity compared to standard surfactants as demonstrated in the 3D dermal model. Moreover, the IC50s of surfactants against the 3D dermal model are higher than IC50s obtained with the 2D dermal model (monolayer mouse fibroblasts). Both synthesized and standard surfactants show no irritant effects after 48h of topical application on RHE. Throughout the study, we demonstrate the difficulty to link the physico-chemical properties of surfactants and their cytotoxicity in complex models. More importantly, our data suggest that, prior to in vivo tests, a complete understanding of surfactant cytotoxicity or irritancy potential requires a combination of cellular and tissue models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Martinez, Sydney A; Beebe, Laura A; Thompson, David M; Wagener, Theodore L; Terrell, Deirdra R; Campbell, Janis E
2018-01-01
The inverse association between socioeconomic status and smoking is well established, yet the mechanisms that drive this relationship are unclear. We developed and tested four theoretical models of the pathways that link socioeconomic status to current smoking prevalence using a structural equation modeling (SEM) approach. Using data from the 2013 National Health Interview Survey, we selected four indicator variables (poverty ratio, personal earnings, educational attainment, and employment status) that we hypothesize underlie a latent variable, socioeconomic status. We measured direct, indirect, and total effects of socioeconomic status on smoking on four pathways through four latent variables representing social cohesion, financial strain, sleep disturbance, and psychological distress. Results of the model indicated that the probability of being a smoker decreased by 26% of a standard deviation for every one standard deviation increase in socioeconomic status. The direct effects of socioeconomic status on smoking accounted for the majority of the total effects, but the overall model also included significant indirect effects. Of the four mediators, sleep disturbance and psychological distress had the largest total effects on current smoking. We explored the use of structural equation modeling in epidemiology to quantify effects of socioeconomic status on smoking through four social and psychological factors to identify potential targets for interventions. A better understanding of the complex relationship between socioeconomic status and smoking is critical as we continue to reduce the burden of tobacco and eliminate health disparities related to smoking.
Beebe, Laura A.; Thompson, David M.; Wagener, Theodore L.; Terrell, Deirdra R.; Campbell, Janis E.
2018-01-01
The inverse association between socioeconomic status and smoking is well established, yet the mechanisms that drive this relationship are unclear. We developed and tested four theoretical models of the pathways that link socioeconomic status to current smoking prevalence using a structural equation modeling (SEM) approach. Using data from the 2013 National Health Interview Survey, we selected four indicator variables (poverty ratio, personal earnings, educational attainment, and employment status) that we hypothesize underlie a latent variable, socioeconomic status. We measured direct, indirect, and total effects of socioeconomic status on smoking on four pathways through four latent variables representing social cohesion, financial strain, sleep disturbance, and psychological distress. Results of the model indicated that the probability of being a smoker decreased by 26% of a standard deviation for every one standard deviation increase in socioeconomic status. The direct effects of socioeconomic status on smoking accounted for the majority of the total effects, but the overall model also included significant indirect effects. Of the four mediators, sleep disturbance and psychological distress had the largest total effects on current smoking. We explored the use of structural equation modeling in epidemiology to quantify effects of socioeconomic status on smoking through four social and psychological factors to identify potential targets for interventions. A better understanding of the complex relationship between socioeconomic status and smoking is critical as we continue to reduce the burden of tobacco and eliminate health disparities related to smoking. PMID:29408939
Power flow analysis and optimal locations of resistive type superconducting fault current limiters.
Zhang, Xiuchang; Ruiz, Harold S; Geng, Jianzhao; Shen, Boyang; Fu, Lin; Zhang, Heng; Coombs, Tim A
2016-01-01
Based on conventional approaches for the integration of resistive-type superconducting fault current limiters (SFCLs) on electric distribution networks, SFCL models largely rely on the insertion of a step or exponential resistance that is determined by a predefined quenching time. In this paper, we expand the scope of the aforementioned models by considering the actual behaviour of an SFCL in terms of the temperature dynamic power-law dependence between the electrical field and the current density, characteristic of high temperature superconductors. Our results are compared to the step-resistance models for the sake of discussion and clarity of the conclusions. Both SFCL models were integrated into a power system model built based on the UK power standard, to study the impact of these protection strategies on the performance of the overall electricity network. As a representative renewable energy source, a 90 MVA wind farm was considered for the simulations. Three fault conditions were simulated, and the figures for the fault current reduction predicted by both fault current limiting models have been compared in terms of multiple current measuring points and allocation strategies. Consequently, we have shown that the incorporation of the E - J characteristics and thermal properties of the superconductor at the simulation level of electric power systems, is crucial for estimations of reliability and determining the optimal locations of resistive type SFCLs in distributed power networks. Our results may help decision making by distribution network operators regarding investment and promotion of SFCL technologies, as it is possible to determine the maximum number of SFCLs necessary to protect against different fault conditions at multiple locations.
Nevers, Meredith B.; Whitman, Richard L.
2011-01-01
Efforts to improve public health protection in recreational swimming waters have focused on obtaining real-time estimates of water quality. Current monitoring techniques rely on the time-intensive culturing of fecal indicator bacteria (FIB) from water samples, but rapidly changing FIB concentrations result in management errors that lead to the public being exposed to high FIB concentrations (type II error) or beaches being closed despite acceptable water quality (type I error). Empirical predictive models may provide a rapid solution, but their effectiveness at improving health protection has not been adequately assessed. We sought to determine if emerging monitoring approaches could effectively reduce risk of illness exposure by minimizing management errors. We examined four monitoring approaches (inactive, current protocol, a single predictive model for all beaches, and individual models for each beach) with increasing refinement at 14 Chicago beaches using historical monitoring and hydrometeorological data and compared management outcomes using different standards for decision-making. Predictability (R2) of FIB concentration improved with model refinement at all beaches but one. Predictive models did not always reduce the number of management errors and therefore the overall illness burden. Use of a Chicago-specific single-sample standard-rather than the default 235 E. coli CFU/100 ml widely used-together with predictive modeling resulted in the greatest number of open beach days without any increase in public health risk. These results emphasize that emerging monitoring approaches such as empirical models are not equally applicable at all beaches, and combining monitoring approaches may expand beach access.
Constant-current control method of multi-function electromagnetic transmitter.
Xue, Kaichang; Zhou, Fengdao; Wang, Shuang; Lin, Jun
2015-02-01
Based on the requirements of controlled source audio-frequency magnetotelluric, DC resistivity, and induced polarization, a constant-current control method is proposed. Using the required current waveforms in prospecting as a standard, the causes of current waveform distortion and current waveform distortion's effects on prospecting are analyzed. A cascaded topology is adopted to achieve 40 kW constant-current transmitter. The responsive speed and precision are analyzed. According to the power circuit of the transmitting system, the circuit structure of the pulse width modulation (PWM) constant-current controller is designed. After establishing the power circuit model of the transmitting system and the PWM constant-current controller model, analyzing the influence of ripple current, and designing an open-loop transfer function according to the amplitude-frequency characteristic curves, the parameters of the PWM constant-current controller are determined. The open-loop transfer function indicates that the loop gain is no less than 28 dB below 160 Hz, which assures the responsive speed of the transmitting system; the phase margin is 45°, which assures the stabilization of the transmitting system. Experimental results verify that the proposed constant-current control method can keep the control error below 4% and can effectively suppress load change caused by the capacitance of earth load.
Constant-current control method of multi-function electromagnetic transmitter
NASA Astrophysics Data System (ADS)
Xue, Kaichang; Zhou, Fengdao; Wang, Shuang; Lin, Jun
2015-02-01
Based on the requirements of controlled source audio-frequency magnetotelluric, DC resistivity, and induced polarization, a constant-current control method is proposed. Using the required current waveforms in prospecting as a standard, the causes of current waveform distortion and current waveform distortion's effects on prospecting are analyzed. A cascaded topology is adopted to achieve 40 kW constant-current transmitter. The responsive speed and precision are analyzed. According to the power circuit of the transmitting system, the circuit structure of the pulse width modulation (PWM) constant-current controller is designed. After establishing the power circuit model of the transmitting system and the PWM constant-current controller model, analyzing the influence of ripple current, and designing an open-loop transfer function according to the amplitude-frequency characteristic curves, the parameters of the PWM constant-current controller are determined. The open-loop transfer function indicates that the loop gain is no less than 28 dB below 160 Hz, which assures the responsive speed of the transmitting system; the phase margin is 45°, which assures the stabilization of the transmitting system. Experimental results verify that the proposed constant-current control method can keep the control error below 4% and can effectively suppress load change caused by the capacitance of earth load.
Gasson, Natalie; Johnson, Andrew R.; Booth, Leon; Loftus, Andrea M.
2018-01-01
This study examined whether standard cognitive training, tailored cognitive training, transcranial direct current stimulation (tDCS), standard cognitive training + tDCS, or tailored cognitive training + tDCS improved cognitive function and functional outcomes in participants with PD and mild cognitive impairment (PD-MCI). Forty-two participants with PD-MCI were randomized to one of six groups: (1) standard cognitive training, (2) tailored cognitive training, (3) tDCS, (4) standard cognitive training + tDCS, (5) tailored cognitive training + tDCS, or (6) a control group. Interventions lasted 4 weeks, with cognitive and functional outcomes measured at baseline, post-intervention, and follow-up. The trial was registered with the Australian New Zealand Clinical Trials Registry (ANZCTR: 12614001039673). While controlling for moderator variables, Generalized Linear Mixed Models (GLMMs) showed that when compared to the control group, the intervention groups demonstrated variable statistically significant improvements across executive function, attention/working memory, memory, language, activities of daily living (ADL), and quality of life (QOL; Hedge's g range = 0.01 to 1.75). More outcomes improved for the groups that received standard or tailored cognitive training combined with tDCS. Participants with PD-MCI receiving cognitive training (standard or tailored) or tDCS demonstrated significant improvements on cognitive and functional outcomes, and combining these interventions provided greater therapeutic effects. PMID:29780572
[Undergraduate psychiatric training in Turkey].
Cıngı Başterzi, Ayşe Devrim; Tükel, Raşit; Uluşahin, Aylin; Coşkun, Bülent; Alkın, Tunç; Murat Demet, Mehmet; Konuk, Numan; Taşdelen, Bahar
2010-01-01
The current trend in medical education is to abandon the experience-based traditional model and embrace the competency-based education model (CBE). The basic principle behind CBE is standardization. The first step in standardization is to determine what students must know, what they must accomplish, and what attitude they should display, and the establishment of educational goals. One of the goals of the Psychiatric Association of Turkey, Psychiatric Training Section is to standardize psychiatric training in Turkish medical schools. This study aimed to determine the current state of undergraduate psychiatric training in Turkish medical schools. Questionnaires were sent to the psychiatry department chairs of 41 medical schools. Data were analyzed using descriptive statistical methods. Of the 41 department chairs that were sent the questionnaire, 29 (70%) completed and returned them, of which 16 (66.7%) reported that they had already defined goals and educational objectives for their undergraduate psychiatric training programs. The Core Education Program, prepared by the Turkish Medicine and Health Education Council, was predominately used at 9 (37.5%) medical schools. Pre-clinical and clinical training schedules varied between medical schools. In all, 3 of the medical schools did not offer internships in psychiatry. The majority of chairs emphasized the importance of mood disorders (49.9%) and anxiety disorders (40%), suggesting that these disorders should be treated by general practitioners. Computer technology was commonly used for lecturing; however, utilization of interactive and skill-based teaching methods was limited. The most commonly used evaluation methods were written examination (87.5%) during preclinical training and oral examination (91.6%) during clinical training. The most important finding of this study was the lack of a standardized curriculum for psychiatric training in Turkey. Standardization of psychiatric training in Turkish medical schools must be developed.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
NASA Astrophysics Data System (ADS)
Merritt, David
2017-02-01
I argue that some important elements of the current cosmological model are 'conventionalist' in the sense defined by Karl Popper. These elements include dark matter and dark energy; both are auxiliary hypotheses that were invoked in response to observations that falsified the standard model as it existed at the time. The use of conventionalist stratagems in response to unexpected observations implies that the field of cosmology is in a state of 'degenerating problemshift' in the language of Imre Lakatos. I show that the 'concordance' argument, often put forward by cosmologists in support of the current paradigm, is weaker than the convergence arguments that were made in the past in support of the atomic theory of matter or the quantization of energy.
Decay of standard-model-like Higgs boson h →μ τ in a 3-3-1 model with inverse seesaw neutrino masses
NASA Astrophysics Data System (ADS)
Nguyen, T. Phong; Le, T. Thuy; Hong, T. T.; Hue, L. T.
2018-04-01
By adding new gauge singlets of neutral leptons, the improved versions of the 3-3-1 models with right-handed neutrinos have been recently introduced in order to explain recent experimental neutrino oscillation data through the inverse seesaw mechanism. We prove that these models predict promising signals of lepton-flavor-violating decays of the standard-model-like Higgs boson h10→μ τ ,e τ , which are suppressed in the original versions. One-loop contributions to these decay amplitudes are introduced in the unitary gauge. Based on a numerical investigation, we find that the branching ratios of the decays h10→μ τ ,e τ can reach values of 10-5 in the regions of parameter space satisfying the current experimental data of the decay μ →e γ . The value of 10-4 appears when the Yukawa couplings of leptons are close to the perturbative limit. Some interesting properties of these regions of parameter space are also discussed.
Probing particle and nuclear physics models of neutrinoless double beta decay with different nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogli, G. L.; Rotunno, A. M.; Istituto Nazionale di Fisica Nucleare, Sezione di Bari, Via Orabona 4, 70126 Bari
2009-07-01
Half-life estimates for neutrinoless double beta decay depend on particle physics models for lepton-flavor violation, as well as on nuclear physics models for the structure and transitions of candidate nuclei. Different models considered in the literature can be contrasted - via prospective data - with a 'standard' scenario characterized by light Majorana neutrino exchange and by the quasiparticle random phase approximation, for which the theoretical covariance matrix has been recently estimated. We show that, assuming future half-life data in four promising nuclei ({sup 76}Ge, {sup 82}Se, {sup 130}Te, and {sup 136}Xe), the standard scenario can be distinguished from a fewmore » nonstandard physics models, while being compatible with alternative state-of-the-art nuclear calculations (at 95% C.L.). Future signals in different nuclei may thus help to discriminate at least some decay mechanisms, without being spoiled by current nuclear uncertainties. Prospects for possible improvements are also discussed.« less
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.
The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.
Comparison of Predictive Modeling Methods of Aircraft Landing Speed
NASA Technical Reports Server (NTRS)
Diallo, Ousmane H.
2012-01-01
Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.
On the contribution of active galactic nuclei to the high-redshift metagalactic ionizing background
NASA Astrophysics Data System (ADS)
D'Aloisio, Anson; Upton Sanderbeck, Phoebe R.; McQuinn, Matthew; Trac, Hy; Shapiro, Paul R.
2017-07-01
Motivated by the claimed detection of a large population of faint active galactic nuclei (AGNs) at high redshift, recent studies have proposed models in which AGNs contribute significantly to the z > 4 H I ionizing background. In some models, AGNs are even the chief sources of reionization. If proved true, these models would make necessary a complete revision to the standard view that galaxies dominated the high-redshift ionizing background. It has been suggested that AGN-dominated models can better account for two recent observations that appear to be in conflict with the standard view: (1) large opacity variations in the z ˜ 5.5 H I Ly α forest, and (2) slow evolution in the mean opacity of the He II Ly α forest. Large spatial fluctuations in the ionizing background from the brightness and rarity of AGNs may account for the former, while the earlier onset of He II reionization in these models may account for the latter. Here we show that models in which AGN emissions source ≳50 per cent of the ionizing background generally provide a better fit to the observed H I Ly α forest opacity variations compared to standard galaxy-dominated models. However, we argue that these AGN-dominated models are in tension with constraints on the thermal history of the intergalactic medium (IGM). Under standard assumptions about the spectra of AGNs, we show that the earlier onset of He II reionization heats up the IGM well above recent temperature measurements. We further argue that the slower evolution of the mean opacity of the He II Ly α forest relative to simulations may reflect deficiencies in current simulations rather than favour AGN-dominated models as has been suggested.
Phenomenology of ultrahigh energy neutrino interactions and fluxes
NASA Astrophysics Data System (ADS)
Hussain, Shahid
There are several models that predict the existence of high and ultrahigh energy (UHE) neutrinos; neutrinos that have amazingly high energies---energies above 10 15 eV. No man-made machines, existing or planned, can produce any particles of this high energies. It is the energies of these neutrinos that make them very interesting for the particle physics and astrophysics community; these neutrinos can be a unique tool to study the unknown regimes of energy, space, and time. Consequently, there is intense experimental activity focused on the detection of these neutrinos; no UHE neutrinos have been detected by these experiments so far. However, most of the UHE neutrino flux models predict that the fluxes of these neutrinos might be too small to be detected by the current detectors. Therefore, more powerful detectors are being built and we are at the beginning of a new and exciting era in neutrino astronomy. The interactions and fluxes of UHE neutrinos both are unknown experimentally. Our focus here is to explore, by numerically calculating observable signals from these neutrinos, different scenarios that can arise by the inter play of UHE neutrino interaction and flux models. Given several AGN and cosmogenic neutrino flux models, we look at two possibilities for neutrino interactions: (i) Neutrinos have standard model weak interactions at ultrahigh energies. (ii) neutrino interactions are enhanced around a TeV mass-scale, as implied by low scale gravity models with extra dimensions. The standard model weak and low scale gravity enhanced neutrino-nucleon interactions of UHE neutrinos both produce observable signals. In standard model, the charged current neutrino-nucleon interactions give muons, taus, and particle showers, and the neutral current interactions give particle showers. In low scale gravity, the micro black hole formation (and its subsequent decay) and the graviton exchange both give particle showers. Muons, taus, and the showers can be detected by the optical Cherenkov radiation they produce; showers can also be detected by the coherent radio Cherenkov signal they produce which is much powerful than their optical Cherenkov signal. We give the formalism for calculating muon, tau, and shower rates for the optical (ICECUBE- like) and the shower rates for the radio (RICE-like) Cherenkov detectors. Our focus is on simulation of the radio signal from neutrino-initiated showers and calculation of the expected neutrino-initiated shower rates for RICE. Finally, given the calculated rates for muons, taus, and showers, we discuss what we can say about the models for UHE neutrino fluxes and interactions.
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
Visualizing and Validating Metadata Traceability within the CDISC Standards.
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information.
Visualizing and Validating Metadata Traceability within the CDISC Standards
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information. PMID:28815125
Groth, Kevin M; Granata, Kevin P
2008-06-01
Due to the mathematical complexity of current musculoskeletal spine models, there is a need for computationally efficient models of the intervertebral disk (IVD). The aim of this study is to develop a mathematical model that will adequately describe the motion of the IVD under axial cyclic loading as well as maintain computational efficiency for use in future musculoskeletal spine models. Several studies have successfully modeled the creep characteristics of the IVD using the three-parameter viscoelastic standard linear solid (SLS) model. However, when the SLS model is subjected to cyclic loading, it underestimates the load relaxation, the cyclic modulus, and the hysteresis of the human lumbar IVD. A viscoelastic standard nonlinear solid (SNS) model was used to predict the response of the human lumbar IVD subjected to low-frequency vibration. Nonlinear behavior of the SNS model was simulated by a strain-dependent elastic modulus on the SLS model. Parameters of the SNS model were estimated from experimental load deformation and stress-relaxation curves obtained from the literature. The SNS model was able to predict the cyclic modulus of the IVD at frequencies of 0.01 Hz, 0.1 Hz, and 1 Hz. Furthermore, the SNS model was able to quantitatively predict the load relaxation at a frequency of 0.01 Hz. However, model performance was unsatisfactory when predicting load relaxation and hysteresis at higher frequencies (0.1 Hz and 1 Hz). The SLS model of the lumbar IVD may require strain-dependent elastic and viscous behavior to represent the dynamic response to compressive strain.
Vajuvalli, Nithin N; Nayak, Krupa N; Geethanath, Sairam
2014-01-01
Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) is widely used in the diagnosis of cancer and is also a promising tool for monitoring tumor response to treatment. The Tofts model has become a standard for the analysis of DCE-MRI. The process of curve fitting employed in the Tofts equation to obtain the pharmacokinetic (PK) parameters is time-consuming for high resolution scans. Current work demonstrates a frequency-domain approach applied to the standard Tofts equation to speed-up the process of curve-fitting in order to obtain the pharmacokinetic parameters. The results obtained show that using the frequency domain approach, the process of curve fitting is computationally more efficient compared to the time-domain approach.
Interferometric tests of Planckian quantum geometry models
Kwon, Ohkyung; Hogan, Craig J.
2016-04-19
The effect of Planck scale quantum geometrical effects on measurements with interferometers is estimated with standard physics, and with a variety of proposed extensions. It is shown that effects are negligible in standard field theory with canonically quantized gravity. Statistical noise levels are estimated in a variety of proposals for nonstandard metric fluctuations, and these alternatives are constrained using upper bounds on stochastic metric fluctuations from LIGO. Idealized models of several interferometer system architectures are used to predict signal noise spectra in a quantum geometry that cannot be described by a fluctuating metric, in which position noise arises from holographicmore » bounds on directional information. Lastly, predictions in this case are shown to be close to current and projected experimental bounds.« less
Recent results on rare B decays with BaBar
NASA Astrophysics Data System (ADS)
Margoni, Martino; BaBar Collaboration
2017-04-01
Flavor Changing Neutral Current transitions b → sl+l- and b → sγ provide an excellent laboratory for the search for physics beyond the Standard Model. Standard Model tests are performed through measurements of the lepton forward-backward asymmetry AFB and the longitudinal K* polarization FL in the decay B →K*l+l-, and the search for the rare decay B+ →K+τ+τ-. From the study of the Kπ+π- system in B radiative-penguin decays, the time-dependent CP asymmetry in the decay B0 →KS0 π+π- γ is measured, together with the branching fractions of B+ →K+π-π+ γ and B0 →K0π-π+ γ.
Understanding flavour at the LHC
Nir, Yosef
2018-05-22
Huge progress in flavour physics has been achieved by the two B-factories and the Tevatron experiments. This progress has, however, deepened the new physics flavour puzzle: If there is new physics at the TeV scale, why aren't flavour changing neutral current processes enhanced by orders of magnitude compared to the standard model predictions? The forthcoming ATLAS and CMS experiments can potentially solve this puzzle. Perhaps even more surprisingly, these experiments can potentially lead to progress in understanding the standard model flavour puzzle: Why is there smallness and hierarchy in the flavour parameters? Thus, a rich and informative flavour program is awaiting us not only in the flavour-dedicated LHCb experiment, but also in the high-pT ATLAS and CMS experiments.
Predicting ESI/MS Signal Change for Anions in Different Solvents.
Kruve, Anneli; Kaupmees, Karl
2017-05-02
LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.
Comparison of a novel fixation device with standard suturing methods for spinal cord stimulators.
Bowman, Richard G; Caraway, David; Bentley, Ishmael
2013-01-01
Spinal cord stimulation is a well-established treatment for chronic neuropathic pain of the trunk or limbs. Currently, the standard method of fixation is to affix the leads of the neuromodulation device to soft tissue, fascia or ligament, through the use of manually tying general suture. A novel semiautomated device is proposed that may be advantageous to the current standard. Comparison testing in an excised caprine spine and simulated bench top model was performed. Three tests were performed: 1) perpendicular pull from fascia of caprine spine; 2) axial pull from fascia of caprine spine; and 3) axial pull from Mylar film. Six samples of each configuration were tested for each scenario. Standard 2-0 Ethibond was compared with a novel semiautomated device (Anulex fiXate). Upon completion of testing statistical analysis was performed for each scenario. For perpendicular pull in the caprine spine, the failure load for standard suture was 8.95 lbs with a standard deviation of 1.39 whereas for fiXate the load was 15.93 lbs with a standard deviation of 2.09. For axial pull in the caprine spine, the failure load for standard suture was 6.79 lbs with a standard deviation of 1.55 whereas for fiXate the load was 12.31 lbs with a standard deviation of 4.26. For axial pull in Mylar film, the failure load for standard suture was 10.87 lbs with a standard deviation of 1.56 whereas for fiXate the load was 19.54 lbs with a standard deviation of 2.24. These data suggest a novel semiautomated device offers a method of fixation that may be utilized in lieu of standard suturing methods as a means of securing neuromodulation devices. Data suggest the novel semiautomated device in fact may provide a more secure fixation than standard suturing methods. © 2012 International Neuromodulation Society.
Kim, Younggy; Walker, W Shane; Lawler, Desmond F
2012-05-01
In electrodialysis desalination, the boundary layer near ion-exchange membranes is the limiting region for the overall rate of ionic separation due to concentration polarization over tens of micrometers in that layer. Under high current conditions, this sharp concentration gradient, creating substantial ionic diffusion, can drive a preferential separation for certain ions depending on their concentration and diffusivity in the solution. Thus, this study tested a hypothesis that the boundary layer affects the competitive transport between di- and mono-valent cations, which is known to be governed primarily by the partitioning with cation-exchange membranes. A laboratory-scale electrodialyzer was operated at steady state with a mixture of 10mM KCl and 10mM CaCl(2) at various flow rates. Increased flows increased the relative calcium transport. A two-dimensional model was built with analytical solutions of the Nernst-Planck equation. In the model, the boundary layer thickness was considered as a random variable defined with three statistical parameters: mean, standard deviation, and correlation coefficient between the thicknesses of the two boundary layers facing across a spacer. Model simulations with the Monte Carlo method found that a greater calcium separation was achieved with a smaller mean, greater standard deviation, or more negative correlation coefficient. The model and experimental results were compared for the cationic transport number as well as the current and potential relationship. The mean boundary layer thickness was found to decrease from 40 to less than 10 μm as the superficial water velocity increased from 1.06 to 4.24 cm/s. The standard deviation was greater than the mean thickness at slower water velocities and smaller at faster water velocities. Copyright © 2012 Elsevier Ltd. All rights reserved.
Beyond the Standard Model: The pragmatic approach to the gauge hierarchy problem
NASA Astrophysics Data System (ADS)
Mahbubani, Rakhi
The current favorite solution to the gauge hierarchy problem, the Minimal Supersymmetric Standard Model (MSSM), is looking increasingly fine tuned as recent results from LEP-II have pushed it to regions of its parameter space where a light higgs seems unnatural. Given this fact it seems sensible to explore other approaches to this problem; we study three alternatives here. The first is a Little Higgs theory, in which the Higgs particle is realized as the pseudo-Goldstone boson of an approximate global chiral symmetry and so is naturally light. We analyze precision electroweak observables in the Minimal Moose model, one example of such a theory, and look for regions in its parameter space that are consistent with current limits on these. It is also possible to find a solution within a supersymmetric framework by adding to the MSSM superpotential a lambdaSHuH d term and UV completing with new strong dynamics under which S is a composite before lambda becomes non-perturbative. This allows us to increase the MSSM tree level higgs mass bound to a value that alleviates the supersymmetric fine-tuning problem with elementary higgs fields, maintaining gauge coupling unification in a natural way. Finally we try an entirely different tack, in which we do not attempt to solve the hierarchy problem, but rather assume that the tuning of the higgs can be explained in some unnatural way, from environmental considerations for instance. With this philosophy in mind we study in detail the low-energy phenomenology of the minimal extension to the Standard Model with a dark matter candidate and gauge coupling unification, consisting of additional fermions with the quantum numbers of SUSY higgsinos, and a singlet.
Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O
2017-08-01
Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
ERIC Educational Resources Information Center
Cohen, Julie; Schuldt, Lorien Chambers; Brown, Lindsay; Grossman, Pamela
2016-01-01
Background/Context: Current efforts to build rigorous teacher evaluation systems has increased interest in standardized classroom observation tools as reliable measures for assessing teaching. However, many argue these instruments can also be used to effect change in classroom practice. This study investigates a model of professional development…
Development of a Hand Held Thromboelastograph
2014-01-01
on a prototype model, and there was no indication of damage and was found to comply with IEC 61010 -1. Currently, loss of calibration has not been...Standards. Task 4 - PCM Certification Testing Subtask 4a: IEC 60601-1 Subtask 4b: IEC 60601-1-2 Subtask 4c: ISO 10993 Subtask 4d: ISTA 2A
Testing the Efficacy of Theoretically Derived Improvements in the Treatment of Social Phobia
ERIC Educational Resources Information Center
Rapee, Ronald M.; Gaston, Jonathan E.; Abbott, Maree J.
2009-01-01
Recent theoretical models of social phobia suggest that targeting several specific cognitive factors in treatment should enhance treatment efficacy over that of more traditional skills-based treatment programs. In the current study, 195 people with social phobia were randomly allocated to 1 of 3 treatments: standard cognitive restructuring plus in…
The Evolution of SCORM to Tin Can API: Implications for Instructional Design
ERIC Educational Resources Information Center
Lindert, Lisa; Su, Bude
2016-01-01
Integrating and documenting formal and informal learning experiences is challenging using the current Shareable Content Object Reference Model (SCORM) eLearning standard, which limits the media and data that are obtained from eLearning. In response to SCORM's limitations, corporate, military, and academic institutions have collaborated to develop…
The Dependency Axiom and the Relation between Agreement and Movement
ERIC Educational Resources Information Center
Linares Scarcerieau, Carlo Andrei
2012-01-01
Agreement and movement go hand in hand in a number of constructions across languages, and this correlation has played an important role in syntactic theory. The current standard approach to this "movement-agreement connection" is the Agree+EPP model, whose EPP component has often been questioned on conceptual grounds. The goal of this…
The potential impacts of development on wildlands in El Dorado County, California
Shawn C. Saving; Gregory B. Greenwood
2002-01-01
We modeled future development in rapidly urbanizing El Dorado County, California, to assess ecological impacts of expanding urbanization and effectiveness of standard policy mitigation efforts. Using raster land cover data and county parcel data, we constructed a footprint of current development and simulated future development using a modified stochastic flood-fill...
Distributional Effects of Educational Improvements: Are We Using the Wrong Model?
ERIC Educational Resources Information Center
Bourguignon, Francois; Rogers, F. Halsey
2007-01-01
Measuring the incidence of public spending in education requires an intergenerational framework distinguishing between what current and future generations--that is, parents and children--give and receive. In standard distributional incidence analysis, households are assumed to receive a benefit equal to what is spent on their children enrolled in…
False Accusations: A Growing Fear in the Classroom
ERIC Educational Resources Information Center
Bradley, Jon
2011-01-01
Male role models are becoming increasingly scarce in Canadian classrooms, and the demographics indicate that the current low numbers will continue to decline. New teachers are quite prepared to take up the pedagogical issues raised by changing standards and a changing demographic; however, the spectre of violence and false accusations adds a level…
Emerging Evidence for Instructional Practice: Repeated Viewings of Sign Language Models
ERIC Educational Resources Information Center
Beal-Alvarez, Jennifer S.; Huston, Sandra G.
2014-01-01
Current initiatives in education, such as No Child Left Behind and the National Common Core Standards movement, call for the use of evidence-based practices, or those instructional practices that are supported by documentation of their effectiveness related to student learning outcomes, including students with special needs. While hearing loss is…
Enhanced diffusion of pollutants by self-propulsion.
Zhao, Guanjia; Stuart, Emma J E; Pumera, Martin
2011-07-28
Current environmental models mostly account for the passive participation of pollutants in their environmental propagation. Here we demonstrate the paradigm-changing concept that pollutants can propagate themselves with a rate that is greater than the rate for standard molecular diffusion by five orders of magnitude. This journal is © the Owner Societies 2011
Information Technology Programming Standards and Annual Project Maintenance Costs
ERIC Educational Resources Information Center
Mynyk, John
2012-01-01
Organizations that depend on the use of IT in their business models must maintain their systems and keep their systems current to survive (Filipek, 2008; Kulkarni, Kumar, Mookerjee, & Sethi, 2009; Unterkalmsteiner et al., 2012). As most IT departments allocate as much as 80% of their budget to maintain stability while leaving only the other…
(PRESENTED IN ALBERTA, CANADA) A PERFORMANCE EVALUATION OF THE 2004 RELEASE OF MODELS-3 CMAQ
The Clean Air Act and its Amendments require that the U.S. Environmental Protection Agency (EPA) establish National Ambient Air Quality Standards for O3 and particulate matter and to assess current and future air quality regulations designed to protect human health and...
Use of model-predicted “transference ratios” is currently under consideration by the US EPA in the formulation of a Secondary National Ambient Air Quality Standard for oxidized nitrogen and oxidized sulfur. This term is an empirical parameter defined for oxidized sulfur (TS)as th...
Does H → γγ taste like vanilla new physics?
NASA Astrophysics Data System (ADS)
Almeida, L. G.; Bertuzzo, E.; Machado, P. A. N.; Funchal, R. Zukanovich
2012-11-01
We analyse the interplay between the Higgs to diphoton rate and electroweak precision measurements constraints in extensions of the Standard Model with new uncolored charged fermions that do not mix with the ordinary ones. We also compute the pair production cross sections for the lightest fermion and compare them with current bounds.
International Futures (IFs): A Global Issues Simulation for Teaching and Research.
ERIC Educational Resources Information Center
Hughes, Barry B.
This paper describes the International Futures (IFs) computer assisted simulation game for use with undergraduates. Written in Standard Fortran IV, the model currently runs on mainframe or mini computers, but has not been adapted for micros. It has been successfully installed on Harris, Burroughs, Telefunken, CDC, Univac, IBM, and Prime machines.…
Group Work during International Disaster Outreach Projects: A Model to Advance Cultural Competence
ERIC Educational Resources Information Center
West-Olatunji, Cirecie; Henesy, Rachel; Varney, Melanie
2015-01-01
Given the rise in disasters worldwide, counselors will increasingly be called upon to respond. Current accreditation standards require that programs train students to become skillful in disaster/crisis interventions. Group processing to enhance self-awareness and improve conceptualization skills is an essential element of such training. This…
Final progress report for "To continue to explore the energy frontier with the ATLAS detector"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mallik, Usha
2016-09-13
We are trying to understand the origin of our universe, the gap between the current theory: the Standard Model (SM) and the very beginning; the question is what led to our universe as we see today, and what are the conditions that might have led to the SM.
NASA Astrophysics Data System (ADS)
Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan
2012-03-01
Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.
Duckworth, Angela L.; Quinn, Patrick D.; Tsukayama, Eli
2013-01-01
The increasing prominence of standardized testing to assess student learning motivated the current investigation. We propose that standardized achievement test scores assess competencies determined more by intelligence than by self-control, whereas report card grades assess competencies determined more by self-control than by intelligence. In particular, we suggest that intelligence helps students learn and solve problems independent of formal instruction, whereas self-control helps students study, complete homework, and behave positively in the classroom. Two longitudinal, prospective studies of middle school students support predictions from this model. In both samples, IQ predicted changes in standardized achievement test scores over time better than did self-control, whereas self-control predicted changes in report card grades over time better than did IQ. As expected, the effect of self-control on changes in report card grades was mediated in Study 2 by teacher ratings of homework completion and classroom conduct. In a third study, ratings of middle school teachers about the content and purpose of standardized achievement tests and report card grades were consistent with the proposed model. Implications for pedagogy and public policy are discussed. PMID:24072936
Setting performance standards for medical practice: a theoretical framework.
Southgate, L; Hays, R B; Norcini, J; Mulholland, H; Ayers, B; Woolliscroft, J; Cusimano, M; McAvoy, P; Ainsworth, M; Haist, S; Campbell, M
2001-05-01
The assessment of performance in the real world of medical practice is now widely accepted as the goal of assessment at the postgraduate level. This is largely a validity issue, as it is recognised that tests of knowledge and in clinical simulations cannot on their own really measure how medical practitioners function in the broader health care system. However, the development of standards for performance-based assessment is not as well understood as in competency assessment, where simulations can more readily reflect narrower issues of knowledge and skills. This paper proposes a theoretical framework for the development of standards that reflect the more complex world in which experienced medical practitioners work. The paper reflects the combined experiences of a group of education researchers and the results of literature searches that included identifying current health system data sources that might contribute information to the measurement of standards. Standards that reflect the complexity of medical practice may best be developed through an "expert systems" analysis of clinical conditions for which desired health care outcomes reflect the contribution of several health professionals within a complex, three-dimensional, contextual model. Examples of the model are provided, but further work is needed to test validity and measurability.
Energy Storage System Safety: Plan Review and Inspection Checklist
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Pam C.; Conover, David R.
Codes, standards, and regulations (CSR) governing the design, construction, installation, commissioning, and operation of the built environment are intended to protect the public health, safety, and welfare. While these documents change over time to address new technology and new safety challenges, there is generally some lag time between the introduction of a technology into the market and the time it is specifically covered in model codes and standards developed in the voluntary sector. After their development, there is also a timeframe of at least a year or two until the codes and standards are adopted. Until existing model codes andmore » standards are updated or new ones are developed and then adopted, one seeking to deploy energy storage technologies or needing to verify the safety of an installation may be challenged in trying to apply currently implemented CSRs to an energy storage system (ESS). The Energy Storage System Guide for Compliance with Safety Codes and Standards1 (CG), developed in June 2016, is intended to help address the acceptability of the design and construction of stationary ESSs, their component parts, and the siting, installation, commissioning, operations, maintenance, and repair/renovation of ESS within the built environment.« less
Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas
2016-06-15
Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balazs, Csaba; Conrad, Jan; Farmer, Ben
Imaging atmospheric Cherenkov telescopes (IACTs) that are sensitive to potential γ-ray signals from dark matter (DM) annihilation above ~50 GeV will soon be superseded by the Cherenkov Telescope Array (CTA). CTA will have a point source sensitivity an order of magnitude better than currently operating IACTs and will cover a broad energy range between 20 GeV and 300 TeV. Using effective field theory and simplified models to calculate γ-ray spectra resulting from DM annihilation, we compare the prospects to constrain such models with CTA observations of the Galactic center with current and near-future measurements at the Large Hadron Collider (LHC)more » and direct detection experiments. Here, for DM annihilations via vector or pseudoscalar couplings, CTA observations will be able to probe DM models out of reach of the LHC, and, if DM is coupled to standard fermions by a pseudoscalar particle, beyond the limits of current direct detection experiments.« less
Balazs, Csaba; Conrad, Jan; Farmer, Ben; ...
2017-10-04
Imaging atmospheric Cherenkov telescopes (IACTs) that are sensitive to potential γ-ray signals from dark matter (DM) annihilation above ~50 GeV will soon be superseded by the Cherenkov Telescope Array (CTA). CTA will have a point source sensitivity an order of magnitude better than currently operating IACTs and will cover a broad energy range between 20 GeV and 300 TeV. Using effective field theory and simplified models to calculate γ-ray spectra resulting from DM annihilation, we compare the prospects to constrain such models with CTA observations of the Galactic center with current and near-future measurements at the Large Hadron Collider (LHC)more » and direct detection experiments. Here, for DM annihilations via vector or pseudoscalar couplings, CTA observations will be able to probe DM models out of reach of the LHC, and, if DM is coupled to standard fermions by a pseudoscalar particle, beyond the limits of current direct detection experiments.« less
The Standard Model: how far can it go and how can we tell?
Butterworth, J M
2016-08-28
The Standard Model of particle physics encapsulates our current best understanding of physics at the smallest distances and highest energies. It incorporates quantum electrodynamics (the quantized version of Maxwell's electromagnetism) and the weak and strong interactions, and has survived unmodified for decades, save for the inclusion of non-zero neutrino masses after the observation of neutrino oscillations in the late 1990s. It describes a vast array of data over a wide range of energy scales. I review a selection of these successes, including the remarkably successful prediction of a new scalar boson, a qualitatively new kind of object observed in 2012 at the Large Hadron Collider. New calculational techniques and experimental advances challenge the Standard Model across an ever-wider range of phenomena, now extending significantly above the electroweak symmetry breaking scale. I will outline some of the consequences of these new challenges, and briefly discuss what is still to be found.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'. © 2016 The Author(s).
Developing the Precision Magnetic Field for the E989 Muon g{2 Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Matthias W.
The experimental value ofmore » $$(g\\hbox{--}2)_\\mu$$ historically has been and contemporarily remains an important probe into the Standard Model and proposed extensions. Previous measurements of $$(g\\hbox{--}2)_\\mu$$ exhibit a persistent statistical tension with calculations using the Standard Model implying that the theory may be incomplete and constraining possible extensions. The Fermilab Muon g-2 experiment, E989, endeavors to increase the precision over previous experiments by a factor of four and probe more deeply into the tension with the Standard Model. The $$(g\\hbox{--}2)_\\mu$$ experimental implementation measures two spin precession frequencies defined by the magnetic field, proton precession and muon precession. The value of $$(g\\hbox{--}2)_\\mu$$ is derived from a relationship between the two frequencies. The precision of magnetic field measurements and the overall magnetic field uniformity achieved over the muon storage volume are then two undeniably important aspects of the e xperiment in minimizing uncertainty. The current thesis details the methods employed to achieve magnetic field goals and results of the effort.« less
NASA Technical Reports Server (NTRS)
Teague, E. C.; Vorburger, T. V.; Scire, F. E.; Baker, S. M.; Jensen, S. W.; Gloss, B. B.; Trahan, C.
1982-01-01
Current work by the National Bureau of Standards at the NASA National Transonic Facility (NTF) to evaluate the performance of stylus instruments for determining the topography of models under investigation is described along with instrumentation for characterization of the surface microtopography. Potential areas of surface effects are reviewed, and the need for finer surfaced models for the NTF high Reynolds number flows is stressed. Current stylus instruments have a radii as large as 25 microns, and three models with surface finishes of 4-6, 8-10, and 12-15 micro-in. rms surface finishes were fabricated for tests with a stylus with a tip radius of 1 micron and a 50 mg force. Work involving three-dimensional stylus profilometry is discussed in terms of stylus displacement being converted to digital signals, and the design of a light scattering instrument capable of measuring the surface finish on curved objects is presented.
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, G.; Wu, C.; Li, X.; Song, P.
2013-12-01
The 3D urban geological information system has been a major part of the national urban geological survey project of China Geological Survey in recent years. Large amount of multi-source and multi-subject data are to be stored in the urban geological databases. There are various models and vocabularies drafted and applied by industrial companies in urban geological data. The issues such as duplicate and ambiguous definition of terms and different coding structure increase the difficulty of information sharing and data integration. To solve this problem, we proposed a national standard-driven information classification and coding method to effectively store and integrate urban geological data, and we applied the data dictionary technology to achieve structural and standard data storage. The overall purpose of this work is to set up a common data platform to provide information sharing service. Research progresses are as follows: (1) A unified classification and coding method for multi-source data based on national standards. Underlying national standards include GB 9649-88 for geology and GB/T 13923-2006 for geography. Current industrial models are compared with national standards to build a mapping table. The attributes of various urban geological data entity models are reduced to several categories according to their application phases and domains. Then a logical data model is set up as a standard format to design data file structures for a relational database. (2) A multi-level data dictionary for data standardization constraint. Three levels of data dictionary are designed: model data dictionary is used to manage system database files and enhance maintenance of the whole database system; attribute dictionary organizes fields used in database tables; term and code dictionary is applied to provide a standard for urban information system by adopting appropriate classification and coding methods; comprehensive data dictionary manages system operation and security. (3) An extension to system data management function based on data dictionary. Data item constraint input function is making use of the standard term and code dictionary to get standard input result. Attribute dictionary organizes all the fields of an urban geological information database to ensure the consistency of term use for fields. Model dictionary is used to generate a database operation interface automatically with standard semantic content via term and code dictionary. The above method and technology have been applied to the construction of Fuzhou Urban Geological Information System, South-East China with satisfactory results.
Using RDF to Model the Structure and Process of Systems
NASA Astrophysics Data System (ADS)
Rodriguez, Marko A.; Watkins, Jennifer H.; Bollen, Johan; Gershenson, Carlos
Many systems can be described in terms of networks of discrete elements and their various relationships to one another. A semantic network, or multi-relational network, is a directed labeled graph consisting of a heterogeneous set of entities connected by a heterogeneous set of relationships. Semantic networks serve as a promising general-purpose modeling substrate for complex systems. Various standardized formats and tools are now available to support practical, large-scale semantic network models. First, the Resource Description Framework (RDF) offers a standardized semantic network data model that can be further formalized by ontology modeling languages such as RDF Schema (RDFS) and the Web Ontology Language (OWL). Second, the recent introduction of highly performant triple-stores (i.e. semantic network databases) allows semantic network models on the order of 109 edges to be efficiently stored and manipulated. RDF and its related technologies are currently used extensively in the domains of computer science, digital library science, and the biological sciences. This article will provide an introduction to RDF/RDFS/OWL and an examination of its suitability to model discrete element complex systems.
Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression
Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.; ...
2017-01-18
Currently, Constraint-Based Reconstruction and Analysis (COBRA) is the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Furthermore, standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We also developed a quadrupleprecision version of ourmore » linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.« less
Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.
Currently, Constraint-Based Reconstruction and Analysis (COBRA) is the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Furthermore, standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We also developed a quadrupleprecision version of ourmore » linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.« less
LORENZ: a system for planning long-bone fracture reduction
NASA Astrophysics Data System (ADS)
Birkfellner, Wolfgang; Burgstaller, Wolfgang; Wirth, Joachim; Baumann, Bernard; Jacob, Augustinus L.; Bieri, Kurt; Traud, Stefan; Strub, Michael; Regazzoni, Pietro; Messmer, Peter
2003-05-01
Long bone fractures belong to the most common injuries encountered in clinical routine trauma surgery. Preoperative assessment and decision making is usually based on standard 2D radiographs of the injured limb. Taking into account that a 3D - imaging modality such as computed tomography (CT) is not used for diagnosis in clinical routine, we have designed LORENZ, a fracture reduction planning tool based on such standard radiographs. Taking into account the considerable success of so-called image free navigation systems for total knee replacement in orthopaedic surgery, we assume that a similar tool for long bone fracture reposition should have considerable impact on computer-aided trauma surgery in a standard clinical routine setup. The case for long bone fracture reduction is, however, somewhat more complicated since not only scale independent angles indicating biomechanical measures such as varus and valgus are involved. Reduction path planning requires that the individual anatomy and the classification of the fracture is taken into account. In this paper, we present the basic ideas of this planning tool, it's current state, and the methodology chosen. LORENZ takes one or more conventional radiographs of the broken limb as input data. In addition, one or more x-rays of the opposite healthy bone are taken and mirrored if necessary. A most adequate CT model is being selected from a database; currently, this is achieved by using a scale space approach on the digitized x-ray images and comparing standard perspective renderings to these x-rays. After finding a CT-volume with a similar bone, a triangulated surface model is generated, and the surgeon can break the bone and arrange the fragments in 3D according to the x-ray images of the broken bone. Common osteosynthesis plates and implants can be loaded from CAD-datasets and are visualized as well. In addition, LORENZ renders virtual x-ray views of the fracture reduction process. The hybrid surface/voxel rendering engine of LORENZ also features full collision detection of fragments and implants by using the RAPID collision detection library. The reduction path is saved, and a TCP/IP interface to a robot for executing the reduction was added. LORENZ is platform independent and was programmed using Qt, AVW and OpenGL. We present a prototype for computer-aided fracture reduction planning based on standard radiographs. First test on clinical CT-Xray image pairs showed good performance; a current effort focuses on improving the speed of model retrieval by using orthonormal image moment decomposition, and on clinical evaluation for both training and surgical planning purposes. Furthermore, user-interface aspects are currently under evaluation and will be discussed.
The Application of the SPASE Metadata Standard in the U.S. and Worldwide
NASA Astrophysics Data System (ADS)
Thieman, J. R.; King, T. A.; Roberts, D.
2012-12-01
The Space Physics Archive Search and Extract (SPASE) Metadata standard for Heliophysics and related data is now an established standard within the NASA-funded space and solar physics community and is spreading to the international groups within that community. Development of SPASE had involved a number of international partners and the current version of the SPASE Metadata Model (version 2.2.2) has not needed any structural modifications since January 2011 . The SPASE standard has been adopted by groups such as NASA's Heliophysics division, the Canadian Space Science Data Portal (CSSDP), Canada's AUTUMN network, Japan's Inter-university Upper atmosphere Global Observation NETwork (IUGONET), Centre de Données de la Physique des Plasmas (CDPP), and the near-Earth space data infrastructure for e-Science (ESPAS). In addition, portions of the SPASE dictionary have been modeled in semantic web ontologies for use with reasoners and semantic searches. While we anticipate additional modifications to the model in the future to accommodate simulation and model data, these changes will not affect the data descriptions already generated for instrument-related datasets. Examples of SPASE descriptions can be viewed at
Gasqui, Patrick; Trommenschlager, Jean-Marie
2017-08-21
Milk production in dairy cow udders is a complex and dynamic physiological process that has resisted explanatory modelling thus far. The current standard model, Wood's model, is empirical in nature, represents yield in daily terms, and was published in 1967. Here, we have developed a dynamic and integrated explanatory model that describes milk yield at the scale of the milking session. Our approach allowed us to formally represent and mathematically relate biological features of known relevance while accounting for stochasticity and conditional elements in the form of explicit hypotheses, which could then be tested and validated using real-life data. Using an explanatory mathematical and biological model to explore a physiological process and pinpoint potential problems (i.e., "problem finding"), it is possible to filter out unimportant variables that can be ignored, retaining only those essential to generating the most realistic model possible. Such modelling efforts are multidisciplinary by necessity. It is also helpful downstream because model results can be compared with observed data, via parameter estimation using maximum likelihood and statistical testing using model residuals. The process in its entirety yields a coherent, robust, and thus repeatable, model.
A Ball Lightning Model as a Possible Explanation of Recently Reported Cavity Lights
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fryberger, David; /SLAC
The salient features of cavity lights, in particular, mobile luminous objects (MLO's), as have been experimentally observed in superconducting accelerator cavities, are summarized. A model based upon standard electromagnetic interactions between a small particle and the 1.5 GHz cavity excitation field is described. This model can explain some features of these data, in particular, the existence of particle orbits without wall contact. While this result is an important success for the model, it is detailed why the model as it stands is incomplete. It is argued that no avenues for a suitable extension of the model through established physics appearmore » evident, which motivates an investigation of a model based upon a more exotic object, ball lightning. As discussed, further motivation derives from the fact that there are significant similarities in many of the qualitative features of ball lightning and MLO's, even though they appear in quite different circumstances and differ in scale by orders of magnitude. The ball lightning model, which incorporates electromagnetic charges and currents, is based on a symmetrized set of Maxwell's equations in which the electromagnetic sources and fields are characterized by a process called dyality rotation. It is shown that a consistent mathematical description of dyality rotation as a physical process can be achieved by adding suitable (phenomenological) current terms to supplement the usual current terms in the symmetrized Maxwell's equations. These currents, which enable the conservation of electric and magnetic charge, are called vacuum currents. It is shown that the proposed ball lightning model offers a good qualitative explanation of the perplexing aspects of the MLO data. Avenues for further study are indicated.« less
Experience with abstract notation one
NASA Technical Reports Server (NTRS)
Harvey, James D.; Weaver, Alfred C.
1990-01-01
The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.
Prototyping an online wetland ecosystem services model using open model sharing standards
Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.
2011-01-01
Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.
Information-Flow-Based Access Control for Web Browsers
NASA Astrophysics Data System (ADS)
Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu
The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.
Earth Global Reference Atmospheric Model (GRAM) Overview and Updates: DOLWG Meeting
NASA Technical Reports Server (NTRS)
White, Patrick
2017-01-01
What is Earth-GRAM (Global Reference Atmospheric Model): Provides monthly mean and standard deviation for any point in atmosphere - Monthly, Geographic, and Altitude Variation; Earth-GRAM is a C++ software package - Currently distributed as Earth-GRAM 2016; Atmospheric variables included: pressure, density, temperature, horizontal and vertical winds, speed of sound, and atmospheric constituents; Used by engineering community because of ability to create dispersions in atmosphere at a rapid runtime - Often embedded in trajectory simulation software; Not a forecast model; Does not readily capture localized atmospheric effects.
OPM: The Open Porous Media Initiative
NASA Astrophysics Data System (ADS)
Flemisch, B.; Flornes, K. M.; Lie, K.; Rasmussen, A.
2011-12-01
The principal objective of the Open Porous Media (OPM) initiative is to develop a simulation suite that is capable of modeling industrially and scientifically relevant flow and transport processes in porous media and bridge the gap between the different application areas of porous media modeling, including reservoir mechanics, CO2 sequestration, biological systems, and product development of engineered media. The OPM initiative will provide a long-lasting, efficient, and well-maintained open-source software for flow and transport in porous media built on modern software principles. The suite is released under the GNU General Public License (GPL). Our motivation is to provide a means to unite industry and public research on simulation of flow and transport in porous media. For academic users, we seek to provide a software infrastructure that facilitates testing of new ideas on models with industry-standard complexity, while at the same time giving the researcher control over discretization and solvers. Similarly, we aim to accelerate the technology transfer from academic institutions to professional companies by making new research results available as free software of professional standard. The OPM initiative is currently supported by six research groups in Norway and Germany and funded by existing grants from public research agencies as well as from Statoil Petroleum and Total E&P Norge. However, a full-scale development of the OPM initiative requires substantially more funding and involvement of more research groups and potential end users. In this talk, we will provide an overview of the current activities in the OPM initiative. Special emphasis will be given to the demonstration of the synergies achieved by combining the strengths of individual open-source software components. In particular, a new fully implicit solver developed within the DUNE-based simulator DuMux could be enhanced by the ability to read industry-standard Eclipse input files and to run on grids given in corner-point format. Examples taken from the SPE comparative solution projects and CO2 sequestration benchmarks illustrate the current capabilities of the simulation suite.
Information Technology: A Tool to Cut Health Care Costs
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Maly, K. J.; Overstreet, C. M.; Foudriat, E. C.
1996-01-01
Old Dominion University embarked on a project to see how current computer technology could be applied to reduce the cost and or to improve the efficiency of health care services. We designed and built a prototype for an integrated medical record system (MRS). The MRS is written in Tool control language/Tool kit (Tcl/Tk). While the initial version of the prototype had patient information hard coded into the system, later versions used an INGRES database for storing patient information. Currently, we have proposed an object-oriented model for implementing MRS. These projects involve developing information systems for physicians and medical researchers to enhance their ability for improved treatment at reduced costs. The move to computerized patient records is well underway, several standards exist for laboratory records, and several groups are working on standards for other portions of the patient record.
Scientific misconduct, the pharmaceutical industry, and the tragedy of institutions.
Cohen-Kohler, Jillian Clare; Esmail, Laura C
2007-09-01
This paper examines how current legislative and regulatory models do not adequately govern the pharmaceutical industry towards ethical scientific conduct. In the context of a highly profit-driven industry, governments need to ensure ethical and legal standards are not only in place for companies but that they are enforceable. We demonstrate with examples from both industrialized and developing countries how without sufficient controls, there is a risk that corporate behaviour will transgress ethical boundaries. We submit that there is a critical need for urgent drug regulatory reform. There must be robust regulatory structures in place which enforce corporate governance mechanisms to ensure that pharmaceutical companies maintain ethical standards in drug research and development and the marketing of pharmaceuticals. What is also needed is for the pharmaceutical industry to adopt authentic "corporate social responsibility" policies as current policies and practices are insufficient.
Dark matter direct detection of a fermionic singlet at one loop
NASA Astrophysics Data System (ADS)
Herrero-García, Juan; Molinaro, Emiliano; Schmidt, Michael A.
2018-06-01
The strong direct detection limits could be pointing to dark matter - nucleus scattering at loop level. We study in detail the prototype example of an electroweak singlet (Dirac or Majorana) dark matter fermion coupled to an extended dark sector, which is composed of a new fermion and a new scalar. Given the strong limits on colored particles from direct and indirect searches we assume that the fields of the new dark sector are color singlets. We outline the possible simplified models, including the well-motivated cases in which the extra scalar or fermion is a Standard Model particle, as well as the possible connection to neutrino masses. We compute the contributions to direct detection from the photon, the Z and the Higgs penguins for arbitrary quantum numbers of the dark sector. Furthermore, we derive compact expressions in certain limits, i.e., when all new particles are heavier than the dark matter mass and when the fermion running in the loop is light, like a Standard Model lepton. We study in detail the predicted direct detection rate and how current and future direct detection limits constrain the model parameters. In case dark matter couples directly to Standard Model leptons we find an interesting interplay between lepton flavor violation, direct detection and the observed relic abundance.
[Current situation of the standardization of acupuncture and moxibustion in Taiwan].
Pan, Li-Jia; Cui, Rui; Zhan, Bi-Yu; Liao, Cai-Yan; Cao, Qi-Hua; Li, Gui-Lan; Guo, Yi
2012-09-01
The current situation of the standardization of acupuncture and moxibustion in the Taiwan region is introduced in this paper from the three aspects, named the development state of standard of acupuncture and moxibustion in Taiwan, the implementation of Taiwan district standard and the standardization of acupuncture and moxibustion in Taiwan. At present, the relevant standards of acupuncture and moxibustion in Taiwan just include the standard operation procedure of acupuncture and moxibustion, the reference guideline of the safe operation in the medical service centers of traditional Chinese medicine, and the faculty standard of Chinese medicine hospital, etc. It is concluded that the current situation of the standardization of acupuncture and moxibusiton presented the weak awareness of the standardization of acupuncture and moxibustion in the industry, insufficient enterprise standard, less-quantity of the implemented standards and narrow coverage.
Izaguirre, Eder; Lin, Tongyan; Shuve, Brian
2017-03-15
Here, we propose new searches for axion-like particles (ALPs) produced in flavor-changing neutral current (FCNC) processes. This proposal exploits the often-overlooked coupling of ALPs to W ± bosons, leading to FCNC production of ALPs even in the absence of a direct coupling to fermions. Our proposed searches for resonant ALP production in decays such as B→K(*)a, a→γγ, and K→πa, a→γγ could greatly improve upon the current sensitivity to ALP couplings to standard model particles. Finally, we also determine analogous constraints and discovery prospects for invisibly decaying ALPs.
2013-10-01
a GE unit and 100 images from a Hologic unit. These were reviewed during Dr. Harvey’s visit to Toronto October 2012. The ...patient underwent the standard of practice 4-view mammogram. Following this, a different technologist obtained a second craniocaudal image of the left...project and one related to a current event. Representatives from the project were present to provide information at the Charlottesville Four