Physics at a 100 TeV pp Collider: Standard Model Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mangano, M. L.; Zanderighi, G.; Aguilar Saavedra, J. A.
This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1994-01-01
This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET
The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...
77 FR 61307 - New Postal Product
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-09
...: Transfer Mail Processing Cost Model for Machinable and Irregular Standard Mail Parcels to the Mail Processing Cost Model for Parcel Select/Parcel Return Service. The Postal Service proposes to move the machinable and irregular cost worksheets contained in the Standard Mail parcel mail processing cost model to...
2012-06-01
THIS PAGE INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS BPM Business Process Model BPMN Business Process Modeling Notation C&A...checking leads to an improvement in the quality and success of enterprise software development. Business Process Modeling Notation ( BPMN ) is an...emerging standard that allows business processes to be captured in a standardized format. BPMN lacks formal semantics which leaves many of its features
Needed: A Standard Information Processing Model of Learning and Learning Processes.
ERIC Educational Resources Information Center
Carifio, James
One strategy to prevent confusion as new paradigms emerge is to have professionals in the area develop and use a standard model of the phenomenon in question. The development and use of standard models in physics, genetics, archaeology, and cosmology have been very productive. The cognitive revolution in psychology and education has produced a…
Fixation of strategies with the Moran and Fermi processes in evolutionary games
NASA Astrophysics Data System (ADS)
Liu, Xuesong; He, Mingfeng; Kang, Yibin; Pan, Qiuhui
2017-10-01
A model of stochastic evolutionary game dynamics with finite population was built. It combines the standard Moran and Fermi rules with two strategies cooperation and defection. We obtain the expressions of fixation probabilities and fixation times. The one-third rule which has been found in the frequency dependent Moran process also holds for our model. We obtain the conditions of strategy being an evolutionarily stable strategy in our model, and then make a comparison with the standard Moran process. Besides, the analytical results show that compared with the standard Moran process, fixation occurs with higher probabilities under a prisoner's dilemma game and coordination game, but with lower probabilities under a coexistence game. The simulation result shows that the fixation time in our mixed process is lower than that in the standard Fermi process. In comparison with the standard Moran process, fixation always takes more time on average in spatial populations, regardless of the game. In addition, the fixation time decreases with the growth of the number of neighbors.
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2013-12-01
Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.
NASA Astrophysics Data System (ADS)
Putra, A.; Masril, M.; Yurnetti, Y.
2018-04-01
One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.
Kropf, Stefan; Chalopin, Claire; Lindner, Dirk; Denecke, Kerstin
2017-06-28
Access to patient data within the hospital or between hospitals is still problematic since a variety of information systems is in use applying different vendor specific terminologies and underlying knowledge models. Beyond, the development of electronic health record systems (EHRSs) is time and resource consuming. Thus, there is a substantial need for a development strategy of standardized EHRSs. We are applying a reuse-oriented process model and demonstrate its feasibility and realization on a practical medical use case, which is an EHRS holding all relevant data arising in the context of treatment of tumors of the sella region. In this paper, we describe the development process and our practical experiences. Requirements towards the development of the EHRS were collected by interviews with a neurosurgeon and patient data analysis. For modelling of patient data, we selected openEHR as standard and exploited the software tools provided by the openEHR foundation. The patient information model forms the core of the development process, which comprises the EHR generation and the implementation of an EHRS architecture. Moreover, a reuse-oriented process model from the business domain was adapted to the development of the EHRS. The reuse-oriented process model is a model for a suitable abstraction of both, modeling and development of an EHR centralized EHRS. The information modeling process resulted in 18 archetypes that were aggregated in a template and built the boilerplate of the model driven development. The EHRs and the EHRS were developed by openEHR and W3C standards, tightly supported by well-established XML techniques. The GUI of the final EHRS integrates and visualizes information from various examinations, medical reports, findings and laboratory test results. We conclude that the development of a standardized overarching EHR and an EHRS is feasible using openEHR and W3C standards, enabling a high degree of semantic interoperability. The standardized representation visualizes data and can in this way support the decision process of clinicians.
.NET INTEROPERABILITY GUIDELINES
The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...
NASA Astrophysics Data System (ADS)
Peckham, Scott
2016-04-01
Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
The Emerging Importance of Business Process Standards in the Federal Government
2006-02-23
delivers enough value for its commercialization into the general industry. Today, we are seeing standards such as SOA, BPMN and BPEL hit that...Process Modeling Notation ( BPMN ) and the Business Process Execution Language (BPEL). BPMN provides a standard representation for capturing and...execution. The combination of BPMN and BPEL offers organizations the potential to standardize processes in a distributed environment, enabling
A process-based standard for the Solar Energetic Particle Event Environment
NASA Astrophysics Data System (ADS)
Gabriel, Stephen
For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.
Reliability Analysis and Standardization of Spacecraft Command Generation Processes
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Grenander, Sven; Evensen, Ken
2011-01-01
center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.
Measuring health care process quality with software quality measures.
Yildiz, Ozkan; Demirörs, Onur
2012-01-01
Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.
NASA Astrophysics Data System (ADS)
Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad
2018-03-01
Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.
Yiu, Sean; Tom, Brian Dm
2017-01-01
Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colaneri, Luca
2017-04-01
With the experimental discovery of the Higgs boson, the Standard Model has been considered veri ed in all its previsions. The Standard Model, though, is still considered an incomplete theory, because it fails to address many theoretical and phenomenological issues. Among those, it doesn't provide any viable Dark Matter candidate. Many Beyond-Standard Model theories, such as the Supersymmetric Standard Model, provide possible solutions. In this work we have reported the experimental observations that led to considerate the existence of a new Force, mediated by a new massive vector boson, that could address all the observed phenomenology. This new dark Forcemore » could open an observational channel between the Standard Model and a new Dark Sector, convey by the interaction of the Standard Model photon with the massive dark photon, also called the A'. Purpose of this work was to develop an independent study of the background processes and the implementation of an independent event generator, to better understand the kinematics of the produced particles in the process e - +W → e - +W' + e + + e - and validate, or invalidate, the o cial event generator.« less
ERIC Educational Resources Information Center
Stone, Gregory Ethan; Koskey, Kristin L. K.; Sondergeld, Toni A.
2011-01-01
Typical validation studies on standard setting models, most notably the Angoff and modified Angoff models, have ignored construct development, a critical aspect associated with all conceptualizations of measurement processes. Stone compared the Angoff and objective standard setting (OSS) models and found that Angoff failed to define a legitimate…
Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O
2017-08-01
Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
Cook, David J; Thompson, Jeffrey E; Suri, Rakesh; Prinsen, Sharon K
2014-01-01
The absence of standardization in surgical care process, exemplified in a "solution shop" model, can lead to unwarranted variation, increased cost, and reduced quality. A comprehensive effort was undertaken to improve quality of care around indwelling bladder catheter use following surgery by creating a "focused factory" model within the cardiac surgical practice. Baseline compliance with Surgical Care Improvement Inf-9, removal of urinary catheter by the end of surgical postoperative day 2, was determined. Comparison of baseline data to postintervention results showed clinically important reductions in the duration of indwelling bladder catheters as well as marked reduction in practice variation. Following the intervention, Surgical Care Improvement Inf-9 guidelines were met in 97% of patients. Although clinical quality improvement was notable, the process to accomplish this-identification of patients suitable for standardized pathways, protocol application, and electronic systems to support the standardized practice model-has potentially greater relevance than the specific clinical results. © 2013 by the American College of Medical Quality.
eSPEM - A SPEM Extension for Enactable Behavior Modeling
NASA Astrophysics Data System (ADS)
Ellner, Ralf; Al-Hilank, Samir; Drexler, Johannes; Jung, Martin; Kips, Detlef; Philippsen, Michael
OMG's SPEM - by means of its (semi-)formal notation - allows for a detailed description of development processes and methodologies, but can only be used for a rather coarse description of their behavior. Concepts for a more fine-grained behavior model are considered out of scope of the SPEM standard and have to be provided by other standards like BPDM/BPMN or UML. However, a coarse granularity of the behavior model often impedes a computer-aided enactment of a process model. Therefore, in this paper we present eSPEM, an extension of SPEM, that is based on the UML meta-model and focused on fine-grained behavior and life-cycle modeling and thereby supports automated enactment of development processes.
New vector-like fermions and flavor physics
Ishiwata, Koji; Ligeti, Zoltan; Wise, Mark B.
2015-10-06
We study renormalizable extensions of the standard model that contain vector-like fermions in a (single) complex representation of the standard model gauge group. There are 11 models where the vector-like fermions Yukawa couple to the standard model fermions via the Higgs field. These models do not introduce additional fine-tunings. They can lead to, and are constrained by, a number of different flavor-changing processes involving leptons and quarks, as well as direct searches. An interesting feature of the models with strongly interacting vector-like fermions is that constraints from neutral meson mixings (apart from CP violation inmore » $$ {K}^0-{\\overline{K}}^0 $$ mixing) are not sensitive to higher scales than other flavor-changing neutral-current processes. We identify order 1/(4πM) 2 (where M is the vector-like fermion mass) one-loop contributions to the coefficients of the four-quark operators for meson mixing, that are not suppressed by standard model quark masses and/or mixing angles.« less
Retinal Information Processing for Minimum Laser Lesion Detection and Cumulative Damage
1992-09-17
TAL3Unaqr~orJ:ccd [] J ,;--Wicic tion --------------... MYRON....... . ................... ... ....... ...........................MYRON L. WOLBARSHT B D ist...possible beneficial visual function of the small retinal image movements. B . Visual System Models Prior models of visual system information processing have...against standard secondary sources whose calibrations can be traced to the National Bureau of Standards. B . Electrophysiological Techniques Extracellular
Designing an evaluation framework for WFME basic standards for medical education.
Tackett, Sean; Grant, Janet; Mmari, Kristin
2016-01-01
To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
The MP (Materialization Pattern) Model for Representing Math Educational Standards
NASA Astrophysics Data System (ADS)
Choi, Namyoun; Song, Il-Yeol; An, Yuan
Representing natural languages with UML has been an important research issue for various reasons. Little work has been done for modeling imperative mood sentences which are the sentence structure of math educational standard statements. In this paper, we propose the MP (Materialization Pattern) model that captures the semantics of English sentences used in math educational standards. The MP model is based on the Reed-Kellogg sentence diagrams and creates MP schemas with the UML notation. The MP model explicitly represents the semantics of the sentences by extracting math concepts and the cognitive process of math concepts from math educational standard statements, and simplifies modeling. This MP model is also developed to be used for aligning math educational standard statements via schema matching.
Service Oriented Architecture for Coast Guard Command and Control
2007-03-01
Operations BPEL4WS The Business Process Execution Language for Web Services BPMN Business Process Modeling Notation CASP Computer Aided Search Planning...Business Process Modeling Notation ( BPMN ) provides a standardized graphical notation for drawing business processes in a workflow. Software tools
Standard model of knowledge representation
NASA Astrophysics Data System (ADS)
Yin, Wensheng
2016-09-01
Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.
A standard satellite control reference model
NASA Technical Reports Server (NTRS)
Golden, Constance
1994-01-01
This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.
2005 v4.3 Technical Support Document
Emissions Modeling for the Final Mercury and Air Toxics Standards Technical Support Document describes how updated 2005 NEI, version 2 emissions were processed for air quality modeling in support of the final Mercury and Air Toxics Standards (MATS).
76 FR 296 - Periodic Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-04
... part would update the mail processing portion of the Parcel Select/Parcel Return Service cost models...) processing cost model that was filed as Proposal Seven on September 8, 2010. Proposal Thirteen at 1. These... develop the Standard Mail/non-flat machinable (NFM) mail processing cost model. It also proposes to use...
Sauer, Vernon B.
2002-01-01
Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.
Impact of the hard-coded parameters on the hydrologic fluxes of the land surface model Noah-MP
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Samaniego, Luis; Clark, Martyn; Wulfmeyer, Volker; Attinger, Sabine; Thober, Stephan
2016-04-01
Land surface models incorporate a large number of processes, described by physical, chemical and empirical equations. The process descriptions contain a number of parameters that can be soil or plant type dependent and are typically read from tabulated input files. Land surface models may have, however, process descriptions that contain fixed, hard-coded numbers in the computer code, which are not identified as model parameters. Here we searched for hard-coded parameters in the computer code of the land surface model Noah with multiple process options (Noah-MP) to assess the importance of the fixed values on restricting the model's agility during parameter estimation. We found 139 hard-coded values in all Noah-MP process options, which are mostly spatially constant values. This is in addition to the 71 standard parameters of Noah-MP, which mostly get distributed spatially by given vegetation and soil input maps. We performed a Sobol' global sensitivity analysis of Noah-MP to variations of the standard and hard-coded parameters for a specific set of process options. 42 standard parameters and 75 hard-coded parameters were active with the chosen process options. The sensitivities of the hydrologic output fluxes latent heat and total runoff as well as their component fluxes were evaluated. These sensitivities were evaluated at twelve catchments of the Eastern United States with very different hydro-meteorological regimes. Noah-MP's hydrologic output fluxes are sensitive to two thirds of its standard parameters. The most sensitive parameter is, however, a hard-coded value in the formulation of soil surface resistance for evaporation, which proved to be oversensitive in other land surface models as well. Surface runoff is sensitive to almost all hard-coded parameters of the snow processes and the meteorological inputs. These parameter sensitivities diminish in total runoff. Assessing these parameters in model calibration would require detailed snow observations or the calculation of hydrologic signatures of the runoff data. Latent heat and total runoff exhibit very similar sensitivities towards standard and hard-coded parameters in Noah-MP because of their tight coupling via the water balance. It should therefore be comparable to calibrate Noah-MP either against latent heat observations or against river runoff data. Latent heat and total runoff are sensitive to both, plant and soil parameters. Calibrating only a parameter sub-set of only soil parameters, for example, thus limits the ability to derive realistic model parameters. It is thus recommended to include the most sensitive hard-coded model parameters that were exposed in this study when calibrating Noah-MP.
Mathematical Modeling: A Structured Process
ERIC Educational Resources Information Center
Anhalt, Cynthia Oropesa; Cortez, Ricardo
2015-01-01
Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…
CellML metadata standards, associated tools and repositories
Beard, Daniel A.; Britten, Randall; Cooling, Mike T.; Garny, Alan; Halstead, Matt D.B.; Hunter, Peter J.; Lawson, James; Lloyd, Catherine M.; Marsh, Justin; Miller, Andrew; Nickerson, David P.; Nielsen, Poul M.F.; Nomura, Taishin; Subramanium, Shankar; Wimalaratne, Sarala M.; Yu, Tommy
2009-01-01
The development of standards for encoding mathematical models is an important component of model building and model sharing among scientists interested in understanding multi-scale physiological processes. CellML provides such a standard, particularly for models based on biophysical mechanisms, and a substantial number of models are now available in the CellML Model Repository. However, there is an urgent need to extend the current CellML metadata standard to provide biological and biophysical annotation of the models in order to facilitate model sharing, automated model reduction and connection to biological databases. This paper gives a broad overview of a number of new developments on CellML metadata and provides links to further methodological details available from the CellML website. PMID:19380315
Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions
2012-07-01
Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software
Creating Royal Australian Navy Standard Operating Procedures using Flow Diagrams
2015-08-01
DST-Group-TR-3137 UNCLASSIFIED Acronyms 4TQ 4TQ Toolkit ABR Australian Book of Reference ADF Australian Defence Force BPMN Business...steps to perform the activity. Object Management Group’s (OMG) Business Process Model and Notation ( BPMN ) [10] is becoming the standard to use when...Department of Defence 10. Object Management Group, Business Process Model and Notation ( BPMN ), version 2.0. 2011, Object Management Group: http
Using the Modification Index and Standardized Expected Parameter Change for Model Modification
ERIC Educational Resources Information Center
Whittaker, Tiffany A.
2012-01-01
Model modification is oftentimes conducted after discovering a badly fitting structural equation model. During the modification process, the modification index (MI) and the standardized expected parameter change (SEPC) are 2 statistics that may be used to aid in the selection of parameters to add to a model to improve the fit. The purpose of this…
A review of the solar array manufacturing industry costing standards
NASA Technical Reports Server (NTRS)
1977-01-01
The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
Standardization efforts of digital pathology in Europe.
Rojo, Marcial García; Daniel, Christel; Schrader, Thomas
2012-01-01
EURO-TELEPATH is a European COST Action IC0604. It started in 2007 and will end in November 2011. Its main objectives are evaluating and validating the common technological framework and communication standards required to access, transmit, and manage digital medical records by pathologists and other medical specialties in a networked environment. Working Group 1, "Business Modelling in Pathology," has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy - using Business Process Modelling Notation (BPMN). Working Group 2 has been dedicated to promoting the application of informatics standards in pathology, collaborating with Integrating Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Health terminology standardization research has become a topic of great interest. Future research work should focus on standardizing automatic image analysis and tissue microarrays imaging.
Qualitative Differences in Real-Time Solution of Standardized Figural Analogies.
ERIC Educational Resources Information Center
Schiano, Diane J.; And Others
Performance on standardized figural analogy tests is considered highly predictive of academic success. While information-processing models of analogy solution attribute performance differences to quantitative differences in processing parameters, the problem-solving literature suggests that qualitative differences in problem representation and…
Towards Automatic Validation and Healing of Citygml Models for Geometric and Semantic Consistency
NASA Astrophysics Data System (ADS)
Alam, N.; Wagner, D.; Wewetzer, M.; von Falkenhausen, J.; Coors, V.; Pries, M.
2013-09-01
A steadily growing number of application fields for large 3D city models have emerged in recent years. Like in many other domains, data quality is recognized as a key factor for successful business. Quality management is mandatory in the production chain nowadays. Automated domain-specific tools are widely used for validation of business-critical data but still common standards defining correct geometric modeling are not precise enough to define a sound base for data validation of 3D city models. Although the workflow for 3D city models is well-established from data acquisition to processing, analysis and visualization, quality management is not yet a standard during this workflow. Processing data sets with unclear specification leads to erroneous results and application defects. We show that this problem persists even if data are standard compliant. Validation results of real-world city models are presented to demonstrate the potential of the approach. A tool to repair the errors detected during the validation process is under development; first results are presented and discussed. The goal is to heal defects of the models automatically and export a corrected CityGML model.
Fractional Ornstein-Uhlenbeck for index prices of FTSE Bursa Malaysia KLCI
NASA Astrophysics Data System (ADS)
Chen, Kho Chia; Bahar, Arifah; Ting, Chee-Ming
2014-07-01
This paper studies the Ornstein-Uhlenbeck model that incorporates long memory stochastic volatility which is known as fractional Ornstein-Uhlenbeck model. The determination of the existence of long range dependence of the index prices of FTSE Bursa Malaysia KLCI is measured by the Hurst exponent. The empirical distribution of unobserved volatility is estimated using the particle filtering method. The performance between fractional Ornstein -Uhlenbeck and standard Ornstein -Uhlenbeck process had been compared. The mean square errors of the fractional Ornstein-Uhlenbeck model indicated that the model describes index prices better than the standard Ornstein-Uhlenbeck process.
NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2013-01-01
The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.
Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard
ERIC Educational Resources Information Center
Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.
2017-01-01
This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…
Implementing PAT with Standards
NASA Astrophysics Data System (ADS)
Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.
2016-02-01
Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of a specific avionics hardware/software system. This standard defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
[Establishment of database with standard 3D tooth crowns based on 3DS MAX].
Cheng, Xiaosheng; An, Tao; Liao, Wenhe; Dai, Ning; Yu, Qing; Lu, Peijun
2009-08-01
The database with standard 3D tooth crowns has laid the groundwork for dental CAD/CAM system. In this paper, we design the standard tooth crowns in 3DS MAX 9.0 and create a database with these models successfully. Firstly, some key lines are collected from standard tooth pictures. Then we use 3DS MAX 9.0 to design the digital tooth model based on these lines. During the design process, it is important to refer to the standard plaster tooth model. After some tests, the standard tooth models designed with this method are accurate and adaptable; furthermore, it is very easy to perform some operations on the models such as deforming and translating. This method provides a new idea to build the database with standard 3D tooth crowns and a basis for dental CAD/CAM system.
Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi
2016-08-01
Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.
CERT Resilience Management Model (CERT-RMM) V1.1: NIST Special Publication Crosswalk Version 1
2011-11-01
Organization for Standards ( ISO ) and International Electrotechnical Commission (IEC) 27000 series, COBIT, the British Standards Institution’s BS 25999...and ISO 24762 • includes quantitative process measurements that can be used to ensure operational resilience processes are performing as intended
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hewamanage, Samantha Kaushalya
2011-01-01
A model-independent signature-based search for physics beyond the Standard Model is performed in the photon + jets + missing transverse energy channel in \\ppbar collisions at a center of mass energy of 1.96 TeV using the CDF II detector. Events with a photon + jets are predicted by the Standard Model and also by many theoretical models beyond the Standard Model. In the Standard Model, the main mechanisms for photon + jets production include quark-antiquark annihilation and quark-gluon scattering. No intrinsic missing transverse energy is present in any of these Standard Model processes. In this search, photon +more » $$\\geq$$1 jet and photon + $$\\geq$$2 jet events are analyzed with and without a minimum requirement on the missing transverse energy. Numerous mass distributions and kinematic distributions are studied and no significant excess over the background prediction is found. All results indicate good agreement with expectations of the Standard Model.« less
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Digital data registration and differencing compression system
NASA Technical Reports Server (NTRS)
Ransford, Gary A. (Inventor); Cambridge, Vivien J. (Inventor)
1990-01-01
A process is disclosed for x ray registration and differencing which results in more efficient compression. Differencing of registered modeled subject image with a modeled reference image forms a differenced image for compression with conventional compression algorithms. Obtention of a modeled reference image includes modeling a relatively unrelated standard reference image upon a three-dimensional model, which three-dimensional model is also used to model the subject image for obtaining the modeled subject image. The registration process of the modeled subject image and modeled reference image translationally correlates such modeled images for resulting correlation thereof in spatial and spectral dimensions. Prior to compression, a portion of the image falling outside a designated area of interest may be eliminated, for subsequent replenishment with a standard reference image. The compressed differenced image may be subsequently transmitted and/or stored, for subsequent decompression and addition to a standard reference image so as to form a reconstituted or approximated subject image at either a remote location and/or at a later moment in time. Overall effective compression ratios of 100:1 are possible for thoracic x ray digital images.
Digital Data Registration and Differencing Compression System
NASA Technical Reports Server (NTRS)
Ransford, Gary A. (Inventor); Cambridge, Vivien J. (Inventor)
1996-01-01
A process for X-ray registration and differencing results in more efficient compression. Differencing of registered modeled subject image with a modeled reference image forms a differenced image for compression with conventional compression algorithms. Obtention of a modeled reference image includes modeling a relatively unrelated standard reference image upon a three-dimensional model, which three-dimensional model is also used to model the subject image for obtaining the modeled subject image. The registration process of the modeled subject image and modeled reference image translationally correlates such modeled images for resulting correlation thereof in spatial and spectral dimensions. Prior to compression, a portion of the image falling outside a designated area of interest may be eliminated, for subsequent replenishment with a standard reference image. The compressed differenced image may be subsequently transmitted and/or stored, for subsequent decompression and addition to a standard reference image so as to form a reconstituted or approximated subject image at either a remote location and/or at a later moment in time. Overall effective compression ratios of 100:1 are possible for thoracic X-ray digital images.
Digital data registration and differencing compression system
NASA Technical Reports Server (NTRS)
Ransford, Gary A. (Inventor); Cambridge, Vivien J. (Inventor)
1992-01-01
A process for x ray registration and differencing results in more efficient compression is discussed. Differencing of registered modeled subject image with a modeled reference image forms a differential image for compression with conventional compression algorithms. Obtention of a modeled reference image includes modeling a relatively unrelated standard reference image upon a three dimensional model, which three dimensional model is also used to model the subject image for obtaining the modeled subject image. The registration process of the modeled subject image and modeled reference image translationally correlates such modeled images for resulting correlation thereof in spatial and spectral dimensions. Prior to compression, a portion of the image falling outside a designated area of interest may be eliminated, for subsequent replenishment with a standard reference image. The compressed differenced image may be subsequently transmitted and/or stored, for subsequent decompression and addition to a standard reference image so as to form a reconstituted or approximated subject image at either remote location and/or at a later moment in time. Overall effective compression ratios of 100:1 are possible for thoracic x ray digital images.
The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH).
García-Rojo, Marcial; Gonçalves, Luís; Blobel, Bernd
2012-01-01
The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH) is a European COST Action that has been running from 2007 to 2011. COST Actions are funded by the COST (European Cooperation in the field of Scientific and Technical Research) Agency, supported by the Seventh Framework Programme for Research and Technological Development (FP7), of the European Union. EURO-TELEPATH's main objectives were evaluating and validating the common technological framework and communication standards required to access, transmit and manage digital medical records by pathologists and other medical professionals in a networked environment. The project was organized in four working groups. orking Group 1 "Business modeling in pathology" has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy -using Business Process Modeling Notation (BPMN). orking Group 2 "Informatics standards in pathology" has been dedicated to promoting the development and application of informatics standards in pathology, collaborating with Integrating the Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Working Group 3 "Images: Analysis, Processing, Retrieval and Management" worked on the use of virtual or digital slides that are fostering the use of image processing and analysis in pathology not only for research purposes, but also in daily practice. Working Group 4 "Technology and Automation in Pathology" was focused on studying the adequacy of current existing technical solutions, including, e.g., the quality of images obtained by slide scanners, or the efficiency of image analysis applications. Major outcome of this action are the collaboration with international health informatics standardization bodies to foster the development of standards for digital pathology, offering a new approach for workflow analysis, based in business process modeling. Health terminology standardization research has become a topic of high interest. Future research work should focus on standardization of automatic image analysis and tissue microarrays imaging.
NASA Astrophysics Data System (ADS)
Liu, Zhenchen; Lu, Guihua; He, Hai; Wu, Zhiyong; He, Jian
2018-01-01
Reliable drought prediction is fundamental for water resource managers to develop and implement drought mitigation measures. Considering that drought development is closely related to the spatial-temporal evolution of large-scale circulation patterns, we developed a conceptual prediction model of seasonal drought processes based on atmospheric and oceanic standardized anomalies (SAs). Empirical orthogonal function (EOF) analysis is first applied to drought-related SAs at 200 and 500 hPa geopotential height (HGT) and sea surface temperature (SST). Subsequently, SA-based predictors are built based on the spatial pattern of the first EOF modes. This drought prediction model is essentially the synchronous statistical relationship between 90-day-accumulated atmospheric-oceanic SA-based predictors and SPI3 (3-month standardized precipitation index), calibrated using a simple stepwise regression method. Predictor computation is based on forecast atmospheric-oceanic products retrieved from the NCEP Climate Forecast System Version 2 (CFSv2), indicating the lead time of the model depends on that of CFSv2. The model can make seamless drought predictions for operational use after a year-to-year calibration. Model application to four recent severe regional drought processes in China indicates its good performance in predicting seasonal drought development, despite its weakness in predicting drought severity. Overall, the model can be a worthy reference for seasonal water resource management in China.
Specifications of Standards in Systems and Synthetic Biology.
Schreiber, Falk; Bader, Gary D; Golebiewski, Martin; Hucka, Michael; Kormeier, Benjamin; Le Novère, Nicolas; Myers, Chris; Nickerson, David; Sommer, Björn; Waltemath, Dagmar; Weise, Stephan
2015-09-04
Standards shape our everyday life. From nuts and bolts to electronic devices and technological processes, standardised products and processes are all around us. Standards have technological and economic benefits, such as making information exchange, production, and services more efficient. However, novel, innovative areas often either lack proper standards, or documents about standards in these areas are not available from a centralised platform or formal body (such as the International Standardisation Organisation). Systems and synthetic biology is a relatively novel area, and it is only in the last decade that the standardisation of data, information, and models related to systems and synthetic biology has become a community-wide effort. Several open standards have been established and are under continuous development as a community initiative. COMBINE, the ‘COmputational Modeling in BIology’ NEtwork has been established as an umbrella initiative to coordinate and promote the development of the various community standards and formats for computational models. There are yearly two meeting, HARMONY (Hackathons on Resources for Modeling in Biology), Hackathon-type meetings with a focus on development of the support for standards, and COMBINE forums, workshop-style events with oral presentations, discussion, poster, and breakout sessions for further developing the standards. For more information see http://co.mbine.org/. So far the different standards were published and made accessible through the standards’ web- pages or preprint services. The aim of this special issue is to provide a single, easily accessible and citable platform for the publication of standards in systems and synthetic biology. This special issue is intended to serve as a central access point to standards and related initiatives in systems and synthetic biology, it will be published annually to provide an opportunity for standard development groups to communicate updated specifications.
A Standard Kinematic Model for Flight Simulation at NASA Ames
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1975-01-01
A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.
The accuracy of ultrashort echo time MRI sequences for medical additive manufacturing.
van Eijnatten, Maureen; Rijkhorst, Erik-Jan; Hofman, Mark; Forouzanfar, Tymour; Wolff, Jan
2016-01-01
Additively manufactured bone models, implants and drill guides are becoming increasingly popular amongst maxillofacial surgeons and dentists. To date, such constructs are commonly manufactured using CT technology that induces ionizing radiation. Recently, ultrashort echo time (UTE) MRI sequences have been developed that allow radiation-free imaging of facial bones. The aim of the present study was to assess the feasibility of UTE MRI sequences for medical additive manufacturing (AM). Three morphologically different dry human mandibles were scanned using a CT and MRI scanner. Additionally, optical scans of all three mandibles were made to acquire a "gold standard". All CT and MRI scans were converted into Standard Tessellation Language (STL) models and geometrically compared with the gold standard. To quantify the accuracy of the AM process, the CT, MRI and gold-standard STL models of one of the mandibles were additively manufactured, optically scanned and compared with the original gold-standard STL model. Geometric differences between all three CT-derived STL models and the gold standard were <1.0 mm. All three MRI-derived STL models generally presented deviations <1.5 mm in the symphyseal and mandibular area. The AM process introduced minor deviations of <0.5 mm. This study demonstrates that MRI using UTE sequences is a feasible alternative to CT in generating STL models of the mandible and would therefore be suitable for surgical planning and AM. Further in vivo studies are necessary to assess the usability of UTE MRI sequences in clinical settings.
Fox Valley Technical College Quality First Process Model.
ERIC Educational Resources Information Center
Fox Valley Technical Coll., Appleton, WI.
An overview is provided of the Quality First Process Model developed by Fox Valley Technical College (FVTC), Wisconsin, to provide guidelines for quality instruction and service consistent with the highest educational standards. The 16-step model involves activities that should be adaptable to any organization. The steps of the quality model are…
Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.
ERIC Educational Resources Information Center
Pawlowski, Jan M.
Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…
Standard model light-by-light scattering in SANC: Analytic and numeric evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.r
2010-11-15
The implementation of the Standard Model process {gamma}{gamma} {yields} {gamma}{gamma} through a fermion and boson loop into the framework of SANC system and additional precomputation modules used for calculation of massive box diagrams are described. The computation of this process takes into account nonzero mass of loop particles. The covariant and helicity amplitudes for this process, some particular cases of D{sub 0} and C{sub 0} Passarino-Veltman functions, and also numerical results of corresponding SANC module evaluation are presented. Whenever possible, the results are compared with those existing in the literature.
Template for success: using a resident-designed sign-out template in the handover of patient care.
Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P
2011-01-01
Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Bai, Yu; Katahira, Kentaro; Ohira, Hideki
2014-01-01
Humans are capable of correcting their actions based on actions performed in the past, and this ability enables them to adapt to a changing environment. The computational field of reinforcement learning (RL) has provided a powerful explanation for understanding such processes. Recently, the dual learning system, modeled as a hybrid model that incorporates value update based on reward-prediction error and learning rate modulation based on the surprise signal, has gained attention as a model for explaining various neural signals. However, the functional significance of the hybrid model has not been established. In the present study, we used computer simulation in a reversal learning task to address functional significance in a probabilistic reversal learning task. The hybrid model was found to perform better than the standard RL model in a large parameter setting. These results suggest that the hybrid model is more robust against the mistuning of parameters compared with the standard RL model when decision-makers continue to learn stimulus-reward contingencies, which can create abrupt changes. The parameter fitting results also indicated that the hybrid model fit better than the standard RL model for more than 50% of the participants, which suggests that the hybrid model has more explanatory power for the behavioral data than the standard RL model. PMID:25161635
NASA Astrophysics Data System (ADS)
Jöckel, P.; Sander, R.; Kerkweg, A.; Tost, H.; Lelieveld, J.
2005-02-01
The development of a comprehensive Earth System Model (ESM) to study the interactions between chemical, physical, and biological processes, requires coupling of the different domains (land, ocean, atmosphere, ...). One strategy is to link existing domain-specific models with a universal coupler, i.e. an independent standalone program organizing the communication between other programs. In many cases, however, a much simpler approach is more feasible. We have developed the Modular Earth Submodel System (MESSy). It comprises (1) a modular interface structure to connect to a , (2) an extendable set of such for miscellaneous processes, and (3) a coding standard. MESSy is therefore not a coupler in the classical sense, but exchanges data between a and several within one comprehensive executable. The internal complexity of the is controllable in a transparent and user friendly way. This provides remarkable new possibilities to study feedback mechanisms (by two-way coupling). Note that the MESSy and the coupler approach can be combined. For instance, an atmospheric model implemented according to the MESSy standard could easily be coupled to an ocean model by means of an external coupler. The vision is to ultimately form a comprehensive ESM which includes a large set of submodels, and a base model which contains only a central clock and runtime control. This can be reached stepwise, since each process can be included independently. Starting from an existing model, process submodels can be reimplemented according to the MESSy standard. This procedure guarantees the availability of a state-of-the-art model for scientific applications at any time of the development. In principle, MESSy can be implemented into any kind of model, either global or regional. So far, the MESSy concept has been applied to the general circulation model ECHAM5 and a number of process boxmodels.
Maximizing your Process Improvement ROI through Harmonization
2008-03-01
ISO 12207 ) provide comprehensive guidance on what system and software engineering processes are needed. The frameworks of Six Sigma provide specific...reductions. Their veloci-Q Enterprise integrated system, includes ISO 9001, CMM, P-CMM, TL9000, British Standard 7799, and Six Sigma. They estimate a 30...at their discretion. And, they chose to blend process maturity models and ISO standards to support their objective regarding the establishment of
Building Dynamic Conceptual Physics Understanding
ERIC Educational Resources Information Center
Trout, Charlotte; Sinex, Scott A.; Ragan, Susan
2011-01-01
Models are essential to the learning and doing of science, and systems thinking is key to appreciating many environmental issues. The National Science Education Standards include models and systems in their unifying concepts and processes standard, while the AAAS Benchmarks include them in their common themes chapter. Hyerle and Marzano argue for…
Travaux Neuchatelois de Linguistique (TRANEL) (Neuchatel Working Papers in Linguistics), Volume 14.
ERIC Educational Resources Information Center
Py, Bernard, Ed.; Rubattel, Christian, Ed.
1989-01-01
Three papers in linguistics, all in French, are presented. "La delocutivite lexicale en francais standard: esquisse d'un modele derivationnel" ("Lexical Delocutivity in Standard French: Sketch of a Derivational Model"), by Marc Bonhomme, examines the process by which certain expressions become neologisms. "La terminologie…
A comparison of BPMN 2.0 with other notations for manufacturing processes
NASA Astrophysics Data System (ADS)
García-Domínguez, A.; Marcos, Mariano; Medina, I.
2012-04-01
In order to study their current practices and improve on them, manufacturing firms need to view their processes from several viewpoints at various abstraction levels. Several notations have been developed for this purpose, such as Value Stream Mappings or IDEF models. More recently, the BPMN 2.0 standard from the Object Management Group has been proposed for modeling business processes. A process organizes several activities (manual or automatic) into a single higher-level entity, which can be reused elsewhere in the organization. Its potential for standardizing business interactions is well-known, but there is little work on using BPMN 2.0 to model manufacturing processes. In this work some of the previous notations are outlined and BPMN 2.0 is positioned among them after discussing it in more depth. Some guidelines on using BPMN 2.0 for manufacturing are offered, and its advantages and disadvantages in comparison with the other notations are presented.
Acceleration of Linear Finite-Difference Poisson-Boltzmann Methods on Graphics Processing Units.
Qi, Ruxi; Botello-Smith, Wesley M; Luo, Ray
2017-07-11
Electrostatic interactions play crucial roles in biophysical processes such as protein folding and molecular recognition. Poisson-Boltzmann equation (PBE)-based models have emerged as widely used in modeling these important processes. Though great efforts have been put into developing efficient PBE numerical models, challenges still remain due to the high dimensionality of typical biomolecular systems. In this study, we implemented and analyzed commonly used linear PBE solvers for the ever-improving graphics processing units (GPU) for biomolecular simulations, including both standard and preconditioned conjugate gradient (CG) solvers with several alternative preconditioners. Our implementation utilizes the standard Nvidia CUDA libraries cuSPARSE, cuBLAS, and CUSP. Extensive tests show that good numerical accuracy can be achieved given that the single precision is often used for numerical applications on GPU platforms. The optimal GPU performance was observed with the Jacobi-preconditioned CG solver, with a significant speedup over standard CG solver on CPU in our diversified test cases. Our analysis further shows that different matrix storage formats also considerably affect the efficiency of different linear PBE solvers on GPU, with the diagonal format best suited for our standard finite-difference linear systems. Further efficiency may be possible with matrix-free operations and integrated grid stencil setup specifically tailored for the banded matrices in PBE-specific linear systems.
Light dark matter through assisted annihilation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dey, Ujjal Kumar; Maity, Tarak Nath; Ray, Tirtha Sankar, E-mail: ujjal@cts.iitkgp.ernet.in, E-mail: tarak.maity.physics@gmail.com, E-mail: tirthasankar.ray@gmail.com
2017-03-01
In this paper we investigate light dark matter scenarios where annihilation to Standard Model particles at tree-level is kinematically forbidden. In such cases annihilation can be aided by massive Standard Model-like species, called assisters , in the initial state that enhances the available phase space opening up novel tree-level processes. We investigate the feasibility of such non-standard assisted annihilation processes to reproduce the observed relic density of dark matter. We present a simple scalar dark matter-scalar assister model where this is realised. We find that if the dark matter and assister are relatively degenerate the required relic density can bemore » achieved for a keV-MeV scale dark matter. We briefly discuss the cosmological constraints on such dark matter scenarios.« less
Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs
NASA Astrophysics Data System (ADS)
Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.
2006-12-01
An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.
Drell-Yan process as an avenue to test a noncommutative standard model at the Large Hadron Collider
NASA Astrophysics Data System (ADS)
J, Selvaganapathy; Das, Prasanta Kumar; Konar, Partha
2016-06-01
We study the Drell-Yan process at the Large Hadron Collider in the presence of the noncommutative extension of the standard model. Using the Seiberg-Witten map, we calculate the production cross section to first order in the noncommutative parameter Θμ ν . Although this idea has been evolving for a long time, only a limited amount of phenomenological analysis has been completed, and this was mostly in the context of the linear collider. An outstanding feature from this nonminimal noncommutative standard model not only modifies the couplings over the SM production channel but also allows additional nonstandard vertices which can play a significant role. Hence, in the Drell-Yan process, as studied in the present analysis, one also needs to account for the gluon fusion process at the tree level. Some of the characteristic signatures, such as oscillatory azimuthal distributions, are an outcome of the momentum-dependent effective couplings. We explore the noncommutative scale ΛNC≥0.4 TeV , considering different machine energy ranging from 7 to 13 TeV.
[Numerical simulation and operation optimization of biological filter].
Zou, Zong-Sen; Shi, Han-Chang; Chen, Xiang-Qiang; Xie, Xiao-Qing
2014-12-01
BioWin software and two sensitivity analysis methods were used to simulate the Denitrification Biological Filter (DNBF) + Biological Aerated Filter (BAF) process in Yuandang Wastewater Treatment Plant. Based on the BioWin model of DNBF + BAF process, the operation data of September 2013 were used for sensitivity analysis and model calibration, and the operation data of October 2013 were used for model validation. The results indicated that the calibrated model could accurately simulate practical DNBF + BAF processes, and the most sensitive parameters were the parameters related to biofilm, OHOs and aeration. After the validation and calibration of model, it was used for process optimization with simulating operation results under different conditions. The results showed that, the best operation condition for discharge standard B was: reflux ratio = 50%, ceasing methanol addition, influent C/N = 4.43; while the best operation condition for discharge standard A was: reflux ratio = 50%, influent COD = 155 mg x L(-1) after methanol addition, influent C/N = 5.10.
Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450
Falat, Lukas; Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.
2012-07-01
Information Modeling ( BIM ) is the process of generating and managing building data during a facility’s entire life cycle. New BIM standards for...cycle Building Information Modeling ( BIM ) as a new standard for building information data repositories can serve as the foun- dation for automation and... Building Information Modeling ( BIM ) is defined as “a digital representa- tion of physical and functional
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
NASA Astrophysics Data System (ADS)
Thober, S.; Cuntz, M.; Mai, J.; Samaniego, L. E.; Clark, M. P.; Branch, O.; Wulfmeyer, V.; Attinger, S.
2016-12-01
Land surface models incorporate a large number of processes, described by physical, chemical and empirical equations. The agility of the models to react to different meteorological conditions is artificially constrained by having hard-coded parameters in their equations. Here we searched for hard-coded parameters in the computer code of the land surface model Noah with multiple process options (Noah-MP) to assess the model's agility during parameter estimation. We found 139 hard-coded values in all Noah-MP process options in addition to the 71 standard parameters. We performed a Sobol' global sensitivity analysis to variations of the standard and hard-coded parameters. The sensitivities of the hydrologic output fluxes latent heat and total runoff, their component fluxes, as well as photosynthesis and sensible heat were evaluated at twelve catchments of the Eastern United States with very different hydro-meteorological regimes. Noah-MP's output fluxes are sensitive to two thirds of its standard parameters. The most sensitive parameter is, however, a hard-coded value in the formulation of soil surface resistance for evaporation, which proved to be oversensitive in other land surface models as well. Latent heat and total runoff show very similar sensitivities towards standard and hard-coded parameters. They are sensitive to both soil and plant parameters, which means that model calibrations of hydrologic or land surface models should take both soil and plant parameters into account. Sensible and latent heat exhibit almost the same sensitivities so that calibration or sensitivity analysis can be performed with either of the two. Photosynthesis has almost the same sensitivities as transpiration, which are different from the sensitivities of latent heat. Including photosynthesis and latent heat in model calibration might therefore be beneficial. Surface runoff is sensitive to almost all hard-coded snow parameters. These sensitivities get, however, diminished in total runoff. It is thus recommended to include the most sensitive hard-coded model parameters that were exposed in this study when calibrating Noah-MP.
NASA Standard for Models and Simulations: Credibility Assessment Scale
NASA Technical Reports Server (NTRS)
Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody
2009-01-01
As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.
CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3
2012-06-01
OMG) standard Business Process Modeling and Nota- tion ( BPMN ) [6] graphical notation. I will address each of these: identify and document steps...to a value stream map using BPMN and textual process narratives. The resulting process narratives or process metadata includes key information...objectives. Once the processes are identified we can graphically document them capturing the process using BPMN (see Figure 1). The BPMN models
Molenaar, Peter C M
2008-01-01
It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.
Chatrchyan, S; Khachatryan, V; Sirunyan, A M; Tumasyan, A; Adam, W; Aguilo, E; Bergauer, T; Dragicevic, M; Erö, J; Fabjan, C; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hörmann, N; Hrubec, J; Jeitler, M; Kiesenhofer, W; Knünz, V; Krammer, M; Krätschmer, I; Liko, D; Mikulec, I; Pernicka, M; Rahbaran, B; Rohringer, C; Rohringer, H; Schöfbeck, R; Strauss, J; Taurok, A; Waltenberger, W; Walzel, G; Widl, E; Wulz, C-E; Mossolov, V; Shumeiko, N; Suarez Gonzalez, J; Bansal, M; Bansal, S; Cornelis, T; De Wolf, E A; Janssen, X; Luyckx, S; Mucibello, L; Ochesanu, S; Roland, B; Rougny, R; Selvaggi, M; Staykova, Z; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Van Spilbeeck, A; Blekman, F; Blyweert, S; D'Hondt, J; Gonzalez Suarez, R; Kalogeropoulos, A; Maes, M; Olbrechts, A; Van Doninck, W; Van Mulders, P; Van Onsem, G P; Villella, I; Clerbaux, B; De Lentdecker, G; Dero, V; Gay, A P R; Hreus, T; Léonard, A; Marage, P E; Mohammadi, A; Reis, T; Thomas, L; Vander Marcken, G; Vander Velde, C; Vanlaer, P; Wang, J; Adler, V; Beernaert, K; Cimmino, A; Costantini, S; Garcia, G; Grunewald, M; Klein, B; Lellouch, J; Marinov, A; Mccartin, J; Ocampo Rios, A A; Ryckbosch, D; Strobbe, N; Thyssen, F; Tytgat, M; Verwilligen, P; Walsh, S; Yazgan, E; Zaganidis, N; Basegmez, S; Bruno, G; Castello, R; Ceard, L; Delaere, C; du Pree, T; Favart, D; Forthomme, L; Giammanco, A; Hollar, J; Lemaitre, V; Liao, J; Militaru, O; Nuttens, C; Pagano, D; Pin, A; Piotrzkowski, K; Schul, N; Vizan Garcia, J M; Beliy, N; Caebergs, T; Daubie, E; Hammad, G H; Alves, G A; Correa Martins Junior, M; De Jesus Damiao, D; Martins, T; Pol, M E; Souza, M H G; Aldá Júnior, W L; Carvalho, W; Custódio, A; Da Costa, E M; De Oliveira Martins, C; Fonseca De Souza, S; Matos Figueiredo, D; Mundim, L; Nogima, H; Oguri, V; Prado Da Silva, W L; Santoro, A; Soares Jorge, L; Sznajder, A; Anjos, T S; Bernardes, C A; Dias, F A; Fernandez Perez Tomei, T R; Gregores, E M; Lagana, C; Marinho, F; Mercadante, P G; Novaes, S F; Padula, Sandra S; Genchev, V; Iaydjiev, P; Piperov, S; Rodozov, M; Stoykova, S; Sultanov, G; Tcholakov, V; Trayanov, R; Vutova, M; Dimitrov, A; Hadjiiska, R; Kozhuharov, V; Litov, L; Pavlov, B; Petkov, P; Bian, J G; Chen, G M; Chen, H S; Jiang, C H; Liang, D; Liang, S; Meng, X; Tao, J; Wang, J; Wang, X; Wang, Z; Xiao, H; Xu, M; Zang, J; Zhang, Z; Asawatangtrakuldee, C; Ban, Y; Guo, Y; Li, W; Liu, S; Mao, Y; Qian, S J; Teng, H; Wang, D; Zhang, L; Zou, W; Avila, C; Gomez, J P; Gomez Moreno, B; Osorio Oliveros, A F; Sanabria, J C; Godinovic, N; Lelas, D; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Kovac, M; Brigljevic, V; Duric, S; Kadija, K; Luetic, J; Morovic, S; Attikis, A; Galanti, M; Mavromanolakis, G; Mousa, J; Nicolaou, C; Ptochos, F; Razis, P A; Finger, M; Finger, M; Assran, Y; Elgammal, S; Ellithi Kamel, A; Mahmoud, M A; Radi, A; Kadastik, M; Müntel, M; Raidal, M; Rebane, L; Tiko, A; Eerola, P; Fedi, G; Voutilainen, M; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Peltola, T; Tuominen, E; Tuominiemi, J; Tuovinen, E; Ungaro, D; Wendland, L; Banzuzi, K; Karjalainen, A; Korpela, A; Tuuva, T; Besancon, M; Choudhury, S; Dejardin, M; Denegri, D; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Locci, E; Malcles, J; Millischer, L; Nayak, A; Rander, J; Rosowsky, A; Shreyber, I; Titov, M; Baffioni, S; Beaudette, F; Benhabib, L; Bianchini, L; Bluj, M; Broutin, C; Busson, P; Charlot, C; Daci, N; Dahms, T; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Mironov, C; Naranjo, I N; Nguyen, M; Ochando, C; Paganini, P; Sabes, D; Salerno, R; Sirois, Y; Veelken, C; Zabi, A; Agram, J-L; Andrea, J; Bloch, D; Bodin, D; Brom, J-M; Cardaci, M; Chabert, E C; Collard, C; Conte, E; Drouhin, F; Ferro, C; Fontaine, J-C; Gelé, D; Goerlach, U; Juillot, P; Le Bihan, A-C; Van Hove, P; Fassi, F; Mercier, D; Beauceron, S; Beaupere, N; Bondu, O; Boudoul, G; Chasserat, J; Chierici, R; Contardo, D; Depasse, P; El Mamouni, H; Fay, J; Gascon, S; Gouzevitch, M; Ille, B; Kurca, T; Lethuillier, M; Mirabito, L; Perries, S; Sgandurra, L; Sordini, V; Tschudi, Y; Verdier, P; Viret, S; Tsamalaidze, Z; Anagnostou, G; Autermann, C; Beranek, S; Edelhoff, M; Feld, L; Heracleous, N; Hindrichs, O; Jussen, R; Klein, K; Merz, J; Ostapchuk, A; Perieanu, A; Raupach, F; Sammet, J; Schael, S; Sprenger, D; Weber, H; Wittmer, B; Zhukov, V; Ata, M; Caudron, J; Dietz-Laursonn, E; Duchardt, D; Erdmann, M; Fischer, R; Güth, A; Hebbeker, T; Heidemann, C; Hoepfner, K; Klingebiel, D; Kreuzer, P; Merschmeyer, M; Meyer, A; Olschewski, M; Papacz, P; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Steggemann, J; Teyssier, D; Weber, M; Bontenackels, M; Cherepanov, V; Erdogan, Y; Flügge, G; Geenen, H; Geisler, M; Haj Ahmad, W; Hoehle, F; Kargoll, B; Kress, T; Kuessel, Y; Lingemann, J; Nowack, A; Perchalla, L; Pooth, O; Sauerland, P; Stahl, A; Aldaya Martin, M; Behr, J; Behrenhoff, W; Behrens, U; Bergholz, M; Bethani, A; Borras, K; Burgmeier, A; Cakir, A; Calligaris, L; Campbell, A; Castro, E; Costanza, F; Dammann, D; Diez Pardos, C; Eckerlin, G; Eckstein, D; Flucke, G; Geiser, A; Glushkov, I; Gunnellini, P; Habib, S; Hauk, J; Hellwig, G; Jung, H; Kasemann, M; Katsas, P; Kleinwort, C; Kluge, H; Knutsson, A; Krämer, M; Krücker, D; Kuznetsova, E; Lange, W; Lohmann, W; Lutz, B; Mankel, R; Marfin, I; Marienfeld, M; Melzer-Pellmann, I-A; Meyer, A B; Mnich, J; Mussgiller, A; Naumann-Emme, S; Novgorodova, O; Olzem, J; Perrey, H; Petrukhin, A; Pitzl, D; Raspereza, A; Ribeiro Cipriano, P M; Riedl, C; Ron, E; Rosin, M; Salfeld-Nebgen, J; Schmidt, R; Schoerner-Sadenius, T; Sen, N; Spiridonov, A; Stein, M; Walsh, R; Wissing, C; Blobel, V; Draeger, J; Enderle, H; Erfle, J; Gebbert, U; Görner, M; Hermanns, T; Höing, R S; Kaschube, K; Kaussen, G; Kirschenmann, H; Klanner, R; Lange, J; Mura, B; Nowak, F; Peiffer, T; Pietsch, N; Rathjens, D; Sander, C; Schettler, H; Schleper, P; Schlieckau, E; Schmidt, A; Schröder, M; Schum, T; Seidel, M; Sola, V; Stadie, H; Steinbrück, G; Thomsen, J; Vanelderen, L; Barth, C; Berger, J; Böser, C; Chwalek, T; De Boer, W; Descroix, A; Dierlamm, A; Feindt, M; Guthoff, M; Hackstein, C; Hartmann, F; Hauth, T; Heinrich, M; Held, H; Hoffmann, K H; Honc, S; Katkov, I; Komaragiri, J R; Lobelle Pardo, P; Martschei, D; Mueller, S; Müller, Th; Niegel, M; Nürnberg, A; Oberst, O; Oehler, A; Ott, J; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Röcker, S; Scheurer, A; Schilling, F-P; Schott, G; Simonis, H J; Stober, F M; Troendle, D; Ulrich, R; Wagner-Kuhr, J; Wayand, S; Weiler, T; Zeise, M; Daskalakis, G; Geralis, T; Kesisoglou, S; Kyriakis, A; Loukas, D; Manolakos, I; Markou, A; Markou, C; Mavrommatis, C; Ntomari, E; Gouskos, L; Mertzimekis, T J; Panagiotou, A; Saoulidou, N; Evangelou, I; Foudas, C; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Bencze, G; Hajdu, C; Hidas, P; Horvath, D; Sikler, F; Veszpremi, V; Vesztergombi, G; Beni, N; Czellar, S; Molnar, J; Palinkas, J; Szillasi, Z; Karancsi, J; Raics, P; Trocsanyi, Z L; Ujvari, B; Beri, S B; Bhatnagar, V; Dhingra, N; Gupta, R; Kaur, M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, J B; Kumar, Ashok; Kumar, Arun; Ahuja, S; Bhardwaj, A; Choudhary, B C; Malhotra, S; Naimuddin, M; Ranjan, K; Sharma, V; Shivpuri, R K; Banerjee, S; Bhattacharya, S; Dutta, S; Gomber, B; Jain, Sa; Jain, Sh; Khurana, R; Sarkar, S; Sharan, M; Abdulsalam, A; Choudhury, R K; Dutta, D; Kailas, S; Kumar, V; Mehta, P; Mohanty, A K; Pant, L M; Shukla, P; Aziz, T; Ganguly, S; Guchait, M; Maity, M; Majumder, G; Mazumdar, K; Mohanty, G B; Parida, B; Sudhakar, K; Wickramage, N; Banerjee, S; Dugad, S; Arfaei, H; Bakhshiansohi, H; Etesami, S M; Fahim, A; Hashemi, M; Hesari, H; Jafari, A; Khakzad, M; Mohammadi Najafabadi, M; Paktinat Mehdiabadi, S; Safarzadeh, B; Zeinali, M; Abbrescia, M; Barbone, L; Calabria, C; Chhibra, S S; Colaleo, A; Creanza, D; De Filippis, N; De Palma, M; Fiore, L; Iaselli, G; Lusito, L; Maggi, G; Maggi, M; Marangelli, B; My, S; Nuzzo, S; Pacifico, N; Pompili, A; Pugliese, G; Selvaggi, G; Silvestris, L; Singh, G; Venditti, R; Zito, G; Abbiendi, G; Benvenuti, A C; Bonacorsi, D; Braibant-Giacomelli, S; Brigliadori, L; Capiluppi, P; Castro, A; Cavallo, F R; Cuffiani, M; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Grandi, C; Guiducci, L; Marcellini, S; Masetti, G; Meneghelli, M; Montanari, A; Navarria, F L; Odorici, F; Perrotta, A; Primavera, F; Rossi, A M; Rovelli, T; Siroli, G P; Travaglini, R; Albergo, S; Cappello, G; Chiorboli, M; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Gonzi, S; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bianco, S; Colafranceschi, S; Fabbri, F; Piccolo, D; Fabbricatore, P; Musenich, R; Tosi, S; Benaglia, A; De Guio, F; Di Matteo, L; Fiorendi, S; Gennai, S; Ghezzi, A; Malvezzi, S; Manzoni, R A; Martelli, A; Massironi, A; Menasce, D; Moroni, L; Paganoni, M; Pedrini, D; Ragazzi, S; Redaelli, N; Sala, S; Tabarelli de Fatis, T; Buontempo, S; Carrillo Montoya, C A; Cavallo, N; De Cosa, A; Dogangun, O; Fabozzi, F; Iorio, A O M; Lista, L; Meola, S; Merola, M; Paolucci, P; Azzi, P; Bacchetta, N; Bisello, D; Branca, A; Carlin, R; Checchia, P; Dorigo, T; Gasparini, F; Gasparini, U; Gozzelino, A; Kanishchev, K; Lacaprara, S; Lazzizzera, I; Margoni, M; Meneguzzo, A T; Pazzini, J; Pozzobon, N; Ronchese, P; Simonetto, F; Torassa, E; Tosi, M; Vanini, S; Zotto, P; Zucchetta, A; Zumerle, G; Gabusi, M; Ratti, S P; Riccardi, C; Torre, P; Vitulo, P; Biasini, M; Bilei, G M; Fanò, L; Lariccia, P; Lucaroni, A; Mantovani, G; Menichelli, M; Nappi, A; Romeo, F; Saha, A; Santocchia, A; Spiezia, A; Taroni, S; Azzurri, P; Bagliesi, G; Bernardini, J; Boccali, T; Broccolo, G; Castaldi, R; D'Agnolo, R T; Dell'Orso, R; Fiori, F; Foà, L; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Martini, L; Messineo, A; Palla, F; Rizzi, A; Serban, A T; Spagnolo, P; Squillacioti, P; Tenchini, R; Tonelli, G; Venturi, A; Verdini, P G; Barone, L; Cavallari, F; Del Re, D; Diemoz, M; Fanelli, C; Grassi, M; Longo, E; Meridiani, P; Micheli, F; Nourbakhsh, S; Organtini, G; Paramatti, R; Rahatlou, S; Sigamani, M; Soffi, L; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Cartiglia, N; Costa, M; Dellacasa, G; Demaria, N; Mariotti, C; Maselli, S; Migliore, E; Monaco, V; Musich, M; Obertino, M M; Pastrone, N; Pelliccioni, M; Potenza, A; Romero, A; Sacchi, R; Solano, A; Staiano, A; Vilela Pereira, A; Belforte, S; Candelise, V; Casarsa, M; Cossutti, F; Della Ricca, G; Gobbo, B; Marone, M; Montanino, D; Penzo, A; Schizzi, A; Heo, S G; Kim, T Y; Nam, S K; Chang, S; Kim, D H; Kim, G N; Kong, D J; Park, H; Ro, S R; Son, D C; Son, T; Kim, J Y; Kim, Zero J; Song, S; Choi, S; Gyun, D; Hong, B; Jo, M; Kim, H; Kim, T J; Lee, K S; Moon, D H; Park, S K; Choi, M; Kim, J H; Park, C; Park, I C; Park, S; Ryu, G; Cho, Y; Choi, Y; Choi, Y K; Goh, J; Kim, M S; Kwon, E; Lee, B; Lee, J; Lee, S; Seo, H; Yu, I; Bilinskas, M J; Grigelionis, I; Janulis, M; Juodagalvis, A; Castilla-Valdez, H; De La Cruz-Burelo, E; Heredia-de La Cruz, I; Lopez-Fernandez, R; Magaña Villalba, R; Martínez-Ortega, J; Sanchez-Hernandez, A; Villasenor-Cendejas, L M; Carrillo Moreno, S; Vazquez Valencia, F; Salazar Ibarguen, H A; Casimiro Linares, E; Morelos Pineda, A; Reyes-Santos, M A; Krofcheck, D; Bell, A J; Butler, P H; Doesburg, R; Reucroft, S; Silverwood, H; Ahmad, M; Ansari, M H; Asghar, M I; Hoorani, H R; Khalid, S; Khan, W A; Khurshid, T; Qazi, S; Shah, M A; Shoaib, M; Bialkowska, H; Boimska, B; Frueboes, T; Gokieli, R; Górski, M; Kazana, M; Nawrocki, K; Romanowska-Rybinska, K; Szleper, M; Wrochna, G; Zalewski, P; Brona, G; Bunkowski, K; Cwiok, M; Dominik, W; Doroba, K; Kalinowski, A; Konecki, M; Krolikowski, J; Almeida, N; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Gallinaro, M; Seixas, J; Varela, J; Vischia, P; Bunin, P; Gavrilenko, M; Golutvin, I; Gorbunov, I; Karjavin, V; Konoplyanikov, V; Kozlov, G; Lanev, A; Malakhov, A; Moisenz, P; Palichik, V; Perelygin, V; Savina, M; Shmatov, S; Smirnov, V; Volodko, A; Zarubin, A; Evstyukhin, S; Golovtsov, V; Ivanov, Y; Kim, V; Levchenko, P; Murzin, V; Oreshkin, V; Smirnov, I; Sulimov, V; Uvarov, L; Vavilov, S; Vorobyev, A; Vorobyev, An; Andreev, Yu; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Tlisov, D; Toropin, A; Epshteyn, V; Erofeeva, M; Gavrilov, V; Kossov, M; Lychkovskaya, N; Popov, V; Safronov, G; Semenov, S; Stolin, V; Vlasov, E; Zhokin, A; Andreev, V; Azarkin, M; Dremin, I; Kirakosyan, M; Leonidov, A; Mesyats, G; Rusakov, S V; Vinogradov, A; Belyaev, A; Boos, E; Bunichev, V; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Markina, A; Obraztsov, S; Perfilov, M; Petrushanko, S; Popov, A; Sarycheva, L; Savrin, V; Azhgirey, I; Bayshev, I; Bitioukov, S; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Petrov, V; Ryutin, R; Sobol, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Ekmedzic, M; Krpic, D; Milosevic, J; Aguilar-Benitez, M; Alcaraz Maestre, J; Arce, P; Battilana, C; Calvo, E; Cerrada, M; Chamizo Llatas, M; Colino, N; De La Cruz, B; Delgado Peris, A; Domínguez Vázquez, D; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Merino, G; Puerta Pelayo, J; Quintario Olmeda, A; Redondo, I; Romero, L; Santaolalla, J; Soares, M S; Willmott, C; Albajar, C; Codispoti, G; de Trocóniz, J F; Brun, H; Cuevas, J; Fernandez Menendez, J; Folgueras, S; Gonzalez Caballero, I; Lloret Iglesias, L; Piedra Gomez, J; Brochero Cifuentes, J A; Cabrillo, I J; Calderon, A; Chuang, S H; Duarte Campderros, J; Felcini, M; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Graziano, A; Jorda, C; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Matorras, F; Munoz Sanchez, F J; Rodrigo, T; Rodríguez-Marrero, A Y; Ruiz-Jimeno, A; Scodellaro, L; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Auffray, E; Auzinger, G; Bachtis, M; Baillon, P; Ball, A H; Barney, D; Benitez, J F; Bernet, C; Bianchi, G; Bloch, P; Bocci, A; Bonato, A; Botta, C; Breuker, H; Camporesi, T; Cerminara, G; Christiansen, T; Coarasa Perez, J A; D'Enterria, D; Dabrowski, A; De Roeck, A; Di Guida, S; Dobson, M; Dupont-Sagorin, N; Elliott-Peisert, A; Frisch, B; Funk, W; Georgiou, G; Giffels, M; Gigi, D; Gill, K; Giordano, D; Giunta, M; Glege, F; Gomez-Reino Garrido, R; Govoni, P; Gowdy, S; Guida, R; Hansen, M; Harris, P; Hartl, C; Harvey, J; Hegner, B; Hinzmann, A; Innocente, V; Janot, P; Kaadze, K; Karavakis, E; Kousouris, K; Lecoq, P; Lee, Y-J; Lenzi, P; Lourenço, C; Magini, N; Mäki, T; Malberti, M; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Mersi, S; Meschi, E; Moser, R; Mozer, M U; Mulders, M; Musella, P; Nesvold, E; Orimoto, T; Orsini, L; Palencia Cortezon, E; Perez, E; Perrozzi, L; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Piparo, D; Polese, G; Quertenmont, L; Racz, A; Reece, W; Rodrigues Antunes, J; Rolandi, G; Rovelli, C; Rovere, M; Sakulin, H; Santanastasio, F; Schäfer, C; Schwick, C; Segoni, I; Sekmen, S; Sharma, A; Siegrist, P; Silva, P; Simon, M; Sphicas, P; Spiga, D; Tsirou, A; Veres, G I; Vlimant, J R; Wöhri, H K; Worm, S D; Zeuner, W D; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Bäni, L; Bortignon, P; Buchmann, M A; Casal, B; Chanon, N; Deisher, A; Dissertori, G; Dittmar, M; Donegà, M; Dünser, M; Eugster, J; Freudenreich, K; Grab, C; Hits, D; Lecomte, P; Lustermann, W; Marini, A C; Martinez Ruiz Del Arbol, P; Mohr, N; Moortgat, F; Nägeli, C; Nef, P; Nessi-Tedaldi, F; Pandolfi, F; Pape, L; Pauss, F; Peruzzi, M; Ronga, F J; Rossini, M; Sala, L; Sanchez, A K; Starodumov, A; Stieger, B; Takahashi, M; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Urscheler, C; Wallny, R; Weber, H A; Wehrli, L; Amsler, C; Chiochia, V; De Visscher, S; Favaro, C; Ivova Rikova, M; Millan Mejias, B; Otiougova, P; Robmann, P; Snoek, H; Tupputi, S; Verzetti, M; Chang, Y H; Chen, K H; Kuo, C M; Li, S W; Lin, W; Liu, Z K; Lu, Y J; Mekterovic, D; Singh, A P; Volpe, R; Yu, S S; Bartalini, P; Chang, P; Chang, Y H; Chang, Y W; Chao, Y; Chen, K F; Dietz, C; Grundler, U; Hou, W-S; Hsiung, Y; Kao, K Y; Lei, Y J; Lu, R-S; Majumder, D; Petrakou, E; Shi, X; Shiu, J G; Tzeng, Y M; Wan, X; Wang, M; Asavapibhop, B; Srimanobhas, N; Adiguzel, A; Bakirci, M N; Cerci, S; Dozen, C; Dumanoglu, I; Eskut, E; Girgis, S; Gokbulut, G; Gurpinar, E; Hos, I; Kangal, E E; Karaman, T; Karapinar, G; Kayis Topaksu, A; Onengut, G; Ozdemir, K; Ozturk, S; Polatoz, A; Sogut, K; Sunar Cerci, D; Tali, B; Topakli, H; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilin, B; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Ocalan, K; Ozpineci, A; Serin, M; Sever, R; Surat, U E; Yalvac, M; Yildirim, E; Zeyrek, M; Gülmez, E; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Cankocak, K; Levchuk, L; Bostock, F; Brooke, J J; Clement, E; Cussans, D; Flacher, H; Frazier, R; Goldstein, J; Grimes, M; Heath, G P; Heath, H F; Kreczko, L; Metson, S; Newbold, D M; Nirunpong, K; Poll, A; Senkin, S; Smith, V J; Williams, T; Basso, L; Bell, K W; Belyaev, A; Brew, C; Brown, R M; Cockerill, D J A; Coughlan, J A; Harder, K; Harper, S; Jackson, J; Kennedy, B W; Olaiya, E; Petyt, D; Radburn-Smith, B C; Shepherd-Themistocleous, C H; Tomalin, I R; Womersley, W J; Bainbridge, R; Ball, G; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Cutajar, M; Dauncey, P; Davies, G; Della Negra, M; Ferguson, W; Fulcher, J; Futyan, D; Gilbert, A; Guneratne Bryer, A; Hall, G; Hatherell, Z; Hays, J; Iles, G; Jarvis, M; Karapostoli, G; Lyons, L; Magnan, A-M; Marrouche, J; Mathias, B; Nandi, R; Nash, J; Nikitenko, A; Papageorgiou, A; Pela, J; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rogerson, S; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sparrow, A; Stoye, M; Tapper, A; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardle, N; Whyntie, T; Chadwick, M; Cole, J E; Hobson, P R; Khan, A; Kyberd, P; Leggat, D; Leslie, D; Martin, W; Reid, I D; Symonds, P; Teodorescu, L; Turner, M; Hatakeyama, K; Liu, H; Scarborough, T; Charaf, O; Henderson, C; Rumerio, P; Avetisyan, A; Bose, T; Fantasia, C; Heister, A; Lawson, P; Lazic, D; Rohlf, J; Sperka, D; St John, J; Sulak, L; Alimena, J; Bhattacharya, S; Cutts, D; Ferapontov, A; Heintz, U; Jabeen, S; Kukartsev, G; Laird, E; Landsberg, G; Luk, M; Narain, M; Nguyen, D; Segala, M; Sinthuprasith, T; Speer, T; Tsang, K V; Breedon, R; Breto, G; Calderon De La Barca Sanchez, M; Chauhan, S; Chertok, M; Conway, J; Conway, R; Cox, P T; Dolen, J; Erbacher, R; Gardner, M; Houtz, R; Ko, W; Kopecky, A; Lander, R; Miceli, T; Pellett, D; Ricci-Tam, F; Rutherford, B; Searle, M; Smith, J; Squires, M; Tripathi, M; Vasquez Sierra, R; Andreev, V; Cline, D; Cousins, R; Duris, J; Erhan, S; Everaerts, P; Farrell, C; Hauser, J; Ignatenko, M; Jarvis, C; Plager, C; Rakness, G; Schlein, P; Traczyk, P; Valuev, V; Weber, M; Babb, J; Clare, R; Dinardo, M E; Ellison, J; Gary, J W; Giordano, F; Hanson, G; Jeng, G Y; Liu, H; Long, O R; Luthra, A; Nguyen, H; Paramesvaran, S; Sturdy, J; Sumowidagdo, S; Wilken, R; Wimpenny, S; Andrews, W; Branson, J G; Cerati, G B; Cittolin, S; Evans, D; Golf, F; Holzner, A; Kelley, R; Lebourgeois, M; Letts, J; Macneill, I; Mangano, B; Padhi, S; Palmer, C; Petrucciani, G; Pieri, M; Sani, M; Sharma, V; Simon, S; Sudano, E; Tadel, M; Tu, Y; Vartak, A; Wasserbaech, S; Würthwein, F; Yagil, A; Yoo, J; Barge, D; Bellan, R; Campagnari, C; D'Alfonso, M; Danielson, T; Flowers, K; Geffert, P; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lowette, S; Mccoll, N; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; West, C; Apresyan, A; Bornheim, A; Chen, Y; Di Marco, E; Duarte, J; Gataullin, M; Ma, Y; Mott, A; Newman, H B; Rogan, C; Spiropulu, M; Timciuc, V; Veverka, J; Wilkinson, R; Xie, S; Yang, Y; Zhu, R Y; Akgun, B; Azzolini, V; Calamba, A; Carroll, R; Ferguson, T; Iiyama, Y; Jang, D W; Liu, Y F; Paulini, M; Vogel, H; Vorobiev, I; Cumalat, J P; Drell, B R; Edelmaier, C J; Ford, W T; Gaz, A; Heyburn, B; Luiggi Lopez, E; Smith, J G; Stenson, K; Ulmer, K A; Wagner, S R; Alexander, J; Chatterjee, A; Eggert, N; Gibbons, L K; Heltsley, B; Khukhunaishvili, A; Kreis, B; Mirman, N; Nicolas Kaufman, G; Patterson, J R; Ryd, A; Salvati, E; Sun, W; Teo, W D; Thom, J; Thompson, J; Tucker, J; Vaughan, J; Weng, Y; Winstrom, L; Wittich, P; Winn, D; Abdullin, S; Albrow, M; Anderson, J; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Bloch, I; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Elvira, V D; Fisk, I; Freeman, J; Gao, Y; Green, D; Gutsche, O; Hanlon, J; Harris, R M; Hirschauer, J; Hooberman, B; Jindariani, S; Johnson, M; Joshi, U; Kilminster, B; Klima, B; Kunori, S; Kwan, S; Leonidopoulos, C; Linacre, J; Lincoln, D; Lipton, R; Lykken, J; Maeshima, K; Marraffino, J M; Maruyama, S; Mason, D; McBride, P; Mishra, K; Mrenna, S; Musienko, Y; Newman-Holmes, C; O'Dell, V; Prokofyev, O; Sexton-Kennedy, E; Sharma, S; Spalding, W J; Spiegel, L; Tan, P; Taylor, L; Tkaczyk, S; Tran, N V; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wu, W; Yang, F; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Bourilkov, D; Chen, M; Cheng, T; Das, S; De Gruttola, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fisher, M; Fu, Y; Furic, I K; Gartner, J; Hugon, J; Kim, B; Konigsberg, J; Korytov, A; Kropivnitskaya, A; Kypreos, T; Low, J F; Matchev, K; Milenovic, P; Mitselmakher, G; Muniz, L; Park, M; Remington, R; Rinkevicius, A; Sellers, P; Skhirtladze, N; Snowball, M; Yelton, J; Zakaria, M; Gaultney, V; Hewamanage, S; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Bochenek, J; Chen, J; Diamond, B; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prosper, H; Veeraraghavan, V; Weinberg, M; Baarmand, M M; Dorney, B; Hohlmann, M; Kalakhety, H; Vodopiyanov, I; Adams, M R; Anghel, I M; Apanasevich, L; Bai, Y; Bazterra, V E; Betts, R R; Bucinskaite, I; Callner, J; Cavanaugh, R; Evdokimov, O; Gauthier, L; Gerber, C E; Hofman, D J; Khalatyan, S; Lacroix, F; Malek, M; O'Brien, C; Silkworth, C; Strom, D; Turner, P; Varelas, N; Akgun, U; Albayrak, E A; Bilki, B; Clarida, W; Duru, F; Griffiths, S; Merlo, J-P; Mermerkaya, H; Mestvirishvili, A; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Onel, Y; Ozok, F; Sen, S; Tiras, E; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bolognesi, S; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Hu, G; Maksimovic, P; Rappoccio, S; Swartz, M; Whitbeck, A; Baringer, P; Bean, A; Benelli, G; Kenny Iii, R P; Murray, M; Noonan, D; Sanders, S; Stringer, R; Tinti, G; Wood, J S; Zhukova, V; Barfuss, A F; Bolton, T; Chakaberia, I; Ivanov, A; Khalil, S; Makouski, M; Maravin, Y; Shrestha, S; Svintradze, I; Gronberg, J; Lange, D; Wright, D; Baden, A; Boutemeur, M; Calvert, B; Eno, S C; Gomez, J A; Hadley, N J; Kellogg, R G; Kirn, M; Kolberg, T; Lu, Y; Marionneau, M; Mignerey, A C; Pedro, K; Peterman, A; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Twedt, E; Apyan, A; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; Dutta, V; Gomez Ceballos, G; Goncharov, M; Hahn, K A; Kim, Y; Klute, M; Krajczar, K; Li, W; Luckey, P D; Ma, T; Nahn, S; Paus, C; Ralph, D; Roland, C; Roland, G; Rudolph, M; Stephans, G S F; Stöckli, F; Sumorok, K; Sung, K; Velicanu, D; Wenger, E A; Wolf, R; Wyslouch, B; Yang, M; Yilmaz, Y; Yoon, A S; Zanetti, M; Cooper, S I; Dahmes, B; De Benedetti, A; Franzoni, G; Gude, A; Kao, S C; Klapoetke, K; Kubota, Y; Mans, J; Pastika, N; Rusack, R; Sasseville, M; Singovsky, A; Tambe, N; Turkewitz, J; Cremaldi, L M; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Avdeeva, E; Bloom, K; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kravchenko, I; Lazo-Flores, J; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Godshalk, A; Iashvili, I; Jain, S; Kharchilava, A; Kumar, A; Shipkowski, S P; Smith, K; Alverson, G; Barberis, E; Baumgartel, D; Chasco, M; Haley, J; Nash, D; Trocino, D; Wood, D; Zhang, J; Anastassov, A; Kubik, A; Mucia, N; Odell, N; Ofierzynski, R A; Pollack, B; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Brinkerhoff, A; Hildreth, M; Jessop, C; Karmgard, D J; Kolb, J; Lannon, K; Luo, W; Lynch, S; Marinelli, N; Morse, D M; Pearson, T; Planer, M; Ruchti, R; Slaunwhite, J; Valls, N; Wayne, M; Wolf, M; Bylsma, B; Durkin, L S; Hill, C; Hughes, R; Kotov, K; Ling, T Y; Puigh, D; Rodenburg, M; Vuosalo, C; Williams, G; Winer, B L; Adam, N; Berry, E; Elmer, P; Gerbaudo, D; Halyo, V; Hebda, P; Hegeman, J; Hunt, A; Jindal, P; Lopes Pegna, D; Lujan, P; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Quan, X; Raval, A; Safdi, B; Saka, H; Stickland, D; Tully, C; Werner, J S; Zuranski, A; Acosta, J G; Brownson, E; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Zatserklyaniy, A; Alagoz, E; Barnes, V E; Benedetti, D; Bolla, G; Bortoletto, D; De Mattia, M; Everett, A; Hu, Z; Jones, M; Koybasi, O; Kress, M; Laasanen, A T; Leonardo, N; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Shipsey, I; Silvers, D; Svyatkovskiy, A; Vidal Marono, M; Yoo, H D; Zablocki, J; Zheng, Y; Guragain, S; Parashar, N; Adair, A; Boulahouache, C; Ecklund, K M; Geurts, F J M; Padley, B P; Redjimi, R; Roberts, J; Zabel, J; Betchart, B; Bodek, A; Chung, Y S; Covarelli, R; de Barbaro, P; Demina, R; Eshaq, Y; Ferbel, T; Garcia-Bellido, A; Goldenzweig, P; Han, J; Harel, A; Miner, D C; Vishnevskiy, D; Zielinski, M; Bhatti, A; Ciesielski, R; Demortier, L; Goulianos, K; Lungu, G; Malik, S; Mesropian, C; Arora, S; Barker, A; Chou, J P; Contreras-Campana, C; Contreras-Campana, E; Duggan, D; Ferencek, D; Gershtein, Y; Gray, R; Halkiadakis, E; Hidas, D; Lath, A; Panwalkar, S; Park, M; Patel, R; Rekovic, V; Robles, J; Rose, K; Salur, S; Schnetzer, S; Seitz, C; Somalwar, S; Stone, R; Thomas, S; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Eusebi, R; Flanagan, W; Gilmore, J; Kamon, T; Khotilovich, V; Montalvo, R; Osipenkov, I; Pakhotin, Y; Perloff, A; Roe, J; Safonov, A; Sakuma, T; Sengupta, S; Suarez, I; Tatarinov, A; Toback, D; Akchurin, N; Damgov, J; Dragoiu, C; Dudero, P R; Jeong, C; Kovitanggoon, K; Lee, S W; Libeiro, T; Roh, Y; Volobouev, I; Appelt, E; Delannoy, A G; Florez, C; Greene, S; Gurrola, A; Johns, W; Johnston, C; Kurt, P; Maguire, C; Melo, A; Sharma, M; Sheldon, P; Snook, B; Tuo, S; Velkovska, J; Arenton, M W; Balazs, M; Boutle, S; Cox, B; Francis, B; Goodell, J; Hirosky, R; Ledovskoy, A; Lin, C; Neu, C; Wood, J; Yohay, R; Gollapinni, S; Harr, R; Karchin, P E; Kottachchi Kankanamge Don, C; Lamichhane, P; Sakharov, A; Anderson, M; Belknap, D A; Borrello, L; Carlsmith, D; Cepeda, M; Dasu, S; Friis, E; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Herndon, M; Hervé, A; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Mohapatra, A; Ojalvo, I; Palmonari, F; Pierro, G A; Ross, I; Savin, A; Smith, W H; Swanson, J
A search for physics beyond the standard model is performed with events having one or more hadronically decaying τ leptons, highly energetic jets, and large transverse momentum imbalance. The data sample corresponds to an integrated luminosity of 4.98 fb -1 of proton-proton collisions at [Formula: see text] collected with the CMS detector at the LHC in 2011. The number of observed events is consistent with predictions for standard model processes. Lower limits on the mass of the gluino in supersymmetric models are determined.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-23
... parts of the risk adjustment process--the risk adjustment model, the calculation of plan average... risk adjustment process. The risk adjustment model calculates individual risk scores. The calculation...'' to mean all data that are used in a risk adjustment model, the calculation of plan average actuarial...
Striking a Balance: Students' Tendencies to Oversimplify or Overcomplicate in Mathematical Modeling
ERIC Educational Resources Information Center
Gould, Heather; Wasserman, Nicholas H.
2014-01-01
With the adoption of the "Common Core State Standards for Mathematics" (CCSSM), the process of mathematical modeling has been given increased attention in mathematics education. This article reports on a study intended to inform the implementation of modeling in classroom contexts by examining students' interactions with the process of…
Ontology based standardization of Petri net modeling for signaling pathways.
Takai-Igarashi, Takako
2005-01-01
Taking account of the great availability of Petri nets in modeling and analyzing large complicated signaling networks, semantics of Petri nets is in need of systematization for the purpose of consistency and reusability of the models. This paper reports on standardization of units of Petri nets on the basis of an ontology that gives an intrinsic definition to the process of signaling in signaling pathways.
Ontology based standardization of petri net modeling for signaling pathways.
Takai-Igarashi, Takako
2011-01-01
Taking account of the great availability of Petri nets in modeling and analyzing large complicated signaling networks, semantics of Petri nets is in need of systematization for the purpose of consistency and reusability of the models. This paper reports on standardization of units of Petri nets on the basis of an ontology that gives an intrinsic definition to the process of signaling in signaling pathways.
Health level 7 development framework for medication administration.
Kim, Hwa Sun; Cho, Hune
2009-01-01
We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.
Adaptive Filtering Using Recurrent Neural Networks
NASA Technical Reports Server (NTRS)
Parlos, Alexander G.; Menon, Sunil K.; Atiya, Amir F.
2005-01-01
A method for adaptive (or, optionally, nonadaptive) filtering has been developed for estimating the states of complex process systems (e.g., chemical plants, factories, or manufacturing processes at some level of abstraction) from time series of measurements of system inputs and outputs. The method is based partly on the fundamental principles of the Kalman filter and partly on the use of recurrent neural networks. The standard Kalman filter involves an assumption of linearity of the mathematical model used to describe a process system. The extended Kalman filter accommodates a nonlinear process model but still requires linearization about the state estimate. Both the standard and extended Kalman filters involve the often unrealistic assumption that process and measurement noise are zero-mean, Gaussian, and white. In contrast, the present method does not involve any assumptions of linearity of process models or of the nature of process noise; on the contrary, few (if any) assumptions are made about process models, noise models, or the parameters of such models. In this regard, the method can be characterized as one of nonlinear, nonparametric filtering. The method exploits the unique ability of neural networks to approximate nonlinear functions. In a given case, the process model is limited mainly by limitations of the approximation ability of the neural networks chosen for that case. Moreover, despite the lack of assumptions regarding process noise, the method yields minimum- variance filters. In that they do not require statistical models of noise, the neural- network-based state filters of this method are comparable to conventional nonlinear least-squares estimators.
Guide to solar reference spectra and irradiance models
NASA Astrophysics Data System (ADS)
Tobiska, W. Kent
The international standard for determining solar irradiances was published by the International Standards Organization (ISO) in May 2007. The document, ISO 21348 Space Environment (natural and artificial) - Process for determining solar irradiances, describes the process for representing solar irradiances. We report on the next progression of standards work, i.e., the development of a guide that identifies solar reference spectra and irradiance models for use in engineering design or scientific research. This document will be produced as an AIAA Guideline and ISO Technical Report. It will describe the content of the reference spectra and models, uncertainties and limitations, technical basis, data bases from which the reference spectra and models are formed, publication references, and sources of computer code for reference spectra and solar irradiance models, including those which provide spectrally-resolved lines as well as solar indices and proxies and which are generally recognized in the solar sciences. The document is intended to assist aircraft and space vehicle designers and developers, heliophysicists, geophysicists, aeronomers, meteorologists, and climatologists in understanding available models, comparing sources of data, and interpreting engineering and scientific results based on different solar reference spectra and irradiance models.
Modeling aging effects on two-choice tasks: response signal and response time data.
Ratcliff, Roger
2008-12-01
In the response signal paradigm, a test stimulus is presented, and then at one of a number of experimenter-determined times, a signal to respond is presented. Response signal, standard response time (RT), and accuracy data were collected from 19 college-age and 19 60- to 75-year-old participants in a numerosity discrimination task. The data were fit with 2 versions of the diffusion model. Response signal data were modeled by assuming a mixture of processes, those that have terminated before the signal and those that have not terminated; in the latter case, decisions are based on either partial information or guessing. The effects of aging on performance in the regular RT task were explained the same way in the models, with a 70- to 100-ms increase in the nondecision component of processing, more conservative decision criteria, and more variability across trials in drift and the nondecision component of processing, but little difference in drift rate (evidence). In the response signal task, the primary reason for a slower rise in the response signal functions for older participants was variability in the nondecision component of processing. Overall, the results were consistent with earlier fits of the diffusion model to the standard RT task for college-age participants and to the data from aging studies using this task in the standard RT procedure. Copyright (c) 2009 APA, all rights reserved.
Modeling Aging Effects on Two-Choice Tasks: Response Signal and Response Time Data
Ratcliff, Roger
2009-01-01
In the response signal paradigm, a test stimulus is presented, and then at one of a number of experimenter-determined times, a signal to respond is presented. Response signal, standard response time (RT), and accuracy data were collected from 19 college-age and 19 60- to 75-year-old participants in a numerosity discrimination task. The data were fit with 2 versions of the diffusion model. Response signal data were modeled by assuming a mixture of processes, those that have terminated before the signal and those that have not terminated; in the latter case, decisions are based on either partial information or guessing. The effects of aging on performance in the regular RT task were explained the same way in the models, with a 70- to 100-ms increase in the nondecision component of processing, more conservative decision criteria, and more variability across trials in drift and the nondecision component of processing, but little difference in drift rate (evidence). In the response signal task, the primary reason for a slower rise in the response signal functions for older participants was variability in the nondecision component of processing. Overall, the results were consistent with earlier fits of the diffusion model to the standard RT task for college-age participants and to the data from aging studies using this task in the standard RT procedure. PMID:19140659
Current Status of Multidisciplinary Care in Psoriatic Arthritis in Spain: NEXUS 2.0 Project.
Queiro, Rubén; Coto, Pablo; Joven, Beatriz; Rivera, Raquel; Navío Marco, Teresa; de la Cueva, Pablo; Alvarez Vega, Jose Luis; Narváez Moreno, Basilio; Rodriguez Martínez, Fernando José; Pardo Sánchez, José; Feced Olmos, Carlos; Pujol, Conrad; Rodríguez, Jesús; Notario, Jaume; Pujol Busquets, Manel; García Font, Mercè; Galindez, Eva; Pérez Barrio, Silvia; Urruticoechea-Arana, Ana; Hergueta, Merce; López Montilla, M Dolores; Vélez García-Nieto, Antonio; Maceiras, Francisco; Rodríguez Pazos, Laura; Rubio Romero, Esteban; Rodríguez Fernandez Freire, Lourdes; Luelmo, Jesús; Gratacós, Jordi
2018-02-26
1) To analyze the implementation of multidisciplinary care models in psoriatic arthritis (PsA) patients, 2) To define minimum and excellent standards of care. A survey was sent to clinicians who already performed multidisciplinary care or were in the process of undertaking it, asking: 1) Type of multidisciplinary care model implemented; 2) Degree, priority and feasibility of the implementation of quality standards in the structure, process and result for care. In 6 regional meetings the results of the survey were presented and discussed, and the ultimate priority of quality standards for care was defined. At a nominal meeting group, 11 experts (rheumatologists and dermatologists) analyzed the results of the survey and the regional meetings. With this information, they defined which standards of care are currently considered as minimum and which are excellent. The simultaneous and parallel models of multidisciplinary care are those most widely implemented, but the implementation of quality standards is highly variable. In terms of structure it ranges from 22% to 74%, in those related to process from 17% to 54% and in the results from 2% to 28%. Of the 25 original quality standards for care, 9 were considered only minimum, 4 were excellent and 12 defined criteria for minimum level and others for excellence. The definition of minimum and excellent quality standards for care will help achieve the goal of multidisciplinary care for patients with PAs, which is the best healthcare possible. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Needs assessment of land use modeling for FSUTMS, phase 1.
DOT National Transportation Integrated Search
2012-01-06
In Florida, the transportation demand modeling process is done through the Cube software, following the statewide modeling system (The Florida Standard Urban Transportation Model Structure or FSUTMS). Other states and metropolitan planning organizati...
Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.
ERIC Educational Resources Information Center
Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn
This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…
Silicon web process development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.
1977-01-01
Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
European standardization effort: interworking the goal
NASA Astrophysics Data System (ADS)
Mattheus, Rudy A.
1993-09-01
In the European Standardization Committee (CEN), the technical committee responsible for the standardization activities in Medical Informatics (CEN TC 251), has agreed upon the directions of the scopes to follow in this field. They are described in the Directory of the European Standardization Requirements for Healthcare Informatics and Programme for the Development of Standards adopted on 02-28-1991 by CEN/TC 251 and approved by CEN/BT. Top-down objectives describe the common framework and items like terminology, security, more bottom up oriented items describe fields like medical imaging and multi-media. The draft standard is described; the general framework model and object oriented model; the interworking aspects, the relation to ISO standards, and the DICOM proposal. This paper also focuses on all the boundaries in the standardization work, which are also influencing the standardization process.
Business process modeling in healthcare.
Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd
2012-01-01
The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.
Interference between light and heavy neutrinos for 0 νββ decay in the left–right symmetric model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, Fahim; Neacsu, Andrei; Horoi, Mihai
Neutrinoless double-beta decay is proposed as an important low energy phenomenon that could test beyond the Standard Model physics. There are several potentially competing beyond the Standard Model mechanisms that can induce the process. It thus becomes important to disentangle the different processes. In the present study we consider the interference effect between the light left-handed and heavy right-handed Majorana neutrino exchange mechanisms. The decay rate, and consequently, the phase-space factors for the interference term are derived, based on the left–right symmetric model. The numerical values for the interference phase-space factors for several nuclides are calculated, taking into consideration themore » relativistic Coulomb distortion of the electron wave function and finite-size of the nucleus. As a result, the variation of the interference effect with the Q-value of the process is studied.« less
Interference between light and heavy neutrinos for 0 νββ decay in the left–right symmetric model
Ahmed, Fahim; Neacsu, Andrei; Horoi, Mihai
2017-03-31
Neutrinoless double-beta decay is proposed as an important low energy phenomenon that could test beyond the Standard Model physics. There are several potentially competing beyond the Standard Model mechanisms that can induce the process. It thus becomes important to disentangle the different processes. In the present study we consider the interference effect between the light left-handed and heavy right-handed Majorana neutrino exchange mechanisms. The decay rate, and consequently, the phase-space factors for the interference term are derived, based on the left–right symmetric model. The numerical values for the interference phase-space factors for several nuclides are calculated, taking into consideration themore » relativistic Coulomb distortion of the electron wave function and finite-size of the nucleus. As a result, the variation of the interference effect with the Q-value of the process is studied.« less
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2016-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... Proposed Information Collection to OMB ``Logic Model'' Grant Performance Report Standard AGENCY: Office of... proposal. Applicants of HUD Federal Financial Assistance are required to indicate intended results and impacts. Grant recipients report against their baseline performance standards. This process standardizes...
Parameter Variability and Distributional Assumptions in the Diffusion Model
ERIC Educational Resources Information Center
Ratcliff, Roger
2013-01-01
If the diffusion model (Ratcliff & McKoon, 2008) is to account for the relative speeds of correct responses and errors, it is necessary that the components of processing identified by the model vary across the trials of a task. In standard applications, the rate at which information is accumulated by the diffusion process is assumed to be normally…
Exploring Yellowstone National Park with Mathematical Modeling
ERIC Educational Resources Information Center
Wickstrom, Megan H.; Carr, Ruth; Lackey, Dacia
2017-01-01
Mathematical modeling, a practice standard in the Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010), is a process by which students develop and use mathematics as a tool to make sense of the world around them. Students investigate a real-world situation by asking mathematical questions; along the way, they need to decide how to use…
Coherence Threshold and the Continuity of Processing: The RI-Val Model of Comprehension
ERIC Educational Resources Information Center
O'Brien, Edward J.; Cook, Anne E.
2016-01-01
Common to all models of reading comprehension is the assumption that a reader's level of comprehension is heavily influenced by their standards of coherence (van den Broek, Risden, & Husbye-Hartman, 1995). Our discussion focuses on a subcomponent of the readers' standards of coherence: the coherence threshold. We situate this discussion within…
Pair production processes and flavor in gauge-invariant perturbation theory
NASA Astrophysics Data System (ADS)
Egger, Larissa; Maas, Axel; Sondenheimer, René
2017-12-01
Gauge-invariant perturbation theory is an extension of ordinary perturbation theory which describes strictly gauge-invariant states in theories with a Brout-Englert-Higgs effect. Such gauge-invariant states are composite operators which have necessarily only global quantum numbers. As a consequence, flavor is exchanged for custodial quantum numbers in the Standard Model, recreating the fermion spectrum in the process. Here, we study the implications of such a description, possibly also for the generation structure of the Standard Model. In particular, this implies that scattering processes are essentially bound-state-bound-state interactions, and require a suitable description. We analyze the implications for the pair-production process e+e-→f¯f at a linear collider to leading order. We show how ordinary perturbation theory is recovered as the leading contribution. Using a PDF-type language, we also assess the impact of sub-leading contributions. To lowest order, we find that the result is mainly influenced by how large the contribution of the Higgs at large x is. This gives an interesting, possibly experimentally testable, scenario for the formal field theory underlying the electroweak sector of the Standard Model.
Efficient model learning methods for actor-critic control.
Grondman, Ivo; Vaandrager, Maarten; Buşoniu, Lucian; Babuska, Robert; Schuitema, Erik
2012-06-01
We propose two new actor-critic algorithms for reinforcement learning. Both algorithms use local linear regression (LLR) to learn approximations of the functions involved. A crucial feature of the algorithms is that they also learn a process model, and this, in combination with LLR, provides an efficient policy update for faster learning. The first algorithm uses a novel model-based update rule for the actor parameters. The second algorithm does not use an explicit actor but learns a reference model which represents a desired behavior, from which desired control actions can be calculated using the inverse of the learned process model. The two novel methods and a standard actor-critic algorithm are applied to the pendulum swing-up problem, in which the novel methods achieve faster learning than the standard algorithm.
NASA Astrophysics Data System (ADS)
Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian
2016-04-01
Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatrchyan, S.; Khachatryan, V.; Sirunyan, A. M.
A search for physics beyond the standard model is performed with events having one or more hadronically decaying tau leptons, highly energetic jets, and large transverse momentum imbalance. The data sample corresponds to an integrated luminosity of 4.98 inverse femtobarns of proton-proton collisions at sqrt(s) = 7 TeV collected with the CMS detector at the LHC in 2011. The number of observed events is consistent with predictions for standard model processes. Lower limits on the mass of the gluino in supersymmetric models are determined.
Size reduction techniques for vital compliant VHDL simulation models
Rich, Marvin J.; Misra, Ashutosh
2006-08-01
A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.
Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin; ...
2015-05-05
We present for the first time the full automation of collider predictions matched with parton showers at the next-to-leading accuracy in QCD within nontrivial extensions of the standard model. The sole inputs required from the user are the model Lagrangian and the process of interest. As an application of the above, we explore scenarios beyond the standard model where new colored scalar particles can be pair produced in hadron collisions. Using simplified models to describe the new field interactions with the standard model, we present precision predictions for the LHC within the MadGraph5_aMC@NLO framework.
AN OVERVIEW OF THE INTEROPERABILITY ROADMAP FOR COM/.NET-BASED CAPE-OPEN
The CAPE-OPEN standard interfaces have been designed to permit flexibility and modularization of process simulation environments (PMEs) in order to use process modeling components such as unit operation or thermodynamic property models across a range of tolls employed in the life...
Command Process Modeling & Risk Analysis
NASA Technical Reports Server (NTRS)
Meshkat, Leila
2011-01-01
Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.
CMS Physics Technical Design Report, Volume II: Physics Performance
NASA Astrophysics Data System (ADS)
CMS Collaboration
2007-06-01
CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider (LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking—through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start-up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb -1 or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, B s production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb -1 to 30 fb -1 . The Standard Model processes include QCD, B -physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z 0 boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2 6 describe examples of full analyses, with photons, electrons, muons, jets, missing E T , B-mesons and τ's, and for quarkonia in heavy ion collisions. Chapters 7 15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model.
Development of NASA's Models and Simulations Standard
NASA Technical Reports Server (NTRS)
Bertch, William J.; Zang, Thomas A.; Steele, Martin J.
2008-01-01
From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.
A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenger, Andreas
2009-01-01
The study of processes involving flavour-changing neutral currents provides a particularly promising probe for New Physics beyond the Standard Model of particle physics. These processes are forbidden at tree level and proceed through loop processes, which are strongly suppressed in the Standard Model. Cross-sections for these processes can be significantly enhanced by contributions from new particles as they are proposed in most extentions of the Standard Model. This thesis presents searches for two flavour-changing neutral current decays, B± ! K±μ+μ- and B0 d ! K¤μ+μ-. The analysis was performed on 4.1 fb-1 of data collected by the DØ detector inmore » Run II of the Fermilab Tevatron. Candidate events for the decay B± ! K±μ+μ- were selected using a multi-variate analysis technique and the number of signal events determined by a fit to the invariant mass spectrum. Normalising to the known branching fraction for B± ! J/ÃK±, a branching fraction of B(B± ! K± μ+μ-) = 6.45 ± 2.24 (stat) ± 1.19 (syst) × 10-7 (1) was measured. The branching fraction for the decay B0 d ! K¤μ+μ- was determined in a similar way. Normalizing to the known branching fraction for B0 d ! J/ÃK¤, a branching fraction of B(B0 d ! K¤ μ+μ-) = 11.15 ± 3.05 (stat) ± 1.94 (syst) × 10-7 (2) was measured. All measurements are in agreement with the Standard Model.« less
Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio
2017-01-01
The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage. PMID:28273801
Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio
2017-03-03
The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.
Modeling Bloch oscillations in ultra-small Josephson junctions
NASA Astrophysics Data System (ADS)
Vora, Heli; Kautz, Richard; Nam, Sae Woo; Aumentado, Jose
In a seminal paper, Likharev et al. developed a theory for ultra-small Josephson junctions with Josephson coupling energy (Ej) less than the charging energy (Ec) and showed that such junctions demonstrate Bloch oscillations which could be used to make a fundamental current standard that is a dual of the Josephson volt standard. Here, based on the model of Geigenmüller and Schön, we numerically calculate the current-voltage relationship of such an ultra-small junction which includes various error processes present in a nanoscale Josephson junction such as random quasiparticle tunneling events and Zener tunneling between bands. This model allows us to explore the parameter space to see the effect of each process on the width and height of the Bloch step and serves as a guide to determine whether it is possible to build a quantum current standard of a metrological precision using Bloch oscillations.
Lepton number violation in theories with a large number of standard model copies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovalenko, Sergey; Schmidt, Ivan; Paes, Heinrich
2011-03-01
We examine lepton number violation (LNV) in theories with a saturated black hole bound on a large number of species. Such theories have been advocated recently as a possible solution to the hierarchy problem and an explanation of the smallness of neutrino masses. On the other hand, the violation of the lepton number can be a potential phenomenological problem of this N-copy extension of the standard model as due to the low quantum gravity scale black holes may induce TeV scale LNV operators generating unacceptably large rates of LNV processes. We show, however, that this issue can be avoided bymore » introducing a spontaneously broken U{sub 1(B-L)}. Then, due to the existence of a specific compensation mechanism between contributions of different Majorana neutrino states, LNV processes in the standard model copy become extremely suppressed with rates far beyond experimental reach.« less
DICOM static and dynamic representation through unified modeling language
NASA Astrophysics Data System (ADS)
Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.
2004-04-01
The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.
NASA Astrophysics Data System (ADS)
Faber, Tracy L.; Garcia, Ernest V.; Lalush, David S.; Segars, W. Paul; Tsui, Benjamin M.
2001-05-01
The spline-based Mathematical Cardiac Torso (MCAT) phantom is a realistic software simulation designed to simulate single photon emission computed tomographic (SPECT) data. It incorporates a heart model of known size and shape; thus, it is invaluable for measuring accuracy of acquisition, reconstruction, and post-processing routines. New functionality has been added by replacing the standard heart model with left ventricular (LV) epicaridal and endocardial surface points detected from actual patient SPECT perfusion studies. LV surfaces detected from standard post-processing quantitation programs are converted through interpolation in space and time into new B-spline models. Perfusion abnormalities are added to the model based on results of standard perfusion quantification. The new LV is translated and rotated to fit within existing atria and right ventricular models, which are scaled based on the size of the LV. Simulations were created for five different patients with myocardial infractions who had undergone SPECT perfusion imaging. Shape, size, and motion of the resulting activity map were compared visually to the original SPECT images. In all cases, size, shape and motion of simulated LVs matched well with the original images. Thus, realistic simulations with known physiologic and functional parameters can be created for evaluating efficacy of processing algorithms.
Relativistic diffusion processes and random walk models
NASA Astrophysics Data System (ADS)
Dunkel, Jörn; Talkner, Peter; Hänggi, Peter
2007-02-01
The nonrelativistic standard model for a continuous, one-parameter diffusion process in position space is the Wiener process. As is well known, the Gaussian transition probability density function (PDF) of this process is in conflict with special relativity, as it permits particles to propagate faster than the speed of light. A frequently considered alternative is provided by the telegraph equation, whose solutions avoid superluminal propagation speeds but suffer from singular (noncontinuous) diffusion fronts on the light cone, which are unlikely to exist for massive particles. It is therefore advisable to explore other alternatives as well. In this paper, a generalized Wiener process is proposed that is continuous, avoids superluminal propagation, and reduces to the standard Wiener process in the nonrelativistic limit. The corresponding relativistic diffusion propagator is obtained directly from the nonrelativistic Wiener propagator, by rewriting the latter in terms of an integral over actions. The resulting relativistic process is non-Markovian, in accordance with the known fact that nontrivial continuous, relativistic Markov processes in position space cannot exist. Hence, the proposed process defines a consistent relativistic diffusion model for massive particles and provides a viable alternative to the solutions of the telegraph equation.
Gasqui, Patrick; Trommenschlager, Jean-Marie
2017-08-21
Milk production in dairy cow udders is a complex and dynamic physiological process that has resisted explanatory modelling thus far. The current standard model, Wood's model, is empirical in nature, represents yield in daily terms, and was published in 1967. Here, we have developed a dynamic and integrated explanatory model that describes milk yield at the scale of the milking session. Our approach allowed us to formally represent and mathematically relate biological features of known relevance while accounting for stochasticity and conditional elements in the form of explicit hypotheses, which could then be tested and validated using real-life data. Using an explanatory mathematical and biological model to explore a physiological process and pinpoint potential problems (i.e., "problem finding"), it is possible to filter out unimportant variables that can be ignored, retaining only those essential to generating the most realistic model possible. Such modelling efforts are multidisciplinary by necessity. It is also helpful downstream because model results can be compared with observed data, via parameter estimation using maximum likelihood and statistical testing using model residuals. The process in its entirety yields a coherent, robust, and thus repeatable, model.
Information system life-cycle and documentation standards, volume 1
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.
Thermal Model Development for Ares I-X
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; DelCorso, Joe
2008-01-01
Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.
He, Li; Xu, Zongda; Fan, Xing; Li, Jing; Lu, Hongwei
2017-05-01
This study develops a meta-modeling based mathematical programming approach with flexibility in environmental standards. It integrates numerical simulation, meta-modeling analysis, and fuzzy programming within a general framework. A set of models between remediation strategies and remediation performance can well guarantee the mitigation in computational efforts in the simulation and optimization process. In order to prevent the occurrence of over-optimistic and pessimistic optimization strategies, a high satisfaction level resulting from the implementation of a flexible standard can indicate the degree to which the environmental standard is satisfied. The proposed approach is applied to a naphthalene-contaminated site in China. Results show that a longer remediation period corresponds to a lower total pumping rate and a stringent risk standard implies a high total pumping rate. The wells located near or in the down-gradient direction to the contaminant sources have the most significant efficiency among all of remediation schemes.
Using a logical information model-driven design process in healthcare.
Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen
2011-01-01
A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.
Mincarone, Pierpaolo; Leo, Carlo Giacomo; Trujillo-Martín, Maria Del Mar; Manson, Jan; Guarino, Roberto; Ponzini, Giuseppe; Sabina, Saverio
2018-04-01
The importance of working toward quality improvement in healthcare implies an increasing interest in analysing, understanding and optimizing process logic and sequences of activities embedded in healthcare processes. Their graphical representation promotes faster learning, higher retention and better compliance. The study identifies standardized graphical languages and notations applied to patient care processes and investigates their usefulness in the healthcare setting. Peer-reviewed literature up to 19 May 2016. Information complemented by a questionnaire sent to the authors of selected studies. Systematic review conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Five authors extracted results of selected studies. Ten articles met the inclusion criteria. One notation and language for healthcare process modelling were identified with an application to patient care processes: Business Process Model and Notation and Unified Modeling Language™. One of the authors of every selected study completed the questionnaire. Users' comprehensibility and facilitation of inter-professional analysis of processes have been recognized, in the filled in questionnaires, as major strengths for process modelling in healthcare. Both the notation and the language could increase the clarity of presentation thanks to their visual properties, the capacity of easily managing macro and micro scenarios, the possibility of clearly and precisely representing the process logic. Both could increase guidelines/pathways applicability by representing complex scenarios through charts and algorithms hence contributing to reduce unjustified practice variations which negatively impact on quality of care and patient safety.
Top quark rare decays via loop-induced FCNC interactions in extended mirror fermion model
NASA Astrophysics Data System (ADS)
Hung, P. Q.; Lin, Yu-Xiang; Nugroho, Chrisna Setyo; Yuan, Tzu-Chiang
2018-02-01
Flavor changing neutral current (FCNC) interactions for a top quark t decays into Xq with X represents a neutral gauge or Higgs boson, and q a up- or charm-quark are highly suppressed in the Standard Model (SM) due to the Glashow-Iliopoulos-Miami mechanism. Whilst current limits on the branching ratios of these processes have been established at the order of 10-4 from the Large Hadron Collider experiments, SM predictions are at least nine orders of magnitude below. In this work, we study some of these FCNC processes in the context of an extended mirror fermion model, originally proposed to implement the electroweak scale seesaw mechanism for non-sterile right-handed neutrinos. We show that one can probe the process t → Zc for a wide range of parameter space with branching ratios varying from 10-6 to 10-8, comparable with various new physics models including the general two Higgs doublet model with or without flavor violations at tree level, minimal supersymmetric standard model with or without R-parity, and extra dimension model.
Towards a framework for business model innovation in health care delivery in developing countries.
Castano, Ramon
2014-12-02
Uncertainty and information asymmetries in health care are the basis for a supply-sided mindset in the health care industry and for a business model for hospitals and doctor's practices; these two models have to be challenged with business model innovation. The three elements which ensure this are standardizability, separability, and patient-centeredness. As scientific evidence advances and outcomes are more predictable, standardization is more feasible. If a standardized process can also be separated from the hospital and doctor's practice, it is more likely that innovative business models emerge. Regarding patient centeredness, it has to go beyond the oversimplifying approach to patient satisfaction with amenities and interpersonal skills of staff, to include the design of structure and processes starting from patients' needs, expectations, and preferences. Six business models are proposed in this article, including those of hospitals and doctor's practices. Unravelling standardized and separable processes from the traditional hospital setting will increase hospital expenditure, however, the new business models would reduce expenses. The net effect on efficiency could be argued to be positive. Regarding equity in access to high-quality care, most of the innovations described along these business models have emerged in developing countries; it is therefore reasonable to be optimistic regarding their impact on access by the poor. These models provide a promising route to achieve sustainable universal access to high quality care by the poor. Business model innovation is a necessary step to guarantee sustainability of health care systems; standardizability, separability, and patient-centeredness are key elements underlying the six business model innovations proposed in this article.
Multi-core processing and scheduling performance in CMS
NASA Astrophysics Data System (ADS)
Hernández, J. M.; Evans, D.; Foulkes, S.
2012-12-01
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.
The Common Core State Standards Initiative: an Overview
ERIC Educational Resources Information Center
Watt, Michael G.
2011-01-01
The purpose of this study was to evaluate decision making in the Common Core State Standards Initiative as the change process moved from research, development and diffusion activities to adoption of the Common Core State Standards by the states. A decision-oriented evaluation model was used to describe the four stages of planning, structuring,…
The accuracy of ultrashort echo time MRI sequences for medical additive manufacturing
Rijkhorst, Erik-Jan; Hofman, Mark; Forouzanfar, Tymour; Wolff, Jan
2016-01-01
Objectives: Additively manufactured bone models, implants and drill guides are becoming increasingly popular amongst maxillofacial surgeons and dentists. To date, such constructs are commonly manufactured using CT technology that induces ionizing radiation. Recently, ultrashort echo time (UTE) MRI sequences have been developed that allow radiation-free imaging of facial bones. The aim of the present study was to assess the feasibility of UTE MRI sequences for medical additive manufacturing (AM). Methods: Three morphologically different dry human mandibles were scanned using a CT and MRI scanner. Additionally, optical scans of all three mandibles were made to acquire a “gold standard”. All CT and MRI scans were converted into Standard Tessellation Language (STL) models and geometrically compared with the gold standard. To quantify the accuracy of the AM process, the CT, MRI and gold-standard STL models of one of the mandibles were additively manufactured, optically scanned and compared with the original gold-standard STL model. Results: Geometric differences between all three CT-derived STL models and the gold standard were <1.0 mm. All three MRI-derived STL models generally presented deviations <1.5 mm in the symphyseal and mandibular area. The AM process introduced minor deviations of <0.5 mm. Conclusions: This study demonstrates that MRI using UTE sequences is a feasible alternative to CT in generating STL models of the mandible and would therefore be suitable for surgical planning and AM. Further in vivo studies are necessary to assess the usability of UTE MRI sequences in clinical settings. PMID:26943179
NASA Technical Reports Server (NTRS)
Estefan, J. A.; Sovers, O. J.
1994-01-01
The standard tropospheric calibration model implemented in the operational Orbit Determination Program is the seasonal model developed by C. C. Chao in the early 1970's. The seasonal model has seen only slight modification since its release, particularly in the format and content of the zenith delay calibrations. Chao's most recent standard mapping tables, which are used to project the zenith delay calibrations along the station-to-spacecraft line of sight, have not been modified since they were first published in late 1972. This report focuses principally on proposed upgrades to the zenith delay mapping process, although modeling improvements to the zenith delay calibration process are also discussed. A number of candidate approximation models for the tropospheric mapping are evaluated, including the semi-analytic mapping function of Lanyi, and the semi-empirical mapping functions of Davis, et. al.('CfA-2.2'), of Ifadis (global solution model), of Herring ('MTT'), and of Niell ('NMF'). All of the candidate mapping functions are superior to the Chao standard mapping tables and approximation formulas when evaluated against the current Deep Space Network Mark 3 intercontinental very long baselines interferometry database.
Standardization Process for Space Radiation Models Used for Space System Design
NASA Technical Reports Server (NTRS)
Barth, Janet; Daly, Eamonn; Brautigam, Donald
2005-01-01
The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wirtz, Stefan
2014-05-01
In soil erosion research, rills are believed to be one of the most efficient forms. They act as preferential flow paths for overland flow and hence become the most efficient sediment sources in a catchment. However their fraction of the overall detachment in a certain area compared to other soil erosion processes is contentious. The requirement for handling this subject is the standardization of the used measurement methods for rill erosion quantification. Only by using a standardized method, the results of different studies become comparable and can be synthesized to one overall statement. In rill erosion research, such a standardized field method was missing until now. Hence, the first aim of this study is to present an experimental setup that enables us to obtain comparable data about process dynamics in eroding rills under standardized conditions in the field. Using this rill experiment, the runoff efficiency of rills (second aim) and the fraction of rill erosion on total soil loss (third aim) in a catchment are quantified. The erosion rate [g m-2] in the rills is between twenty- and sixty-times higher compared to the interrill areas, the specific discharge [L s-1 m-2] in the rills is about 2000 times higher. The identification and quantification of different rill erosion processes are the fourth aim within this project. Gravitative processes like side wall failure, headcut- and knickpoint retreat provide up to 94 % of the detached sediment quantity. In soil erosion models, only the incision into the rill's bottom is considered, hence the modelled results are unsatisfactorily. Due to the low quality of soil erosion model results, the fifth aim of the study is to review two physical basic assumptions using the rill experiments. Contrasting with the model assumptions, there is no clear linear correlation between any hydraulic parameter and the detachment rate and the transport rate is capable of exceeding the transport capacity. In conclusion, the results clearly show the need of experimental field data obtained under conditions as close as possible to reality. This is the only way to improve the fundamental knowledge about the function and the impact of the different processes in rill erosion. A better understanding of the process combinations is a fundamental request for developing a really functioning soil erosion model. In such a model, spatial and temporal variability as well as the combination of different sub-processes must be considered. Regarding the experimental results of this study, the simulation of natural processes using simple, static mathematical equations seems not to be possible.
The Mediating Relation between Symbolic and Nonsymbolic Foundations of Math Competence
Price, Gavin R.; Fuchs, Lynn S.
2016-01-01
This study investigated the relation between symbolic and nonsymbolic magnitude processing abilities with 2 standardized measures of math competence (WRAT Arithmetic and KeyMath Numeration) in 150 3rd- grade children (mean age 9.01 years). Participants compared sets of dots and pairs of Arabic digits with numerosities 1–9 for relative numerical magnitude. In line with previous studies, performance on both symbolic and nonsymbolic magnitude processing was related to math ability. Performance metrics combining reaction and accuracy, as well as weber fractions, were entered into mediation models with standardized math test scores. Results showed that symbolic magnitude processing ability fully mediates the relation between nonsymbolic magnitude processing and math ability, regardless of the performance metric or standardized test. PMID:26859564
The Mediating Relation between Symbolic and Nonsymbolic Foundations of Math Competence.
Price, Gavin R; Fuchs, Lynn S
2016-01-01
This study investigated the relation between symbolic and nonsymbolic magnitude processing abilities with 2 standardized measures of math competence (WRAT Arithmetic and KeyMath Numeration) in 150 3rd-grade children (mean age 9.01 years). Participants compared sets of dots and pairs of Arabic digits with numerosities 1-9 for relative numerical magnitude. In line with previous studies, performance on both symbolic and nonsymbolic magnitude processing was related to math ability. Performance metrics combining reaction and accuracy, as well as weber fractions, were entered into mediation models with standardized math test scores. Results showed that symbolic magnitude processing ability fully mediates the relation between nonsymbolic magnitude processing and math ability, regardless of the performance metric or standardized test.
Anomalous single production of the fourth generation quarks at the CERN LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ciftci, R.
Possible anomalous single productions of the fourth standard model generation up and down type quarks at CERN Large Hadron Collider are studied. Namely, pp{yields}u{sub 4}(d{sub 4})X with subsequent u{sub 4}{yields}bW{sup +} process followed by the leptonic decay of the W boson and d{sub 4}{yields}b{gamma} (and its H.c.) decay channel are considered. Signatures of these processes and corresponding standard model backgrounds are discussed in detail. Discovery limits for the quark mass and achievable values of the anomalous coupling strength are determined.
National standards in science education: Teacher perceptions regarding utilization
NASA Astrophysics Data System (ADS)
Fletcher, Carol Louise Parsons
The purpose of this naturalistic study was to determine what factors most influence middle school science teachers' intentions to utilize or ignore national standards, as a toot for reform in their classrooms, schools, or districts. Results indicate. that teachers with. minimal training were unlikely to use national standards documents due to their perceptions of a lack of support from peers, administrators and a high-stakes state accountability system. Teachers with more extensive training were more inclined to use national standards documents as philosophical guides for reform because they believed in the validity of the recommendations. Implications are discussed, chief among them that short-term professional development may actually do more harm than good if teachers retain or develop unexamined misconceptions about national standards recommendations as a result. In addition, due to the concerns expressed by teachers regarding state curriculum mandates and standardized testing, this study indicates that changes in these external factors must be instituted before teachers will commit themselves to standards-based reforms. It is suggested that staff development focus on opportunities for reflection and application which will promote conceptual change in teachers. A model predicated on the notion that the process of implementing reform is essentially an issue of promoting conceptual change in teachers is proposed. This model, termed the Reform Implementation as Conceptual Change, or RICC, focuses specifically on the cognitive processes teachers may go through when they are exposed to an innovation such as national standards. Stages such as integrated application, accommodation, assimilation, disconnection, and false accommodation, are described. The impact that professional development and training may have on the likelihood that teachers will experience these various stages is also discussed. This model serves as a theoretical framework for explaining why some teachers are unlikely to embrace national standards while others choose to utilize them as a tool for reforming science education in their classrooms, schools, or districts. As such, it can be used by reformers to design and diagnostically evaluate the implementation process and its related staff development.
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
CFD Modeling of Flow, Temperature, and Concentration Fields in a Pilot-Scale Rotary Hearth Furnace
NASA Astrophysics Data System (ADS)
Liu, Ying; Su, Fu-Yong; Wen, Zhi; Li, Zhi; Yong, Hai-Quan; Feng, Xiao-Hong
2014-01-01
A three-dimensional mathematical model for simulation of flow, temperature, and concentration fields in a pilot-scale rotary hearth furnace (RHF) has been developed using a commercial computational fluid dynamics software, FLUENT. The layer of composite pellets under the hearth is assumed to be a porous media layer with CO source and energy sink calculated by an independent mathematical model. User-defined functions are developed and linked to FLUENT to process the reduction process of the layer of composite pellets. The standard k-ɛ turbulence model in combination with standard wall functions is used for modeling of gas flow. Turbulence-chemistry interaction is taken into account through the eddy-dissipation model. The discrete ordinates model is used for modeling of radiative heat transfer. A comparison is made between the predictions of the present model and the data from a test of the pilot-scale RHF, and a reasonable agreement is found. Finally, flow field, temperature, and CO concentration fields in the furnace are investigated by the model.
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 10, October 2008
2008-10-01
proprietary modeling offerings, there is considerable conver- gence around Business Process Modeling Notation ( BPMN ). The research also found strong...support across vendors for the Business Process Execution Language standard, though there is also emerging support for direct execution of BPMN through...the use of the XML Process Definition Language, an XML serialization of BPMN . Many vendors also provide the needed moni- toring of those processes at
Dynamical Evolution of Planetary Embryos
NASA Technical Reports Server (NTRS)
Wetherill, George W.
2002-01-01
During the past decade, progress has been made by relating the 'standard model' for the formation of planetary systems to computational and observational advances. A significant contribution to this has been provided by this grant. The consequence of this is that the rigor of the physical modeling has improved considerably. This has identified discrepancies between the predictions of the standard model and recent observations of extrasolar planets. In some cases, the discrepancies can be resolved by recognition of the stochastic nature of the planetary formation process, leading to variations in the final state of a planetary system. In other cases, it seems more likely that there are major deficiencies in the standard model, requiring our identifying variations to the model that are not so strongly constrained to our Solar System.
A model-driven approach to information security compliance
NASA Astrophysics Data System (ADS)
Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena
2017-06-01
The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.
Introducing Python tools for magnetotellurics: MTpy
NASA Astrophysics Data System (ADS)
Krieger, L.; Peacock, J.; Inverarity, K.; Thiel, S.; Robertson, K.
2013-12-01
Within the framework of geophysical exploration techniques, the magnetotelluric method (MT) is relatively immature: It is still not as widely spread as other geophysical methods like seismology, and its processing schemes and data formats are not thoroughly standardized. As a result, the file handling and processing software within the academic community is mainly based on a loose collection of codes, which are sometimes highly adapted to the respective local specifications. Although tools for the estimation of the frequency dependent MT transfer function, as well as inversion and modelling codes, are available, the standards and software for handling MT data are generally not unified throughout the community. To overcome problems that arise from missing standards, and to simplify the general handling of MT data, we have developed the software package "MTpy", which allows the handling, processing, and imaging of magnetotelluric data sets. It is written in Python and the code is open-source. The setup of this package follows the modular approach of successful software packages like GMT or Obspy. It contains sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides pure Python classes and functions, MTpy provides wrappers and convenience scripts to call external software, e.g. modelling and inversion codes. Even though still under development, MTpy already contains ca. 250 functions that work on raw and preprocessed data. However, as our aim is not to produce a static collection of software, we rather introduce MTpy as a flexible framework, which will be dynamically extended in the future. It then has the potential to help standardise processing procedures and at same time be a versatile supplement for existing algorithms. We introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing utilising MTpy on an example data set collected over a geothermal exploration site in South Australia. Workflow of MT data processing. Within the structural diagram, the MTpy sub-packages are shown in red (time series data processing), green (handling of EDI files and impedance tensor data), yellow (connection to modelling/inversion algorithms), black (impedance tensor interpretation, e.g. by Phase Tensor calculations), and blue (generation of visual representations, e.g pseudo sections or resistivity models).
Mission Assurance in a Distributed Environment
2009-06-01
Notation ( BPMN ) – Graphical representation of business processes in a workflow • Unified Modeling Language (UML) – Use standard UML diagrams to model the system – Component, sequence, activity diagrams
ERIC Educational Resources Information Center
Weaver, Kim M.
2005-01-01
In this unit, elementary students design and build a lunar plant growth chamber using the Engineering Design Process. The purpose of the unit is to help students understand and apply the design process as it relates to plant growth on the moon. This guide includes six lessons, which meet a number of national standards and benchmarks in…
Ishii, Lisa; Pronovost, Peter J; Demski, Renee; Wylie, Gill; Zenilman, Michael
2016-06-01
An increasing volume of ambulatory surgeries has led to an increase in the number of ambulatory surgery centers (ASCs). Some academic health systems have aligned with ASCs to create a more integrated care delivery system. Yet, these centers are diverse in many areas, including specialty types, ownership models, management, physician employment, and regulatory oversight. Academic health systems then face challenges in integrating these ASCs into their organizations. Johns Hopkins Medicine created the Ambulatory Surgery Coordinating Council in 2014 to manage, standardize, and promote peer learning among its eight ASCs. The Armstrong Institute for Patient Safety and Quality provided support and a model for this organization through its quality management infrastructure. The physician-led council defined a mission and created goals to identify best practices, uniformly provide the highest-quality patient-centered care, and continuously improve patient outcomes and experience across ASCs. Council members built trust and agreed on a standardized patient safety and quality dashboard to report measures that include regulatory, care process, patient experience, and outcomes data. The council addressed unintentional outcomes and process variation across the system and agreed to standard approaches to optimize quality. Council members also developed a process for identifying future goals, standardizing care practices and electronic medical record documentation, and creating quality and safety policies. The early success of the council supports the continuation of the Armstrong Institute model for physician-led quality management. Other academic health systems can learn from this model as they integrate ASCs into their complex organizations.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
The coming commoditization of processes.
Davenport, Thomas H
2005-06-01
Despite the much-ballyhooed increase in outsourcing, most companies are in do-it-yourself mode for the bulk of their processes, in large part because there's no way to compare outside organizations' capabilities with those of internal functions. Given the lack of comparability, it's almost surprising that anyone outsources today. But it's not surprising that cost is by far companies' primary criterion for evaluating outsourcers or that many companies are dissatisfied with their outsourcing relationships. A new world is coming, says the author, and it will lead to dramatic changes in the shape and structure of corporations. A broad set of process standards will soon make it easy to determine whether a business capability can be improved by outsourcing it. Such standards will also help businesses compare service providers and evaluate the costs versus the benefits of outsourcing. Eventually these costs and benefits will be so visible to buyers that outsourced processes will become a commodity, and prices will drop significantly. The low costs and low risk of outsourcing will accelerate the flow of jobs offshore, force companies to reassess their strategies, and change the basis of competition. The speed with which some businesses have already adopted process standards suggests that many previously unscrutinized areas are ripe for change. In the field of technology, for instance, the Carnegie Mellon Software Engineering Institute has developed a global standard for software development processes, called the Capability Maturity Model (CMM). For companies that don't have process standards in place, it makes sense for them to create standards by working with customers, competitors, software providers, businesses that processes may be outsourced to, and objective researchers and standard-setters. Setting standards is likely to lead to the improvement of both internal and outsourced processes.
RECOLA2: REcursive Computation of One-Loop Amplitudes 2
NASA Astrophysics Data System (ADS)
Denner, Ansgar; Lang, Jean-Nicolas; Uccirati, Sandro
2018-03-01
We present the Fortran95 program RECOLA2 for the perturbative computation of next-to-leading-order transition amplitudes in the Standard Model of particle physics and extended Higgs sectors. New theories are implemented via model files in the 't Hooft-Feynman gauge in the conventional formulation of quantum field theory and in the Background-Field method. The present version includes model files for Two-Higgs-Doublet Model and the Higgs-Singlet Extension of the Standard Model. We support standard renormalization schemes for the Standard Model as well as many commonly used renormalization schemes in extended Higgs sectors. Within these models the computation of next-to-leading-order polarized amplitudes and squared amplitudes, optionally summed over spin and colour, is fully automated for any process. RECOLA2 allows the computation of colour- and spin-correlated leading-order squared amplitudes that are needed in the dipole subtraction formalism. RECOLA2 is publicly available for download at http://recola.hepforge.org.
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2017-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated – from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry. PMID:28691120
Vajuvalli, Nithin N; Nayak, Krupa N; Geethanath, Sairam
2014-01-01
Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) is widely used in the diagnosis of cancer and is also a promising tool for monitoring tumor response to treatment. The Tofts model has become a standard for the analysis of DCE-MRI. The process of curve fitting employed in the Tofts equation to obtain the pharmacokinetic (PK) parameters is time-consuming for high resolution scans. Current work demonstrates a frequency-domain approach applied to the standard Tofts equation to speed-up the process of curve-fitting in order to obtain the pharmacokinetic parameters. The results obtained show that using the frequency domain approach, the process of curve fitting is computationally more efficient compared to the time-domain approach.
ERIC Educational Resources Information Center
Starns, Jeffrey J.; Rotello, Caren M.; Hautus, Michael J.
2014-01-01
We tested the dual process and unequal variance signal detection models by jointly modeling recognition and source confidence ratings. The 2 approaches make unique predictions for the slope of the recognition memory zROC function for items with correct versus incorrect source decisions. The standard bivariate Gaussian version of the unequal…
Effect of Time Varying Gravity on DORIS processing for ITRF2013
NASA Astrophysics Data System (ADS)
Zelensky, N. P.; Lemoine, F. G.; Chinn, D. S.; Beall, J. W.; Melachroinos, S. A.; Beckley, B. D.; Pavlis, D.; Wimert, J.
2013-12-01
Computations are under way to develop a new time series of DORIS SINEX solutions to contribute to the development of the new realization of the terrestrial reference frame (c.f. ITRF2013). One of the improvements that are envisaged is the application of improved models of time-variable gravity in the background orbit modeling. At GSFC we have developed a time series of spherical harmonics to degree and order 5 (using the GOC02S model as a base), based on the processing of SLR and DORIS data to 14 satellites from 1993 to 2013. This is compared with the standard approach used in ITRF2008, based on the static model EIGEN-GL04S1 which included secular variations in only a few select coefficients. Previous work on altimeter satellite POD (c.f. TOPEX/Poseidon, Jason-1, Jason-2) has shown that the standard model is not adequate and orbit improvements are observed with application of more detailed models of time-variable gravity. In this study, we quantify the impact of TVG modeling on DORIS satellite POD, and ascertain the impact on DORIS station positions estimated weekly from 1993 to 2013. The numerous recent improvements to SLR and DORIS processing at GSFC include a more complete compliance to IERS2010 standards, improvements to SLR/DORIS measurement modeling, and improved non-conservative force modeling to DORIS satellites. These improvements will affect gravity coefficient estimates, POD, and the station solutions. Tests evaluate the impact of time varying gravity on tracking data residuals, station consistency, and the geocenter and scale reference frame parameters.
Lattice Gauge Theories Within and Beyond the Standard Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelzer, Zechariah John
The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involvingmore » $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($$B \\to \\pi \\ell \
Common Core Standards for High School Mathematics: A Quick-Start Guide
ERIC Educational Resources Information Center
Dempsey, Kathleen; Schwols, Armitra
2012-01-01
Shifting your high school's math program to new Common Core standards is much easier when teachers and leaders have this handy guide. Getting a copy for every staff member involved in the process ensures everyone knows: (1) How the six conceptual categories throughout the math standards are connected and reinforced; (2) How the modeling standards…
Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L
2016-02-01
Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field.
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML)
Lechevalier, D.; Ak, R.; Ferguson, M.; Law, K. H.; Lee, Y.-T. T.; Rachuri, S.
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain. PMID:29202125
Gaussian Process Regression (GPR) Representation in Predictive Model Markup Language (PMML).
Park, J; Lechevalier, D; Ak, R; Ferguson, M; Law, K H; Lee, Y-T T; Rachuri, S
2017-01-01
This paper describes Gaussian process regression (GPR) models presented in predictive model markup language (PMML). PMML is an extensible-markup-language (XML) -based standard language used to represent data-mining and predictive analytic models, as well as pre- and post-processed data. The previous PMML version, PMML 4.2, did not provide capabilities for representing probabilistic (stochastic) machine-learning algorithms that are widely used for constructing predictive models taking the associated uncertainties into consideration. The newly released PMML version 4.3, which includes the GPR model, provides new features: confidence bounds and distribution for the predictive estimations. Both features are needed to establish the foundation for uncertainty quantification analysis. Among various probabilistic machine-learning algorithms, GPR has been widely used for approximating a target function because of its capability of representing complex input and output relationships without predefining a set of basis functions, and predicting a target output with uncertainty quantification. GPR is being employed to various manufacturing data-analytics applications, which necessitates representing this model in a standardized form for easy and rapid employment. In this paper, we present a GPR model and its representation in PMML. Furthermore, we demonstrate a prototype using a real data set in the manufacturing domain.
Flavour-changing neutral currents making and breaking the standard model.
Archilli, F; Bettler, M-O; Owen, P; Petridis, K A
2017-06-07
The standard model of particle physics is our best description yet of fundamental particles and their interactions, but it is known to be incomplete. As yet undiscovered particles and interactions might exist. One of the most powerful ways to search for new particles is by studying processes known as flavour-changing neutral current decays, whereby a quark changes its flavour without altering its electric charge. One example of such a transition is the decay of a beauty quark into a strange quark. Here we review some intriguing anomalies in these decays, which have revealed potential cracks in the standard model-hinting at the existence of new phenomena.
Diffusion models of the flanker task: Discrete versus gradual attentional selection
White, Corey N.; Ratcliff, Roger; Starns, Jeffrey S.
2011-01-01
The present study tested diffusion models of processing in the flanker task, in which participants identify a target that is flanked by items that indicate the same (congruent) or opposite response (incongruent). Single- and dual-process flanker models were implemented in a diffusion-model framework and tested against data from experiments that manipulated response bias, speed/accuracy tradeoffs, attentional focus, and stimulus configuration. There was strong mimcry among the models, and each captured the main trends in the data for the standard conditions. However, when more complex conditions were used, a single-process spotlight model captured qualitative and quantitative patterns that the dual-process models could not. Since the single-process model provided the best balance of fit quality and parsimony, the results indicate that processing in the simple versions of the flanker task is better described by gradual rather than discrete narrowing of attention. PMID:21964663
Gratacós, Jordi; Luelmo, Jesús; Rodríguez, Jesús; Notario, Jaume; Marco, Teresa Navío; de la Cueva, Pablo; Busquets, Manel Pujol; Font, Mercè García; Joven, Beatriz; Rivera, Raquel; Vega, Jose Luis Alvarez; Álvarez, Antonio Javier Chaves; Parera, Ricardo Sánchez; Carrascosa, Jose Carlos Ruiz; Martínez, Fernando José Rodríguez; Sánchez, José Pardo; Olmos, Carlos Feced; Pujol, Conrad; Galindez, Eva; Barrio, Silvia Pérez; Arana, Ana Urruticoechea; Hergueta, Mercedes; Coto, Pablo; Queiro, Rubén
2018-06-01
To define and give priority to standards of care and quality indicators of multidisciplinary care for patients with psoriatic arthritis (PsA). A systematic literature review on PsA standards of care and quality indicators was performed. An expert panel of rheumatologists and dermatologists who provide multidisciplinary care was established. In a consensus meeting group, the experts discussed and developed the standards of care and quality indicators and graded their priority, agreement and also the feasibility (only for quality indicators) following qualitative methodology and a Delphi process. Afterwards, these results were discussed with 2 focus groups, 1 with patients, another with health managers. A descriptive analysis is presented. We obtained 25 standards of care (9 of structure, 9 of process, 7 of results) and 24 quality indicators (2 of structure, 5 of process, 17 of results). Standards of care include relevant aspects in the multidisciplinary care of PsA patients like an appropriate physical infrastructure and technical equipment, the access to nursing care, labs and imaging techniques, other health professionals and treatments, or the development of care plans. Regarding quality indicators, the definition of multidisciplinary care model objectives and referral criteria, the establishment of responsibilities and coordination among professionals and the active evaluation of patients and data collection were given a high priority. Patients considered all of them as important. This set of standards of care and quality indicators for the multidisciplinary care of patients with PsA should help improve quality of care in these patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Kathleen; Lopez, Hugo; Cairns, Julie
An overview of the main North American codes and standards associated with hydrogen safety sensors is provided. The distinction between a code and a standard is defined, and the relationship between standards and codes is clarified, especially for those circumstances where a standard or a certification requirement is explicitly referenced within a code. The report identifies three main types of standards commonly applied to hydrogen sensors (interface and controls standards, shock and hazard standards, and performance-based standards). The certification process and a list and description of the main standards and model codes associated with the use of hydrogen safety sensorsmore » in hydrogen infrastructure are presented.« less
Chen, Elizabeth S.; Maloney, Francine L.; Shilmayster, Eugene; Goldberg, Howard S.
2009-01-01
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs. PMID:20351830
Chen, Elizabeth S; Maloney, Francine L; Shilmayster, Eugene; Goldberg, Howard S
2009-11-14
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs.
Artifact-Based Transformation of IBM Global Financing
NASA Astrophysics Data System (ADS)
Chao, Tian; Cohn, David; Flatgard, Adrian; Hahn, Sandy; Linehan, Mark; Nandi, Prabir; Nigam, Anil; Pinel, Florian; Vergo, John; Wu, Frederick Y.
IBM Global Financing (IGF) is transforming its business using the Business Artifact Method, an innovative business process modeling technique that identifies key business artifacts and traces their life cycles as they are processed by the business. IGF is a complex, global business operation with many business design challenges. The Business Artifact Method is a fundamental shift in how to conceptualize, design and implement business operations. The Business Artifact Method was extended to solve the problem of designing a global standard for a complex, end-to-end process while supporting local geographic variations. Prior to employing the Business Artifact method, process decomposition, Lean and Six Sigma methods were each employed on different parts of the financing operation. Although they provided critical input to the final operational model, they proved insufficient for designing a complete, integrated, standard operation. The artifact method resulted in a business operations model that was at the right level of granularity for the problem at hand. A fully functional rapid prototype was created early in the engagement, which facilitated an improved understanding of the redesigned operations model. The resulting business operations model is being used as the basis for all aspects of business transformation in IBM Global Financing.
ERIC Educational Resources Information Center
McGaughy, Charis; de Gonzalez, Alicia
2012-01-01
The California Department of Education is in the process of revising the Career and Technical Education (CTE) Model Curriculum Standards. The Educational Policy Improvement Center (EPIC) conducted an investigation of the draft version of the Health Sciences and Medical Technology Standards (Health Science). The purpose of the study is to…
NASA Astrophysics Data System (ADS)
Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.
2016-04-01
In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.
The Evolution of PLA's Planning Model.
ERIC Educational Resources Information Center
Elsner, Edward J.
2002-01-01
Explores the movement toward community-centered standards in public libraries. Tracks the changes of the Public Library Association's (PLA's) planning model through four incarnations, summarizes each model, and examines trends and suggests a way to use the various models together for an easier planning process. (Author/LRW)
NASA Astrophysics Data System (ADS)
Baumgartel, Darin C.
Since the formulation of the Standard Model of particle physics, numerous experiments have sought to observe the signatures of the subatomic particles by examining the outcomes of charged particle collisions. Over time, advances in detector technology and scientific computing have allowed for unprecedented precision measurements of Standard Model phenomena and particle properties. Although the Standard Model has displayed remarkable predictive power, extensions to the Standard Model have been formulated to account for unexplained phenomena, and these extensions often infer the existence of additional subatomic particles. Consequently, experiments at particle colliders often endeavor to search for signatures of physics beyond the Standard Model. These searches and measurements are often complementary pursuits, as searches are often limited by the precision of estimations of the Standard Model backgrounds. At the forefront of present-day collider experiments is the Large Hadron Collider at CERN, which delivers proton-proton collisions with unprecedented energy and luminosity. Collisions are recorded with detectors located at interaction points along the ring of the Large Hadron Collider. The CMS detector is one of two general-purpose detectors at the Large Hadron Collider, and the high-precision detection of particles from collision events in the CMS detector make the CMS detector a powerful tool for both Standard-Model measurements and searches for new physics. The Standard Model is characterized by three generation of quarks and leptons. This correspondence between the generations of quarks and leptons is necessary to allow for the renormalizability of the Standard Model, but it is not an inherent property of the Standard Model. Motivated by this compelling symmetry, many theories and models propose the existence of leptoquark bosons which mediate transitions between quarks and leptons. Experimental constraints indicate that leptoquarks would couple to a single generation, and this thesis describes searches for leptoquarks produced in pairs and decaying to final states containing either two muons and two jets, or one muon, one muon-neutrino, and two jets. Searches are conducted with collision data at center-of-mass energies of both 7 TeV and 8 TeV. No compelling evidence for the existence of leptoquarks is found, and upper limits on the leptoquark mass and cross section are placed at the 95% confidence level. These limits are the most stringent to date, and are several times larger than limits placed previously at hadron collider experiments. While the pair production of massive leptoquark bosons yields final states which have strong kinematic differences from the Standard Model processes, the ability to exploit these differences is limited by the ability to accurately model the backgrounds. The most notable of these backgrounds is the production of a W boson in association with one or more jets. Since the W+jets process has a very large cross section and a final state containing missing energy, its contribution to the total Standard Model background is both nominally large and more difficult to discriminate against than backgrounds with only visible final state objects. Furthermore, estimates of this background are not easily improved by comparisons with data in control regions, and simulations of the background are often limited to leading-order predictions. To improve the understanding and modeling of this background for future endeavors, this thesis also presents measurements of the W+jets process differentially as a function of several variables, including the jet multiplicity, the individual jet transverse momenta and pseudorapidities, the angular separation between the jets and the muon, and the scalar sum of the transverse momenta of all jets. The agreement of these measurements with respect to predictions from event leading-order generators and next-to-leading-order calculations is assessed.
The Use of Regulatory Air Quality Models to Develop Successful Ozone Attainment Strategies
NASA Astrophysics Data System (ADS)
Canty, T. P.; Salawitch, R. J.; Dickerson, R. R.; Ring, A.; Goldberg, D. L.; He, H.; Anderson, D. C.; Vinciguerra, T.
2015-12-01
The Environmental Protection Agency (EPA) recently proposed lowering the 8-hr ozone standard to between 65-70 ppb. Not all regions of the U.S. are in attainment of the current 75 ppb standard and it is expected that many regions currently in attainment will not meet the future, lower surface ozone standard. Ozone production is a nonlinear function of emissions, biological processes, and weather. Federal and state agencies rely on regulatory air quality models such as the Community Multi-Scale Air Quality (CMAQ) model and Comprehensive Air Quality Model with Extensions (CAMx) to test ozone precursor emission reduction strategies that will bring states into compliance with the National Ambient Air Quality Standards (NAAQS). We will describe various model scenarios that simulate how future limits on emission of ozone precursors (i.e. NOx and VOCs) from sources such as power plants and vehicles will affect air quality. These scenarios are currently being developed by states required to submit a State Implementation Plan to the EPA. Projections from these future case scenarios suggest that strategies intended to control local ozone may also bring upwind states into attainment of the new NAAQS. Ground based, aircraft, and satellite observations are used to ensure that air quality models accurately represent photochemical processes within the troposphere. We will highlight some of the improvements made to the CMAQ and CAMx model framework based on our analysis of NASA observations obtained by the OMI instrument on the Aura satellite and by the DISCOVER-AQ field campaign.
Hammes, Florian; Hille, Thomas; Kissel, Thomas
2014-02-01
A process analytical method using reflectance infrared spectrometry was developed for the in-line monitoring of the amount of the active pharmaceutical ingredient (API) nicotine during a coating process for an oral thin film (OTF). In-line measurements were made using a reflectance infrared (RI) sensor positioned after the last drying zone of the coating line. Real-time spectra from the coating process were used for modelling the nicotine content. Partial least squares (PLS1) calibration models with different data pre-treatments were generated. The calibration model with the most comparable standard error of calibration (SEC) and the standard error of cross validation (SECV) was selected for an external validation run on the production coating line with an independent laminate. Good correlations could be obtained between values estimated from the reflectance infrared data and the reference HPLC test method, respectively. With in-line measurements it was possible to allow real-time adjustments during the production process to keep product specifications within predefined limits hence avoiding loss of material and batch. Copyright © 2013 Elsevier B.V. All rights reserved.
Modeling Standards of Care for an Online Environment
Jones-Schenk, Jan; Rossi, Julia
1998-01-01
At Intermountain Health Care in Salt Lake City a team was created to develop core standards for clinical practice that would enhance consistency of care across the care continuum. The newly developed Standards of Care had to meet the following criteria: electronic delivery, research-based, and support an interdisciplinary care environment along with an exception-based documentation system. The process has slowly evolved and the team has grown to include clinicians from multiple sites and disciplines who have met on a regular basis for over a year. The first challenge was to develop a model for the standards of care that would be suitable for an online environment.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Minding Impacting Events in a Model of Stochastic Variance
Duarte Queirós, Sílvio M.; Curado, Evaldo M. F.; Nobre, Fernando D.
2011-01-01
We introduce a generalization of the well-known ARCH process, widely used for generating uncorrelated stochastic time series with long-term non-Gaussian distributions and long-lasting correlations in the (instantaneous) standard deviation exhibiting a clustering profile. Specifically, inspired by the fact that in a variety of systems impacting events are hardly forgot, we split the process into two different regimes: a first one for regular periods where the average volatility of the fluctuations within a certain period of time is below a certain threshold, , and another one when the local standard deviation outnumbers . In the former situation we use standard rules for heteroscedastic processes whereas in the latter case the system starts recalling past values that surpassed the threshold. Our results show that for appropriate parameter values the model is able to provide fat tailed probability density functions and strong persistence of the instantaneous variance characterized by large values of the Hurst exponent (), which are ubiquitous features in complex systems. PMID:21483864
DOT National Transportation Integrated Search
2013-08-01
The Texas Department of Transportation : (TxDOT) created a standardized trip-based : modeling approach for travel demand modeling : called the Texas Package Suite of Travel Demand : Models (referred to as the Texas Package) to : oversee the travel de...
Multi-core processing and scheduling performance in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, J. M.; Evans, D.; Foulkes, S.
2012-01-01
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resultingmore » in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.« less
On Alternative Formulations for Linearised Miss Distance Analysis
2013-05-01
is traditionally employed by analysts as part of the solution process . To gain further insight into the nature of the missile-target engagement...a constant. Thus, following this process , the revised block diagram model for the linearised equations is presented in Figure 13. This model is... process is known as reducing the block to its fundamental closed loop form and has been achieved here using standard block diagram algebra. This
Ensemble-type numerical uncertainty information from single model integrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rauser, Florian, E-mail: florian.rauser@mpimet.mpg.de; Marotzke, Jochem; Korn, Peter
2015-07-01
We suggest an algorithm that quantifies the discretization error of time-dependent physical quantities of interest (goals) for numerical models of geophysical fluid dynamics. The goal discretization error is estimated using a sum of weighted local discretization errors. The key feature of our algorithm is that these local discretization errors are interpreted as realizations of a random process. The random process is determined by the model and the flow state. From a class of local error random processes we select a suitable specific random process by integrating the model over a short time interval at different resolutions. The weights of themore » influences of the local discretization errors on the goal are modeled as goal sensitivities, which are calculated via automatic differentiation. The integration of the weighted realizations of local error random processes yields a posterior ensemble of goal approximations from a single run of the numerical model. From the posterior ensemble we derive the uncertainty information of the goal discretization error. This algorithm bypasses the requirement of detailed knowledge about the models discretization to generate numerical error estimates. The algorithm is evaluated for the spherical shallow-water equations. For two standard test cases we successfully estimate the error of regional potential energy, track its evolution, and compare it to standard ensemble techniques. The posterior ensemble shares linear-error-growth properties with ensembles of multiple model integrations when comparably perturbed. The posterior ensemble numerical error estimates are of comparable size as those of a stochastic physics ensemble.« less
Towards Semantic Modelling of Business Processes for Networked Enterprises
NASA Astrophysics Data System (ADS)
Furdík, Karol; Mach, Marián; Sabol, Tomáš
The paper presents an approach to the semantic modelling and annotation of business processes and information resources, as it was designed within the FP7 ICT EU project SPIKE to support creation and maintenance of short-term business alliances and networked enterprises. A methodology for the development of the resource ontology, as a shareable knowledge model for semantic description of business processes, is proposed. Systematically collected user requirements, conceptual models implied by the selected implementation platform as well as available ontology resources and standards are employed in the ontology creation. The process of semantic annotation is described and illustrated using an example taken from a real application case.
A model for a knowledge-based system's life cycle
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a Committee on Standards for Artificial Intelligence. Presented here are the initial efforts of one of the working groups of that committee. The purpose here is to present a candidate model for the development life cycle of Knowledge Based Systems (KBS). The intent is for the model to be used by the Aerospace Community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are detailed as are and the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
Artificial Intelligence Software Engineering (AISE) model
NASA Technical Reports Server (NTRS)
Kiss, Peter A.
1990-01-01
The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.
FPGA-based firmware model for extended measurement systems with data quality monitoring
NASA Astrophysics Data System (ADS)
Wojenski, A.; Pozniak, K. T.; Mazon, D.; Chernyshova, M.
2017-08-01
Modern physics experiments requires construction of advanced, modular measurement systems for data processing and registration purposes. Components are often designed in one of the common mechanical and electrical standards, e.g. VME or uTCA. The paper is focused on measurement systems using FPGAs as data processing blocks, especially for plasma diagnostics using GEM detectors with data quality monitoring aspects. In the article is proposed standardized model of HDL FPGA firmware implementation, for use in a wide range of different measurement system. The effort was made in term of flexible implementation of data quality monitoring along with source data dynamic selection. In the paper is discussed standard measurement system model followed by detailed model of FPGA firmware for modular measurement systems. Considered are both: functional blocks and data buses. In the summary, necessary blocks and signal lines are described. Implementation of firmware following the presented rules should provide modular design, with ease of change different parts of it. The key benefit is construction of universal, modular HDL design, that can be applied in different measurement system with simple adjustments.
Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...
NASA Astrophysics Data System (ADS)
Laarits, Toomas; O'Gorman, Bryan; Crescimanno, Michael
2008-03-01
We describe and solve a quantum optics models for multiphoton interrogation of an electromagnetically induced transparency (EIT) resonance. Multiphoton EIT, like its well studied Lambda-system EIT progenitor, is a generalization of the N-resonance process recently studied for atomic time keeping. The solution of these models allows a preliminary determination of this processes utility as the basis of a frequency standard.
NASA Astrophysics Data System (ADS)
Liang, J.; Sédillot, S.; Traverson, B.
1997-09-01
This paper addresses federation of a transactional object standard - Object Management Group (OMG) object transaction service (OTS) - with the X/Open distributed transaction processing (DTP) model and International Organization for Standardization (ISO) open systems interconnection (OSI) transaction processing (TP) communication protocol. The two-phase commit propagation rules within a distributed transaction tree are similar in the X/Open, ISO and OMG models. Building an OTS on an OSI TP protocol machine is possible because the two specifications are somewhat complementary. OTS defines a set of external interfaces without specific internal protocol machine, while OSI TP specifies an internal protocol machine without any application programming interface. Given these observations, and having already implemented an X/Open two-phase commit transaction toolkit based on an OSI TP protocol machine, we analyse the feasibility of using this implementation as a transaction service provider for OMG interfaces. Based on the favourable result of this feasibility study, we are implementing an OTS compliant system, which, by initiating the extensibility and openness strengths of OSI TP, is able to provide interoperability between X/Open DTP and OMG OTS models.
The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.
2010-12-01
Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.
Architectural Improvements and New Processing Tools for the Open XAL Online Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M
The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less
Decreasing patient identification band errors by standardizing processes.
Walley, Susan Chu; Berger, Stephanie; Harris, Yolanda; Gallizzi, Gina; Hayes, Leslie
2013-04-01
Patient identification (ID) bands are an essential component in patient ID. Quality improvement methodology has been applied as a model to reduce ID band errors although previous studies have not addressed standardization of ID bands. Our specific aim was to decrease ID band errors by 50% in a 12-month period. The Six Sigma DMAIC (define, measure, analyze, improve, and control) quality improvement model was the framework for this study. ID bands at a tertiary care pediatric hospital were audited from January 2011 to January 2012 with continued audits to June 2012 to confirm the new process was in control. After analysis, the major improvement strategy implemented was standardization of styles of ID bands and labels. Additional interventions included educational initiatives regarding the new ID band processes and disseminating institutional and nursing unit data. A total of 4556 ID bands were audited with a preimprovement ID band error average rate of 9.2%. Significant variation in the ID band process was observed, including styles of ID bands. Interventions were focused on standardization of the ID band and labels. The ID band error rate improved to 5.2% in 9 months (95% confidence interval: 2.5-5.5; P < .001) and was maintained for 8 months. Standardization of ID bands and labels in conjunction with other interventions resulted in a statistical decrease in ID band error rates. This decrease in ID band error rates was maintained over the subsequent 8 months.
Measurement of inclusive radiative B-meson decay B decaying to X(S) meson-gamma
NASA Astrophysics Data System (ADS)
Ozcan, Veysi Erkcan
Radiative decays of the B meson, B→ Xsgamma, proceed via virtual flavor changing neutral current processes that are sensitive to contributions from high mass scales, either within the Standard Model of electroweak interactions or beyond. In the Standard Model, these transitions are sensitive to the weak interactions of the top quark, and relatively robust predictions of the inclusive decay rate exist. Significant deviation from these predictions could be interpreted as indications for processes not included in the minimal Standard Model, like interactions of charged Higgs or SUSY particles. The analysis of the inclusive photon spectrum from B→ Xsgamma decays is rather challenging due to high backgrounds from photons emitted in the decay of mesons in B decays as well as e+e- annihilation to low mass quark and lepton pairs. Based on 88.5 million BB events collected by the BABAR detector, the photon spectrum above 1.9 GeV is presented. By comparison of the first and second moments of the photon spectrum with QCD predictions (calculated in the kinetic scheme), QCD parameters describing the bound state of the b quark in the B meson are extracted: mb=4.45+/-0.16 GeV/c2m2 p=0.65+/-0.29 GeV2 These parameters are useful input to non-perturbative QCD corrections to the semileptonic B decay rate and the determination of the CKM parameter Vub. Based on these parameters and heavy quark expansion, the full branching fraction is obtained as: BRB→X sgEg >1.6GeV=4.050.32 stat+/-0.38syst +/-0.29model x10-4. This result is in good agreement with previous measurements, the statistical and systematic errors are comparable. It is also in good agreement with the theoretical Standard Model predictions, and thus within the present errors there is no indication of any interactions not accounted for in the Standard Model. This finding implies strong constraints on physics beyond the Standard Model.
NASA Astrophysics Data System (ADS)
Gigan, Olivier; Chen, Hua; Robert, Olivier; Renard, Stephane; Marty, Frederic
2002-11-01
This paper is dedicated to the fabrication and technological aspect of a silicon microresonator sensor. The entire project includes the fabrication processes, the system modelling/simulation, and the electronic interface. The mechanical model of such resonator is presented including description of frequency stability and Hysterises behaviour of the electrostatically driven resonator. Numeric model and FEM simulations are used to simulate the system dynamic behaviour. The complete fabrication process is based on standard microelectronics technology with specific MEMS technological steps. The key steps are described: micromachining on SOI by Deep Reactive Ion Etching (DRIE), specific release processes to prevent sticking (resist and HF-vapour release process) and collective vacuum encapsulation by Silicon Direct Bonding (SDB). The complete process has been validated and prototypes have been fabricated. The ASIC was designed to interface the sensor and to control the vibration amplitude. This electronic was simulated and designed to work up to 200°C and implemented in a standard 0.6μ CMOS technology. Characterizations of sensor prototypes are done both mechanically and electrostatically. These measurements showed good agreements with theory and FEM simulations.
Johns, Brendan T; Taler, Vanessa; Pisoni, David B; Farlow, Martin R; Hake, Ann Marie; Kareken, David A; Unverzagt, Frederick W; Jones, Michael N
2018-06-01
Mild cognitive impairment (MCI) is characterised by subjective and objective memory impairment in the absence of dementia. MCI is a strong predictor for the development of Alzheimer's disease, and may represent an early stage in the disease course in many cases. A standard task used in the diagnosis of MCI is verbal fluency, where participants produce as many items from a specific category (e.g., animals) as possible. Verbal fluency performance is typically analysed by counting the number of items produced. However, analysis of the semantic path of the items produced can provide valuable additional information. We introduce a cognitive model that uses multiple types of lexical information in conjunction with a standard memory search process. The model used a semantic representation derived from a standard semantic space model in conjunction with a memory searching mechanism derived from the Luce choice rule (Luce, 1977). The model was able to detect differences in the memory searching process of patients who were developing MCI, suggesting that the formal analysis of verbal fluency data is a promising avenue to examine the underlying changes occurring in the development of cognitive impairment. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Composable Framework Support for Software-FMEA Through Model Execution
NASA Astrophysics Data System (ADS)
Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco
2016-08-01
Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.
NASA Technical Reports Server (NTRS)
Nashman, Marilyn; Chaconas, Karen J.
1988-01-01
The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.
NASA Technical Reports Server (NTRS)
Hartman, Brian Davis
1995-01-01
A key drawback to estimating geodetic and geodynamic parameters over time based on satellite laser ranging (SLR) observations is the inability to accurately model all the forces acting on the satellite. Errors associated with the observations and the measurement model can detract from the estimates as well. These 'model errors' corrupt the solutions obtained from the satellite orbit determination process. Dynamical models for satellite motion utilize known geophysical parameters to mathematically detail the forces acting on the satellite. However, these parameters, while estimated as constants, vary over time. These temporal variations must be accounted for in some fashion to maintain meaningful solutions. The primary goal of this study is to analyze the feasibility of using a sequential process noise filter for estimating geodynamic parameters over time from the Laser Geodynamics Satellite (LAGEOS) SLR data. This evaluation is achieved by first simulating a sequence of realistic LAGEOS laser ranging observations. These observations are generated using models with known temporal variations in several geodynamic parameters (along track drag and the J(sub 2), J(sub 3), J(sub 4), and J(sub 5) geopotential coefficients). A standard (non-stochastic) filter and a stochastic process noise filter are then utilized to estimate the model parameters from the simulated observations. The standard non-stochastic filter estimates these parameters as constants over consecutive fixed time intervals. Thus, the resulting solutions contain constant estimates of parameters that vary in time which limits the temporal resolution and accuracy of the solution. The stochastic process noise filter estimates these parameters as correlated process noise variables. As a result, the stochastic process noise filter has the potential to estimate the temporal variations more accurately since the constraint of estimating the parameters as constants is eliminated. A comparison of the temporal resolution of solutions obtained from standard sequential filtering methods and process noise sequential filtering methods shows that the accuracy is significantly improved using process noise. The results show that the positional accuracy of the orbit is improved as well. The temporal resolution of the resulting solutions are detailed, and conclusions drawn about the results. Benefits and drawbacks of using process noise filtering in this type of scenario are also identified.
Evidence for the associated production of a W boson and a top quark at ATLAS
NASA Astrophysics Data System (ADS)
Koll, James
This thesis discusses a search for the Standard Model single top Wt-channel process. An analysis has been performed searching for the Wt-channel process using 4.7 fb-1 of integrated luminosity collected with the ATLAS detector at the Large Hadron Collider. A boosted decision tree is trained using machine learning techniques to increase the separation between signal and background. A profile likelihood fit is used to measure the cross-section of the Wt-channel process at sigma(pp → Wt + X) = 16.8 +/-2.9 (stat) +/- 4.9(syst) pb, consistent with the Standard Model prediction. This fit is also used to generate pseudoexperiments to calculate the significance, finding an observed (expected) 3.3 sigma (3.4 sigma) excess over background.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Samaniego, Luis; Clark, Martyn; Wulfmeyer, Volker; Branch, Oliver; Attinger, Sabine; Thober, Stephan
2016-09-01
Land surface models incorporate a large number of process descriptions, containing a multitude of parameters. These parameters are typically read from tabulated input files. Some of these parameters might be fixed numbers in the computer code though, which hinder model agility during calibration. Here we identified 139 hard-coded parameters in the model code of the Noah land surface model with multiple process options (Noah-MP). We performed a Sobol' global sensitivity analysis of Noah-MP for a specific set of process options, which includes 42 out of the 71 standard parameters and 75 out of the 139 hard-coded parameters. The sensitivities of the hydrologic output fluxes latent heat and total runoff as well as their component fluxes were evaluated at 12 catchments within the United States with very different hydrometeorological regimes. Noah-MP's hydrologic output fluxes are sensitive to two thirds of its applicable standard parameters (i.e., Sobol' indexes above 1%). The most sensitive parameter is, however, a hard-coded value in the formulation of soil surface resistance for direct evaporation, which proved to be oversensitive in other land surface models as well. Surface runoff is sensitive to almost all hard-coded parameters of the snow processes and the meteorological inputs. These parameter sensitivities diminish in total runoff. Assessing these parameters in model calibration would require detailed snow observations or the calculation of hydrologic signatures of the runoff data. Latent heat and total runoff exhibit very similar sensitivities because of their tight coupling via the water balance. A calibration of Noah-MP against either of these fluxes should therefore give comparable results. Moreover, these fluxes are sensitive to both plant and soil parameters. Calibrating, for example, only soil parameters hence limit the ability to derive realistic model parameters. It is thus recommended to include the most sensitive hard-coded model parameters that were exposed in this study when calibrating Noah-MP.
Predicting Student Performance in a Collaborative Learning Environment
ERIC Educational Resources Information Center
Olsen, Jennifer K.; Aleven, Vincent; Rummel, Nikol
2015-01-01
Student models for adaptive systems may not model collaborative learning optimally. Past research has either focused on modeling individual learning or for collaboration, has focused on group dynamics or group processes without predicting learning. In the current paper, we adjust the Additive Factors Model (AFM), a standard logistic regression…
Managing public health in the Army through a standard community health promotion council model.
Courie, Anna F; Rivera, Moira Shaw; Pompey, Allison
2014-01-01
Public health processes in the US Army remain uncoordinated due to competing lines of command, funding streams and multiple subject matter experts in overlapping public health concerns. The US Army Public Health Command (USAPHC) has identified a standard model for community health promotion councils (CHPCs) as an effective framework for synchronizing and integrating these overlapping systems to ensure a coordinated approach to managing the public health process. The purpose of this study is to test a foundational assumption of the CHPC effectiveness theory: the 3 features of a standard CHPC model - a CHPC chaired by a strong leader, ie, the senior commander; a full time health promotion team dedicated to the process; and centralized management through the USAPHC - will lead to high quality health promotion councils capable of providing a coordinated approach to addressing public health on Army installations. The study employed 2 evaluation questions: (1) Do CHPCs with centralized management through the USAPHC, alignment with the senior commander, and a health promotion operations team adhere more closely to the evidence-based CHPC program framework than CHPCs without these 3 features? (2) Do members of standard CHPCs report that participation in the CHPC leads to a well-coordinated approach to public health at the installation? The results revealed that both time (F(5,76)=25.02, P<.0001) and the 3 critical features of the standard CHPC model (F(1,76)=28.40, P<.0001) independently predicted program adherence. Evaluation evidence supports the USAPHC's approach to CHPC implementation as part of public health management on Army installations. Preliminary evidence suggests that the standard CHPC model may lead to a more coordinated approach to public health and may assure that CHPCs follow an evidence-informed design. This is consistent with past research demonstrating that community coalitions and public health systems that have strong leadership; dedicated staff time and expertise; influence over policy, governance and oversight; and formalized rules and regulations function more effectively than those without. It also demonstrates the feasibility of implementing an evidence-informed approach to community coalitions in an Army environment.
Renewal processes based on generalized Mittag-Leffler waiting times
NASA Astrophysics Data System (ADS)
Cahoy, Dexter O.; Polito, Federico
2013-03-01
The fractional Poisson process has recently attracted experts from several fields of study. Its natural generalization of the ordinary Poisson process made the model more appealing for real-world applications. In this paper, we generalized the standard and fractional Poisson processes through the waiting time distribution, and showed their relations to an integral operator with a generalized Mittag-Leffler function in the kernel. The waiting times of the proposed renewal processes have the generalized Mittag-Leffler and stretched-squashed Mittag-Leffler distributions. Note that the generalizations naturally provide greater flexibility in modeling real-life renewal processes. Algorithms to simulate sample paths and to estimate the model parameters are derived. Note also that these procedures are necessary to make these models more usable in practice. State probabilities and other qualitative or quantitative features of the models are also discussed.
NASA Astrophysics Data System (ADS)
Lingel, Karen; Skwarnicki, Tomasz; Smith, James G.
Penguin, or loop, decays of B mesons induce effective flavor-changing neutral currents, which are forbidden at tree level in the standard model. These decays give special insight into the CKM matrix and are sensitive to non-standard-model effects. In this review, we give a historical and theoretical introduction to penguins and a description of the various types of penguin processes: electromagnetic, electroweak, and gluonic. We review the experimental searches for penguin decays, including the measurements of the electromagnetic penguins b -> sgamma and B -> K*gamma and gluonic penguins B -> Kpi, B+ -> omegaK+ and B -> eta'K, and their implications for the standard model and new physics. We conclude by exploring the future prospects for penguin physics.
Search for the standard model Higgs boson in tau final states.
Abazov, V M; Abbott, B; Abolins, M; Acharya, B S; Adams, M; Adams, T; Aguilo, E; Ahsan, M; Alexeev, G D; Alkhazov, G; Alton, A; Alverson, G; Alves, G A; Ancu, L S; Andeen, T; Anzelc, M S; Aoki, M; Arnoud, Y; Arov, M; Arthaud, M; Askew, A; Asman, B; Atramentov, O; Avila, C; Backusmayes, J; Badaud, F; Bagby, L; Baldin, B; Bandurin, D V; Banerjee, S; Barberis, E; Barfuss, A-F; Bargassa, P; Baringer, P; Barreto, J; Bartlett, J F; Bassler, U; Bauer, D; Beale, S; Bean, A; Begalli, M; Begel, M; Belanger-Champagne, C; Bellantoni, L; Bellavance, A; Benitez, J A; Beri, S B; Bernardi, G; Bernhard, R; Bertram, I; Besançon, M; Beuselinck, R; Bezzubov, V A; Bhat, P C; Bhatnagar, V; Blazey, G; Blessing, S; Bloom, K; Boehnlein, A; Boline, D; Bolton, T A; Boos, E E; Borissov, G; Bose, T; Brandt, A; Brock, R; Brooijmans, G; Bross, A; Brown, D; Bu, X B; Buchholz, D; Buehler, M; Buescher, V; Bunichev, V; Burdin, S; Burnett, T H; Buszello, C P; Calfayan, P; Calpas, B; Calvet, S; Cammin, J; Carrasco-Lizarraga, M A; Carrera, E; Carvalho, W; Casey, B C K; Castilla-Valdez, H; Chakrabarti, S; Chakraborty, D; Chan, K M; Chandra, A; Cheu, E; Cho, D K; Choi, S; Choudhary, B; Christoudias, T; Cihangir, S; Claes, D; Clutter, J; Cooke, M; Cooper, W E; Corcoran, M; Couderc, F; Cousinou, M-C; Crépé-Renaudin, S; Cuplov, V; Cutts, D; Cwiok, M; Das, A; Davies, G; De, K; de Jong, S J; De La Cruz-Burelo, E; DeVaughan, K; Déliot, F; Demarteau, M; Demina, R; Denisov, D; Denisov, S P; Desai, S; Diehl, H T; Diesburg, M; Dominguez, A; Dorland, T; Dubey, A; Dudko, L V; Duflot, L; Duggan, D; Duperrin, A; Dutt, S; Dyshkant, A; Eads, M; Edmunds, D; Ellison, J; Elvira, V D; Enari, Y; Eno, S; Ermolov, P; Escalier, M; Evans, H; Evdokimov, A; Evdokimov, V N; Facini, G; Ferapontov, A V; Ferbel, T; Fiedler, F; Filthaut, F; Fisher, W; Fisk, H E; Fortner, M; Fox, H; Fu, S; Fuess, S; Gadfort, T; Galea, C F; Garcia-Bellido, A; Gavrilov, V; Gay, P; Geist, W; Geng, W; Gerber, C E; Gershtein, Y; Gillberg, D; Ginther, G; Gómez, B; Goussiou, A; Grannis, P D; Greder, S; Greenlee, H; Greenwood, Z D; Gregores, E M; Grenier, G; Gris, Ph; Grivaz, J-F; Grohsjean, A; Grünendahl, S; Grünewald, M W; Guo, F; Guo, J; Gutierrez, G; Gutierrez, P; Haas, A; Hadley, N J; Haefner, P; Hagopian, S; Haley, J; Hall, I; Hall, R E; Han, L; Harder, K; Harel, A; Hauptman, J M; Hays, J; Hebbeker, T; Hedin, D; Hegeman, J G; Heinson, A P; Heintz, U; Hensel, C; Heredia-De La Cruz, I; Herner, K; Hesketh, G; Hildreth, M D; Hirosky, R; Hoang, T; Hobbs, J D; Hoeneisen, B; Hohlfeld, M; Hossain, S; Houben, P; Hu, Y; Hubacek, Z; Huske, N; Hynek, V; Iashvili, I; Illingworth, R; Ito, A S; Jabeen, S; Jaffré, M; Jain, S; Jakobs, K; Jamin, D; Jarvis, C; Jesik, R; Johns, K; Johnson, C; Johnson, M; Johnston, D; Jonckheere, A; Jonsson, P; Juste, A; Kajfasz, E; Karmanov, D; Kasper, P A; Katsanos, I; Kaushik, V; Kehoe, R; Kermiche, S; Khalatyan, N; Khanov, A; Kharchilava, A; Kharzheev, Y N; Khatidze, D; Kim, T J; Kirby, M H; Kirsch, M; Klima, B; Kohli, J M; Konrath, J-P; Kozelov, A V; Kraus, J; Kuhl, T; Kumar, A; Kupco, A; Kurca, T; Kuzmin, V A; Kvita, J; Lacroix, F; Lam, D; Lammers, S; Landsberg, G; Lebrun, P; Lee, W M; Leflat, A; Lellouch, J; Li, J; Li, L; Li, Q Z; Lietti, S M; Lim, J K; Lincoln, D; Linnemann, J; Lipaev, V V; Lipton, R; Liu, Y; Liu, Z; Lobodenko, A; Lokajicek, M; Love, P; Lubatti, H J; Luna-Garcia, R; Lyon, A L; Maciel, A K A; Mackin, D; Mättig, P; Magerkurth, A; Mal, P K; Malbouisson, H B; Malik, S; Malyshev, V L; Maravin, Y; Martin, B; McCarthy, R; McGivern, C L; Meijer, M M; Melnitchouk, A; Mendoza, L; Menezes, D; Mercadante, P G; Merkin, M; Merritt, K W; Meyer, A; Meyer, J; Mitrevski, J; Mommsen, R K; Mondal, N K; Moore, R W; Moulik, T; Muanza, G S; Mulhearn, M; Mundal, O; Mundim, L; Nagy, E; Naimuddin, M; Narain, M; Neal, H A; Negret, J P; Neustroev, P; Nilsen, H; Nogima, H; Novaes, S F; Nunnemann, T; Obrant, G; Ochando, C; Onoprienko, D; Orduna, J; Oshima, N; Osman, N; Osta, J; Otec, R; Otero Y Garzón, G J; Owen, M; Padilla, M; Padley, P; Pangilinan, M; Parashar, N; Park, S-J; Park, S K; Parsons, J; Partridge, R; Parua, N; Patwa, A; Pawloski, G; Penning, B; Perfilov, M; Peters, K; Peters, Y; Pétroff, P; Piegaia, R; Piper, J; Pleier, M-A; Podesta-Lerma, P L M; Podstavkov, V M; Pogorelov, Y; Pol, M-E; Polozov, P; Popov, A V; Potter, C; Prado da Silva, W L; Protopopescu, S; Qian, J; Quadt, A; Quinn, B; Rakitine, A; Rangel, M S; Ranjan, K; Ratoff, P N; Renkel, P; Rich, P; Rijssenbeek, M; Ripp-Baudot, I; Rizatdinova, F; Robinson, S; Rodrigues, R F; Rominsky, M; Royon, C; Rubinov, P; Ruchti, R; Safronov, G; Sajot, G; Sánchez-Hernández, A; Sanders, M P; Sanghi, B; Savage, G; Sawyer, L; Scanlon, T; Schaile, D; Schamberger, R D; Scheglov, Y; Schellman, H; Schliephake, T; Schlobohm, S; Schwanenberger, C; Schwienhorst, R; Sekaric, J; Severini, H; Shabalina, E; Shamim, M; Shary, V; Shchukin, A A; Shivpuri, R K; Siccardi, V; Simak, V; Sirotenko, V; Skubic, P; Slattery, P; Smirnov, D; Snow, G R; Snow, J; Snyder, S; Söldner-Rembold, S; Sonnenschein, L; Sopczak, A; Sosebee, M; Soustruznik, K; Spurlock, B; Stark, J; Stolin, V; Stoyanova, D A; Strandberg, J; Strandberg, S; Strang, M A; Strauss, E; Strauss, M; Ströhmer, R; Strom, D; Stutte, L; Sumowidagdo, S; Svoisky, P; Takahashi, M; Tanasijczuk, A; Taylor, W; Tiller, B; Tissandier, F; Titov, M; Tokmenin, V V; Torchiani, I; Tsybychev, D; Tuchming, B; Tully, C; Tuts, P M; Unalan, R; Uvarov, L; Uvarov, S; Uzunyan, S; Vachon, B; van den Berg, P J; Van Kooten, R; van Leeuwen, W M; Varelas, N; Varnes, E W; Vasilyev, I A; Verdier, P; Vertogradov, L S; Verzocchi, M; Vilanova, D; Vint, P; Vokac, P; Voutilainen, M; Wagner, R; Wahl, H D; Wang, M H L S; Warchol, J; Watts, G; Wayne, M; Weber, G; Weber, M; Welty-Rieger, L; Wenger, A; Wetstein, M; White, A; Wicke, D; Williams, M R J; Wilson, G W; Wimpenny, S J; Wobisch, M; Wood, D R; Wyatt, T R; Xie, Y; Xu, C; Yacoob, S; Yamada, R; Yang, W-C; Yasuda, T; Yatsunenko, Y A; Ye, Z; Yin, H; Yip, K; Yoo, H D; Youn, S W; Yu, J; Zeitnitz, C; Zelitch, S; Zhao, T; Zhou, B; Zhu, J; Zielinski, M; Zieminska, D; Zivkovic, L; Zutshi, V; Zverev, E G
2009-06-26
We present a search for the standard model Higgs boson using hadronically decaying tau leptons, in 1 fb(-1) of data collected with the D0 detector at the Fermilab Tevatron pp collider. We select two final states: tau+/- plus missing transverse energy and b jets, and tau+ tau- plus jets. These final states are sensitive to a combination of associated W/Z boson plus Higgs boson, vector boson fusion, and gluon-gluon fusion production processes. The observed ratio of the combined limit on the Higgs production cross section at the 95% C.L. to the standard model expectation is 29 for a Higgs boson mass of 115 GeV.
NASA Astrophysics Data System (ADS)
Sőrés, László
2013-04-01
INSPIRE is a European directive to harmonize spatial data in Europe. Its' aim is to establish a transparent, multidisciplinary network of environmental information by using international standards and OGC web services. Spatial data themes defined in the annex of the directive cover 34 domains that are closely bundled to environment and spatial information. According to the INSPIRE roadmap all data providers must setup discovery, viewing and download services and restructure data stores to provide spatial data as defined by the underlying specifications by 2014 December 1. More than 3000 institutions are going to be involved in the progress. During the data specification process geophysics as an inevitable source of geo information was introduced to Annex II Geology. Within the Geology theme Geophysics is divided into core and extended model. The core model contains specifications for legally binding data provisioning and is going to be part of the Implementation Rules of the INSPIRE directives. To minimize the work load of obligatory data transformations the scope of the core model is very limited and simple. It covers the most essential geophysical feature types that are relevant in economic and environmental context. To fully support the use cases identified by the stake holders the extended model was developed. It contains a wide range of spatial object types for geophysical measurements, processed and interpreted results, and wrapper classes to help data providers in using the Observation and Measurements (O&M) standard for geophysical data exchange. Instead of introducing the traditional concept of "geophysical methods" at a high structural level the data model classifies measurements and geophysical models based on their spatial characteristics. Measurements are classified as geophysical station (point), geophysical profile (curve) and geophysical swath (surface). Generic classes for processing results and interpretation models are curve model (1D), surface model (2D), and solid model (3D). Both measurements and models are derived from O&M sampling features that may be linked to sampling procedures and observation results. Geophysical products are output of complex procedures and can precisely be described as chains of consecutive O&M observations. For describing geophysical processes and results the data model both supports the use of OGC standard XML encoding (SensorML, SWE, GML) and traditional industry standards (SPS, UKOOA, SEG formats). To control the scope of the model and to harmonize terminology an initial set of extendable code lists was developed. The attempt to create a hierarchical SKOS vocabulary of terms for geophysical methods, resource types, processes, properties and technical parameters was partly based on the work done in the eContentPlus GEOMIND project. The result is far from being complete, and the work must be continued in the future.
ERIC Educational Resources Information Center
Nelson, Catherine J.
2012-01-01
The author is a strong proponent of incorporating the Content and Process Standards (NCTM 2000) into the teaching of mathematics. For candidates in her methods course, she models research-based best practices anchored in the Standards. Her students use manipulatives, engage in problem-solving activities, listen to children's literature, and use…
Model-independent determination of the triple Higgs coupling at e + e – colliders
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; ...
2018-03-20
Here, the observation of Higgs pair production at high-energy colliders can give evidence for the presence of a triple Higgs coupling. However, the actual determination of the value of this coupling is more difficult. In the context of general models for new physics, double Higgs production processes can receive contributions from many possible beyond-Standard-Model effects. This dependence must be understood if one is to make a definite statement about the deviation of the Higgs field potential from the Standard Model. In this paper, we study the extraction of the triple Higgs coupling from the process e +e –→Zhh. We showmore » that, by combining the measurement of this process with other measurements available at a 500 GeV e +e – collider, it is possible to quote model-independent limits on the effective field theory parameter c 6 that parametrizes modifications of the Higgs potential. We present precise error estimates based on the anticipated International Linear Collider physics program, studied with full simulation. Our analysis also gives new insight into the model-independent extraction of the Higgs boson coupling constants and total width from e +e – data.« less
Model-independent determination of the triple Higgs coupling at e + e – colliders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon
Here, the observation of Higgs pair production at high-energy colliders can give evidence for the presence of a triple Higgs coupling. However, the actual determination of the value of this coupling is more difficult. In the context of general models for new physics, double Higgs production processes can receive contributions from many possible beyond-Standard-Model effects. This dependence must be understood if one is to make a definite statement about the deviation of the Higgs field potential from the Standard Model. In this paper, we study the extraction of the triple Higgs coupling from the process e +e –→Zhh. We showmore » that, by combining the measurement of this process with other measurements available at a 500 GeV e +e – collider, it is possible to quote model-independent limits on the effective field theory parameter c 6 that parametrizes modifications of the Higgs potential. We present precise error estimates based on the anticipated International Linear Collider physics program, studied with full simulation. Our analysis also gives new insight into the model-independent extraction of the Higgs boson coupling constants and total width from e +e – data.« less
Model-independent determination of the triple Higgs coupling at e+e- colliders
NASA Astrophysics Data System (ADS)
Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Peskin, Michael E.; Tian, Junping
2018-03-01
The observation of Higgs pair production at high-energy colliders can give evidence for the presence of a triple Higgs coupling. However, the actual determination of the value of this coupling is more difficult. In the context of general models for new physics, double Higgs production processes can receive contributions from many possible beyond-Standard-Model effects. This dependence must be understood if one is to make a definite statement about the deviation of the Higgs field potential from the Standard Model. In this paper, we study the extraction of the triple Higgs coupling from the process e+e-→Z h h . We show that, by combining the measurement of this process with other measurements available at a 500 GeV e+e- collider, it is possible to quote model-independent limits on the effective field theory parameter c6 that parametrizes modifications of the Higgs potential. We present precise error estimates based on the anticipated International Linear Collider physics program, studied with full simulation. Our analysis also gives new insight into the model-independent extraction of the Higgs boson coupling constants and total width from e+e- data.
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1990-01-01
While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.
Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-01-01
Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757
Data preprocessing methods of FT-NIR spectral data for the classification cooking oil
NASA Astrophysics Data System (ADS)
Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli
2014-12-01
This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.
Deposition on disordered substrates with precursor layer diffusion
NASA Astrophysics Data System (ADS)
Filipe, J. A. N.; Rodgers, G. J.; Tavassoli, Z.
1998-09-01
Recently we introduced a one-dimensional accelerated random sequential adsorption process as a model for chemisorption with precursor layer diffusion. In this paper we consider this deposition process on disordered or impure substrates. The problem is solved exactly on both the lattice and continuum and for various impurity distributions. The results are compared with those from the standard random sequential adsorption model.
Academic Outcome Measures of a Dedicated Education Unit Over Time: Help or Hinder?
Smyer, Tish; Gatlin, Tricia; Tan, Rhigel; Tejada, Marianne; Feng, Du
2015-01-01
Critical thinking, nursing process, quality and safety measures, and standardized RN exit examination scores were compared between students (n = 144) placed in a dedicated education unit (DEU) and those in a traditional clinical model. Standardized test scores showed that differences between the clinical groups were not statistically significant. This study shows that the DEU model is 1 approach to clinical education that can enhance students' academic outcomes.
The SGML Standardization Framework and the Introduction of XML
Grütter, Rolf
2000-01-01
Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future. PMID:11720931
The SGML standardization framework and the introduction of XML.
Fierz, W; Grütter, R
2000-01-01
Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future.
Discovering Implicit Networks from Point Process Data
2013-08-03
Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 SOCIAL NETWORK ANALYSIS Szell et al, Nature 2012 Saturday, August 3, 13 (a) Adjacency...processes: ‣ Seismology ‣ Epidemiology ‣ Economics ‣ Modeling dependence is challenging - “beyond Poisson” ‣ Strauss and Gibbs Processes ‣ Determinantal
ERIC Educational Resources Information Center
Pawlowski, Jan M.
2007-01-01
In 2005, the new quality standard for learning, education, and training, ISO/IEC 19796-1, was published. Its purpose is to help educational organizations to develop quality systems and to improve the quality of their processes, products, and services. In this article, the standard is presented and compared to existing approaches, showing the…
[HL7 standard--features, principles, and methodology].
Koncar, Miroslav
2005-01-01
The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.
NASA Astrophysics Data System (ADS)
Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok
2013-08-01
The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.
NASA Standard for Models and Simulations (M and S): Development Process and Rationale
NASA Technical Reports Server (NTRS)
Zang, Thomas A.; Blattnig, Steve R.; Green, Lawrence L.; Hemsch, Michael J.; Luckring, James M.; Morison, Joseph H.; Tripathi, Ram K.
2009-01-01
After the Columbia Accident Investigation Board (CAIB) report. the NASA Administrator at that time chartered an executive team (known as the Diaz Team) to identify the CAIB report elements with Agency-wide applicability, and to develop corrective measures to address each element. This report documents the chronological development and release of an Agency-wide Standard for Models and Simulations (M&S) (NASA Standard 7009) in response to Action #4 from the report, "A Renewed Commitment to Excellence: An Assessment of the NASA Agency-wide Applicability of the Columbia Accident Investigation Board Report, January 30, 2004".
Stable Kalman filters for processing clock measurement data
NASA Technical Reports Server (NTRS)
Clements, P. A.; Gibbs, B. P.; Vandergraft, J. S.
1989-01-01
Kalman filters have been used for some time to process clock measurement data. Due to instabilities in the standard Kalman filter algorithms, the results have been unreliable and difficult to obtain. During the past several years, stable forms of the Kalman filter have been developed, implemented, and used in many diverse applications. These algorithms, while algebraically equivalent to the standard Kalman filter, exhibit excellent numerical properties. Two of these stable algorithms, the Upper triangular-Diagonal (UD) filter and the Square Root Information Filter (SRIF), have been implemented to replace the standard Kalman filter used to process data from the Deep Space Network (DSN) hydrogen maser clocks. The data are time offsets between the clocks in the DSN, the timescale at the National Institute of Standards and Technology (NIST), and two geographically intermediate clocks. The measurements are made by using the GPS navigation satellites in mutual view between clocks. The filter programs allow the user to easily modify the clock models, the GPS satellite dependent biases, and the random noise levels in order to compare different modeling assumptions. The results of this study show the usefulness of such software for processing clock data. The UD filter is indeed a stable, efficient, and flexible method for obtaining optimal estimates of clock offsets, offset rates, and drift rates. A brief overview of the UD filter is also given.
Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa
2016-01-01
Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.
Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)
NASA Technical Reports Server (NTRS)
Gray, Justin S.; Briggs, Jeffery L.
2008-01-01
The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.
Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.
Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J
2015-08-21
In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).
NASA Astrophysics Data System (ADS)
Nguyen, Thi Hoai Thu; Chen, Jyh-Chen; Hu, Chieh; Chen, Chun-Hung; Huang, Yen-Hao; Lin, Huang-Wei; Yu, Andy; Hsu, Bruce
2017-06-01
In this study, a global transient numerical simulation of silicon growth from the beginning of the solidification process until the end of the cooling process is carried out modeling the growth of an 800 kg ingot in an industrial seeded directional solidification furnace. The standard furnace is modified by the addition of insulating blocks in the hot zone. The simulation results show that there is a significant decrease in the thermal stress and dislocation density in the modified model as compared to the standard one (a maximal decrease of 23% and 75% along the center line of ingot for thermal stress and dislocation density, respectively). This modification reduces the heating power consumption for solidification of the silicon melt by about 17% and shortens the growth time by about 2.5 h. Moreover, it is found that adjusting the operating conditions of modified model to obtain the lower growth rate during the early stages of the solidification process can lower dislocation density and total heater power.
Vanilla technicolor at linear colliders
NASA Astrophysics Data System (ADS)
Frandsen, Mads T.; Järvinen, Matti; Sannino, Francesco
2011-08-01
We analyze the reach of linear colliders for models of dynamical electroweak symmetry breaking. We show that linear colliders can efficiently test the compositeness scale, identified with the mass of the new spin-one resonances, until the maximum energy in the center of mass of the colliding leptons. In particular we analyze the Drell-Yan processes involving spin-one intermediate heavy bosons decaying either leptonically or into two standard model gauge bosons. We also analyze the light Higgs production in association with a standard model gauge boson stemming also from an intermediate spin-one heavy vector.
Aaltonen, T.
2015-03-17
Production of the Υ(1S) meson in association with a vector boson is a rare process in the standard model with a cross section predicted to be below the sensitivity of the Tevatron. Observation of this process could signify contributions not described by the standard model or reveal limitations with the current nonrelativistic quantum-chromodynamic models used to calculate the cross section. We perform a search for this process using the full Run II data set collected by the CDF II detector corresponding to an integrated luminosity of 9.4 fb -1. Our search considers the Υ→μμ decay and the decay of themore » W and Z bosons into muons and electrons. Furthermore, in these purely leptonic decay channels, we observe one ΥW candidate with an expected background of 1.2±0.5 events, and one ΥZcandidate with an expected background of 0.1±0.1 events. Both observations are consistent with the predicted background contributions. The resulting upper limits on the cross section for Υ+W/Zproduction are the most sensitive reported from a single experiment and place restrictions on potential contributions from non-standard-model physics.« less
Ecological models supporting environmental decision making: a strategy for the future
Schmolke, Amelie; Thorbek, Pernille; DeAngelis, Donald L.; Grimm, Volker
2010-01-01
Ecological models are important for environmental decision support because they allow the consequences of alternative policies and management scenarios to be explored. However, current modeling practice is unsatisfactory. A literature review shows that the elements of good modeling practice have long been identified but are widely ignored. The reasons for this might include lack of involvement of decision makers, lack of incentives for modelers to follow good practice, and the use of inconsistent terminologies. As a strategy for the future, we propose a standard format for documenting models and their analyses: transparent and comprehensive ecological modeling (TRACE) documentation. This standard format will disclose all parts of the modeling process to scrutiny and make modeling itself more efficient and coherent.
ISO 9000 Quality Systems: Application to Higher Education.
ERIC Educational Resources Information Center
Clery, Roger G.
This paper describes and explains the 20 elements of the International Organization for Standards 9000 (ISO 9000) series, a model for quality assurance in the business processes of design/development, production, installation and servicing. The standards were designed in 1987 to provide a common denominator for business quality particularly to…
2014-06-01
27000 series, COBIT, the British Standards Institution’s BS 25999, and ISO 24762 includes quantitative process measurements that can be used to...the NIST special publications 800 series, the International Organization for Standards ( ISO ) and International Electrotechnical Commission (IEC
Internal audit in a microbiology laboratory.
Mifsud, A J; Shafi, M S
1995-01-01
AIM--To set up a programme of internal laboratory audit in a medical microbiology laboratory. METHODS--A model of laboratory based process audit is described. Laboratory activities were examined in turn by specimen type. Standards were set using laboratory standard operating procedures; practice was observed using a purpose designed questionnaire and the data were analysed by computer; performance was assessed at laboratory audit meetings; and the audit circle was closed by re-auditing topics after an interval. RESULTS--Improvements in performance scores (objective measures) and in staff morale (subjective impression) were observed. CONCLUSIONS--This model of process audit could be applied, with amendments to take local practice into account, in any microbiology laboratory. PMID:7665701
Diffusion Decision Model: Current Issues and History
Ratcliff, Roger; Smith, Philip L.; Brown, Scott D.; McKoon, Gail
2016-01-01
There is growing interest in diffusion models to represent the cognitive and neural processes of speeded decision making. Sequential-sampling models like the diffusion model have a long history in psychology. They view decision making as a process of noisy accumulation of evidence from a stimulus. The standard model assumes that evidence accumulates at a constant rate during the second or two it takes to make a decision. This process can be linked to the behaviors of populations of neurons and to theories of optimality. Diffusion models have been used successfully in a range of cognitive tasks and as psychometric tools in clinical research to examine individual differences. In this article, we relate the models to both earlier and more recent research in psychology. PMID:26952739
The tangled bank of amino acids
Pollock, David D.
2016-01-01
Abstract The use of amino acid substitution matrices to model protein evolution has yielded important insights into both the evolutionary process and the properties of specific protein families. In order to make these models tractable, standard substitution matrices represent the average results of the evolutionary process rather than the underlying molecular biophysics and population genetics, treating proteins as a set of independently evolving sites rather than as an integrated biomolecular entity. With advances in computing and the increasing availability of sequence data, we now have an opportunity to move beyond current substitution matrices to more interpretable mechanistic models with greater fidelity to the evolutionary process of mutation and selection and the holistic nature of the selective constraints. As part of this endeavour, we consider how epistatic interactions induce spatial and temporal rate heterogeneity, and demonstrate how these generally ignored factors can reconcile standard substitution rate matrices and the underlying biology, allowing us to better understand the meaning of these substitution rates. Using computational simulations of protein evolution, we can demonstrate the importance of both spatial and temporal heterogeneity in modelling protein evolution. PMID:27028523
ERIC Educational Resources Information Center
Chan, Kit Yu Karen; Yang, Sylvia; Maliska, Max E.; Grunbaum, Daniel
2012-01-01
The National Science Education Standards have highlighted the importance of active learning and reflection for contemporary scientific methods in K-12 classrooms, including the use of models. Computer modeling and visualization are tools that researchers employ in their scientific inquiry process, and often computer models are used in…
Mathematical form models of tree trunks
Rudolfs Ozolins
2000-01-01
Assortment structure analysis of tree trunks is a characteristic and proper problem that can be solved by using mathematical modeling and standard computer programs. Mathematical form model of tree trunks consists of tapering curve equations and their parameters. Parameters for nine species were obtained by processing measurements of 2,794 model trees and studying the...
Bayesian model selection validates a biokinetic model for zirconium processing in humans
2012-01-01
Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152
ERIC Educational Resources Information Center
Powell, Justin J. W.; Bernhard, Nadine; Graf, Lukas
2012-01-01
Proposing an alternative to the American model, intergovernmental reform initiatives in Europe have developed and promote a comprehensive European model of skill formation. What ideals, standards, and governance are proposed in this new pan-European model? This model responds to heightened global competition among "knowledge societies"…
Modelling daily water temperature from air temperature for the Missouri River.
Zhu, Senlin; Nyarko, Emmanuel Karlo; Hadzima-Nyarko, Marijana
2018-01-01
The bio-chemical and physical characteristics of a river are directly affected by water temperature, which thereby affects the overall health of aquatic ecosystems. It is a complex problem to accurately estimate water temperature. Modelling of river water temperature is usually based on a suitable mathematical model and field measurements of various atmospheric factors. In this article, the air-water temperature relationship of the Missouri River is investigated by developing three different machine learning models (Artificial Neural Network (ANN), Gaussian Process Regression (GPR), and Bootstrap Aggregated Decision Trees (BA-DT)). Standard models (linear regression, non-linear regression, and stochastic models) are also developed and compared to machine learning models. Analyzing the three standard models, the stochastic model clearly outperforms the standard linear model and nonlinear model. All the three machine learning models have comparable results and outperform the stochastic model, with GPR having slightly better results for stations No. 2 and 3, while BA-DT has slightly better results for station No. 1. The machine learning models are very effective tools which can be used for the prediction of daily river temperature.
Klein, Michael T; Hou, Gang; Quann, Richard J; Wei, Wei; Liao, Kai H; Yang, Raymond S H; Campain, Julie A; Mazurek, Monica A; Broadbelt, Linda J
2002-01-01
A chemical engineering approach for the rigorous construction, solution, and optimization of detailed kinetic models for biological processes is described. This modeling capability addresses the required technical components of detailed kinetic modeling, namely, the modeling of reactant structure and composition, the building of the reaction network, the organization of model parameters, the solution of the kinetic model, and the optimization of the model. Even though this modeling approach has enjoyed successful application in the petroleum industry, its application to biomedical research has just begun. We propose to expand the horizons on classic pharmacokinetics and physiologically based pharmacokinetics (PBPK), where human or animal bodies were often described by a few compartments, by integrating PBPK with reaction network modeling described in this article. If one draws a parallel between an oil refinery, where the application of this modeling approach has been very successful, and a human body, the individual processing units in the oil refinery may be considered equivalent to the vital organs of the human body. Even though the cell or organ may be much more complicated, the complex biochemical reaction networks in each organ may be similarly modeled and linked in much the same way as the modeling of the entire oil refinery through linkage of the individual processing units. The integrated chemical engineering software package described in this article, BioMOL, denotes the biological application of molecular-oriented lumping. BioMOL can build a detailed model in 1-1,000 CPU sec using standard desktop hardware. The models solve and optimize using standard and widely available hardware and software and can be presented in the context of a user-friendly interface. We believe this is an engineering tool with great promise in its application to complex biological reaction networks. PMID:12634134
Billieux, Joël; Philippot, Pierre; Schmid, Cécile; Maurage, Pierre; De Mol, Jan; Van der Linden, Martial
2015-01-01
Dysfunctional use of the mobile phone has often been conceptualized as a 'behavioural addiction' that shares most features with drug addictions. In the current article, we challenge the clinical utility of the addiction model as applied to mobile phone overuse. We describe the case of a woman who overuses her mobile phone from two distinct approaches: (1) a symptom-based categorical approach inspired from the addiction model of dysfunctional mobile phone use and (2) a process-based approach resulting from an idiosyncratic clinical case conceptualization. In the case depicted here, the addiction model was shown to lead to standardized and non-relevant treatment, whereas the clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific, empirically based psychological interventions. This finding highlights that conceptualizing excessive behaviours (e.g., gambling and sex) within the addiction model can be a simplification of an individual's psychological functioning, offering only limited clinical relevance. The addiction model, applied to excessive behaviours (e.g., gambling, sex and Internet-related activities) may lead to non-relevant standardized treatments. Clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific empirically based psychological interventions. The biomedical model might lead to the simplification of an individual's psychological functioning with limited clinical relevance. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
Electroweak standard model with very special relativity
NASA Astrophysics Data System (ADS)
Alfaro, Jorge; González, Pablo; Ávila, Ricardo
2015-05-01
The very special relativity electroweak Standard Model (VSR EW SM) is a theory with SU (2 )L×U (1 )R symmetry, with the same number of leptons and gauge fields as in the usual Weinberg-Salam model. No new particles are introduced. The model is renormalizable and unitarity is preserved. However, photons obtain mass and the massive bosons obtain different masses for different polarizations. Besides, neutrino masses are generated. A VSR-invariant term will produce neutrino oscillations and new processes are allowed. In particular, we compute the rate of the decays μ →e +γ . All these processes, which are forbidden in the electroweak Standard Model, put stringent bounds on the parameters of our model and measure the violation of Lorentz invariance. We investigate the canonical quantization of this nonlocal model. Second quantization is carried out, and we obtain a well-defined particle content. Additionally, we do a counting of the degrees of freedom associated with the gauge bosons involved in this work, after spontaneous symmetry breaking has been realized. Violations of Lorentz invariance have been predicted by several theories of quantum gravity [J. Alfaro, H. Morales-Tecotl, and L. F. Urrutia, Phys. Rev. Lett. 84, 2318 (2000); Phys. Rev. D 65, 103509 (2002)]. It is a remarkable possibility that the low-energy effects of Lorentz violation induced by quantum gravity could be contained in the nonlocal terms of the VSR EW SM.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S
2017-11-01
Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Integrated control system for electron beam processes
NASA Astrophysics Data System (ADS)
Koleva, L.; Koleva, E.; Batchkova, I.; Mladenov, G.
2018-03-01
The ISO/IEC 62264 standard is widely used for integration of the business systems of a manufacturer with the corresponding manufacturing control systems based on hierarchical equipment models, functional data and manufacturing operations activity models. In order to achieve the integration of control systems, formal object communication models must be developed, together with manufacturing operations activity models, which coordinate the integration between different levels of control. In this article, the development of integrated control system for electron beam welding process is presented as part of a fully integrated control system of an electron beam plant, including also other additional processes: surface modification, electron beam evaporation, selective melting and electron beam diagnostics.
State resolved vibrational relaxation modeling for strongly nonequilibrium flows
NASA Astrophysics Data System (ADS)
Boyd, Iain D.; Josyula, Eswar
2011-05-01
Vibrational relaxation is an important physical process in hypersonic flows. Activation of the vibrational mode affects the fundamental thermodynamic properties and finite rate relaxation can reduce the degree of dissociation of a gas. Low fidelity models of vibrational activation employ a relaxation time to capture the process at a macroscopic level. High fidelity, state-resolved models have been developed for use in continuum gas dynamics simulations based on computational fluid dynamics (CFD). By comparison, such models are not as common for use with the direct simulation Monte Carlo (DSMC) method. In this study, a high fidelity, state-resolved vibrational relaxation model is developed for the DSMC technique. The model is based on the forced harmonic oscillator approach in which multi-quantum transitions may become dominant at high temperature. Results obtained for integrated rate coefficients from the DSMC model are consistent with the corresponding CFD model. Comparison of relaxation results obtained with the high-fidelity DSMC model shows significantly less excitation of upper vibrational levels in comparison to the standard, lower fidelity DSMC vibrational relaxation model. Application of the new DSMC model to a Mach 7 normal shock wave in carbon monoxide provides better agreement with experimental measurements than the standard DSMC relaxation model.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
Timing Interactions in Social Simulations: The Voter Model
NASA Astrophysics Data System (ADS)
Fernández-Gracia, Juan; Eguíluz, Víctor M.; Miguel, Maxi San
The recent availability of huge high resolution datasets on human activities has revealed the heavy-tailed nature of the interevent time distributions. In social simulations of interacting agents the standard approach has been to use Poisson processes to update the state of the agents, which gives rise to very homogeneous activity patterns with a well defined characteristic interevent time. As a paradigmatic opinion model we investigate the voter model and review the standard update rules and propose two new update rules which are able to account for heterogeneous activity patterns. For the new update rules each node gets updated with a probability that depends on the time since the last event of the node, where an event can be an update attempt (exogenous update) or a change of state (endogenous update). We find that both update rules can give rise to power law interevent time distributions, although the endogenous one more robustly. Apart from that for the exogenous update rule and the standard update rules the voter model does not reach consensus in the infinite size limit, while for the endogenous update there exist a coarsening process that drives the system toward consensus configurations.
Penguin-like diagrams from the standard model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ping, Chia Swee
2015-04-24
The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, wemore » present an exact calculation of the penguin diagram vertices in the ‘tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.« less
An assessment model for quality management
NASA Astrophysics Data System (ADS)
Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.
2002-07-01
SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.
McKoon, Gail; Ratcliff, Roger
2016-01-01
Millions of adults in the United States lack the necessary literacy skills for most living wage jobs. For students from adult learning classes, we used a lexical decision task to measure their knowledge of words and we used a decision-making model (Ratcliff's, 1978, diffusion model) to abstract the mechanisms underlying their performance from their RTs and accuracy. We also collected scores for each participant on standardized IQ tests and standardized reading tests used commonly in the education literature. We found significant correlations between the model's estimates of the strengths with which words are represented in memory and scores for some of the standardized tests but not others. The findings point to the feasibility and utility of combining a test of word knowledge, lexical decision, that is well-established in psycholinguistic research, a decision-making model that supplies information about underlying mechanisms, and standardized tests. The goal for future research is to use this combination of approaches to understand better how basic processes relate to standardized tests with the eventual aim of understanding what these tests are measuring and what the specific difficulties are for individual, low-literacy adults. Copyright © 2015. Published by Elsevier B.V.
Humor Facilitates Text Comprehension: Evidence from Eye Movements
ERIC Educational Resources Information Center
Ferstl, Evelyn C.; Israel, Laura; Putzar, Lisa
2017-01-01
One crucial property of verbal jokes is that the punchline usually contains an incongruency that has to be resolved by updating the situation model representation. In the standard pragmatic model, these processes are considered to require cognitive effort. However, only few studies compared jokes to texts requiring a situation model revision…
Proposing an Educational Scaling-and-Diffusion Model for Inquiry-Based Learning Designs
ERIC Educational Resources Information Center
Hung, David; Lee, Shu-Shing
2015-01-01
Education cannot adopt the linear model of scaling used by the medical sciences. "Gold standards" cannot be replicated without considering process-in-learning, diversity, and student-variedness in classrooms. This article proposes a nuanced model of educational scaling-and-diffusion, describing the scaling (top-down supports) and…
Exploring Term Dependences in Probabilistic Information Retrieval Model.
ERIC Educational Resources Information Center
Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae
2003-01-01
Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…
Program Assessment: Getting to a Practical How-To Model
ERIC Educational Resources Information Center
Gardiner, Lorraine R.; Corbitt, Gail; Adams, Steven J.
2010-01-01
The Association to Advance Collegiate Schools of Business (AACSB) International's assurance of learning (AoL) standards require that schools develop a sophisticated continuous-improvement process. The authors review various assessment models and develop a practical, 6-step AoL model based on the literature and the authors' AoL-implementation…
NASA Technical Reports Server (NTRS)
Stovall, John R.; Wray, Richard B.
1994-01-01
This paper presents a description of a model for a space vehicle operational scenario and the commands for avionics. This model will be used in developing a dynamic architecture simulation model using the Statemate CASE tool for validation of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA has been proposed as an avionics architecture standard to NASA through its Strategic Avionics Technology Working Group (SATWG) and has been accepted by the Society of Automotive Engineers (SAE) for conversion into an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division (FDSD) of the NASA Johnson Space Center (JSC) by the Lockheed Engineering and Sciences Company (LESC), Houston, Texas. This SGOAA includes a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, and a nine class model of interfaces. The SGOAA is both scalable and recursive and can be applied to any hierarchical level of hardware/software processing systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatrchyan, S.; Khachatryan, V.; Sirunyan, A. M.
A search for physics beyond the standard model involving events with one or more photons, jets, and missing transverse energy has been performed by the CMS experiment. The data sample corresponds to an integrated luminosity of 4.93 fb -1 of proton-proton collisions at TeV, produced at the Large Hadron Collider. No excess of events with large missing transverse energy is observed beyond expectations from standard model processes, and upper limits on the signal production cross sections for new physics processes are set at the 95% confidence level. The results of this search are interpreted in the context of three modelsmore » of new physics: a general model of gauge-mediated supersymmetry breaking, Simplified Models, and a theory involving universal extra dimensions. In the absence of evidence for new physics, exclusion regions are derived in the parameter spaces of the respective models.« less
Cataloging, Processing, Administering AV Materials. A Model for Wisconsin Schools. Revised, 1974.
ERIC Educational Resources Information Center
Little, Robert David, Ed.; And Others
The Wisconsin Association of School Librarians has produced a manual for standardized processing of all nonprint media, based on two principles: (1) the media should be centralized, organized, and administered for maximum access; and (2) content is more important than form. Definitions, cataloging, processing, housing, circulation, and care are…
NASA Astrophysics Data System (ADS)
Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.
2018-05-01
The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.
Opposite Effects of Context on Immediate Structural and Lexical Processing.
ERIC Educational Resources Information Center
Harris, John W.
The testing of a number of hypotheses about the effect of hearing a prior context sentence on immediate processing of a subsequent target sentence is described. According to the standard deep structure model, higher level processing (e.g. semantic interpretation, integration of context-tarqet information) does not occur immediately as speech is…
Barbagallo, Simone; Corradi, Luca; de Ville de Goyet, Jean; Iannucci, Marina; Porro, Ivan; Rosso, Nicola; Tanfani, Elena; Testi, Angela
2015-05-17
The Operating Room (OR) is a key resource of all major hospitals, but it also accounts for up 40% of resource costs. Improving cost effectiveness, while maintaining a quality of care, is a universal objective. These goals imply an optimization of planning and a scheduling of the activities involved. This is highly challenging due to the inherent variable and unpredictable nature of surgery. A Business Process Modeling Notation (BPMN 2.0) was used for the representation of the "OR Process" (being defined as the sequence of all of the elementary steps between "patient ready for surgery" to "patient operated upon") as a general pathway ("path"). The path was then both further standardized as much as possible and, at the same time, keeping all of the key-elements that would allow one to address or define the other steps of planning, and the inherent and wide variability in terms of patient specificity. The path was used to schedule OR activity, room-by-room, and day-by-day, feeding the process from a "waiting list database" and using a mathematical optimization model with the objective of ending up in an optimized planning. The OR process was defined with special attention paid to flows, timing and resource involvement. Standardization involved a dynamics operation and defined an expected operating time for each operation. The optimization model has been implemented and tested on real clinical data. The comparison of the results reported with the real data, shows that by using the optimization model, allows for the scheduling of about 30% more patients than in actual practice, as well as to better exploit the OR efficiency, increasing the average operating room utilization rate up to 20%. The optimization of OR activity planning is essential in order to manage the hospital's waiting list. Optimal planning is facilitated by defining the operation as a standard pathway where all variables are taken into account. By allowing a precise scheduling, it feeds the process of planning and, further up-stream, the management of a waiting list in an interactive and bi-directional dynamic process.
Evidence for the $$ H\\to b\\overline{b} $$ decay with the ATLAS detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaboud, M.; Aad, G.; Abbott, B.
A search for the decay of the Standard Model Higgs boson into a bmore » $$\\bar{b}$$ pair when produced in association with a W or Z boson is performed with the ATLAS detector. The analysed data, corresponding to an integrated luminosity of 36.1 fb -1, were collected in proton-proton collisions in Run 2 of the Large Hadron Collider at a centre-of-mass energy of 13 TeV. Final states containing zero, one and two charged leptons (electrons or muons) are considered, targeting the decays Z → νν, W → ℓν and Z → ℓℓ. For a Higgs boson mass of 125 GeV, an excess of events over the expected background from other Standard Model processes is found with an observed significance of 3.5 standard deviations, compared to an expectation of 3.0 standard deviations. This excess thus provides evidence for the Higgs boson decay into b-quarks and for its production in association with a vector boson. Furthermore, the combination of this result with that of the Run 1 analysis yields a ratio of the measured signal events to the Standard Model expectation equal to 0.90±0.18(stat.) -0.19 + 0.21 (syst.). Assuming the Standard Model production cross-section, the results are consistent with the value of the Yukawa coupling to b-quarks in the Standard Model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaboud, M.; Aad, G.; Abbott, B.
A search for the decay of the Standard Model Higgs boson into a bmore » $$\\bar{b}$$ pair when produced in association with a W or Z boson is performed with the ATLAS detector. The analysed data, corresponding to an integrated luminosity of 36.1 fb -1, were collected in proton-proton collisions in Run 2 of the Large Hadron Collider at a centre-of-mass energy of 13 TeV. Final states containing zero, one and two charged leptons (electrons or muons) are considered, targeting the decays Z → νν, W → ℓν and Z → ℓℓ. For a Higgs boson mass of 125 GeV, an excess of events over the expected background from other Standard Model processes is found with an observed significance of 3.5 standard deviations, compared to an expectation of 3.0 standard deviations. This excess thus provides evidence for the Higgs boson decay into b-quarks and for its production in association with a vector boson. Furthermore, the combination of this result with that of the Run 1 analysis yields a ratio of the measured signal events to the Standard Model expectation equal to 0.90±0.18(stat.) -0.19 + 0.21 (syst.). Assuming the Standard Model production cross-section, the results are consistent with the value of the Yukawa coupling to b-quarks in the Standard Model.« less
Evidence for the H\\to b\\overline{b} decay with the ATLAS detector
NASA Astrophysics Data System (ADS)
Aaboud, M.; Aad, G.; Abbott, B.; Abdinov, O.; Abeloos, B.; Abidi, S. H.; AbouZeid, O. S.; Abraham, N. L.; Abramowicz, H.; Abreu, H.; Abreu, R.; Abulaiti, Y.; Acharya, B. S.; Adachi, S.; Adamczyk, L.; Adelman, J.; Adersberger, M.; Adye, T.; Affolder, A. A.; Afik, Y.; Agatonovic-Jovin, T.; Agheorghiesei, C.; Aguilar-Saavedra, J. A.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Akatsuka, S.; Akerstedt, H.; Åkesson, T. P. A.; Akilli, E.; Akimov, A. V.; Alberghi, G. L.; Albert, J.; Albicocco, P.; Alconada Verzini, M. J.; Alderweireldt, S. C.; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alhroob, M.; Ali, B.; Aliev, M.; Alimonti, G.; Alison, J.; Alkire, S. P.; Allbrooke, B. M. M.; Allen, B. W.; Allport, P. P.; Aloisio, A.; Alonso, A.; Alonso, F.; Alpigiani, C.; Alshehri, A. A.; Alstaty, M. I.; Alvarez Gonzalez, B.; Álvarez Piqueras, D.; Alviggi, M. G.; Amadio, B. T.; Amaral Coutinho, Y.; Amelung, C.; Amidei, D.; Amor Dos Santos, S. P.; Amoroso, S.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, J. K.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Angelidakis, S.; Angelozzi, I.; Angerami, A.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antel, C.; Antonelli, M.; Antonov, A.; Antrim, D. J.; Anulli, F.; Aoki, M.; Aperio Bella, L.; Arabidze, G.; Arai, Y.; Araque, J. P.; Araujo Ferraz, V.; Arce, A. T. H.; Ardell, R. E.; Arduh, F. A.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Armitage, L. J.; Arnaez, O.; Arnold, H.; Arratia, M.; Arslan, O.; Artamonov, A.; Artoni, G.; Artz, S.; Asai, S.; Asbah, N.; Ashkenazi, A.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Augsten, K.; Avolio, G.; Axen, B.; Ayoub, M. K.; Azuelos, G.; Baas, A. E.; Baca, M. J.; Bachacou, H.; Bachas, K.; Backes, M.; Bagnaia, P.; Bahmani, M.; Bahrasemani, H.; Baines, J. T.; Bajic, M.; Baker, O. K.; Bakker, P. J.; Baldin, E. M.; Balek, P.; Balli, F.; Balunas, W. K.; Banas, E.; Bandyopadhyay, A.; Banerjee, Sw.; Bannoura, A. A. E.; Barak, L.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisits, M.-S.; Barkeloo, J. T.; Barklow, T.; Barlow, N.; Barnes, S. L.; Barnett, B. M.; Barnett, R. M.; Barnovska-Blenessy, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barranco Navarro, L.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Bartoldus, R.; Barton, A. E.; Bartos, P.; Basalaev, A.; Bassalat, A.; Bates, R. L.; Batista, S. J.; Batley, J. R.; Battaglia, M.; Bauce, M.; Bauer, F.; Bawa, H. S.; Beacham, J. B.; Beattie, M. D.; Beau, T.; Beauchemin, P. H.; Bechtle, P.; Beck, H. P.; Beck, H. C.; Becker, K.; Becker, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bednyakov, V. A.; Bedognetti, M.; Bee, C. P.; Beermann, T. A.; Begalli, M.; Begel, M.; Behr, J. K.; Bell, A. S.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belotskiy, K.; Beltramello, O.; Belyaev, N. L.; Benary, O.; Benchekroun, D.; Bender, M.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez, J.; Benjamin, D. P.; Benoit, M.; Bensinger, J. R.; Bentvelsen, S.; Beresford, L.; Beretta, M.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Beringer, J.; Berlendis, S.; Bernard, N. R.; Bernardi, G.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertoli, G.; Bertram, I. A.; Bertsche, C.; Besjes, G. J.; Bessidskaia Bylund, O.; Bessner, M.; Besson, N.; Bethani, A.; Bethke, S.; Betti, A.; Bevan, A. J.; Beyer, J.; Bianchi, R. M.; Biebel, O.; Biedermann, D.; Bielski, R.; Bierwagen, K.; Biesuz, N. V.; Biglietti, M.; Billoud, T. R. V.; Bilokon, H.; Bindi, M.; Bingul, A.; Bini, C.; Biondi, S.; Bisanz, T.; Bittrich, C.; Bjergaard, D. M.; Black, J. E.; Black, K. M.; Blair, R. E.; Blazek, T.; Bloch, I.; Blocker, C.; Blue, A.; Blumenschein, U.; Blunier, S.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Bock, C.; Boehler, M.; Boerner, D.; Bogavac, D.; Bogdanchikov, A. G.; Bohm, C.; Boisvert, V.; Bokan, P.; Bold, T.; Boldyrev, A. S.; Bolz, A. E.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Bortfeldt, J.; Bortoletto, D.; Bortolotto, V.; Boscherini, D.; Bosman, M.; Bossio Sola, J. D.; Boudreau, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Boutle, S. K.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozson, A. J.; Bracinik, J.; Brandt, A.; Brandt, G.; Brandt, O.; Braren, F.; Bratzler, U.; Brau, B.; Brau, J. E.; Breaden Madden, W. D.; Brendlinger, K.; Brennan, A. J.; Brenner, L.; Brenner, R.; Bressler, S.; Briglin, D. L.; Bristow, T. M.; Britton, D.; Britzger, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Broughton, J. H.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruni, A.; Bruni, G.; Bruni, L. S.; Bruno, S.; Brunt, BH; Bruschi, M.; Bruscino, N.; Bryant, P.; Bryngemark, L.; Buanes, T.; Buat, Q.; Buchholz, P.; Buckley, A. G.; Budagov, I. A.; Buehrer, F.; Bugge, M. K.; Bulekov, O.; Bullock, D.; Burch, T. J.; Burdin, S.; Burgard, C. D.; Burger, A. M.; Burghgrave, B.; Burka, K.; Burke, S.; Burmeister, I.; Burr, J. T. P.; Büscher, D.; Büscher, V.; Bussey, P.; Butler, J. M.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Buzykaev, A. R.; Cabrera Urbán, S.; Caforio, D.; Cai, H.; Cairo, V. M.; Cakir, O.; Calace, N.; Calafiura, P.; Calandri, A.; Calderini, G.; Calfayan, P.; Callea, G.; Caloba, L. P.; Calvente Lopez, S.; Calvet, D.; Calvet, S.; Calvet, T. P.; Camacho Toro, R.; Camarda, S.; Camarri, P.; Cameron, D.; Caminal Armadans, R.; Camincher, C.; Campana, S.; Campanelli, M.; Camplani, A.; Campoverde, A.; Canale, V.; Cano Bret, M.; Cantero, J.; Cao, T.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capua, M.; Carbone, R. M.; Cardarelli, R.; Cardillo, F.; Carli, I.; Carli, T.; Carlino, G.; Carlson, B. T.; Carminati, L.; Carney, R. M. D.; Caron, S.; Carquin, E.; Carrá, S.; Carrillo-Montoya, G. D.; Casadei, D.; Casado, M. P.; Casha, A. F.; Casolino, M.; Casper, D. W.; Castelijn, R.; Castillo Gimenez, V.; Castro, N. F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caudron, J.; Cavaliere, V.; Cavallaro, E.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Celebi, E.; Ceradini, F.; Cerda Alberich, L.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chan, S. K.; Chan, W. S.; Chan, Y. L.; Chang, P.; Chapman, J. D.; Charlton, D. G.; Chau, C. C.; Chavez Barajas, C. A.; Che, S.; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, C.; Chen, H.; Chen, J.; Chen, S.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, H. J.; Cheplakov, A.; Cheremushkina, E.; Cherkaoui El Moursli, R.; Cheu, E.; Cheung, K.; Chevalier, L.; Chiarella, V.; Chiarelli, G.; Chiodini, G.; Chisholm, A. S.; Chitan, A.; Chiu, Y. H.; Chizhov, M. V.; Choi, K.; Chomont, A. R.; Chouridou, S.; Chow, Y. S.; Christodoulou, V.; Chu, M. C.; Chudoba, J.; Chuinard, A. J.; Chwastowski, J. J.; Chytka, L.; Ciftci, A. K.; Cinca, D.; Cindro, V.; Cioara, I. A.; Ciocio, A.; Cirotto, F.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, B. L.; Clark, M. R.; Clark, P. J.; Clarke, R. N.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Colasurdo, L.; Cole, B.; Colijn, A. P.; Collot, J.; Colombo, T.; Conde Muiño, P.; Coniavitis, E.; Connell, S. H.; Connelly, I. A.; Constantinescu, S.; Conti, G.; Conventi, F.; Cooke, M.; Cooper-Sarkar, A. M.; Cormier, F.; Cormier, K. J. R.; Corradi, M.; Corriveau, F.; Cortes-Gonzalez, A.; Costa, G.; Costa, M. J.; Costanzo, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Crawley, S. J.; Creager, R. A.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Cribbs, W. A.; Cristinziani, M.; Croft, V.; Crosetti, G.; Cueto, A.; Cuhadar Donszelmann, T.; Cukierman, A. R.; Cummings, J.; Curatolo, M.; Cúth, J.; Czekierda, S.; Czodrowski, P.; D'amen, G.; D'Auria, S.; D'eramo, L.; D'Onofrio, M.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dado, T.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Dandoy, J. R.; Daneri, M. F.; Dang, N. P.; Daniells, A. C.; Dann, N. S.; Danninger, M.; Dano Hoffmann, M.; Dao, V.; Darbo, G.; Darmora, S.; Dassoulas, J.; Dattagupta, A.; Daubney, T.; Davey, W.; David, C.; Davidek, T.; Davis, D. R.; Davison, P.; Dawe, E.; Dawson, I.; De, K.; de Asmundis, R.; De Benedetti, A.; De Castro, S.; De Cecco, S.; De Groot, N.; de Jong, P.; De la Torre, H.; De Lorenzi, F.; De Maria, A.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vasconcelos Corga, K.; De Vivie De Regie, J. B.; Debbe, R.; Debenedetti, C.; Dedovich, D. V.; Dehghanian, N.; Deigaard, I.; Del Gaudio, M.; Del Peso, J.; Delgove, D.; Deliot, F.; Delitzsch, C. M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delporte, C.; Delsart, P. A.; DeMarco, D. A.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Denysiuk, D.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Dette, K.; Devesa, M. R.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Di Bello, F. A.; Di Ciaccio, A.; Di Ciaccio, L.; Di Clemente, W. K.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Micco, B.; Di Nardo, R.; Di Petrillo, K. F.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaconu, C.; Diamond, M.; Dias, F. A.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Díez Cornell, S.; Dimitrievska, A.; Dingfelder, J.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Djuvsland, J. I.; do Vale, M. A. B.; Dobos, D.; Dobre, M.; Dodsworth, D.; Doglioni, C.; Dolejsi, J.; Dolezal, Z.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Drechsler, E.; Dris, M.; Du, Y.; Duarte-Campderros, J.; Dubinin, F.; Dubreuil, A.; Duchovni, E.; Duckeck, G.; Ducourthial, A.; Ducu, O. A.; Duda, D.; Dudarev, A.; Dudder, A. Chr.; Duffield, E. M.; Duflot, L.; Dührssen, M.; Dulsen, C.; Dumancic, M.; Dumitriu, A. E.; Duncan, A. K.; Dunford, M.; Duperrin, A.; Duran Yildiz, H.; Düren, M.; Durglishvili, A.; Duschinger, D.; Dutta, B.; Duvnjak, D.; Dyndal, M.; Dziedzic, B. S.; Eckardt, C.; Ecker, K. M.; Edgar, R. C.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; El Kosseifi, R.; Ellajosyula, V.; Ellert, M.; Elles, S.; Ellinghaus, F.; Elliot, A. A.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Ennis, J. S.; Epland, M. B.; Erdmann, J.; Ereditato, A.; Ernst, M.; Errede, S.; Escalier, M.; Escobar, C.; Esposito, B.; Estrada Pastor, O.; Etienvre, A. I.; Etzion, E.; Evans, H.; Ezhilov, A.; Ezzi, M.; Fabbri, F.; Fabbri, L.; Fabiani, V.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Falla, R. J.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farina, C.; Farina, E. M.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Faucci Giannelli, M.; Favareto, A.; Fawcett, W. J.; Fayard, L.; Fedin, O. L.; Fedorko, W.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Fenton, M. J.; Fenyuk, A. B.; Feremenga, L.; Fernandez Martinez, P.; Ferrando, J.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Fischer, A.; Fischer, C.; Fischer, J.; Fisher, W. C.; Flaschel, N.; Fleck, I.; Fleischmann, P.; Fletcher, R. R. M.; Flick, T.; Flierl, B. M.; Flores Castillo, L. R.; Flowerdew, M. J.; Forcolin, G. T.; Formica, A.; Förster, F. A.; Forti, A.; Foster, A. G.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Franchino, S.; Francis, D.; Franconi, L.; Franklin, M.; Frate, M.; Fraternali, M.; Freeborn, D.; Fressard-Batraneanu, S. M.; Freund, B.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fusayasu, T.; Fuster, J.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gach, G. P.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, L. G.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Ganguly, S.; Gao, Y.; Gao, Y. S.; Garay Walls, F. M.; García, C.; García Navarro, J. E.; García Pascual, J. A.; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gascon Bravo, A.; Gasnikova, K.; Gatti, C.; Gaudiello, A.; Gaudio, G.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Gee, C. N. P.; Geisen, J.; Geisen, M.; Geisler, M. P.; Gellerstedt, K.; Gemme, C.; Genest, M. H.; Geng, C.; Gentile, S.; Gentsos, C.; George, S.; Gerbaudo, D.; Geßner, G.; Ghasemi, S.; Ghneimat, M.; Giacobbe, B.; Giagu, S.; Giangiacomi, N.; Giannetti, P.; Gibson, S. M.; Gignac, M.; Gilchriese, M.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giordani, M. P.; Giorgi, F. M.; Giraud, P. F.; Giromini, P.; Giugliarelli, G.; Giugni, D.; Giuli, F.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkaitatzis, S.; Gkialas, I.; Gkougkousis, E. L.; Gkountoumis, P.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Goblirsch-Kolb, M.; Godlewski, J.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gonçalo, R.; Goncalves Gama, R.; Goncalves Pinto Firmino Da Costa, J.; Gonella, G.; Gonella, L.; Gongadze, A.; Gonski, J. L.; González de la Hoz, S.; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Gottardo, C. A.; Goudet, C. R.; Goujdami, D.; Goussiou, A. G.; Govender, N.; Gozani, E.; Grabowska-Bold, I.; Gradin, P. O. J.; Gramling, J.; Gramstad, E.; Grancagnolo, S.; Gratchev, V.; Gravila, P. M.; Gray, C.; Gray, H. M.; Greenwood, Z. D.; Grefe, C.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Grevtsov, K.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grivaz, J.-F.; Groh, S.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Grout, Z. J.; Grummer, A.; Guan, L.; Guan, W.; Guenther, J.; Guescini, F.; Guest, D.; Gueta, O.; Gui, B.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Guo, J.; Guo, W.; Guo, Y.; Gupta, R.; Gurbuz, S.; Gustavino, G.; Gutelman, B. J.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guyot, C.; Guzik, M. P.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Hadef, A.; Hageböck, S.; Hagihara, M.; Hakobyan, H.; Haleem, M.; Haley, J.; Halladjian, G.; Hallewell, G. D.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamilton, A.; Hamity, G. N.; Hamnett, P. G.; Han, L.; Han, S.; Hanagaki, K.; Hanawa, K.; Hance, M.; Handl, D. M.; Haney, B.; Hanke, P.; Hansen, J. B.; Hansen, J. D.; Hansen, M. C.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Hariri, F.; Harkusha, S.; Harrison, P. F.; Hartmann, N. M.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauser, R.; Hauswald, L.; Havener, L. B.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hayakawa, D.; Hayden, D.; Hays, C. P.; Hays, J. M.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heer, S.; Heidegger, K. K.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, J. J.; Heinrich, L.; Heinz, C.; Hejbal, J.; Helary, L.; Held, A.; Hellman, S.; Helsens, C.; Henderson, R. C. W.; Heng, Y.; Henkelmann, S.; Henriques Correia, A. M.; Henrot-Versille, S.; Herbert, G. H.; Herde, H.; Herget, V.; Hernández Jiménez, Y.; Herr, H.; Herten, G.; Hertenberger, R.; Hervas, L.; Herwig, T. C.; Hesketh, G. G.; Hessey, N. P.; Hetherly, J. W.; Higashino, S.; Higón-Rodriguez, E.; Hildebrand, K.; Hill, E.; Hill, J. C.; Hiller, K. H.; Hillier, S. J.; Hils, M.; Hinchliffe, I.; Hirose, M.; Hirschbuehl, D.; Hiti, B.; Hladik, O.; Hlaluku, D. R.; Hoad, X.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoenig, F.; Hohn, D.; Holmes, T. R.; Homann, M.; Honda, S.; Honda, T.; Hong, T. M.; Hooberman, B. H.; Hopkins, W. H.; Horii, Y.; Horton, A. J.; Hostachy, J.-Y.; Hostiuc, A.; Hou, S.; Hoummada, A.; Howarth, J.; Hoya, J.; Hrabovsky, M.; Hrdinka, J.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hrynevich, A.; Hsu, P. J.; Hsu, S.-C.; Hu, Q.; Hu, S.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Huhtinen, M.; Hunter, R. F. H.; Huo, P.; Huseynov, N.; Huston, J.; Huth, J.; Hyneman, R.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Idrissi, Z.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ilic, N.; Iltzsche, F.; Introzzi, G.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Isacson, M. F.; Ishijima, N.; Ishino, M.; Ishitsuka, M.; Issever, C.; Istin, S.; Ito, F.; Iturbe Ponce, J. M.; Iuppa, R.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jabbar, S.; Jackson, P.; Jacobs, R. M.; Jain, V.; Jakobi, K. B.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jamin, D. O.; Jana, D. K.; Jansky, R.; Janssen, J.; Janus, M.; Janus, P. A.; Jarlskog, G.; Javadov, N.; Javůrek, T.; Javurkova, M.; Jeanneau, F.; Jeanty, L.; Jejelava, J.; Jelinskas, A.; Jenni, P.; Jeske, C.; Jézéquel, S.; Ji, H.; Jia, J.; Jiang, H.; Jiang, Y.; Jiang, Z.; Jiggins, S.; Jimenez Pena, J.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Jivan, H.; Johansson, P.; Johns, K. A.; Johnson, C. A.; Johnson, W. J.; Jon-And, K.; Jones, R. W. L.; Jones, S. D.; Jones, S.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Jovicevic, J.; Ju, X.; Juste Rozas, A.; Köhler, M. K.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kahn, S. J.; Kaji, T.; Kajomovitz, E.; Kalderon, C. W.; Kaluza, A.; Kama, S.; Kamenshchikov, A.; Kanaya, N.; Kanjir, L.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kaplan, L. S.; Kar, D.; Karakostas, K.; Karastathis, N.; Kareem, M. J.; Karentzos, E.; Karpov, S. N.; Karpova, Z. M.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kasahara, K.; Kashif, L.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Kato, C.; Katre, A.; Katzy, J.; Kawade, K.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kay, E. F.; Kazanin, V. F.; Keeler, R.; Kehoe, R.; Keller, J. S.; Kellermann, E.; Kempster, J. J.; Kendrick, J.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Keyes, R. A.; Khader, M.; Khalil-zada, F.; Khanov, A.; Kharlamov, A. G.; Kharlamova, T.; Khodinov, A.; Khoo, T. J.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kido, S.; Kilby, C. R.; Kim, H. Y.; Kim, S. H.; Kim, Y. K.; Kimura, N.; Kind, O. M.; King, B. T.; Kirchmeier, D.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kitali, V.; Kivernyk, O.; Kladiva, E.; Klapdor-Kleingrothaus, T.; Klein, M. H.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klingl, T.; Klioutchnikova, T.; Klitzner, F. F.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, A.; Kobayashi, D.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koffas, T.; Koffeman, E.; Köhler, N. M.; Koi, T.; Kolb, M.; Koletsou, I.; Komar, A. A.; Kondo, T.; Kondrashova, N.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Konya, B.; Kopeliansky, R.; Koperny, S.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Kortner, O.; Kortner, S.; Kosek, T.; Kostyukhin, V. V.; Kotwal, A.; Koulouris, A.; Kourkoumeli-Charalampidi, A.; Kourkoumelis, C.; Kourlitis, E.; Kouskoura, V.; Kowalewska, A. B.; Kowalewski, R.; Kowalski, T. Z.; Kozakai, C.; Kozanecki, W.; Kozhin, A. S.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Krauss, D.; Kremer, J. A.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Krizka, K.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumnack, N.; Kruse, M. C.; Kubota, T.; Kucuk, H.; Kuday, S.; Kuechler, J. T.; Kuehn, S.; Kugel, A.; Kuger, F.; Kuhl, T.; Kukhtin, V.; Kukla, R.; Kulchitsky, Y.; Kuleshov, S.; Kulinich, Y. P.; Kuna, M.; Kunigo, T.; Kupco, A.; Kupfer, T.; Kuprash, O.; Kurashige, H.; Kurchaninov, L. L.; Kurochkin, Y. A.; Kurth, M. G.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwan, T.; Kyriazopoulos, D.; La Rosa, A.; La Rosa Navarro, J. L.; La Rotonda, L.; La Ruffa, F.; Lacasta, C.; Lacava, F.; Lacey, J.; Lack, D. P. J.; Lacker, H.; Lacour, D.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lammers, S.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lanfermann, M. C.; Lang, V. S.; Lange, J. C.; Langenberg, R. J.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Lapertosa, A.; Laplace, S.; Laporte, J. F.; Lari, T.; Lasagni Manghi, F.; Lassnig, M.; Lau, T. S.; Laurelli, P.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Lazovich, T.; Lazzaroni, M.; Le, B.; Le Dortz, O.; Le Guirriec, E.; Le Quilleuc, E. P.; LeBlanc, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, G. R.; Lee, S. C.; Lee, L.; Lefebvre, B.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehmann Miotto, G.; Lei, X.; Leight, W. A.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzi, B.; Leone, R.; Leone, S.; Leonidopoulos, C.; Lerner, G.; Leroy, C.; Les, R.; Lesage, A. A. J.; Lester, C. G.; Levchenko, M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Lewis, D.; Li, B.; Li, Changqiao; Li, H.; Li, L.; Li, Q.; Li, Q.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liberti, B.; Liblong, A.; Lie, K.; Liebal, J.; Liebig, W.; Limosani, A.; Lin, K.; Lin, S. C.; Lin, T. H.; Linck, R. A.; Lindquist, B. E.; Lionti, A. E.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lister, A.; Litke, A. M.; Liu, B.; Liu, H.; Liu, H.; Liu, J. K. K.; Liu, J.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, Y. L.; Liu, Y.; Livan, M.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo, C. Y.; Sterzo, F. Lo; Lobodzinska, E. M.; Loch, P.; Loebinger, F. K.; Loesle, A.; Loew, K. M.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, B. A.; Long, J. D.; Long, R. E.; Longo, L.; Looper, K. A.; Lopez, J. A.; Lopez Paz, I.; Lopez Solis, A.; Lorenz, J.; Lorenzo Martinez, N.; Losada, M.; Lösel, P. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lu, H.; Lu, N.; Lu, Y. J.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luedtke, C.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Lutz, M. S.; Luzi, P. M.; Lynn, D.; Lysak, R.; Lytken, E.; Lyu, F.; Lyubushkin, V.; Ma, H.; Ma, L. L.; Ma, Y.; Maccarrone, G.; Macchiolo, A.; Macdonald, C. M.; Maček, B.; Machado Miguens, J.; Madaffari, D.; Madar, R.; Mader, W. F.; Madsen, A.; Madysa, N.; Maeda, J.; Maeland, S.; Maeno, T.; Maevskiy, A. S.; Magerl, V.; Maiani, C.; Maidantchik, C.; Maier, T.; Maio, A.; Majersky, O.; Majewski, S.; Makida, Y.; Makovec, N.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyukov, S.; Mamuzic, J.; Mancini, G.; Mandić, I.; Maneira, J.; Manhaes de Andrade Filho, L.; Manjarres Ramos, J.; Mankinen, K. H.; Mann, A.; Manousos, A.; Mansoulie, B.; Mansour, J. D.; Mantifel, R.; Mantoani, M.; Manzoni, S.; Mapelli, L.; Marceca, G.; March, L.; Marchese, L.; Marchiori, G.; Marcisovsky, M.; Marin Tobon, C. A.; Marjanovic, M.; Marley, D. E.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Martensson, M. U. F.; Marti-Garcia, S.; Martin, C. B.; Martin, T. A.; Martin, V. J.; Martin dit Latour, B.; Martinez, M.; Martinez Outschoorn, V. I.; Martin-Haugh, S.; Martoiu, V. S.; Martyniuk, A. C.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Mason, L. H.; Massa, L.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mättig, P.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Maznas, I.; Mazza, S. M.; Mc Fadden, N. C.; Mc Goldrick, G.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McClymont, L. I.; McDonald, E. F.; Mcfayden, J. A.; Mchedlidze, G.; McMahon, S. J.; McNamara, P. C.; McNicol, C. J.; McPherson, R. A.; Meehan, S.; Megy, T. J.; Mehlhase, S.; Mehta, A.; Meideck, T.; Meier, K.; Meirose, B.; Melini, D.; Mellado Garcia, B. R.; Mellenthin, J. D.; Melo, M.; Meloni, F.; Melzer, A.; Menary, S. B.; Meng, L.; Meng, X. T.; Mengarelli, A.; Menke, S.; Meoni, E.; Mergelmeyer, S.; Merlassino, C.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Meyer Zu Theenhausen, H.; Miano, F.; Middleton, R. P.; Miglioranzi, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Milesi, M.; Milic, A.; Millar, D. A.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Minaenko, A. A.; Minami, Y.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Minegishi, Y.; Ming, Y.; Mir, L. M.; Mirto, A.; Mistry, K. P.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Miucci, A.; Miyagawa, P. S.; Mizukami, A.; Mjörnmark, J. U.; Mkrtchyan, T.; Mlynarikova, M.; Moa, T.; Mochizuki, K.; Mogg, P.; Mohapatra, S.; Molander, S.; Moles-Valls, R.; Mondragon, M. C.; Mönig, K.; Monk, J.; Monnier, E.; Montalbano, A.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Morange, N.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Morgenstern, S.; Mori, D.; Mori, T.; Morii, M.; Morinaga, M.; Morisbak, V.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Morvaj, L.; Moschovakos, P.; Mosidze, M.; Moss, H. J.; Moss, J.; Motohashi, K.; Mount, R.; Mountricha, E.; Moyse, E. J. W.; Muanza, S.; Mueller, F.; Mueller, J.; Mueller, R. S. P.; Muenstermann, D.; Mullen, P.; Mullier, G. A.; Munoz Sanchez, F. J.; Murray, W. J.; Musheghyan, H.; Muškinja, M.; Myagkov, A. G.; Myska, M.; Nachman, B. P.; Nackenhorst, O.; Nagai, K.; Nagai, R.; Nagano, K.; Nagasaka, Y.; Nagata, K.; Nagel, M.; Nagy, E.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Naranjo Garcia, R. F.; Narayan, R.; Narrias Villar, D. I.; Naryshkin, I.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Negri, A.; Negrini, M.; Nektarijevic, S.; Nellist, C.; Nelson, A.; Nelson, M. E.; Nemecek, S.; Nemethy, P.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Newman, P. R.; Ng, T. Y.; Nguyen Manh, T.; Nickerson, R. B.; Nicolaidou, R.; Nielsen, J.; Nikiforou, N.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nishu, N.; Nisius, R.; Nitsche, I.; Nitta, T.; Nobe, T.; Noguchi, Y.; Nomachi, M.; Nomidis, I.; Nomura, M. A.; Nooney, T.; Nordberg, M.; Norjoharuddeen, N.; Novgorodova, O.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nurse, E.; Nuti, F.; O'connor, K.; O'Neil, D. C.; O'Rourke, A. A.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, I.; Ochoa-Ricoux, J. P.; Oda, S.; Odaka, S.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Oide, H.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Oleiro Seabra, L. F.; Olivares Pino, S. A.; Oliveira Damazio, D.; Olsson, M. J. R.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onogi, K.; Onyisi, P. U. E.; Oppen, H.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Orr, R. S.; Osculati, B.; Ospanov, R.; Otero y Garzon, G.; Otono, H.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Owen, M.; Owen, R. E.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pacheco Pages, A.; Pacheco Rodriguez, L.; Padilla Aranda, C.; Pagan Griso, S.; Paganini, M.; Paige, F.; Palacino, G.; Palazzo, S.; Palestini, S.; Palka, M.; Pallin, D.; St. Panagiotopoulou, E.; Panagoulias, I.; Pandini, C. E.; Panduro Vazquez, J. G.; Pani, P.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, A. J.; Parker, M. A.; Parker, K. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pascuzzi, V. R.; Pasner, J. M.; Pasqualucci, E.; Passaggio, S.; Pastore, Fr.; Pataraia, S.; Pater, J. R.; Pauly, T.; Pearson, B.; Pedraza Lopez, S.; Pedro, R.; Peleganchuk, S. V.; Penc, O.; Peng, C.; Peng, H.; Penwell, J.; Peralva, B. S.; Perego, M. M.; Perepelitsa, D. V.; Peri, F.; Perini, L.; Pernegger, H.; Perrella, S.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petroff, P.; Petrolo, E.; Petrov, M.; Petrucci, F.; Pettersson, N. E.; Peyaud, A.; Pezoa, R.; Phillips, F. H.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Pickering, M. A.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pinamonti, M.; Pinfold, J. L.; Pirumov, H.; Pitt, M.; Plazak, L.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Pluth, D.; Podberezko, P.; Poettgen, R.; Poggi, R.; Poggioli, L.; Pogrebnyak, I.; Pohl, D.; Pokharel, I.; Polesello, G.; Poley, A.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Ponomarenko, D.; Pontecorvo, L.; Popeneciu, G. A.; Portillo Quintero, D. M.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potti, H.; Poulsen, T.; Poveda, J.; Pozo Astigarraga, M. E.; Pralavorio, P.; Pranko, A.; Prell, S.; Price, D.; Primavera, M.; Prince, S.; Proklova, N.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Puri, A.; Puzo, P.; Qian, J.; Qin, G.; Qin, Y.; Quadt, A.; Queitsch-Maitland, M.; Quilty, D.; Raddum, S.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Raine, J. A.; Rajagopalan, S.; Rangel-Smith, C.; Rashid, T.; Raspopov, S.; Ratti, M. G.; Rauch, D. M.; Rauscher, F.; Rave, S.; Ravinovich, I.; Rawling, J. H.; Raymond, M.; Read, A. L.; Readioff, N. P.; Reale, M.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reed, R. G.; Reeves, K.; Rehnisch, L.; Reichert, J.; Reiss, A.; Rembser, C.; Ren, H.; Rescigno, M.; Resconi, S.; Resseguie, E. D.; Rettie, S.; Reynolds, E.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter, S.; Richter-Was, E.; Ricken, O.; Ridel, M.; Rieck, P.; Riegel, C. J.; Rieger, J.; Rifki, O.; Rijssenbeek, M.; Rimoldi, A.; Rimoldi, M.; Rinaldi, L.; Ripellino, G.; Ristić, B.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Rizzi, C.; Roberts, R. T.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Rocco, E.; Roda, C.; Rodina, Y.; Rodriguez Bosca, S.; Rodriguez Perez, A.; Rodriguez Rodriguez, D.; Roe, S.; Rogan, C. S.; Røhne, O.; Roloff, J.; Romaniouk, A.; Romano, M.; Romano Saez, S. M.; Romero Adam, E.; Rompotis, N.; Ronzani, M.; Roos, L.; Rosati, S.; Rosbach, K.; Rose, P.; Rosien, N.-A.; Rossi, E.; Rossi, L. P.; Rosten, J. H. N.; Rosten, R.; Rotaru, M.; Rothberg, J.; Rousseau, D.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Ruettinger, E. M.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Russell, H. L.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryu, S.; Ryzhov, A.; Rzehorz, G. F.; Saavedra, A. F.; Sabato, G.; Sacerdoti, S.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Saha, P.; Sahinsoy, M.; Saimpert, M.; Saito, M.; Saito, T.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salazar Loyola, J. E.; Salek, D.; Sales De Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sammel, D.; Sampsonidis, D.; Sampsonidou, D.; Sánchez, J.; Sanchez Martinez, V.; Sanchez Pineda, A.; Sandaker, H.; Sandbach, R. L.; Sander, C. O.; Sandhoff, M.; Sandoval, C.; Sankey, D. P. C.; Sannino, M.; Sano, Y.; Sansoni, A.; Santoni, C.; Santos, H.; Santoyo Castillo, I.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sasaki, O.; Sato, K.; Sauvan, E.; Savage, G.; Savard, P.; Savic, N.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Schaarschmidt, J.; Schacht, P.; Schachtner, B. M.; Schaefer, D.; Schaefer, L.; Schaefer, R.; Schaeffer, J.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Schiavi, C.; Schier, S.; Schildgen, L. K.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt-Sommerfeld, K. R.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, S.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schopf, E.; Schott, M.; Schouwenberg, J. F. P.; Schovancova, J.; Schramm, S.; Schuh, N.; Schulte, A.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwarz, T. A.; Schweiger, H.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Sciandra, A.; Sciolla, G.; Scornajenghi, M.; Scuri, F.; Scutti, F.; Searcy, J.; Seema, P.; Seidel, S. C.; Seiden, A.; Seixas, J. M.; Sekhniaidze, G.; Sekhon, K.; Sekula, S. J.; Semprini-Cesari, N.; Senkin, S.; Serfon, C.; Serin, L.; Serkin, L.; Sessa, M.; Seuster, R.; Severini, H.; Sfiligoj, T.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shaikh, N. W.; Shan, L. Y.; Shang, R.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Shaw, S. M.; Shcherbakova, A.; Shehu, C. Y.; Shen, Y.; Sherafati, N.; Sherwood, P.; Shi, L.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shipsey, I. P. J.; Shirabe, S.; Shiyakova, M.; Shlomi, J.; Shmeleva, A.; Shoaleh Saadi, D.; Shochet, M. J.; Shojaii, S.; Shope, D. R.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Sicho, P.; Sickles, A. M.; Sidebo, P. E.; Sideras Haddad, E.; Sidiropoulou, O.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silva, J.; Silverstein, S. B.; Simak, V.; Simic, L.; Simion, S.; Simioni, E.; Simmons, B.; Simon, M.; Sinervo, P.; Sinev, N. B.; Sioli, M.; Siragusa, G.; Siral, I.; Sivoklokov, S. Yu.; Sjölin, J.; Skinner, M. B.; Skubic, P.; Slater, M.; Slavicek, T.; Slawinska, M.; Sliwa, K.; Slovak, R.; Smakhtin, V.; Smart, B. H.; Smiesko, J.; Smirnov, N.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, J. W.; Smith, M. N. K.; Smith, R. W.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snyder, I. M.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Søgaard, A.; Soh, D. A.; Sokhrannyi, G.; Solans Sanchez, C. A.; Solar, M.; Soldatov, E. Yu.; Soldevila, U.; Solodkov, A. A.; Soloshenko, A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Son, H.; Sopczak, A.; Sosa, D.; Sotiropoulou, C. L.; Sottocornola, S.; Soualah, R.; Soukharev, A. M.; South, D.; Sowden, B. C.; Spagnolo, S.; Spalla, M.; Spangenberg, M.; Spanò, F.; Sperlich, D.; Spettel, F.; Spieker, T. M.; Spighi, R.; Spigo, G.; Spiller, L. A.; Spousta, M.; St. Denis, R. D.; Stabile, A.; Stamen, R.; Stamm, S.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanitzki, M. M.; Stapf, B. S.; Stapnes, S.; Starchenko, E. A.; Stark, G. H.; Stark, J.; Stark, S. H.; Staroba, P.; Starovoitov, P.; Stärz, S.; Staszewski, R.; Stegler, M.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stevenson, T. J.; Stewart, G. A.; Stockton, M. C.; Stoebe, M.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Stramaglia, M. E.; Strandberg, J.; Strandberg, S.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Strubig, A.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Suchek, S.; Sugaya, Y.; Suk, M.; Sulin, V. V.; Sultan, DMS; Sultansoy, S.; Sumida, T.; Sun, S.; Sun, X.; Suruliz, K.; Suster, C. J. E.; Sutton, M. R.; Suzuki, S.; Svatos, M.; Swiatlowski, M.; Swift, S. P.; Sykora, I.; Sykora, T.; Ta, D.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Tahirovic, E.; Taiblum, N.; Takai, H.; Takashima, R.; Takasugi, E. H.; Takeda, K.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tanaka, J.; Tanaka, M.; Tanaka, R.; Tanaka, S.; Tanioka, R.; Tannenwald, B. B.; Tapia Araya, S.; Tapprogge, S.; Tarem, S.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, A. C.; Taylor, A. J.; Taylor, G. N.; Taylor, P. T. E.; Taylor, W.; Teixeira-Dias, P.; Temple, D.; Ten Kate, H.; Teng, P. K.; Teoh, J. J.; Tepel, F.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Thais, S. J.; Theveneaux-Pelzer, T.; Thiele, F.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Tian, Y.; Tibbetts, M. J.; Ticse Torres, R. E.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tipton, P.; Tisserant, S.; Todome, K.; Todorova-Nova, S.; Todt, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tolley, E.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, B.; Tornambe, P.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Treado, C. J.; Trefzger, T.; Tresoldi, F.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trocmé, B.; Trofymov, A.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; Truong, L.; Trzebinski, M.; Trzupek, A.; Tsang, K. W.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tu, Y.; Tudorache, A.; Tudorache, V.; Tulbure, T. T.; Tuna, A. N.; Turchikhin, S.; Turgeman, D.; Turk Cakir, I.; Turra, R.; Tuts, P. M.; Ucchielli, G.; Ueda, I.; Ughetto, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Uno, K.; Unverdorben, C.; Urban, J.; Urquijo, P.; Urrejola, P.; Usai, G.; Usui, J.; Vacavant, L.; Vacek, V.; Vachon, B.; Vadla, K. O. H.; Vaidya, A.; Valderanis, C.; Valdes Santurio, E.; Valente, M.; Valentinetti, S.; Valero, A.; Valéry, L.; Valkar, S.; Vallier, A.; Valls Ferrer, J. A.; Van Den Wollenberg, W.; van der Graaf, H.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vaniachine, A.; Vankov, P.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varni, C.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vasquez, J. G.; Vasquez, G. A.; Vazeille, F.; Vazquez Furelos, D.; Vazquez Schroeder, T.; Veatch, J.; Veeraraghavan, V.; Veloce, L. M.; Veloso, F.; Veneziano, S.; Ventura, A.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, A. T.; Vermeulen, J. C.; Vetterli, M. C.; Viaux Maira, N.; Viazlo, O.; Vichou, I.; Vickey, T.; Vickey Boeriu, O. E.; Viehhauser, G. H. A.; Viel, S.; Vigani, L.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Vishwakarma, A.; Vittori, C.; Vivarelli, I.; Vlachos, S.; Vogel, M.; Vokac, P.; Volpi, G.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vuillermet, R.; Vukotic, I.; Wagner, P.; Wagner, W.; Wagner-Kuhr, J.; Wahlberg, H.; Wahrmund, S.; Walder, J.; Walker, R.; Walkowiak, W.; Wallangen, V.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, Q.; Wang, R.-J.; Wang, R.; Wang, S. M.; Wang, T.; Wang, W.; Wang, W.; Wang, Z.; Wanotayaroj, C.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Washbrook, A.; Watkins, P. M.; Watson, A. T.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, A. F.; Webb, S.; Weber, M. S.; Weber, S. M.; Weber, S. W.; Weber, S. A.; Webster, J. S.; Weidberg, A. R.; Weinert, B.; Weingarten, J.; Weirich, M.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M. D.; Werner, P.; Wessels, M.; Weston, T. D.; Whalen, K.; Whallon, N. L.; Wharton, A. M.; White, A. S.; White, A.; White, M. J.; White, R.; Whiteson, D.; Whitmore, B. W.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wildauer, A.; Wilk, F.; Wilkens, H. G.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wingerter-Seez, I.; Winkels, E.; Winklmeier, F.; Winston, O. J.; Winter, B. T.; Wittgen, M.; Wobisch, M.; Wolf, A.; Wolf, T. M. H.; Wolff, R.; Wolter, M. W.; Wolters, H.; Wong, V. W. S.; Woods, N. L.; Worm, S. D.; Wosiek, B. K.; Wotschack, J.; Wozniak, K. W.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xi, Z.; Xia, L.; Xu, D.; Xu, L.; Xu, T.; Xu, W.; Yabsley, B.; Yacoob, S.; Yamaguchi, D.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, S.; Yamanaka, T.; Yamane, F.; Yamatani, M.; Yamazaki, T.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yap, Y. C.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yeletskikh, I.; Yigitbasi, E.; Yildirim, E.; Yorita, K.; Yoshihara, K.; Young, C.; Young, C. J. S.; Yu, J.; Yu, J.; Yuen, S. P. Y.; Yusuff, I.; Zabinski, B.; Zacharis, G.; Zaidan, R.; Zaitsev, A. M.; Zakharchuk, N.; Zalieckas, J.; Zaman, A.; Zambito, S.; Zanzi, D.; Zeitnitz, C.; Zemaityte, G.; Zemla, A.; Zeng, J. C.; Zeng, Q.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zhang, D.; Zhang, D.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, L.; Zhang, M.; Zhang, P.; Zhang, R.; Zhang, R.; Zhang, X.; Zhang, Y.; Zhang, Z.; Zhao, X.; Zhao, Y.; Zhao, Z.; Zhemchugov, A.; Zhou, B.; Zhou, C.; Zhou, L.; Zhou, M.; Zhou, M.; Zhou, N.; Zhou, Y.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhukov, K.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, S.; Zinonos, Z.; Zinser, M.; Ziolkowski, M.; Živković, L.; Zobernig, G.; Zoccoli, A.; Zou, R.; zur Nedden, M.; Zwalinski, L.
2017-12-01
A search for the decay of the Standard Model Higgs boson into a b\\overline{b} pair when produced in association with a W or Z boson is performed with the ATLAS detector. The analysed data, corresponding to an integrated luminosity of 36.1 fb-1, were collected in proton-proton collisions in Run 2 of the Large Hadron Collider at a centre-of-mass energy of 13 TeV. Final states containing zero, one and two charged leptons (electrons or muons) are considered, targeting the decays Z → νν, W → ℓν and Z → ℓℓ. For a Higgs boson mass of 125 GeV, an excess of events over the expected background from other Standard Model processes is found with an observed significance of 3.5 standard deviations, compared to an expectation of 3.0 standard deviations. This excess provides evidence for the Higgs boson decay into b-quarks and for its production in association with a vector boson. The combination of this result with that of the Run 1 analysis yields a ratio of the measured signal events to the Standard Model expectation equal to 0.90 ± 0.18(stat.) - 0.19 + 0.21 (syst.). Assuming the Standard Model production cross-section, the results are consistent with the value of the Yukawa coupling to b-quarks in the Standard Model. [Figure not available: see fulltext.
Evidence for the $$ H\\to b\\overline{b} $$ decay with the ATLAS detector
Aaboud, M.; Aad, G.; Abbott, B.; ...
2017-12-06
A search for the decay of the Standard Model Higgs boson into a bmore » $$\\bar{b}$$ pair when produced in association with a W or Z boson is performed with the ATLAS detector. The analysed data, corresponding to an integrated luminosity of 36.1 fb -1, were collected in proton-proton collisions in Run 2 of the Large Hadron Collider at a centre-of-mass energy of 13 TeV. Final states containing zero, one and two charged leptons (electrons or muons) are considered, targeting the decays Z → νν, W → ℓν and Z → ℓℓ. For a Higgs boson mass of 125 GeV, an excess of events over the expected background from other Standard Model processes is found with an observed significance of 3.5 standard deviations, compared to an expectation of 3.0 standard deviations. This excess thus provides evidence for the Higgs boson decay into b-quarks and for its production in association with a vector boson. Furthermore, the combination of this result with that of the Run 1 analysis yields a ratio of the measured signal events to the Standard Model expectation equal to 0.90±0.18(stat.) -0.19 + 0.21 (syst.). Assuming the Standard Model production cross-section, the results are consistent with the value of the Yukawa coupling to b-quarks in the Standard Model.« less
Search for WZ+ZZ Production with Missing Transverse Energy and b Jets at CDF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poprocki, Stephen
Observation of diboson processes at hadron colliders is an important milestone on the road to discovery or exclusion of the standard model Higgs boson. Since the decay processes happen to be closely related, methods, tools, and insights obtained through the more common diboson decays can be incorporated into low-mass standard model Higgs searches. The combined WW + WZ + ZZ diboson cross section has been measured at the Tevatron in hadronic decay modes. In this thesis we take this one step closer to the Higgs by measuring just the WZ + ZZ cross section, exploiting a novel arti cial neural network based b-jet tagger to separate the WW background. The number of signal events is extracted from data events with large E T using a simultaneous t in events with and without two jets consistent with B hadron decays. Using 5:2 fb -1 of data from the CDF II detector, we measure a cross section of (pmore » $$\\bar{p}$$ → WZ,ZZ) = 5:8 +3.6 -3.0 pb, in agreement with the standard model.« less
A spatial haplotype copying model with applications to genotype imputation.
Yang, Wen-Yun; Hormozdiari, Farhad; Eskin, Eleazar; Pasaniuc, Bogdan
2015-05-01
Ever since its introduction, the haplotype copy model has proven to be one of the most successful approaches for modeling genetic variation in human populations, with applications ranging from ancestry inference to genotype phasing and imputation. Motivated by coalescent theory, this approach assumes that any chromosome (haplotype) can be modeled as a mosaic of segments copied from a set of chromosomes sampled from the same population. At the core of the model is the assumption that any chromosome from the sample is equally likely to contribute a priori to the copying process. Motivated by recent works that model genetic variation in a geographic continuum, we propose a new spatial-aware haplotype copy model that jointly models geography and the haplotype copying process. We extend hidden Markov models of haplotype diversity such that at any given location, haplotypes that are closest in the genetic-geographic continuum map are a priori more likely to contribute to the copying process than distant ones. Through simulations starting from the 1000 Genomes data, we show that our model achieves superior accuracy in genotype imputation over the standard spatial-unaware haplotype copy model. In addition, we show the utility of our model in selecting a small personalized reference panel for imputation that leads to both improved accuracy as well as to a lower computational runtime than the standard approach. Finally, we show our proposed model can be used to localize individuals on the genetic-geographical map on the basis of their genotype data.
NASA Astrophysics Data System (ADS)
Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.
2014-12-01
In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.
Proposed standards for peer-reviewed publication of computer code
USDA-ARS?s Scientific Manuscript database
Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...
Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-05-01
To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Contextualizing Learning Scenarios According to Different Learning Management Systems
ERIC Educational Resources Information Center
Drira, R.; Laroussi, M.; Le Pallec, X.; Warin, B.
2012-01-01
In this paper, we first demonstrate that an instructional design process of Technology Enhanced Learning (TEL) systems based on a Model Driven Approach (MDA) addresses the limits of Learning Technology Standards (LTS), such as SCORM and IMS-LD. Although these standards ensure the interoperability of TEL systems across different Learning Management…
40 CFR 60.5110 - How do I comply with the increment of progress for submittal of a control plan?
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Existing Sewage Sludge Incineration Units Model Rule... pollution control and process changes that you will use to comply with the emission limits and standards and...
ERIC Educational Resources Information Center
Kultur, Can; Oytun, Erden; Cagiltay, Kursat; Ozden, M. Yasar; Kucuk, Mehmet Emin
2004-01-01
The Shareable Content Object Reference Model (SCORM) aims to standardize electronic course content, its packaging and delivery. Instructional designers and e-learning material producer organizations accept SCORM?s significant impact on instructional design/delivery process, however not much known about how such standards will be implemented to…
Fracking: Drilling into Math and Social Justice
ERIC Educational Resources Information Center
Hendrickson, Katie A.
2015-01-01
Mathematical modeling, a focus of the Common Core State Standards for School Mathematics (CCSSI 2010) and one of the Standards for Mathematical Practice, is generally considered to be the process of exploring a real-world situation and making sense of it using mathematics (Lesh and Zawojewski 2007). Teachers need to create opportunities for…
Connecting dark matter annihilation to the vertex functions of Standard Model fermions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Jason; Light, Christopher, E-mail: jkumar@hawaii.edu, E-mail: lightc@hawaii.edu
We consider scenarios in which dark matter is a Majorana fermion which couples to Standard Model fermions through the exchange of charged mediating particles. The matrix elements for various dark matter annihilation processes are then related to one-loop corrections to the fermion-photon vertex, where dark matter and the charged mediators run in the loop. In particular, in the limit where Standard Model fermion helicity mixing is suppressed, the cross section for dark matter annihilation to various final states is related to corrections to the Standard Model fermion charge form factor. These corrections can be extracted in a gauge-invariant manner frommore » collider cross sections. Although current measurements from colliders are not precise enough to provide useful constraints on dark matter annihilation, improved measurements at future experiments, such as the International Linear Collider, could improve these constraints by several orders of magnitude, allowing them to surpass the limits obtainable by direct observation.« less
A data types profile suitable for use with ISO EN 13606.
Sun, Shanghua; Austin, Tony; Kalra, Dipak
2012-12-01
ISO EN 13606 is a five part International Standard specifying how Electronic Healthcare Record (EHR) information should be communicated between different EHR systems and repositories. Part 1 of the standard defines an information model for representing the EHR information itself, including the representation of types of data value. A later International Standard, ISO 21090:2010, defines a comprehensive set of models for data types needed by all health IT systems. This latter standard is vast, and duplicates some of the functions already handled by ISO EN 13606 part 1. A profile (sub-set) of ISO 21090 would therefore be expected to provide EHR system vendors with a more specially tailored set of data types to implement and avoid the risk of providing more than one modelling option for representing the information properties. This paper describes the process and design decisions made for developing a data types profile for EHR interoperability.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
NASA Technical Reports Server (NTRS)
Avila, Arturo
2011-01-01
The Standard JPL thermal engineering practice prescribes worst-case methodologies for design. In this process, environmental and key uncertain thermal parameters (e.g., thermal blanket performance, interface conductance, optical properties) are stacked in a worst case fashion to yield the most hot- or cold-biased temperature. Thus, these simulations would represent the upper and lower bounds. This, effectively, represents JPL thermal design margin philosophy. Uncertainty in the margins and the absolute temperatures is usually estimated by sensitivity analyses and/or by comparing the worst-case results with "expected" results. Applicability of the analytical model for specific design purposes along with any temperature requirement violations are documented in peer and project design review material. In 2008, NASA released NASA-STD-7009, Standard for Models and Simulations. The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the Modeling and Simulation (M&S) credibility, and the reporting of the M&S results. The Mars Exploration Rover (MER) project thermal control system M&S activity was chosen as a case study determining whether JPL practice is in line with the standard and to identify areas of non-compliance. This paper summarizes the results and makes recommendations regarding the application of this standard to JPL thermal M&S practices.
Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8
DOE Office of Scientific and Technical Information (OSTI.GOV)
First, M.W.
1991-02-01
Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)
The Effects of Autocorrelation on the Curve-of-Factors Growth Model
ERIC Educational Resources Information Center
Murphy, Daniel L.; Beretvas, S. Natasha; Pituch, Keenan A.
2011-01-01
This simulation study examined the performance of the curve-of-factors model (COFM) when autocorrelation and growth processes were present in the first-level factor structure. In addition to the standard curve-of factors growth model, 2 new models were examined: one COFM that included a first-order autoregressive autocorrelation parameter, and a…
Taming Many-Parameter BSM Models with Bayesian Neural Networks
NASA Astrophysics Data System (ADS)
Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.
2017-09-01
The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.
Diffusion Decision Model: Current Issues and History.
Ratcliff, Roger; Smith, Philip L; Brown, Scott D; McKoon, Gail
2016-04-01
There is growing interest in diffusion models to represent the cognitive and neural processes of speeded decision making. Sequential-sampling models like the diffusion model have a long history in psychology. They view decision making as a process of noisy accumulation of evidence from a stimulus. The standard model assumes that evidence accumulates at a constant rate during the second or two it takes to make a decision. This process can be linked to the behaviors of populations of neurons and to theories of optimality. Diffusion models have been used successfully in a range of cognitive tasks and as psychometric tools in clinical research to examine individual differences. In this review, we relate the models to both earlier and more recent research in psychology. Copyright © 2016. Published by Elsevier Ltd.
The Future of Geospatial Standards
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Simonis, I.
2016-12-01
The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.
Interacting damage models mapped onto ising and percolation models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toussaint, Renaud; Pride, Steven R.
The authors introduce a class of damage models on regular lattices with isotropic interactions between the broken cells of the lattice. Quasistatic fiber bundles are an example. The interactions are assumed to be weak, in the sense that the stress perturbation from a broken cell is much smaller than the mean stress in the system. The system starts intact with a surface-energy threshold required to break any cell sampled from an uncorrelated quenched-disorder distribution. The evolution of this heterogeneous system is ruled by Griffith's principle which states that a cell breaks when the release in potential (elastic) energy in themore » system exceeds the surface-energy barrier necessary to break the cell. By direct integration over all possible realizations of the quenched disorder, they obtain the probability distribution of each damage configuration at any level of the imposed external deformation. They demonstrate an isomorphism between the distributions so obtained and standard generalized Ising models, in which the coupling constants and effective temperature in the Ising model are functions of the nature of the quenched-disorder distribution and the extent of accumulated damage. In particular, they show that damage models with global load sharing are isomorphic to standard percolation theory, that damage models with local load sharing rule are isomorphic to the standard ising model, and draw consequences thereof for the universality class and behavior of the autocorrelation length of the breakdown transitions corresponding to these models. they also treat damage models having more general power-law interactions, and classify the breakdown process as a function of the power-law interaction exponent. Last, they also show that the probability distribution over configurations is a maximum of Shannon's entropy under some specific constraints related to the energetic balance of the fracture process, which firmly relates this type of quenched-disorder based damage model to standard statistical mechanics.« less
Process gg{yields}h{sub 0}{yields}{gamma}{gamma} in the Lee-Wick standard model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krauss, F.; Underwood, T. E. J.; Zwicky, R.
2008-01-01
The process gg{yields}h{sub 0}{yields}{gamma}{gamma} is studied in the Lee-Wick extension of the standard model (LWSM) proposed by Grinstein, O'Connell, and Wise. In this model, negative norm partners for each SM field are introduced with the aim to cancel quadratic divergences in the Higgs mass. All sectors of the model relevant to gg{yields}h{sub 0}{yields}{gamma}{gamma} are diagonalized and results are commented on from the perspective of both the Lee-Wick and higher-derivative formalisms. Deviations from the SM rate for gg{yields}h{sub 0} are found to be of the order of 15%-5% for Lee-Wick masses in the range 500-1000 GeV. Effects on the rate formore » h{sub 0}{yields}{gamma}{gamma} are smaller, of the order of 5%-1% for Lee-Wick masses in the same range. These comparatively small changes may well provide a means of distinguishing the LWSM from other models such as universal extra dimensions where same-spin partners to standard model fields also appear. Corrections to determinations of Cabibbo-Kobayashi-Maskawa (CKM) elements |V{sub t(b,s,d)}| are also considered and are shown to be positive, allowing the possibility of measuring a CKM element larger than unity, a characteristic signature of the ghostlike nature of the Lee-Wick fields.« less
The CMMI Product Suite and International Standards
2006-07-01
standards: “2.3 Reference Documents 2.3.1 Applicable ISO /IEC documents, including ISO /IEC 12207 and ISO /IEC 15504.” “3.1 Development User Requirements...related international standards such as ISO 9001:2000, 12207 , 15288 © 2006 by Carnegie Mellon University Page 12 Key Supplements Needed...the Measurement Framework in ISO /IEC 15504; and • the Process Reference Model included in ISO /IEC 12207 . A possible approach has been developed for
Doctors or technicians: assessing quality of medical education
Hasan, Tayyab
2010-01-01
Medical education institutions usually adapt industrial quality management models that measure the quality of the process of a program but not the quality of the product. The purpose of this paper is to analyze the impact of industrial quality management models on medical education and students, and to highlight the importance of introducing a proper educational quality management model. Industrial quality management models can measure the training component in terms of competencies, but they lack the educational component measurement. These models use performance indicators to assess their process improvement efforts. Researchers suggest that the performance indicators used in educational institutions may only measure their fiscal efficiency without measuring the quality of the educational experience of the students. In most of the institutions, where industrial models are used for quality assurance, students are considered as customers and are provided with the maximum services and facilities possible. Institutions are required to fulfill a list of recommendations from the quality control agencies in order to enhance student satisfaction and to guarantee standard services. Quality of medical education should be assessed by measuring the impact of the educational program and quality improvement procedures in terms of knowledge base development, behavioral change, and patient care. Industrial quality models may focus on academic support services and processes, but educational quality models should be introduced in parallel to focus on educational standards and products. PMID:23745059
Doctors or technicians: assessing quality of medical education.
Hasan, Tayyab
2010-01-01
Medical education institutions usually adapt industrial quality management models that measure the quality of the process of a program but not the quality of the product. The purpose of this paper is to analyze the impact of industrial quality management models on medical education and students, and to highlight the importance of introducing a proper educational quality management model. Industrial quality management models can measure the training component in terms of competencies, but they lack the educational component measurement. These models use performance indicators to assess their process improvement efforts. Researchers suggest that the performance indicators used in educational institutions may only measure their fiscal efficiency without measuring the quality of the educational experience of the students. In most of the institutions, where industrial models are used for quality assurance, students are considered as customers and are provided with the maximum services and facilities possible. Institutions are required to fulfill a list of recommendations from the quality control agencies in order to enhance student satisfaction and to guarantee standard services. Quality of medical education should be assessed by measuring the impact of the educational program and quality improvement procedures in terms of knowledge base development, behavioral change, and patient care. Industrial quality models may focus on academic support services and processes, but educational quality models should be introduced in parallel to focus on educational standards and products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidhu, K.S.
1991-06-01
The primary objective of a standard setting process is to arrive at a drinking water concentration at which exposure to a contaminant would result in no known or potential adverse health effect on human health. The drinking water standards also serve as guidelines to prevent pollution of water sources and may be applicable in some cases as regulatory remediation levels. The risk assessment methods along with various decision making parameters are used to establish drinking water standards. For carcinogens classified in Groups A and B by the United States Environmental Protection Agency (USEPA) the standards are set by using nonthresholdmore » cancer risk models. The linearized multistage model is commonly used for computation of potency factors for carcinogenic contaminants. The acceptable excess risk level may vary from 10(-6) to 10(-4). For noncarcinogens, a threshold model approach based on application of an uncertainty factor is used to arrive at a reference dose (RfD). The RfD approach may also be used for carcinogens classified in Group C by the USEPA. The RfD approach with an additional uncertainty factory of 10 for carcinogenicity has been applied in the formulation of risk assessment for Group C carcinogens. The assumptions commonly used in arriving at drinking water standards are human life expectancy, 70 years; average human body weight, 70 kg; human daily drinking water consumption, 2 liters; and contribution of exposure to the contaminant from drinking water (expressed as a part of the total environmental exposure), 20%. Currently, there are over 80 USEPA existing or proposed primary standards for organic and inorganic contaminants in drinking water. Some of the state versus federal needs and viewpoints are discussed.« less
Beyond standard model calculations with Sherpa
Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; ...
2015-03-24
We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.
A New Activity-Based Financial Cost Management Method
NASA Astrophysics Data System (ADS)
Qingge, Zhang
The standard activity-based financial cost management model is a new model of financial cost management, which is on the basis of the standard cost system and the activity-based cost and integrates the advantages of the two. It is a new model of financial cost management with more accurate and more adequate cost information by taking the R&D expenses as the accounting starting point and after-sale service expenses as the terminal point and covering the whole producing and operating process and the whole activities chain and value chain aiming at serving the internal management and decision.
Beyond standard model calculations with Sherpa.
Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; Siegert, Frank
We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in Beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.
NASA Astrophysics Data System (ADS)
Hamada, Yuta; Yamada, Masatoshi
2017-09-01
The null result in the LHC may indicate that the standard model is not drastically modified up to very high scales, such as the GUT/string scale. Having this in the mind, we suggest a novel leptogenesis scenario realized in the false vacuum of the Higgs field. If the Higgs field develops a large vacuum expectation value in the early universe, a lepton number violating process is enhanced, which we use for baryogenesis. To demonstrate the scenario, several models are discussed. For example, we show that the observed baryon asymmetry is successfully generated in the standard model with higher-dimensional operators.
From conceptual modeling to a map
NASA Astrophysics Data System (ADS)
Gotlib, Dariusz; Olszewski, Robert
2018-05-01
Nowadays almost every map is a component of the information system. Design and production of maps requires the use of specific rules for modeling information systems: conceptual, application and data modelling. While analyzing various stages of cartographic modeling the authors ask the question: at what stage of this process a map occurs. Can we say that the "life of the map" begins even before someone define its form of presentation? This question is particularly important at the time of exponentially increasing number of new geoinformation products. During the analysis of the theory of cartography and relations of the discipline to other fields of knowledge it has been attempted to define a few properties of cartographic modeling which distinguish the process from other methods of spatial modeling. Assuming that the map is a model of reality (created in the process of cartographic modeling supported by domain-modeling) the article proposes an analogy of the process of cartographic modeling to the scheme of conceptual modeling presented in ISO 19101 standard.
Memory processes during sleep: beyond the standard consolidation theory.
Axmacher, Nikolai; Draguhn, Andreas; Elger, Christian E; Fell, Juergen
2009-07-01
Two-step theories of memory formation suggest that an initial encoding stage, during which transient neural assemblies are formed in the hippocampus, is followed by a second step called consolidation, which involves re-processing of activity patterns and is associated with an increasing involvement of the neocortex. Several studies in human subjects as well as in animals suggest that memory consolidation occurs predominantly during sleep (standard consolidation model). Alternatively, it has been suggested that consolidation may occur during waking state as well and that the role of sleep is rather to restore encoding capabilities of synaptic connections (synaptic downscaling theory). Here, we review the experimental evidence favoring and challenging these two views and suggest an integrative model of memory consolidation.
Bayesian non-parametric inference for stochastic epidemic models using Gaussian Processes.
Xu, Xiaoguang; Kypraios, Theodore; O'Neill, Philip D
2016-10-01
This paper considers novel Bayesian non-parametric methods for stochastic epidemic models. Many standard modeling and data analysis methods use underlying assumptions (e.g. concerning the rate at which new cases of disease will occur) which are rarely challenged or tested in practice. To relax these assumptions, we develop a Bayesian non-parametric approach using Gaussian Processes, specifically to estimate the infection process. The methods are illustrated with both simulated and real data sets, the former illustrating that the methods can recover the true infection process quite well in practice, and the latter illustrating that the methods can be successfully applied in different settings. © The Author 2016. Published by Oxford University Press.
Toward the First Data Acquisition Standard in Synthetic Biology.
Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I
2016-08-19
This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.
Higgs boson decays to neutralinos in low-scale gauge mediation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason, John D.; Poland, David; Morrissey, David E.
2009-12-01
We study the decays of a standard model-like minimal supersymmetric standard model Higgs boson to pairs of neutralinos, each of which subsequently decays promptly to a photon and a gravitino. Such decays can arise in supersymmetric scenarios where supersymmetry breaking is mediated to us by gauge interactions with a relatively light gauge messenger sector (M{sub mess} < or approx. 100 TeV). This process gives rise to a collider signal consisting of a pair of photons and missing energy. In the present work we investigate the bounds on this scenario within the minimal supersymmetric standard model from existing collider data. Wemore » also study the prospects for discovering the Higgs boson through this decay mode with upcoming data from the Tevatron and the LHC.« less
Search for ZH → l+ l- bb production in 4.2 fb(-1) of pp collisions at sqrt[s] =1 .96 TeV.
Abazov, V M; Abbott, B; Abolins, M; Acharya, B S; Adams, M; Adams, T; Alexeev, G D; Alkhazov, G; Alton, A; Alverson, G; Alves, G A; Ancu, L S; Aoki, M; Arnoud, Y; Arov, M; Askew, A; Asman, B; Atramentov, O; Avila, C; BackusMayes, J; Badaud, F; Bagby, L; Baldin, B; Bandurin, D V; Banerjee, S; Barberis, E; Baringer, P; Barreto, J; Bartlett, J F; Bassler, U; Beale, S; Bean, A; Begalli, M; Begel, M; Belanger-Champagne, C; Bellantoni, L; Benitez, J A; Beri, S B; Bernardi, G; Bernhard, R; Bertram, I; Besançon, M; Beuselinck, R; Bezzubov, V A; Bhat, P C; Bhatnagar, V; Blazey, G; Blessing, S; Bloom, K; Boehnlein, A; Boline, D; Bolton, T A; Boos, E E; Borissov, G; Bose, T; Brandt, A; Brandt, O; Brock, R; Brooijmans, G; Bross, A; Brown, D; Brown, J; Bu, X B; Buchholz, D; Buehler, M; Buescher, V; Bunichev, V; Burdin, S; Burnett, T H; Buszello, C P; Calpas, B; Calvet, S; Camacho-Pérez, E; Carrasco-Lizarraga, M A; Carrera, E; Casey, B C K; Castilla-Valdez, H; Chakrabarti, S; Chakraborty, D; Chan, K M; Chandra, A; Chen, G; Chevalier-Théry, S; Cho, D K; Cho, S W; Choi, S; Choudhary, B; Christoudias, T; Cihangir, S; Claes, D; Clutter, J; Cooke, M; Cooper, W E; Corcoran, M; Couderc, F; Cousinou, M-C; Croc, A; Cutts, D; Cwiok, M; Das, A; Davies, G; De, K; de Jong, S J; De La Cruz-Burelo, E; Déliot, F; Demarteau, M; Demina, R; Denisov, D; Denisov, S P; Desai, S; DeVaughan, K; Diehl, H T; Diesburg, M; Dominguez, A; Dorland, T; Dubey, A; Dudko, L V; Duggan, D; Duperrin, A; Dutt, S; Dyshkant, A; Eads, M; Edmunds, D; Ellison, J; Elvira, V D; Enari, Y; Eno, S; Evans, H; Evdokimov, A; Evdokimov, V N; Facini, G; Ferapontov, A V; Ferbel, T; Fiedler, F; Filthaut, F; Fisher, W; Fisk, H E; Fortner, M; Fox, H; Fuess, S; Gadfort, T; Garcia-Bellido, A; Gavrilov, V; Gay, P; Geist, W; Geng, W; Gerbaudo, D; Gerber, C E; Gershtein, Y; Ginther, G; Golovanov, G; Goussiou, A; Grannis, P D; Greder, S; Greenlee, H; Greenwood, Z D; Gregores, E M; Grenier, G; Gris, Ph; Grivaz, J-F; Grohsjean, A; Grünendahl, S; Grünewald, M W; Guo, F; Guo, J; Gutierrez, G; Gutierrez, P; Haas, A; Hagopian, S; Haley, J; Han, L; Harder, K; Harel, A; Hauptman, J M; Hays, J; Hebbeker, T; Hedin, D; Hegab, H; Heinson, A P; Heintz, U; Hensel, C; Heredia-De La Cruz, I; Herner, K; Hesketh, G; Hildreth, M D; Hirosky, R; Hoang, T; Hobbs, J D; Hoeneisen, B; Hohlfeld, M; Hossain, S; Hubacek, Z; Huske, N; Hynek, V; Iashvili, I; Illingworth, R; Ito, A S; Jabeen, S; Jaffré, M; Jain, S; Jamin, D; Jesik, R; Johns, K; Johnson, M; Johnston, D; Jonckheere, A; Jonsson, P; Joshi, J; Juste, A; Kaadze, K; Kajfasz, E; Karmanov, D; Kasper, P A; Katsanos, I; Kehoe, R; Kermiche, S; Khalatyan, N; Khanov, A; Kharchilava, A; Kharzheev, Y N; Khatidze, D; Kirby, M H; Kohli, J M; Kozelov, A V; Kraus, J; Kumar, A; Kupco, A; Kurča, T; Kuzmin, V A; Kvita, J; Lammers, S; Landsberg, G; Lebrun, P; Lee, H S; Lee, S W; Lee, W M; Lellouch, J; Li, L; Li, Q Z; Lietti, S M; Lim, J K; Lincoln, D; Linnemann, J; Lipaev, V V; Lipton, R; Liu, Y; Liu, Z; Lobodenko, A; Lokajicek, M; Love, P; Lubatti, H J; Luna-Garcia, R; Lyon, A L; Maciel, A K A; Mackin, D; Madar, R; Magaña-Villalba, R; Malik, S; Malyshev, V L; Maravin, Y; Martínez-Ortega, J; McCarthy, R; McGivern, C L; Meijer, M M; Melnitchouk, A; Menezes, D; Mercadante, P G; Merkin, M; Meyer, A; Meyer, J; Mondal, N K; Muanza, G S; Mulhearn, M; Nagy, E; Naimuddin, M; Narain, M; Nayyar, R; Neal, H A; Negret, J P; Neustroev, P; Nilsen, H; Novaes, S F; Nunnemann, T; Obrant, G; Onoprienko, D; Orduna, J; Osman, N; Osta, J; Otero y Garzón, G J; Owen, M; Padilla, M; Pangilinan, M; Parashar, N; Parihar, V; Park, S K; Parsons, J; Partridge, R; Parua, N; Patwa, A; Penning, B; Perfilov, M; Peters, K; Peters, Y; Petrillo, G; Pétroff, P; Piegaia, R; Piper, J; Pleier, M-A; Podesta-Lerma, P L M; Podstavkov, V M; Pol, M-E; Polozov, P; Popov, A V; Prewitt, M; Price, D; Protopopescu, S; Qian, J; Quadt, A; Quinn, B; Rangel, M S; Ranjan, K; Ratoff, P N; Razumov, I; Renkel, P; Rich, P; Rijssenbeek, M; Ripp-Baudot, I; Rizatdinova, F; Rominsky, M; Royon, C; Rubinov, P; Ruchti, R; Safronov, G; Sajot, G; Sánchez-Hernández, A; Sanders, M P; Sanghi, B; Santos, A S; Savage, G; Sawyer, L; Scanlon, T; Schamberger, R D; Scheglov, Y; Schellman, H; Schliephake, T; Schlobohm, S; Schwanenberger, C; Schwienhorst, R; Sekaric, J; Severini, H; Shabalina, E; Shary, V; Shchukin, A A; Shivpuri, R K; Simak, V; Sirotenko, V; Skubic, P; Slattery, P; Smirnov, D; Smith, K J; Snow, G R; Snow, J; Snyder, S; Söldner-Rembold, S; Sonnenschein, L; Sopczak, A; Sosebee, M; Soustruznik, K; Spurlock, B; Stark, J; Stolin, V; Stoyanova, D A; Strauss, E; Strauss, M; Strom, D; Stutte, L; Svoisky, P; Takahashi, M; Tanasijczuk, A; Taylor, W; Titov, M; Tokmenin, V V; Tsybychev, D; Tuchming, B; Tully, C; Tuts, P M; Uvarov, L; Uvarov, S; Uzunyan, S; Van Kooten, R; van Leeuwen, W M; Varelas, N; Varnes, E W; Vasilyev, I A; Verdier, P; Vertogradov, L S; Verzocchi, M; Vesterinen, M; Vilanova, D; Vint, P; Vokac, P; Wahl, H D; Wang, M H L S; Warchol, J; Watts, G; Wayne, M; Weber, M; Wetstein, M; White, A; Wicke, D; Williams, M R J; Wilson, G W; Wimpenny, S J; Wobisch, M; Wood, D R; Wyatt, T R; Xie, Y; Xu, C; Yacoob, S; Yamada, R; Yang, W-C; Yasuda, T; Yatsunenko, Y A; Ye, Z; Yin, H; Yip, K; Yoo, H D; Youn, S W; Yu, J; Zelitch, S; Zhao, T; Zhou, B; Zhu, J; Zielinski, M; Zieminska, D; Zivkovic, L
2010-12-17
We present a search for the standard model Higgs boson produced in association with a Z boson in 4.2 fb(-1) of pp collisions, collected with the D0 detector at the Fermilab Tevatron at sqrt[s] =1 .96 TeV. Selected events contain one reconstructed Z → e+ e- or Z → μ+ μ- candidate and at least two jets, including at least one b-tagged jet. In the absence of an excess over the background expected from other standard model processes, limits on the ZH cross section multiplied by the branching ratios are set. The limit at M(H) = 115 GeV is a factor of 5.9 larger than the standard model prediction.
A reference model for space data system interconnection services
NASA Astrophysics Data System (ADS)
Pietras, John; Theis, Gerhard
1993-03-01
The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).
A reference model for space data system interconnection services
NASA Technical Reports Server (NTRS)
Pietras, John; Theis, Gerhard
1993-01-01
The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).
Guenter, Peggi; Boullata, Joseph I; Ayers, Phil; Gervasio, Jane; Malone, Ainsley; Raymond, Erica; Holcombe, Beverly; Kraft, Michael; Sacks, Gordon; Seres, David
2015-08-01
Parenteral nutrition (PN) provision is complex, as it is a high-alert medication and prone to a variety of potential errors. With changes in clinical practice models and recent federal rulings, the number of PN prescribers may be increasing. Safe prescribing of this therapy requires that competency for prescribers from all disciplines be demonstrated using a standardized process. A standardized model for PN prescribing competency is proposed based on a competency framework, the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.)-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines. This framework will guide institutions and agencies in developing and maintaining competency for safe PN prescription by their staff. © 2015 American Society for Parenteral and Enteral Nutrition.
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
Systems Architecture for a Nationwide Healthcare System.
Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio
2015-01-01
From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.
Compressive behavior of laminated neoprene bridge bearing pads under thermal aging condition
NASA Astrophysics Data System (ADS)
Jun, Xie; Zhang, Yannian; Shan, Chunhong
2017-10-01
The present study was conducted to obtain a better understanding of the variation rule of mechanical properties of laminated neoprene bridge bearing pads under thermal aging condition using compression tests. A total of 5 specimens were processed in a high-temperature chamber. After that, the specimens were tested subjected to axial load. The parameter mainly considered time of thermal aging processing for specimens. The results of compression tests show that the specimens after thermal aging processing are more probably brittle failure than the standard specimen. Moreover, the exposure of steel plate, cracks and other failure phenomena are more serious than the standard specimen. The compressive capacity, ultimate compressive strength, compressive elastic modulus of the laminated neoprene bridge bearing pads decreased dramatically with the increasing in the aging time of thermal aging processing. The attenuation trends of ultimate compressive strength, compressive elastic modulus of laminated neoprene bridge bearing pads under thermal aging condition accord with power function. The attenuation models are acquired by regressing data of experiment with the least square method. The attenuation models conform to reality well which shows that this model is applicable and has vast prospect in assessing the performance of laminated neoprene bridge bearing pads under thermal aging condition.
Zhang, Zhongqi; Zhang, Aming; Xiao, Gang
2012-06-05
Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.
Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study
2007-06-01
Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that
A General Model for Food Purchasing in Captive Food Service Institutions.
1979-08-28
pounds Turnip greens Canned Asparagus 904 38 cases* Chicken Canadian Bacon 1670 1670 pounds Turkey breast Veal 3580 358 pounds Hamburger Rump roast 9648...65 11. RAW TURNIP PRICES .. .. ....... ............. 66 12. PROCESSED CHICKEN PRICES...planning, edited by Birchflield (10), emphasizes the Importance of standard recipes in determining quantities of food required for menu items. Standard
Toward a Qualitative Analysis of Standardized Tests Using an Information Processing Model.
ERIC Educational Resources Information Center
Armour-Thomas, Eleanor
The use of standardized tests and test data to detect and address differences in cognitive styles is advocated here. To this end, the paper describes the componential theory of intelligence addressed by Sternberg et. al. This theory defines the components of intelligence by function and level of generality, including: (1) metacomponents: higher…
ERIC Educational Resources Information Center
Klinker, JoAnn Franklin; Hackmann, Donald G.
High school principals confront ethical dilemmas daily. This report describes a study that examined how MetLife/NASSP secondary principals of the year made ethical decisions conforming to three dispositions from Standard 5 of the ISLLC standards and whether they could identify processes used to reach those decisions through Rest's Four Component…
NASA Astrophysics Data System (ADS)
Sharonov, M. A.; Sharonova, O. V.; Sharonova, V. P.
2018-03-01
The article is an attempt to create a model built using Eulerian circles (Venn diagrams) to illustrate the methodological impact of recent Federal Law 283-FZ “On the independent evaluation of qualifications” and new Federal State Educational Standards of higher education of generation 3++ on educational process in Russia. In modern economic conditions, the ability to correctly assess the role of professional standards, as a matter of fact, some set, the degree of intersection with the approximate basic educational program and the Federal State Educational Standards becomes an important factor on which in the future will depend not only the demand of graduates in the labor market, but also the possibility of passing the professional and public accreditation of the proposed program.
Prediction by regression and intrarange data scatter in surface-process studies
Toy, T.J.; Osterkamp, W.R.; Renard, K.G.
1993-01-01
Modeling is a major component of contemporary earth science, and regression analysis occupies a central position in the parameterization, calibration, and validation of geomorphic and hydrologic models. Although this methodology can be used in many ways, we are primarily concerned with the prediction of values for one variable from another variable. Examination of the literature reveals considerable inconsistency in the presentation of the results of regression analysis and the occurrence of patterns in the scatter of data points about the regression line. Both circumstances confound utilization and evaluation of the models. Statisticians are well aware of various problems associated with the use of regression analysis and offer improved practices; often, however, their guidelines are not followed. After a review of the aforementioned circumstances and until standard criteria for model evaluation become established, we recommend, as a minimum, inclusion of scatter diagrams, the standard error of the estimate, and sample size in reporting the results of regression analyses for most surface-process studies. ?? 1993 Springer-Verlag.
Repeatability Modeling for Wind-Tunnel Measurements: Results for Three Langley Facilities
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Houlden, Heather P.
2014-01-01
Data from extensive check standard tests of seven measurement processes in three NASA Langley Research Center wind tunnels are statistically analyzed to test a simple model previously presented in 2000 for characterizing short-term, within-test and across-test repeatability. The analysis is intended to support process improvement and development of uncertainty models for the measurements. The analysis suggests that the repeatability can be estimated adequately as a function of only the test section dynamic pressure over a two-orders- of-magnitude dynamic pressure range. As expected for low instrument loading, short-term coefficient repeatability is determined by the resolution of the instrument alone (air off). However, as previously pointed out, for the highest dynamic pressure range the coefficient repeatability appears to be independent of dynamic pressure, thus presenting a lower floor for the standard deviation for all three time frames. The simple repeatability model is shown to be adequate for all of the cases presented and for all three time frames.
Multivariate regression model for predicting lumber grade volumes of northern red oak sawlogs
Daniel A. Yaussy; Robert L. Brisbin
1983-01-01
A multivariate regression model was developed to predict green board-foot yields for the seven common factory lumber grades processed from northern red oak (Quercus rubra L.) factory grade logs. The model uses the standard log measurements of grade, scaling diameter, length, and percent defect. It was validated with an independent data set. The model...
Stochastic nonlinear dynamics pattern formation and growth models
Yaroslavsky, Leonid P
2007-01-01
Stochastic evolutionary growth and pattern formation models are treated in a unified way in terms of algorithmic models of nonlinear dynamic systems with feedback built of a standard set of signal processing units. A number of concrete models is described and illustrated by numerous examples of artificially generated patterns that closely imitate wide variety of patterns found in the nature. PMID:17908341
Single Top Production at Next-to-Leading Order in the Standard Model Effective Field Theory.
Zhang, Cen
2016-04-22
Single top production processes at hadron colliders provide information on the relation between the top quark and the electroweak sector of the standard model. We compute the next-to-leading order QCD corrections to the three main production channels: t-channel, s-channel, and tW associated production, in the standard model including operators up to dimension six. The calculation can be matched to parton shower programs and can therefore be directly used in experimental analyses. The QCD corrections are found to significantly impact the extraction of the current limits on the operators, because both of an improved accuracy and a better precision of the theoretical predictions. In addition, the distributions of some of the key discriminating observables are modified in a nontrivial way, which could change the interpretation of measurements in terms of UV complete models.
The tangled bank of amino acids.
Goldstein, Richard A; Pollock, David D
2016-07-01
The use of amino acid substitution matrices to model protein evolution has yielded important insights into both the evolutionary process and the properties of specific protein families. In order to make these models tractable, standard substitution matrices represent the average results of the evolutionary process rather than the underlying molecular biophysics and population genetics, treating proteins as a set of independently evolving sites rather than as an integrated biomolecular entity. With advances in computing and the increasing availability of sequence data, we now have an opportunity to move beyond current substitution matrices to more interpretable mechanistic models with greater fidelity to the evolutionary process of mutation and selection and the holistic nature of the selective constraints. As part of this endeavour, we consider how epistatic interactions induce spatial and temporal rate heterogeneity, and demonstrate how these generally ignored factors can reconcile standard substitution rate matrices and the underlying biology, allowing us to better understand the meaning of these substitution rates. Using computational simulations of protein evolution, we can demonstrate the importance of both spatial and temporal heterogeneity in modelling protein evolution. © 2016 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
ERIC Educational Resources Information Center
St. John, Edward P.; Loescher, Siri; Jacob, Stacy; Cekic, Osman; Kupersmith, Leigh; Musoba, Glenda Droogsma
A growing number of schools are exploring the prospect of applying for funding to implement a Comprehensive School Reform (CSR) model. But the process of selecting a CSR model can be complicated because it frequently involves self-study and a review of models to determine which models best meet the needs of the school. This study guide is intended…
Regulatory ozone modeling: status, directions, and research needs.
Georgopoulos, P G
1995-01-01
The Clean Air Act Amendments (CAAA) of 1990 have established selected comprehensive, three-dimensional, Photochemical Air Quality Simulation Models (PAQSMs) as the required regulatory tools for analyzing the urban and regional problem of high ambient ozone levels across the United States. These models are currently applied to study and establish strategies for meeting the National Ambient Air Quality Standard (NAAQS) for ozone in nonattainment areas; State Implementation Plans (SIPs) resulting from these efforts must be submitted to the U.S. Environmental Protection Agency (U.S. EPA) in November 1994. The following presentation provides an overview and discussion of the regulatory ozone modeling process and its implications. First, the PAQSM-based ozone attainment demonstration process is summarized in the framework of the 1994 SIPs. Then, following a brief overview of the representation of physical and chemical processes in PAQSMs, the essential attributes of standard modeling systems currently in regulatory use are presented in a nonmathematical, self-contained format, intended to provide a basic understanding of both model capabilities and limitations. The types of air quality, emission, and meteorological data needed for applying and evaluating PAQSMs are discussed, as well as the sources, availability, and limitations of existing databases. The issue of evaluating a model's performance in order to accept it as a tool for policy making is discussed, and various methodologies for implementing this objective are summarized. Selected interim results from diagnostic analyses, which are performed as a component of the regulatory ozone modeling process for the Philadelphia-New Jersey region, are also presented to provide some specific examples related to the general issues discussed in this work. Finally, research needs related to a) the evaluation and refinement of regulatory ozone modeling, b) the characterization of uncertainty in photochemical modeling, and c) the improvement of the model-based ozone-attainment demonstration process are presented to identify future directions in this area. Images Figure 7. Figure 7. Figure 7. Figure 8. Figure 9. PMID:7614934
Cost minimizing of cutting process for CNC thermal and water-jet machines
NASA Astrophysics Data System (ADS)
Tavaeva, Anastasia; Kurennov, Dmitry
2015-11-01
This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.
Evolutionary inference via the Poisson Indel Process.
Bouchard-Côté, Alexandre; Jordan, Michael I
2013-01-22
We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.
Evolutionary inference via the Poisson Indel Process
Bouchard-Côté, Alexandre; Jordan, Michael I.
2013-01-01
We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114–124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments. PMID:23275296
Production of τ τ jj final states at the LHC and the TauSpinner algorithm: the spin-2 case
NASA Astrophysics Data System (ADS)
Bahmani, M.; Kalinowski, J.; Kotlarski, W.; Richter-Wąs, E.; Wąs, Z.
2018-01-01
The TauSpinner algorithm is a tool that allows one to modify the physics model of the Monte Carlo generated samples due to the changed assumptions of event production dynamics, but without the need of re-generating events. With the help of weights τ -lepton production or decay processes can be modified accordingly to a new physics model. In a recent paper a new version TauSpinner ver.2.0.0 has been presented which includes a provision for introducing non-standard states and couplings and study their effects in the vector-boson-fusion processes by exploiting the spin correlations of τ -lepton pair decay products in processes where final states include also two hard jets. In the present paper we document how this can be achieved taking as an example the non-standard spin-2 state that couples to Standard Model particles and tree-level matrix elements with complete helicity information included for the parton-parton scattering amplitudes into a τ -lepton pair and two outgoing partons. This implementation is prepared as the external (user-provided) routine for the TauSpinner algorithm. It exploits amplitudes generated by MadGraph5 and adapted to the TauSpinner algorithm format. Consistency tests of the implemented matrix elements, re-weighting algorithm and numerical results for observables sensitive to τ polarisation are presented.
ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goel, Supriya; Rosenberg, Michael I.
This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either ofmore » those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.« less
[Standardization and modeling of surgical processes].
Strauss, G; Schmitz, P
2016-12-01
Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.
Development of a standardized, citywide process for managing smart-pump drug libraries.
Walroth, Todd A; Smallwood, Shannon; Arthur, Karen; Vance, Betsy; Washington, Alana; Staublin, Therese; Haslar, Tammy; Reddan, Jennifer G; Fuller, James
2018-06-15
Development and implementation of an interprofessional consensus-driven process for review and optimization of smart-pump drug libraries and dosing limits are described. The Indianapolis Coalition for Patient Safety (ICPS), which represents 6 Indianapolis-area health systems, identified an opportunity to reduce clinically insignificant alerts that smart infusion pumps present to end users. Through a consensus-driven process, ICPS aimed to identify best practices to implement at individual hospitals in order to establish specific action items for smart-pump drug library optimization. A work group of pharmacists, nurses, and industrial engineers met to evaluate variability within and lack of scrutiny of smart-pump drug libraries. The work group used Lean Six Sigma methodologies to generate a list of key needs and barriers to be addressed in process standardization. The group reviewed targets for smart-pump drug library optimization, including dosing limits, types of alerts reviewed, policies, and safety best practices. The work group also analyzed existing processes at each site to develop a final consensus statement outlining a model process for reviewing alerts and managing smart-pump data. Analysis of the total number of alerts per device across ICPS-affiliated health systems over a 4-year period indicated a 50% decrease (from 7.2 to 3.6 alerts per device per month) after implementation of the model by ICPS member organizations. Through implementation of a standardized, consensus-driven process for smart-pump drug library optimization, ICPS member health systems reduced clinically insignificant smart-pump alerts. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jutebring Sterte, Elin; Johansson, Emma; Sjöberg, Ylva; Huseby Karlsen, Reinert; Laudon, Hjalmar
2018-05-01
Groundwater and surface-water interactions are regulated by catchment characteristics and complex inter- and intra-annual variations in climatic conditions that are not yet fully understood. Our objective was to investigate the influence of catchment characteristics and freeze-thaw processes on surface and groundwater interactions in a boreal landscape, the Krycklan catchment in Sweden. We used a numerical modelling approach and sub-catchment evaluation method to identify and evaluate fundamental catchment characteristics and processes. The model reproduced observed stream discharge patterns of the 14 sub-catchments and the dynamics of the 15 groundwater wells with an average accumulated discharge error of 1% (15% standard deviation) and an average groundwater-level mean error of 0.1 m (0.23 m standard deviation). We show how peatland characteristics dampen the effect of intense rain, and how soil freeze-thaw processes regulate surface and groundwater partitioning during snowmelt. With these results, we demonstrate the importance of defining, understanding and quantifying the role of landscape heterogeneity and sub-catchment characteristics for accurately representing catchment hydrological functioning.
Kärki, Anne; Sävel, Jaana; Sallinen, Merja; Kuusinen, Jere
2013-01-01
ICT innovations are constantly developed, and there is no lack of elderly customers, as the number of the elderly is dramatically increasing. Elderly are willing to use ICT to increase their own safety and social activity, but they need trust on the reliability, accessibility and other ethical aspects of ICT including the maintenance of privacy and self-determination. Ethical standards for ICT are usually not considered. "Ethicted" characterizes an ICT service or product as ethically evaluated. As a standardized procedure, it will not only increase the acceptability of ICT, but also provide services for ICT developers. In the future scenario, ICT under development should be evaluated by using a process model that is specifically built to find the lacks in ethical aspects. The model would then be tested by end-users, the formal and informal care givers, to receive direct feedback for redeveloping solutions. As final outcomes, there should be standards for ICT in elderly care and a service for ICT developers to utilize the evaluation model. This future scenario work included partners from 6 EU member countries. The combination of academic research and industrial/commercial interest of ICT developers should and can bring new value to assistive ICT for elderly care.
A UML model for the description of different brain-computer interface systems.
Quitadamo, Lucia Rita; Abbafati, Manuel; Saggio, Giovanni; Marciani, Maria Grazia; Cardarilli, Gian Carlo; Bianchi, Luigi
2008-01-01
BCI research lacks a universal descriptive language among labs and a unique standard model for the description of BCI systems. This results in a serious problem in comparing performances of different BCI processes and in unifying tools and resources. In such a view we implemented a Unified Modeling Language (UML) model for the description virtually of any BCI protocol and we demonstrated that it can be successfully applied to the most common ones such as P300, mu-rhythms, SCP, SSVEP, fMRI. Finally we illustrated the advantages in utilizing a standard terminology for BCIs and how the same basic structure can be successfully adopted for the implementation of new systems.
Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization
NASA Astrophysics Data System (ADS)
Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad
2017-02-01
Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.
Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin
2017-10-01
In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.
Government Open Systems Interconnection Profile (GOSIP) transition strategy
NASA Astrophysics Data System (ADS)
Laxen, Mark R.
1993-09-01
This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.
Extra dimensions hypothesis in high energy physics
NASA Astrophysics Data System (ADS)
Volobuev, Igor; Boos, Eduard; Bunichev, Viacheslav; Perfilov, Maxim; Smolyakov, Mikhail
2017-10-01
We discuss the history of the extra dimensions hypothesis and the physics and phenomenology of models with large extra dimensions with an emphasis on the Randall- Sundrum (RS) model with two branes. We argue that the Standard Model extension based on the RS model with two branes is phenomenologically acceptable only if the inter-brane distance is stabilized. Within such an extension of the Standard Model, we study the influence of the infinite Kaluza-Klein (KK) towers of the bulk fields on collider processes. In particular, we discuss the modification of the scalar sector of the theory, the Higgs-radion mixing due to the coupling of the Higgs boson to the radion and its KK tower, and the experimental restrictions on the mass of the radion-dominated states.
An EM Algorithm for Maximum Likelihood Estimation of Process Factor Analysis Models
ERIC Educational Resources Information Center
Lee, Taehun
2010-01-01
In this dissertation, an Expectation-Maximization (EM) algorithm is developed and implemented to obtain maximum likelihood estimates of the parameters and the associated standard error estimates characterizing temporal flows for the latent variable time series following stationary vector ARMA processes, as well as the parameters defining the…
This document summarizes the process followed to utilize the fuel consumption map of a Ricardo modeled engine and vehicle fuel consumption data to generate a full engine fuel consumption map which can be used by EPA's ALPHA vehicle simulations.
NASA Astrophysics Data System (ADS)
Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan
2018-03-01
Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.
Solubility and dissolution thermodynamics of phthalic anhydride in organic solvents at 283-313 K
NASA Astrophysics Data System (ADS)
Wang, Long; Zhang, Fang; Gao, Xiaoqiang; Luo, Tingliang; Xu, Li; Liu, Guoji
2017-08-01
The solubility of phthalic anhydride was measured at 283-313 K under atmospheric pressure in ethyl acetate, n-propyl acetate, methyl acetate, acetone, 1,4-dioxane, n-hexane, n-butyl acetate, cyclohexane, and dichloromethane. The solubility of phthalic anhydride in all solvents increased with the increasing temperature. The Van't Hoff equation, modified Apelblat equation, λ h equation, and Wilson model were used to correlate the experimental solubility data. The standard dissolution enthalpy, the standard entropy, and the standard Gibbs energy were evaluated based on the Van't Hoff analysis. The experimental data and model parameters would be useful for optimizing of the separation processes involving phthalic anhydride.
NASA Astrophysics Data System (ADS)
Michalicek, M. Adrian; Comtois, John H.; Schriner, Heather K.
1998-04-01
This paper describes the design and characterization of several types of micromirror devices to include process capabilities, device modeling, and test data resulting in deflection versus applied potential curves and surface contour measurements. These devices are the first to be fabricated in the state-of-the-art four-level planarized polysilicon process available at Sandia National Laboratories known as the Sandia Ultra-planar Multi-level MEMS Technology. This enabling process permits the development of micromirror devices with near-ideal characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics as elevated address electrodes, various address wiring techniques, planarized mirror surfaces suing Chemical Mechanical Polishing, unique post-process metallization, and the best active surface area to date.
Farnell, D J J; Popat, H; Richmond, S
2016-06-01
Methods used in image processing should reflect any multilevel structures inherent in the image dataset or they run the risk of functioning inadequately. We wish to test the feasibility of multilevel principal components analysis (PCA) to build active shape models (ASMs) for cases relevant to medical and dental imaging. Multilevel PCA was used to carry out model fitting to sets of landmark points and it was compared to the results of "standard" (single-level) PCA. Proof of principle was tested by applying mPCA to model basic peri-oral expressions (happy, neutral, sad) approximated to the junction between the mouth/lips. Monte Carlo simulations were used to create this data which allowed exploration of practical implementation issues such as the number of landmark points, number of images, and number of groups (i.e., "expressions" for this example). To further test the robustness of the method, mPCA was subsequently applied to a dental imaging dataset utilising landmark points (placed by different clinicians) along the boundary of mandibular cortical bone in panoramic radiographs of the face. Changes of expression that varied between groups were modelled correctly at one level of the model and changes in lip width that varied within groups at another for the Monte Carlo dataset. Extreme cases in the test dataset were modelled adequately by mPCA but not by standard PCA. Similarly, variations in the shape of the cortical bone were modelled by one level of mPCA and variations between the experts at another for the panoramic radiographs dataset. Results for mPCA were found to be comparable to those of standard PCA for point-to-point errors via miss-one-out testing for this dataset. These errors reduce with increasing number of eigenvectors/values retained, as expected. We have shown that mPCA can be used in shape models for dental and medical image processing. mPCA was found to provide more control and flexibility when compared to standard "single-level" PCA. Specifically, mPCA is preferable to "standard" PCA when multiple levels occur naturally in the dataset. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
An administrative claims model for profiling hospital 30-day mortality rates for pneumonia patients.
Bratzler, Dale W; Normand, Sharon-Lise T; Wang, Yun; O'Donnell, Walter J; Metersky, Mark; Han, Lein F; Rapp, Michael T; Krumholz, Harlan M
2011-04-12
Outcome measures for patients hospitalized with pneumonia may complement process measures in characterizing quality of care. We sought to develop and validate a hierarchical regression model using Medicare claims data that produces hospital-level, risk-standardized 30-day mortality rates useful for public reporting for patients hospitalized with pneumonia. Retrospective study of fee-for-service Medicare beneficiaries age 66 years and older with a principal discharge diagnosis of pneumonia. Candidate risk-adjustment variables included patient demographics, administrative diagnosis codes from the index hospitalization, and all inpatient and outpatient encounters from the year before admission. The model derivation cohort included 224,608 pneumonia cases admitted to 4,664 hospitals in 2000, and validation cohorts included cases from each of years 1998-2003. We compared model-derived state-level standardized mortality estimates with medical record-derived state-level standardized mortality estimates using data from the Medicare National Pneumonia Project on 50,858 patients hospitalized from 1998-2001. The final model included 31 variables and had an area under the Receiver Operating Characteristic curve of 0.72. In each administrative claims validation cohort, model fit was similar to the derivation cohort. The distribution of standardized mortality rates among hospitals ranged from 13.0% to 23.7%, with 25(th), 50(th), and 75(th) percentiles of 16.5%, 17.4%, and 18.3%, respectively. Comparing model-derived risk-standardized state mortality rates with medical record-derived estimates, the correlation coefficient was 0.86 (Standard Error = 0.032). An administrative claims-based model for profiling hospitals for pneumonia mortality performs consistently over several years and produces hospital estimates close to those using a medical record model.
An Administrative Claims Model for Profiling Hospital 30-Day Mortality Rates for Pneumonia Patients
Bratzler, Dale W.; Normand, Sharon-Lise T.; Wang, Yun; O'Donnell, Walter J.; Metersky, Mark; Han, Lein F.; Rapp, Michael T.; Krumholz, Harlan M.
2011-01-01
Background Outcome measures for patients hospitalized with pneumonia may complement process measures in characterizing quality of care. We sought to develop and validate a hierarchical regression model using Medicare claims data that produces hospital-level, risk-standardized 30-day mortality rates useful for public reporting for patients hospitalized with pneumonia. Methodology/Principal Findings Retrospective study of fee-for-service Medicare beneficiaries age 66 years and older with a principal discharge diagnosis of pneumonia. Candidate risk-adjustment variables included patient demographics, administrative diagnosis codes from the index hospitalization, and all inpatient and outpatient encounters from the year before admission. The model derivation cohort included 224,608 pneumonia cases admitted to 4,664 hospitals in 2000, and validation cohorts included cases from each of years 1998–2003. We compared model-derived state-level standardized mortality estimates with medical record-derived state-level standardized mortality estimates using data from the Medicare National Pneumonia Project on 50,858 patients hospitalized from 1998–2001. The final model included 31 variables and had an area under the Receiver Operating Characteristic curve of 0.72. In each administrative claims validation cohort, model fit was similar to the derivation cohort. The distribution of standardized mortality rates among hospitals ranged from 13.0% to 23.7%, with 25th, 50th, and 75th percentiles of 16.5%, 17.4%, and 18.3%, respectively. Comparing model-derived risk-standardized state mortality rates with medical record-derived estimates, the correlation coefficient was 0.86 (Standard Error = 0.032). Conclusions/Significance An administrative claims-based model for profiling hospitals for pneumonia mortality performs consistently over several years and produces hospital estimates close to those using a medical record model. PMID:21532758
Berner, Logan T; Law, Beverly E
2016-01-19
Plant trait measurements are needed for evaluating ecological responses to environmental conditions and for ecosystem process model development, parameterization, and testing. We present a standardized dataset integrating measurements from projects conducted by the Terrestrial Ecosystem Research and Regional Analysis- Pacific Northwest (TERRA-PNW) research group between 1999 and 2014 across Oregon and Northern California, where measurements were collected for scaling and modeling regional terrestrial carbon processes with models such as Biome-BGC and the Community Land Model. The dataset contains measurements of specific leaf area, leaf longevity, leaf carbon and nitrogen for 35 tree and shrub species derived from more than 1,200 branch samples collected from over 200 forest plots, including several AmeriFlux sites. The dataset also contains plot-level measurements of forest composition, structure (e.g., tree biomass), and productivity, as well as measurements of soil structure (e.g., bulk density) and chemistry (e.g., carbon). Publically-archiving regional datasets of standardized, co-located, and geo-referenced plant trait measurements will advance the ability of earth system models to capture species-level climate sensitivity at regional to global scales.
NASA Astrophysics Data System (ADS)
Berner, Logan T.; Law, Beverly E.
2016-01-01
Plant trait measurements are needed for evaluating ecological responses to environmental conditions and for ecosystem process model development, parameterization, and testing. We present a standardized dataset integrating measurements from projects conducted by the Terrestrial Ecosystem Research and Regional Analysis- Pacific Northwest (TERRA-PNW) research group between 1999 and 2014 across Oregon and Northern California, where measurements were collected for scaling and modeling regional terrestrial carbon processes with models such as Biome-BGC and the Community Land Model. The dataset contains measurements of specific leaf area, leaf longevity, leaf carbon and nitrogen for 35 tree and shrub species derived from more than 1,200 branch samples collected from over 200 forest plots, including several AmeriFlux sites. The dataset also contains plot-level measurements of forest composition, structure (e.g., tree biomass), and productivity, as well as measurements of soil structure (e.g., bulk density) and chemistry (e.g., carbon). Publically-archiving regional datasets of standardized, co-located, and geo-referenced plant trait measurements will advance the ability of earth system models to capture species-level climate sensitivity at regional to global scales.
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
NASA Astrophysics Data System (ADS)
Nardi, F.; Grimaldi, S.; Petroselli, A.
2012-12-01
Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.
Exotic Leptons. Higgs, Flavor and Collider Phenomenology
Altmannshofer, Wolfgang; Bauer, Martin; Carena, Marcela
2014-01-15
We study extensions of the standard model by one generation of vector-like leptons with non-standard hypercharges, which allow for a sizable modification of the h → γγ decay rate for new lepton masses in the 300 GeV-1 TeV range. We also analyze vacuum stability implications for different hypercharges. Effects in h → Zγ are typically much smaller than in h → γγ, but distinct among the considered hypercharge assignments. Non-standard hypercharges constrain or entirely forbid possible mixing operators with standard model leptons. As a consequence, the leading contributions to the experimentally strongly constrained electric dipole moments of standard model fermionsmore » are only generated at the two loop level by the new CP violating sources of the considered setups. Furthermore, we derive the bounds from dipole moments, electro-weak precision observables and lepton flavor violating processes, and discuss their implications. Finally, we examine the production and decay channels of the vector-like leptons at the LHC, and find that signatures with multiple light leptons or taus are already probing interesting regions of parameter space.« less
Robust geographically weighted regression of modeling the Air Polluter Standard Index (APSI)
NASA Astrophysics Data System (ADS)
Warsito, Budi; Yasin, Hasbi; Ispriyanti, Dwi; Hoyyi, Abdul
2018-05-01
The Geographically Weighted Regression (GWR) model has been widely applied to many practical fields for exploring spatial heterogenity of a regression model. However, this method is inherently not robust to outliers. Outliers commonly exist in data sets and may lead to a distorted estimate of the underlying regression model. One of solution to handle the outliers in the regression model is to use the robust models. So this model was called Robust Geographically Weighted Regression (RGWR). This research aims to aid the government in the policy making process related to air pollution mitigation by developing a standard index model for air polluter (Air Polluter Standard Index - APSI) based on the RGWR approach. In this research, we also consider seven variables that are directly related to the air pollution level, which are the traffic velocity, the population density, the business center aspect, the air humidity, the wind velocity, the air temperature, and the area size of the urban forest. The best model is determined by the smallest AIC value. There are significance differences between Regression and RGWR in this case, but Basic GWR using the Gaussian kernel is the best model to modeling APSI because it has smallest AIC.
Monte Carlo based toy model for fission process
NASA Astrophysics Data System (ADS)
Kurniadi, R.; Waris, A.; Viridi, S.
2014-09-01
There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance likesthe distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μCN, μL, μR), and standard deviation (σCN, σL, σR). By overlaying of three distributions, the number of particles (NL, NR) that are trapped by central points can be obtained. This process is iterated until (NL, NR) become constant numbers. Smashing process is repeated by changing σL and σR, randomly.
Experimental constraints from flavour changing processes and physics beyond the Standard Model.
Gersabeck, M; Gligorov, V V; Serra, N
Flavour physics has a long tradition of paving the way for direct discoveries of new particles and interactions. Results over the last decade have placed stringent bounds on the parameter space of physics beyond the Standard Model. Early results from the LHC, and its dedicated flavour factory LHCb, have further tightened these constraints and reiterate the ongoing relevance of flavour studies. The experimental status of flavour observables in the charm and beauty sectors is reviewed in measurements of CP violation, neutral meson mixing, and measurements of rare decays.
McKoon, Gail; Ratcliff, Roger
2016-01-01
Millions of adults in the United States lack the necessary literacy skills for most living wage jobs. For students from adult learning classes, we used a lexical decision task to measure their knowledge of words and we used a decision-making model (Ratcliff’s, 1978, diffusion model) to abstract the mechanisms underlying their performance from their RTs and accuracy. We also collected scores for each participant on standardized IQ tests and standardized reading tests used commonly in the education literature. We found significant correlations between the model’s estimates of the strengths with which words are represented in memory and scores for some of the standardized tests but not others. The findings point to the feasibility and utility of combining a test of word knowledge, lexical decision, that is well-established in psycholinguistic research, a decision-making model that supplies information about underlying mechanisms, and standardized tests. The goal for future research is to use this combination of approaches to understand better how basic processes relate to standardized tests with the eventual aim of understanding what these tests are measuring and what the specific difficulties are for individual, low-literacy adults. PMID:26550803
ERIC Educational Resources Information Center
Ryan, David L.
2010-01-01
While research in academic and professional information technology (IT) journals address the need for strategic alignment and defined IT processes, there is little research about what factors should be considered when implementing specific IT hardware standards in an organization. The purpose of this study was to develop a set of factors for…
ERIC Educational Resources Information Center
Howard, Kathryn M.
2009-01-01
This paper examines how Northern Thai (Muang) children are socialized into the discourses and practices of respect in school, a process that indexically links Standard Thai to images of polite and respectful Thai citizenship. Focusing on the socialization of politeness particles, the paper examines how cultural models of conduct are taken up,…
On the Higgs-like boson in the minimal supersymmetric 3-3-1 model
NASA Astrophysics Data System (ADS)
Ferreira, J. G.; Pires, C. A. de S.; da Silva, P. S. Rodrigues; Siqueira, Clarissa
2018-03-01
It is imperative that any proposal of new physics beyond the standard model possesses a Higgs-like boson with 125 GeV of mass and couplings with the standard particles that recover the branching ratios and signal strengths as measured by CMS and ATLAS. We address this issue within the supersymmetric version of the minimal 3-3-1 model. For this we develop the Higgs potential with focus on the lightest Higgs provided by the model. Our proposal is to verify if it recovers the properties of the Standard Model Higgs. With respect to its mass, we calculate it up to one loop level by taking into account all contributions provided by the model. In regard to its couplings, we restrict our investigation to couplings of the Higgs-like boson with the standard particles, only. We then calculate the dominant branching ratios and the respective signal strengths and confront our results with the recent measurements of CMS and ATLAS. As distinctive aspects, we remark that our Higgs-like boson intermediates flavor changing neutral processes and has as signature the decay t → h+c. We calculate its branching ratio and compare it with current bounds. We also show that the Higgs potential of the model is stable for the region of parameter space employed in our calculations.
A UML approach to process modelling of clinical practice guidelines for enactment.
Knape, T; Hederman, L; Wade, V P; Gargan, M; Harris, C; Rahman, Y
2003-01-01
Although clinical practice guidelines (CPGs) have been suggested as a means of encapsulating best practice in evidence-based medical treatment, their usage in clinical environments has been disappointing. Criticisms of guideline representations have been that they are predominantly narrative and are difficult to incorporate into clinical information systems. This paper analyses the use of UML process modelling techniques for guideline representation and proposes the automated generation of executable guidelines using XMI. This hybrid UML-XMI approach provides flexible authoring of guideline decision and control structures whilst integrating appropriate data flow. It also uses an open XMI standard interface to allow the use of authoring tools and process control systems from multiple vendors. The paper first surveys CPG modelling formalisms followed by a brief introduction to process modelling in UMI. Furthermore, the modelling of CPGs in UML is presented leading to a case study of encoding a diabetes mellitus CPG using UML.
GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research
NASA Astrophysics Data System (ADS)
Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.
2015-05-01
To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.
Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef
2010-01-01
It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to reach our interoperability goals faster.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalicek, M.A.; Comtois, J.H.; Barron, C.C.
This paper describes the design and characterization of several types of micromirror devices to include process capabilities, device modeling, and test data resulting in deflection versus applied potential curves. These micromirror devices are the first to be fabricated in the state-of-the-art four-level planarized polysilicon process available at Sandia National Laboratories known as the Sandia Ultra-planar Multi-level MEMS Technology (SUMMiT). This enabling process permits the development of micromirror devices with near-ideal characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics as elevated address electrodes, individual address wiring beneath the device, planarized mirror surfaces usingmore » Chemical Mechanical Polishing (CMP), unique post-process metallization, and the best active surface area to date. This paper presents the design, fabrication, modeling, and characterization of several variations of Flexure-Beam (FBMD) and Axial-Rotation Micromirror Devices (ARMD). The released devices are first metallized using a standard sputtering technique relying on metallization guards and masks that are fabricated next to the devices. Such guards are shown to enable the sharing of bond pads between numerous arrays of micromirrors in order to maximize the number of on-chip test arrays. The devices are modeled and then empirically characterized using a laser interferometer setup located at the Air Force Institute of Technology (AFIT) at Wright-Patterson AFB in Dayton, Ohio. Unique design considerations for these devices and the process are also discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holmes, Steve; Alber, Russ; Asner, David
2013-06-23
Particle physics has made enormous progress in understanding the nature of matter and forces at a fundamental level and has unlocked many mysteries of our world. The development of the Standard Model of particle physics has been a magnificent achievement of the field. Many deep and important questions have been answered and yet many mysteries remain. The discovery of neutrino oscillations, discrepancies in some precision measurements of Standard-Model processes, observation of matter-antimatter asymmetry, the evidence for the existence of dark matter and dark energy, all point to new physics beyond the Standard Model. The pivotal developments of our field, includingmore » the latest discovery of the Higgs Boson, have progressed within three interlocking frontiers of research – the Energy, Intensity and Cosmic frontiers – where discoveries and insights in one frontier powerfully advance the other frontiers as well.« less
Khachatryan, Vardan
2015-04-22
Our search is presented for physics beyond the standard model in final states with two opposite-sign same-flavor leptons, jets, and missing transverse momentum. The data sample corresponds to an integrated luminosity of 19.4 fb -1 of proton-proton collisions at √s = 8 TeV collected with the CMS detector at the CERN LHC in 2012. The analysis focuses on searches for a kinematic edge in the invariant mass distribution of the oppositesign same-flavor lepton pair and for final states with an on-shell Z boson. Furthermore, the observations are consistent with expectations from standard model processes and are interpreted in terms ofmore » upper limits on the production of supersymmetric particles.« less
Structuring Legacy Pathology Reports by openEHR Archetypes to Enable Semantic Querying.
Kropf, Stefan; Krücken, Peter; Mueller, Wolf; Denecke, Kerstin
2017-05-18
Clinical information is often stored as free text, e.g. in discharge summaries or pathology reports. These documents are semi-structured using section headers, numbered lists, items and classification strings. However, it is still challenging to retrieve relevant documents since keyword searches applied on complete unstructured documents result in many false positive retrieval results. We are concentrating on the processing of pathology reports as an example for unstructured clinical documents. The objective is to transform reports semi-automatically into an information structure that enables an improved access and retrieval of relevant data. The data is expected to be stored in a standardized, structured way to make it accessible for queries that are applied to specific sections of a document (section-sensitive queries) and for information reuse. Our processing pipeline comprises information modelling, section boundary detection and section-sensitive queries. For enabling a focused search in unstructured data, documents are automatically structured and transformed into a patient information model specified through openEHR archetypes. The resulting XML-based pathology electronic health records (PEHRs) are queried by XQuery and visualized by XSLT in HTML. Pathology reports (PRs) can be reliably structured into sections by a keyword-based approach. The information modelling using openEHR allows saving time in the modelling process since many archetypes can be reused. The resulting standardized, structured PEHRs allow accessing relevant data by retrieving data matching user queries. Mapping unstructured reports into a standardized information model is a practical solution for a better access to data. Archetype-based XML enables section-sensitive retrieval and visualisation by well-established XML techniques. Focussing the retrieval to particular sections has the potential of saving retrieval time and improving the accuracy of the retrieval.
Modeling treatment of ischemic heart disease with partially observable Markov decision processes.
Hauskrecht, M; Fraser, H
1998-01-01
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.
Bulley, Catherine; Donaghy, Marie
2008-01-01
In a world of rapidly developing knowledge it is important that professions describe their roles and capabilities. The need for a thorough description of sports physiotherapy was addressed through collaboration between the International Federation of Sports Physiotherapy (IFSP) and five European higher education institutions. This resulted in the Sports Physiotherapy for All Project, which has been successful in developing internationally accepted competencies and standards for sports physiotherapists. This article describes and reflects on the process to communicate useful lessons. A competency model was chosen to facilitate differentiation and communication of aspects of sports physiotherapy practice. Documentation relating to sports physiotherapy practice was collected from 16 countries and analysed thematically. A cut and paste method was used by a panel of experts to allocate themes to areas of practice within the competency model. Theme groups were used to select areas of practice for description in competency form. Standards were derived from competencies following in depth discussion with the expert panel, and triangulation with themes derived from international documentation. A rigorous process of international review and revision led to the final list of 11 competencies and related standards, both accepted by the IFSP. This work provides a foundation for the development of an audit toolkit to guide demonstration and evaluation of competencies and standards. This provides a foundation for targeted career development activities, appropriate provision of training opportunities, and quality enhancement. The experiences gained during this project can inform other health professions and their specialisms when embarking on a similar journey.
Guideline validation in multiple trauma care through business process modeling.
Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen
2003-07-01
Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.
Software Engineering Laboratory (SEL) cleanroom process model
NASA Technical Reports Server (NTRS)
Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon
1991-01-01
The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.
In the United States, regional-scale photochemical models are being used to design emission control strategies needed to meet the relevant National Ambient Air Quality Standards (NAAQS) within the framework of the attainment demonstration process. Previous studies have shown that...
Modelling rollover behaviour of exacavator-based forest machines
M.W. Veal; S.E. Taylor; Robert B. Rummer
2003-01-01
This poster presentation provides results from analytical and computer simulation models of rollover behaviour of hydraulic excavators. These results are being used as input to the operator protective structure standards development process. Results from rigid body mechanics and computer simulation methods agree well with field rollover test data. These results show...
Predictive and Prognostic Models: Implications for Healthcare Decision-Making in a Modern Recession
Vogenberg, F. Randy
2009-01-01
Various modeling tools have been developed to address the lack of standardized processes that incorporate the perspectives of all healthcare stakeholders. Such models can assist in the decision-making process aimed at achieving specific clinical outcomes, as well as guide the allocation of healthcare resources and reduce costs. The current efforts in Congress to change the way healthcare is financed, reimbursed, and delivered have rendered the incorporation of modeling tools into the clinical decision-making all the more important. Prognostic and predictive models are particularly relevant to healthcare, particularly in the clinical decision-making, with implications for payers, patients, and providers. The use of these models is likely to increase, as providers and patients seek to improve their clinical decision process to achieve better outcomes, while reducing overall healthcare costs. PMID:25126292
A Stochastic Evolutionary Model for Protein Structure Alignment and Phylogeny
Challis, Christopher J.; Schmidler, Scott C.
2012-01-01
We present a stochastic process model for the joint evolution of protein primary and tertiary structure, suitable for use in alignment and estimation of phylogeny. Indels arise from a classic Links model, and mutations follow a standard substitution matrix, whereas backbone atoms diffuse in three-dimensional space according to an Ornstein–Uhlenbeck process. The model allows for simultaneous estimation of evolutionary distances, indel rates, structural drift rates, and alignments, while fully accounting for uncertainty. The inclusion of structural information enables phylogenetic inference on time scales not previously attainable with sequence evolution models. The model also provides a tool for testing evolutionary hypotheses and improving our understanding of protein structural evolution. PMID:22723302
NASA Technical Reports Server (NTRS)
Nelson, Emily S.; Mulugeta, Lealem; Walton, Marlei; Myers, Jerry G.
2014-01-01
In the wake of the Columbia accident, the NASA-STD-7009 [1] credibility assessment was developed as a unifying platform to describe model credibility and the uncertainties in its modeling predictions. This standard is now being adapted by NASAs Human Research Program to cover a wide range of numerical models for human research. When used properly, the standard can improve the process of code development by encouraging the use of best practices. It can also give management more insight in making informed decisions through a better understanding of the models capabilities and limitations.To a newcomer, the abstractions presented in NASA-STD-7009 and the sheer volume of information that must be absorbed can be overwhelming. This talk is aimed at describing the credibility assessment, which is the heart of the standard, in plain terms. It will outline how to develop a credibility assessment under the standard. It will also show how to quickly interpret the graphs and tables that result from the assessment and how to drill down from the top-level view to the foundation of the assessment. Finally, it will highlight some of the resources that are available for further study.
Linking Goal-Oriented Requirements and Model-Driven Development
NASA Astrophysics Data System (ADS)
Pastor, Oscar; Giachetti, Giovanni
In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.
Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node
NASA Astrophysics Data System (ADS)
Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten
2016-04-01
The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.
Radar Unix: a complete package for GPR data processing
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Durand, Herve
1999-03-01
A complete package for ground penetrating radar data interpretation including data processing, forward modeling and a case history database consultation is presented. Running on an Unix operating system, its architecture consists of a graphical user interface generating batch files transmitted to a library of processing routines. This design allows a better software maintenance and the possibility for the user to run processing or modeling batch files by itself and differed in time. A case history data base is available and consists of an hypertext document which can be consulted by using a standard HTML browser. All the software specifications are presented through a realistic example.
D'Andrade, Roy G; Romney, A Kimball
2003-05-13
This article presents a computational model of the process through which the human visual system transforms reflectance spectra into perceptions of color. Using physical reflectance spectra data and standard human cone sensitivity functions we describe the transformations necessary for predicting the location of colors in the Munsell color space. These transformations include quantitative estimates of the opponent process weights needed to transform cone activations into Munsell color space coordinates. Using these opponent process weights, the Munsell position of specific colors can be predicted from their physical spectra with a mean correlation of 0.989.
Sabia, Gianpaolo; Ferraris, Marco; Spagni, Alessandro
2016-01-01
This study proposes a model-based evaluation of the effect of different operating conditions with and without pre-denitrification treatment and applying three different solids retention times on the fouling mechanisms involved in membrane bioreactors (MBRs). A total of 11 fouling models obtained from literature were used to fit the transmembrane pressure variations measured in a pilot-scale MBR treating real wastewater for more than 1 year. The results showed that all the models represent reasonable descriptions of the fouling processes in the MBR tested. The model-based analysis confirmed that membrane fouling started by pore blocking (complete blocking model) and by a reduction of the pore diameter (standard blocking) while cake filtration became the dominant fouling mechanism over long-term operation. However, the different fouling mechanisms occurred almost simultaneously making it rather difficult to identify each one. The membrane "history" (i.e. age, lifespan, etc.) seems the most important factor affecting the fouling mechanism more than the applied operating conditions. Nonlinear regression of the most complex models (combined models) evaluated in this study sometimes demonstrated unreliable parameter estimates suggesting that the four basic fouling models (complete, standard, intermediate blocking and cake filtration) contain enough details to represent a reasonable description of the main fouling processes occurring in MBRs.
Ontological simulation for educational process organisation in a higher educational institution
NASA Astrophysics Data System (ADS)
Berestneva, O. G.; Marukhina, O. V.; Bahvalov, S. V.; Fisochenko, O. N.; Berestneva, E. V.
2017-01-01
Following the new-generation standards is needed to form a task list connected with planning and organizing of an academic process, structure and content formation of degree programmes. Even when planning the structure and content of an academic process, one meets some problems concerning the necessity to assess the correlation between degree programmes and demands of educational and professional standards and to consider today’s job-market and students demands. The paper presents examples of ontological simulations for solutions of organizing educational process problems in a higher educational institution and gives descriptions of model development. The article presents two examples: ontological simulation when planning an educational process in a higher educational institution and ontological simulation for describing competences of an IT-specialist. The paper sets a conclusion about ontology application perceptiveness for formalization of educational process organization in a higher educational institution.
An intraorganizational model for developing and spreading quality improvement innovations.
Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J
Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.
An intraorganizational model for developing and spreading quality improvement innovations
Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.
2017-01-01
Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788
Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G
2013-01-01
Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931
Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D
2012-08-01
An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.
Model-based evaluation of two BNR processes--UCT and A2N.
Hao, X; Van Loosdrecht, M C; Meijer, S C; Qian, Y
2001-08-01
The activity of denitrifying P-accumulating bacteria (DPB) has been verified to exist in most WWTPs with biological nutrient removal (BNR). The modified UCT process has a high content of DPB. A new BNR process with a two-sludge system named A2N was especially developed to exploit denitrifying dephosphatation. With the identical inflow and effluent standards, an existing full-scale UCT-type WWTP and a designed A2N process were evaluated by simulation. The used model is based on the Delft metabolical model for bio-P removal and ASM2d model for COD and N removal. Both processes accommodate denitrifying dephosphatation, but the A2N process has a more stable performance in N removal. Although excess sludge is increased by 6%, the A2N process leads to savings of 35, 85 and 30% in aeration energy, mixed liquor internal recirculation and land occupation respectively, as compared to the UCT process. Low temperature has a negative effect on growth of poly-P bacteria, which becomes to especially appear in the A2N process.
ERIC Educational Resources Information Center
Landolfi, Adrienne M.
2016-01-01
As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…
On a more rigorous gravity field processing for future LL-SST type gravity satellite missions
NASA Astrophysics Data System (ADS)
Daras, I.; Pail, R.; Murböck, M.
2013-12-01
In order to meet the augmenting demands of the user community concerning accuracies of temporal gravity field models, future gravity missions of low-low satellite-to-satellite tracking (LL-SST) type are planned to carry more precise sensors than their precedents. A breakthrough is planned with the improved LL-SST measurement link, where the traditional K-band microwave instrument of 1μm accuracy will be complemented by an inter-satellite ranging instrument of several nm accuracy. This study focuses on investigations concerning the potential performance of the new sensors and their impact in gravity field solutions. The processing methods for gravity field recovery have to meet the new sensor standards and be able to take full advantage of the new accuracies that they provide. We use full-scale simulations in a realistic environment to investigate whether the standard processing techniques suffice to fully exploit the new sensors standards. We achieve that by performing full numerical closed-loop simulations based on the Integral Equation approach. In our simulation scheme, we simulate dynamic orbits in a conventional tracking analysis to compute pseudo inter-satellite ranges or range-rates that serve as observables. Each part of the processing is validated separately with special emphasis on numerical errors and their impact in gravity field solutions. We demonstrate that processing with standard precision may be a limiting factor for taking full advantage of new generation sensors that future satellite missions will carry. Therefore we have created versions of our simulator with enhanced processing precision with primarily aim to minimize round-off system errors. Results using the enhanced precision show a big reduction of system errors that were present at the standard precision processing even for the error-free scenario, and reveal the improvements the new sensors will bring into the gravity field solutions. As a next step, we analyze the contribution of individual error sources to the system's error budget. More specifically we analyze sensor noise from the laser interferometer and the accelerometers, errors in the kinematic orbits and the background fields as well as temporal and spatial aliasing errors. We give special care on the assessment of error sources with stochastic behavior, such as the laser interferometer and the accelerometers, and their consistent stochastic modeling in frame of the adjustment process.
Stanford, Robert E
2004-05-01
This paper uses a non-parametric frontier model and adaptations of the concepts of cross-efficiency and peer-appraisal to develop a formal methodology for benchmarking provider performance in the treatment of Acute Myocardial Infarction (AMI). Parameters used in the benchmarking process are the rates of proper recognition of indications of six standard treatment processes for AMI; the decision making units (DMUs) to be compared are the Medicare eligible hospitals of a particular state; the analysis produces an ordinal ranking of individual hospital performance scores. The cross-efficiency/peer-appraisal calculation process is constructed to accommodate DMUs that experience no patients in some of the treatment categories. While continuing to rate highly the performances of DMUs which are efficient in the Pareto-optimal sense, our model produces individual DMU performance scores that correlate significantly with good overall performance, as determined by a comparison of the sums of the individual DMU recognition rates for the six standard treatment processes. The methodology is applied to data collected from 107 state Medicare hospitals.
Standardization of domestic frying processes by an engineering approach.
Franke, K; Strijowski, U
2011-05-01
An approach was developed to enable a better standardization of domestic frying of potato products. For this purpose, 5 domestic fryers differing in heating power and oil capacity were used. A very defined frying process using a highly standardized model product and a broad range of frying conditions was carried out in these fryers and the development of browning representing an important quality parameter was measured. Product-to-oil ratio, oil temperature, and frying time were varied. Quite different color changes were measured in the different fryers although the same frying process parameters were applied. The specific energy consumption for water evaporation (spECWE) during frying related to product amount was determined for all frying processes to define an engineering parameter for characterizing the frying process. A quasi-linear regression approach was applied to calculate this parameter from frying process settings and fryer properties. The high significance of the regression coefficients and a coefficient of determination close to unity confirmed the suitability of this approach. Based on this regression equation, curves for standard frying conditions (SFC curves) were calculated which describe the frying conditions required to obtain the same level of spECWE in the different domestic fryers. Comparison of browning results from the different fryers operated at conditions near the SFC curves confirmed the applicability of the approach. © 2011 Institute of Food Technologists®
Modelling the molecular mechanisms of aging
Mc Auley, Mark T.; Guimera, Alvaro Martinez; Hodgson, David; Mcdonald, Neil; Mooney, Kathleen M.; Morgan, Amy E.
2017-01-01
The aging process is driven at the cellular level by random molecular damage that slowly accumulates with age. Although cells possess mechanisms to repair or remove damage, they are not 100% efficient and their efficiency declines with age. There are many molecular mechanisms involved and exogenous factors such as stress also contribute to the aging process. The complexity of the aging process has stimulated the use of computational modelling in order to increase our understanding of the system, test hypotheses and make testable predictions. As many different mechanisms are involved, a wide range of models have been developed. This paper gives an overview of the types of models that have been developed, the range of tools used, modelling standards and discusses many specific examples of models that have been grouped according to the main mechanisms that they address. We conclude by discussing the opportunities and challenges for future modelling in this field. PMID:28096317
An Integrated High Resolution Hydrometeorological Modeling Testbed using LIS and WRF
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Eastman, Joseph L.; Tao, Wei-Kuo
2007-01-01
Scientists have made great strides in modeling physical processes that represent various weather and climate phenomena. Many modeling systems that represent the major earth system components (the atmosphere, land surface, and ocean) have been developed over the years. However, developing advanced Earth system applications that integrates these independently developed modeling systems have remained a daunting task due to limitations in computer hardware and software. Recently, efforts such as the Earth System Modeling Ramework (ESMF) and Assistance for Land Modeling Activities (ALMA) have focused on developing standards, guidelines, and computational support for coupling earth system model components. In this article, the development of a coupled land-atmosphere hydrometeorological modeling system that adopts these community interoperability standards, is described. The land component is represented by the Land Information System (LIS), developed by scientists at the NASA Goddard Space Flight Center. The Weather Research and Forecasting (WRF) model, a mesoscale numerical weather prediction system, is used as the atmospheric component. LIS includes several community land surface models that can be executed at spatial scales as fine as 1km. The data management capabilities in LIS enable the direct use of high resolution satellite and observation data for modeling. Similarly, WRF includes several parameterizations and schemes for modeling radiation, microphysics, PBL and other processes. Thus the integrated LIS-WRF system facilitates several multi-model studies of land-atmosphere coupling that can be used to advance earth system studies.
Ar+ and CuBr laser-assisted chemical bleaching of teeth: estimation of whiteness degree
NASA Astrophysics Data System (ADS)
Dimitrov, S.; Todorovska, Roumyana; Gizbrecht, Alexander I.; Raychev, L.; Petrov, Lyubomir P.
2003-11-01
In this work the results of adaptation of impartial methods for color determination aimed at developing of techniques for estimation of human teeth whiteness degree, sufficiently handy for common use in clinical practice are presented. For approbation and by the way of illustration of the techniques, standards of teeth colors were used as well as model and naturally discolored human teeth treated by two bleaching chemical compositions activated by three light sources each: Ar+ and CuBr lasers, and a standard halogen photopolymerization lamp. Typical reflection and fluorescence spectra of some samples are presented; the samples colors were estimated by a standard computer processing in RGB and B coordinates. The results of the applied spectral and colorimetric techniques are in a good agreement with those of the standard computer processing of the corresponding digital photographs and complies with the visually estimated degree of the teeth whiteness judged according to the standard reference scale commonly used in the aesthetic dentistry.
A review and validation of the IMPLAN model for Pennsylvania's solid hardwood product industries
Bruce E. Lord; Charles H. Strauss
1993-01-01
The IMPLAN model for Pennsylvania was reviewed with respect to the industries processing the state's solid hardwood resources. Several sectors were found to be under represented in the standard sources of industrial activity. Further problems were attributed to the lack of distinction between hardwoods and softwoods in the national model. A further set of changes...
Multivariate regression model for predicting yields of grade lumber from yellow birch sawlogs
Andrew F. Howard; Daniel A. Yaussy
1986-01-01
A multivariate regression model was developed to predict green board-foot yields for the common grades of factory lumber processed from yellow birch factory-grade logs. The model incorporates the standard log measurements of scaling diameter, length, proportion of scalable defects, and the assigned USDA Forest Service log grade. Differences in yields between band and...
Comparative Analysis of InSAR Digital Surface Models for Test Area Bucharest
NASA Astrophysics Data System (ADS)
Dana, Iulia; Poncos, Valentin; Teleaga, Delia
2010-03-01
This paper presents the results of the interferometric processing of ERS Tandem, ENVISAT and TerraSAR- X for digital surface model (DSM) generation. The selected test site is Bucharest (Romania), a built-up area characterized by the usual urban complex pattern: mixture of buildings with different height levels, paved roads, vegetation, and water bodies. First, the DSMs were generated following the standard interferometric processing chain. Then, the accuracy of the DSMs was analyzed against the SPOT HRS model (30 m resolution at the equator). A DSM derived by optical stereoscopic processing of SPOT 5 HRG data and also the SRTM (3 arc seconds resolution at the equator) DSM have been included in the comparative analysis.
Santos-Moreno, Pedro; Galarza-Maldonado, Claudio; Caballero-Uribe, Carlo V.; Cardiel, Mario H.; Massardo, Loreto; Soriano, Enrique R.; Olano, José Aguilar; Díaz Coto, José F.; Durán Pozo, Gabriel R.; da Silveira, Inês Guimarães; de Castrejón, Vianna J. Khoury; Pérez, Leticia Lino; Méndez Justo, Carlos A.; Montufar Guardado, Rubén A.; Muños, Rafael; Elvir, Sergio Murillo; Paredes Domínguez, Ernesto R.; Pons-Estel, Bernardo; Ríos Acosta, Carlos R.; Sandino, Sayonara; Toro Gutiérrez, Carlos E.; Villegas de Morales, Sol María; Pineda, Carlos
2015-01-01
Objective A consensus meeting of representatives of 16 Latin American and Caribbean countries and the REAL-PANLAR group met in the city of Bogota to provide recommendations for improving quality of care of patients with rheumatoid arthritis (RA) in Latin America, defining a minimum standards of care and the concept of center of excellence in RA. Methods Twenty-two rheumatologists from 16 Latin American countries with a special interest in quality of care in RA participated in the consensus meeting. Two RA Colombian patients and 2 health care excellence advisors were also invited to the meeting. A RAND-modified Delphi procedure of 5 steps was applied to define categories of centers of excellence. During a 1-day meeting, working groups were created in order to discuss and validate the minimum quality-of-care standards for the 3 proposed types of centers of excellence in RA. Positive votes from at least 60% of the attending leaders were required for the approval of each standard. Results Twenty-two opinion leaders from the PANLAR countries and the REAL-PANLAR group participated in the discussion and definition of the standards. One hundred percent of the participants agreed with setting up centers of excellence in RA throughout Latin America. Three types of centers of excellence and its criteria were defined, according to indicators of structure, processes, and outcomes: standard, optimal, and model. The standard level should have basic structure and process indicators, the intermediate or optimal level should accomplish more structure and process indicators, and model level should also fulfill outcome indicators and patient experience. Conclusions This is the first Latin American effort to standardize and harmonize the treatment provided to RA patients and to establish centers of excellence that would offer to RA patients acceptable clinical results and high levels of safety. PMID:26010179
Santos-Moreno, Pedro; Galarza-Maldonado, Claudio; Caballero-Uribe, Carlo V; Cardiel, Mario H; Massardo, Loreto; Soriano, Enrique R; Olano, José Aguilar; Díaz Coto, José F; Durán Pozo, Gabriel R; da Silveira, Inês Guimarães; de Castrejón, Vianna J Khoury; Pérez, Leticia Lino; Méndez Justo, Carlos A; Montufar Guardado, Rubén A; Muños, Rafael; Elvir, Sergio Murillo; Paredes Domínguez, Ernesto R; Pons-Estel, Bernardo; Ríos Acosta, Carlos R; Sandino, Sayonara; Toro Gutiérrez, Carlos E; Villegas de Morales, Sol María; Pineda, Carlos
2015-06-01
A consensus meeting of representatives of 16 Latin American and Caribbean countries and the REAL-PANLAR group met in the city of Bogota to provide recommendations for improving quality of care of patients with rheumatoid arthritis (RA) in Latin America, defining a minimum standards of care and the concept of center of excellence in RA. Twenty-two rheumatologists from 16 Latin American countries with a special interest in quality of care in RA participated in the consensus meeting. Two RA Colombian patients and 2 health care excellence advisors were also invited to the meeting. A RAND-modified Delphi procedure of 5 steps was applied to define categories of centers of excellence. During a 1-day meeting, working groups were created in order to discuss and validate the minimum quality-of-care standards for the 3 proposed types of centers of excellence in RA. Positive votes from at least 60% of the attending leaders were required for the approval of each standard. Twenty-two opinion leaders from the PANLAR countries and the REAL-PANLAR group participated in the discussion and definition of the standards. One hundred percent of the participants agreed with setting up centers of excellence in RA throughout Latin America. Three types of centers of excellence and its criteria were defined, according to indicators of structure, processes, and outcomes: standard, optimal, and model. The standard level should have basic structure and process indicators, the intermediate or optimal level should accomplish more structure and process indicators, and model level should also fulfill outcome indicators and patient experience. This is the first Latin American effort to standardize and harmonize the treatment provided to RA patients and to establish centers of excellence that would offer to RA patients acceptable clinical results and high levels of safety.
Aeromedical Disposition and Waiver Consideration for ISS Crewmembers
NASA Technical Reports Server (NTRS)
Taddeo, Terrance
2012-01-01
Aeromedical certification of astronauts and cosmonauts traveling to the International Space Station is a multi?-tiered process that involv es standards agreed to by the partner agencies, and participation by the individual agency aeromedical boards and a multilateral space medi cine board. Medical standards are updated continually by a multilater al working group. The boards operate by consensus and strive to achie ve effective decision making through experience, medical judgment, medical evidence and risk modeling. The aim of the certification process is to minimize the risk to the ISS program of loss of mission object ives due to human health issues.
Ghafoor, Virginia L; Silus, Lauren S
2011-03-15
The development of a policy, evidence-based standard orders, and monitoring for palliative sedation therapy (PST) is described. Concerns regarding PST at the University of Minnesota Medical Center (UMMC) arose and needed to be addressed in a formal process. A multidisciplinary group consisting of palliative care physicians, nurse practitioners, clinical nurse specialists, and clinical pharmacy specialists reached consensus on the practice model and medications to be used for PST. Major elements of the plan included the development and implementation of an institutional policy for palliative sedation; standard orders for patient care, sedation, and monitoring; education for staff, patients, and patients' family members; and quality-assurance monitoring. A literature review was performed to identify research and guidelines defining the practice of PST. Policy content includes the use of a standard order set linking patient care, medication administration, the monitoring of sedation, and symptom management. Approval of the policy involved several UMMC committees. An evaluation matrix was used to determine critical areas for PST monitoring and to guide development of a form to monitor quality. A retrospective chart audit using the quality-assurance monitoring form assessed baseline sedation medication and patient outcomes. Assessment of compliance began in the fall of 2008, after the policy and standard orders were approved by the UMMC medical executive committee. In 2008, two cases of PST were monitored using the standardized form. PST cases will be continually monitored and analyzed. Development of policy, standard orders, and quality-assurance monitoring for PST required a formal multidisciplinary process. A process-improvement process is critical to defining institutional policy, educational goals, and outcome metrics for PST.
[Quality process control system of Chinese medicine preparation based on "holistic view"].
Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming
2018-01-01
"High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.
Bashir, Mohammed J K; Mau Han, Tham; Jun Wei, Lim; Choon Aun, Ng; Abu Amr, Salem S
2016-01-01
As the ponding system used to treat palm oil mill effluent (POME) frequently fails to satisfy the discharge standard in Malaysia, the present study aimed to resolve this problem using an optimized electrocoagulation process. Thus, a central composite design (CCD) module in response surface methodology was employed to optimize the interactions of process variables, namely current density, contact time and initial pH targeted on maximum removal of chemical oxygen demand (COD), colour and turbidity with satisfactory pH of discharge POME. The batch study was initially designed by CCD and statistical models of responses were subsequently derived to indicate the significant terms of interactive process variables. All models were verified by analysis of variance showing model significances with Prob > F < 0.01. The optimum performance was obtained at the current density of 56 mA/cm(2), contact time of 65 min and initial pH of 4.5, rendering complete removal of colour and turbidity with COD removal of 75.4%. The pH of post-treated POME of 7.6 was achieved, which is suitable for direct discharge. These predicted outputs were subsequently confirmed by insignificant standard deviation readings between predicted and actual values. This optimum condition also permitted the simultaneous removal of NH3-N, and various metal ions, signifying the superiority of the electrocoagulation process optimized by CCD.
Leonelli, Sabina; Ankeny, Rachel A.; Nelson, Nicole C.; Ramsden, Edmund
2014-01-01
Argument We examine the criteria used to validate the use of nonhuman organisms in North-American alcohol addiction research from the 1950s to the present day. We argue that this field, where the similarities between behaviors in humans and non-humans are particularly difficult to assess, has addressed questions of model validity by transforming the situatedness of non-human organisms into an experimental tool. We demonstrate that model validity does not hinge on the standardization of one type of organism in isolation, as often the case with genetic model organisms. Rather, organisms are viewed as necessarily situated: they cannot be understood as a model for human behavior in isolation from their environmental conditions. Hence the environment itself is standardized as part of the modeling process; and model validity is assessed with reference to the environmental conditions under which organisms are studied. PMID:25233743
Simulation of springback and microstructural analysis of dual phase steels
NASA Astrophysics Data System (ADS)
Kalyan, T. Sri.; Wei, Xing; Mendiguren, Joseba; Rolfe, Bernard
2013-12-01
With increasing demand for weight reduction and better crashworthiness abilities in car development, advanced high strength Dual Phase (DP) steels have been progressively used when making automotive parts. The higher strength steels exhibit higher springback and lower dimensional accuracy after stamping. This has necessitated the use of simulation of each stamped component prior to production to estimate the part's dimensional accuracy. Understanding the micro-mechanical behaviour of AHSS sheet may provide more accuracy to stamping simulations. This work can be divided basically into two parts: first modelling a standard channel forming process; second modelling the micro-structure of the process. The standard top hat channel forming process, benchmark NUMISHEET'93, is used for investigating springback effect of WISCO Dual Phase steels. The second part of this work includes the finite element analysis of microstructures to understand the behaviour of the multi-phase steel at a more fundamental level. The outcomes of this work will help in the dimensional control of steels during manufacturing stage based on the material's microstructure.
Improving atomic displacement and replacement calculations with physically realistic damage models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordlund, Kai; Zinkle, Steven J.; Sand, Andrea E.
Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor ofmore » 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.« less
Improving atomic displacement and replacement calculations with physically realistic damage models
Nordlund, Kai; Zinkle, Steven J.; Sand, Andrea E.; ...
2018-03-14
Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor ofmore » 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.« less
Improving atomic displacement and replacement calculations with physically realistic damage models.
Nordlund, Kai; Zinkle, Steven J; Sand, Andrea E; Granberg, Fredric; Averback, Robert S; Stoller, Roger; Suzudo, Tomoaki; Malerba, Lorenzo; Banhart, Florian; Weber, William J; Willaime, Francois; Dudarev, Sergei L; Simeone, David
2018-03-14
Atomic collision processes are fundamental to numerous advanced materials technologies such as electron microscopy, semiconductor processing and nuclear power generation. Extensive experimental and computer simulation studies over the past several decades provide the physical basis for understanding the atomic-scale processes occurring during primary displacement events. The current international standard for quantifying this energetic particle damage, the Norgett-Robinson-Torrens displacements per atom (NRT-dpa) model, has nowadays several well-known limitations. In particular, the number of radiation defects produced in energetic cascades in metals is only ~1/3 the NRT-dpa prediction, while the number of atoms involved in atomic mixing is about a factor of 30 larger than the dpa value. Here we propose two new complementary displacement production estimators (athermal recombination corrected dpa, arc-dpa) and atomic mixing (replacements per atom, rpa) functions that extend the NRT-dpa by providing more physically realistic descriptions of primary defect creation in materials and may become additional standard measures for radiation damage quantification.
Gravitational leptogenesis, reheating, and models of neutrino mass
NASA Astrophysics Data System (ADS)
Adshead, Peter; Long, Andrew J.; Sfakianakis, Evangelos I.
2018-02-01
Gravitational leptogenesis refers to a class of baryogenesis models in which the matter-antimatter asymmetry of the Universe arises through the standard model lepton-number gravitational anomaly. In these models chiral gravitational waves source a lepton asymmetry in standard model neutrinos during the inflationary epoch. We point out that gravitational leptogenesis can be successful in either the Dirac or Majorana neutrino mass scenario. In the Dirac mass scenario, gravitational leptogenesis predicts a relic abundance of sterile neutrinos that remain out of equilibrium, and the lepton asymmetry carried by the standard model sector is unchanged. In the Majorana mass scenario, the neutrinos participate in lepton-number-violating interactions that threaten to wash out the lepton asymmetry during postinflationary reheating. However, we show that a complete (exponential) washout of the lepton asymmetry is prevented if the lepton-number-violating interactions go out of equilibrium before all of the standard model Yukawa interactions come into equilibrium. The baryon and lepton asymmetries carried by right-chiral quarks and leptons are sequestered from the lepton-number violation, and the washout processes only suppress the predicted baryon asymmetry by a factor of ɛw .o .=±O (0.1 ). The sign of ɛw .o . depends on the model parameters in such a way that a future measurement of the primordial gravitational wave chirality would constrain the scale of lepton-number violation (heavy Majorana neutrino mass).
Information risk and security modeling
NASA Astrophysics Data System (ADS)
Zivic, Predrag
2005-03-01
This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.
Defense Logistics Standard Systems Functional Requirements.
1987-03-01
Artificial Intelligence - the development of a machine capability to perform functions normally concerned with human intelligence, such as learning , adapting...Basic Data Base Machine Configurations .... ......... D- 18 xx ~ ?f~~~vX PART I: MODELS - DEFENSE LOGISTICS STANDARD SYSTEMS FUNCTIONAL REQUIREMENTS...On-line, Interactive Access. Integrating user input and machine output in a dynamic, real-time, give-and- take process is considered the optimum mode
Calibration and Propagation of Uncertainty for Independence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham
This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.
NASA Astrophysics Data System (ADS)
Zbiciak, M.; Grabowik, C.; Janik, W.
2015-11-01
Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.
Exploring the Processes of Generating LOD (0-2) Citygml Models in Greater Municipality of Istanbul
NASA Astrophysics Data System (ADS)
Buyuksalih, I.; Isikdag, U.; Zlatanova, S.
2013-08-01
3D models of cities, visualised and exploded in 3D virtual environments have been available for several years. Currently a large number of impressive realistic 3D models have been regularly presented at scientific, professional and commercial events. One of the most promising developments is OGC standard CityGML. CityGML is object-oriented model that support 3D geometry and thematic semantics, attributes and relationships, and offers advanced options for realistic visualization. One of the very attractive characteristics of the model is the support of 5 levels of detail (LOD), starting from 2.5D less accurate model (LOD0) and ending with very detail indoor model (LOD4). Different local government offices and municipalities have different needs when utilizing the CityGML models, and the process of model generation depends on local and domain specific needs. Although the processes (i.e. the tasks and activities) for generating the models differs depending on its utilization purpose, there are also some common tasks (i.e. common denominator processes) in the model generation of City GML models. This paper focuses on defining the common tasks in generation of LOD (0-2) City GML models and representing them in a formal way with process modeling diagrams.
Performance of Transit Model Fitting in Processing Four Years of Kepler Science Data
NASA Astrophysics Data System (ADS)
Li, Jie; Burke, Christopher J.; Jenkins, Jon Michael; Quintana, Elisa V.; Rowe, Jason; Seader, Shawn; Tenenbaum, Peter; Twicken, Joseph D.
2014-06-01
We present transit model fitting performance of the Kepler Science Operations Center (SOC) Pipeline in processing four years of science data, which were collected by the Kepler spacecraft from May 13, 2009 to May 12, 2013. Threshold Crossing Events (TCEs), which represent transiting planet detections, are generated by the Transiting Planet Search (TPS) component of the pipeline and subsequently processed in the Data Validation (DV) component. The transit model is used in DV to fit TCEs and derive parameters that are used in various diagnostic tests to validate planetary candidates. The standard transit model includes five fit parameters: transit epoch time (i.e. central time of first transit), orbital period, impact parameter, ratio of planet radius to star radius and ratio of semi-major axis to star radius. In the latest Kepler SOC pipeline codebase, the light curve of the target for which a TCE is generated is initially fitted by a trapezoidal model with four parameters: transit epoch time, depth, duration and ingress time. The trapezoidal model fit, implemented with repeated Levenberg-Marquardt minimization, provides a quick and high fidelity assessment of the transit signal. The fit parameters of the trapezoidal model with the minimum chi-square metric are converted to set initial values of the fit parameters of the standard transit model. Additional parameters, such as the equilibrium temperature and effective stellar flux of the planet candidate, are derived from the fit parameters of the standard transit model to characterize pipeline candidates for the search of Earth-size planets in the Habitable Zone. The uncertainties of all derived parameters are updated in the latest codebase to take into account for the propagated errors of the fit parameters as well as the uncertainties in stellar parameters. The results of the transit model fitting of the TCEs identified by the Kepler SOC Pipeline, including fitted and derived parameters, fit goodness metrics and diagnostic figures, are included in the DV report and one-page report summary, which are accessible by the science community at NASA Exoplanet Archive. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.
NASA Astrophysics Data System (ADS)
Holzmann, Hubert; Massmann, Carolina
2015-04-01
A plenty of hydrological model types have been developed during the past decades. Most of them used a fixed design to describe the variable hydrological processes assuming to be representative for the whole range of spatial and temporal scales. This assumption is questionable as it is evident, that the runoff formation process is driven by dominant processes which can vary among different basins. Furthermore the model application and the interpretation of results is limited by data availability to identify the particular sub-processes, since most models were calibrated and validated only with discharge data. Therefore it can be hypothesized, that simpler model designs, focusing only on the dominant processes, can achieve comparable results with the benefit of less parameters. In the current contribution a modular model concept will be introduced, which allows the integration and neglection of hydrological sub-processes depending on the catchment characteristics and data availability. Key elements of the process modules refer to (1) storage effects (interception, soil), (2) transfer processes (routing), (3) threshold processes (percolation, saturation overland flow) and (4) split processes (rainfall excess). Based on hydro-meteorological observations in an experimental catchment in the Slovak region of the Carpathian mountains a comparison of several model realizations with different degrees of complexity will be discussed. A special focus is given on model parameter sensitivity estimated by Markov Chain Monte Carlo approach. Furthermore the identification of dominant processes by means of Sobol's method is introduced. It could be shown that a flexible model design - and even the simple concept - can reach comparable and equivalent performance than the standard model type (HBV-type). The main benefit of the modular concept is the individual adaptation of the model structure with respect to data and process availability and the option for parsimonious model design.
Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon
2017-12-01
Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.
Elsner, Peter; Seyfarth, Florian; Sonsmann, Flora; Strunk, Meike; John, Swen-Malte; Diepgen, Thomas; Schliemann, Sibylle
2013-10-01
In order to assess the cleaning efficacy of occupational skin cleansers, standardized test dirts mimicking the spectrum of skin soiling at dirty workplaces are necessary. To validate newly developed standardized test dirts (compliant with the EU Cosmetics Directive) for their occupational relevance. In this single-blinded, monocentric questionnaire-based clinical trial, 87 apprentices of three trades (household management; house painting and varnishing; and metal processing) evaluated the cleanability of six standardized test dirts in relation to their workplace dirts. In addition, they judged the similarity of the test dirts to actual dirts encountered in their working environments. Most of the household management participants assessed the hydrophilic model dirt ('mascara'), the lipophilic model dirt ('W/O cream') and a film-forming model dirt ('disperse paint') as best resembling the dirts found at their workplaces. Most of the painters and varnishers judged the filmogenic model dirts ('disperse paint' and 'acrylic paint') as best resembling the dirts found at their workplaces. For the metal workers, the lipophilic and paste-like model dirts were most similar to their workplace dirts. The spectrum of standardized test dirts developed represents well the dirts encountered at various workplaces. The test dirts may be useful in the development and in vivo efficacy testing of occupational skin cleansers. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Search for charged lepton flavor violation of vector mesons in the BLMSSM model
NASA Astrophysics Data System (ADS)
Dong, Xing-Xing; Zhao, Shu-Min; Feng, Jing-Jing; Ning, Guo-Zhu; Chen, Jian-Bin; Zhang, Hai-Bin; Feng, Tai-Fu
2018-03-01
We analyze the charged lepton flavor violating (CLFV) decays of vector mesons V →li±lj∓ with V ∈{ϕ ,J /Ψ ,ϒ ,ρ0,ω } in the BLMSSM model. This new model is introduced as a supersymmetric extension of the Standard Model (SM), where local gauged baryon number B and lepton number L are considered. The numerical results indicate the BLMSSM model can produce significant contributions to such two-body CLFV decays, and the branching ratios to these CLFV processes can easily reach the present experimental upper bounds. Therefore, searching for CLFV processes of vector mesons may be an effective channel to study new physics.
Arnold, Suzanne V.; Masoudi, Frederick A.; Rumsfeld, John S.; Li, Yan; Jones, Philip G.; Spertus, John A.
2014-01-01
Background Before outcomes-based measures of quality can be used to compare and improve care, they must be risk-standardized to account for variations in patient characteristics. Despite the importance of health-related quality of life (HRQL) outcomes among patients with acute myocardial infarction (AMI), no risk-standardized models have been developed. Methods and Results We assessed disease-specific HRQL using the Seattle Angina Questionnaire at baseline and 1 year later in 2693 unselected AMI patients from 24 hospitals enrolled in the TRIUMPH registry. Using 57 candidate sociodemographic, economic, and clinical variables present on admission, we developed a parsimonious, hierarchical linear regression model to predict HRQL. Eleven variables were independently associated with poor HRQL after AMI, including younger age, prior CABG, depressive symptoms, and financial difficulties (R2=20%). The model demonstrated excellent internal calibration and reasonable calibration in an independent sample of 1890 AMI patients in a separate registry, although the model slightly over-predicted HRQL scores in the higher deciles. Among the 24 TRIUMPH hospitals, 1-year unadjusted HRQL scores ranged from 67–89. After risk-standardization, HRQL scores variability narrowed substantially (range=79–83), and the group of hospital performance (bottom 20%/middle 60%/top 20%) changed in 14 of the 24 hospitals (58% reclassification with risk-standardization). Conclusions In this predictive model for HRQL after AMI, we identified risk factors, including economic and psychological characteristics, associated with HRQL outcomes. Adjusting for these factors substantially altered the rankings of hospitals as compared with unadjusted comparisons. Using this model to compare risk-standardized HRQL outcomes across hospitals may identify processes of care that maximize this important patient-centered outcome. PMID:24163068
A standard protocol for describing individual-based and agent-based models
Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.
2006-01-01
Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.
The ISO Edi Conceptual Model Activity and Its Relationship to OSI.
ERIC Educational Resources Information Center
Fincher, Judith A.
1990-01-01
The edi conceptual model is being developed to define common structures, services, and processes that syntax-specific standards like X12 and EDIFACT could adopt. Open Systems Interconnection (OSI) is of interest to edi because of its potential to help enable global interoperability across Electronic Data Interchange (EDI) functional groups. A…
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
Fitting the Mixed Rasch Model to a Reading Comprehension Test: Identifying Reader Types
ERIC Educational Resources Information Center
Baghaei, Purya; Carstensen, Claus H.
2013-01-01
Standard unidimensional Rasch models assume that persons with the same ability parameters are comparable. That is, the same interpretation applies to persons with identical ability estimates as regards the underlying mental processes triggered by the test. However, research in cognitive psychology shows that persons at the same trait level may…
USDA-ARS?s Scientific Manuscript database
Effects of hydraulic redistribution (HR) on hydrological, biogeochemical, and ecological processes have been demonstrated in the field, but the current generation of standard earth system models does not include a representation of HR. Though recent studies have examined the effect of incorporating ...
Finding One's Voice: The Pacesetter Model for More Equitable Assessment.
ERIC Educational Resources Information Center
Badger, Elizabeth
1996-01-01
Describes the College Board's Pacesetter Program, high school courses developed using principles of ongoing performance testing and portfolios, standards, and curriculum. The model is illustrated in a description of the Voices of Modern Culture language arts course. Argues that this assessment process has systemic validity and is more relevant to…
Neutral model analysis of landscape patterns from mathematical morphology
Kurt H. Riitters; Peter Vogt; Pierre Soille; Jacek Kozak; Christine Estreguil
2007-01-01
Mathematical morphology encompasses methods for characterizing land-cover patterns in ecological research and biodiversity assessments. This paper reports a neutral model analysis of patterns in the absence of a structuring ecological process, to help set standards for comparing and interpreting patterns identified by mathematical morphology on real land-cover maps. We...
The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis
NASA Astrophysics Data System (ADS)
Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.
2017-12-01
The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.
An Integrated Product Environment
NASA Technical Reports Server (NTRS)
Higgins, Chuck
1997-01-01
Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.
Liepe, Juliane; Holzhütter, Hermann-Georg; Bellavista, Elena; Kloetzel, Peter M; Stumpf, Michael PH; Mishto, Michele
2015-01-01
Proteasomal protein degradation is a key determinant of protein half-life and hence of cellular processes ranging from basic metabolism to a host of immunological processes. Despite its importance the mechanisms regulating proteasome activity are only incompletely understood. Here we use an iterative and tightly integrated experimental and modelling approach to develop, explore and validate mechanistic models of proteasomal peptide-hydrolysis dynamics. The 20S proteasome is a dynamic enzyme and its activity varies over time because of interactions between substrates and products and the proteolytic and regulatory sites; the locations of these sites and the interactions between them are predicted by the model, and experimentally supported. The analysis suggests that the rate-limiting step of hydrolysis is the transport of the substrates into the proteasome. The transport efficiency varies between human standard- and immuno-proteasomes thereby impinging upon total degradation rate and substrate cleavage-site usage. DOI: http://dx.doi.org/10.7554/eLife.07545.001 PMID:26393687
Ares Upper Stage Processes to Implement Model Based Design - Going Paperless
NASA Technical Reports Server (NTRS)
Gregory, Melanie
2012-01-01
Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.
Cotton, Stephen J.; Miller, William H.
2016-10-14
Previous work has shown how a symmetrical quasi-classical (SQC) windowing procedure can be used to quantize the initial and final electronic degrees of freedom in the Meyer-Miller (MM) classical vibronic (i.e, nuclear + electronic) Hamiltonian, and that the approach provides a very good description of electronically non-adiabatic processes within a standard classical molecular dynamics framework for a number of benchmark problems. This study explores application of the SQC/MM approach to the case of very weak non-adiabatic coupling between the electronic states, showing (as anticipated) how the standard SQC/MM approach used to date fails in this limit, and then devises amore » new SQC windowing scheme to deal with it. Finally, application of this new SQC model to a variety of realistic benchmark systems shows that the new model not only treats the weak coupling case extremely well, but it is also seen to describe the “normal” regime (of electronic transition probabilities ≳ 0.1) even more accurately than the previous “standard” model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cotton, Stephen J.; Miller, William H.
Previous work has shown how a symmetrical quasi-classical (SQC) windowing procedure can be used to quantize the initial and final electronic degrees of freedom in the Meyer-Miller (MM) classical vibronic (i.e, nuclear + electronic) Hamiltonian, and that the approach provides a very good description of electronically non-adiabatic processes within a standard classical molecular dynamics framework for a number of benchmark problems. This study explores application of the SQC/MM approach to the case of very weak non-adiabatic coupling between the electronic states, showing (as anticipated) how the standard SQC/MM approach used to date fails in this limit, and then devises amore » new SQC windowing scheme to deal with it. Finally, application of this new SQC model to a variety of realistic benchmark systems shows that the new model not only treats the weak coupling case extremely well, but it is also seen to describe the “normal” regime (of electronic transition probabilities ≳ 0.1) even more accurately than the previous “standard” model.« less
Incorporating Experience Curves in Appliance Standards Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery
2011-10-31
The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners,more » clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.« less
Kırbıyık, Çisem; Pütün, Ayşe Eren; Pütün, Ersan
2016-01-01
In this study, Fe(III) and Cr(III) metal ion adsorption processes were carried out with three adsorbents in batch experiments and their adsorption performance was compared. These adsorbents were sesame stalk without pretreatment, bio-char derived from thermal decomposition of biomass, and activated carbon which was obtained from chemical activation of biomass. Scanning electron microscopy and Fourier transform-infrared techniques were used for characterization of adsorbents. The optimum conditions for the adsorption process were obtained by observing the influences of solution pH, adsorbent dosage, initial solution concentration, contact time and temperature. The optimum adsorption efficiencies were determined at pH 2.8 and pH 4.0 for Fe(III) and Cr(III) metal ion solutions, respectively. The experimental data were modelled by different isotherm models and the equilibriums were well described by the Langmuir adsorption isotherm model. The pseudo-first-order, pseudo-second-order kinetic, intra-particle diffusion and Elovich models were applied to analyze the kinetic data and to evaluate rate constants. The pseudo-second-order kinetic model gave a better fit than the others. The thermodynamic parameters, such as Gibbs free energy change ΔG°, standard enthalpy change ΔH° and standard entropy change ΔS° were evaluated. The thermodynamic study showed the adsorption was a spontaneous endothermic process.
A Hilbert Space Representation of Generalized Observables and Measurement Processes in the ESR Model
NASA Astrophysics Data System (ADS)
Sozzo, Sandro; Garola, Claudio
2010-12-01
The extended semantic realism ( ESR) model recently worked out by one of the authors embodies the mathematical formalism of standard (Hilbert space) quantum mechanics in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here a Hilbert space representation of the generalized observables introduced by the ESR model that satisfy a simple physical condition, propose a generalization of the projection postulate, and suggest a possible mathematical description of the measurement process in terms of evolution of the compound system made up of the measured system and the measuring apparatus.
Hydrological models as web services: Experiences from the Environmental Virtual Observatory project
NASA Astrophysics Data System (ADS)
Buytaert, W.; Vitolo, C.; Reaney, S. M.; Beven, K.
2012-12-01
Data availability in environmental sciences is expanding at a rapid pace. From the constant stream of high-resolution satellite images to the local efforts of citizen scientists, there is an increasing need to process the growing stream of heterogeneous data and turn it into useful information for decision-making. Environmental models, ranging from simple rainfall - runoff relations to complex climate models, can be very useful tools to process data, identify patterns, and help predict the potential impact of management scenarios. Recent technological innovations in networking, computing and standardization may bring a new generation of interactive models plugged into virtual environments closer to the end-user. They are the driver of major funding initiatives such as the UK's Virtual Observatory program, and the U.S. National Science Foundation's Earth Cube. In this study we explore how hydrological models, being an important subset of environmental models, have to be adapted in order to function within a broader environment of web-services and user interactions. Historically, hydrological models have been developed for very different purposes. Typically they have a rigid model structure, requiring a very specific set of input data and parameters. As such, the process of implementing a model for a specific catchment requires careful collection and preparation of the input data, extensive calibration and subsequent validation. This procedure seems incompatible with a web-environment, where data availability is highly variable, heterogeneous and constantly changing in time, and where the requirements of end-users may be not necessarily align with the original intention of the model developer. We present prototypes of models that are web-enabled using the web standards of the Open Geospatial Consortium, and implemented in online decision-support systems. We identify issues related to (1) optimal use of available data; (2) the need for flexible and adaptive structures; (3) quantification and communication of uncertainties. Lastly, we present some road maps to address these issues and discuss them in the broader context of web-based data processing and "big data" science.
The Effect of Improved Sub-Daily Earth Rotation Models on Global GPS Data Processing
NASA Astrophysics Data System (ADS)
Yoon, S.; Choi, K. K.
2017-12-01
Throughout the various International GNSS Service (IGS) products, strong periodic signals have been observed around the 14 day period. This signal is clearly visible in all IGS time-series such as those related to orbit ephemerides, Earth rotation parameters (ERP) and ground station coordinates. Recent studies show that errors in the sub-daily Earth rotation models are the main factors that induce such noise. Current IGS orbit processing standards adopted the IERS 2010 convention and its sub-daily Earth rotation model. Since the IERS convention had published, recent advances in the VLBI analysis have made contributions to update the sub-daily Earth rotation models. We have compared several proposed sub-daily Earth rotation models and show the effect of using those models on orbit ephemeris, Earth rotation parameters and ground station coordinates generated by the NGS global GPS data processing strategy.
Insulation Cork Boards-Environmental Life Cycle Assessment of an Organic Construction Material.
Silvestre, José D; Pargana, Nuno; de Brito, Jorge; Pinheiro, Manuel D; Durão, Vera
2016-05-20
Envelope insulation is a relevant technical solution to cut energy consumption and reduce environmental impacts in buildings. Insulation Cork Boards (ICB) are a natural thermal insulation material whose production promotes the recycling of agricultural waste. The aim of this paper is to determine and evaluate the environmental impacts of the production, use, and end-of-life processing of ICB. A "cradle-to-cradle" environmental Life Cycle Assessment (LCA) was performed according to International LCA standards and the European standards on the environmental evaluation of buildings. These results were based on site-specific data and resulted from a consistent methodology, fully described in the paper for each life cycle stage: Cork oak tree growth, ICB production, and end-of-life processing-modeling of the carbon flows ( i.e. , uptakes and emissions), including sensitivity analysis of this procedure; at the production stage-the modeling of energy processes and a sensitivity analysis of the allocation procedures; during building operation-the expected service life of ICB; an analysis concerning the need to consider the thermal diffusivity of ICB in the comparison of the performance of insulation materials. This paper presents the up-to-date "cradle-to-cradle" environmental performance of ICB for the environmental categories and life-cycle stages defined in European standards.
Insulation Cork Boards—Environmental Life Cycle Assessment of an Organic Construction Material
Silvestre, José D.; Pargana, Nuno; de Brito, Jorge; Pinheiro, Manuel D.; Durão, Vera
2016-01-01
Envelope insulation is a relevant technical solution to cut energy consumption and reduce environmental impacts in buildings. Insulation Cork Boards (ICB) are a natural thermal insulation material whose production promotes the recycling of agricultural waste. The aim of this paper is to determine and evaluate the environmental impacts of the production, use, and end-of-life processing of ICB. A “cradle-to-cradle” environmental Life Cycle Assessment (LCA) was performed according to International LCA standards and the European standards on the environmental evaluation of buildings. These results were based on site-specific data and resulted from a consistent methodology, fully described in the paper for each life cycle stage: Cork oak tree growth, ICB production, and end-of-life processing-modeling of the carbon flows (i.e., uptakes and emissions), including sensitivity analysis of this procedure; at the production stage—the modeling of energy processes and a sensitivity analysis of the allocation procedures; during building operation—the expected service life of ICB; an analysis concerning the need to consider the thermal diffusivity of ICB in the comparison of the performance of insulation materials. This paper presents the up-to-date “cradle-to-cradle” environmental performance of ICB for the environmental categories and life-cycle stages defined in European standards. PMID:28773516
NASA Astrophysics Data System (ADS)
Rohmanu, Ajar; Everhard, Yan
2017-04-01
A technological development, especially in the field of electronics is very fast. One of the developments in the electronics hardware device is Flexible Flat Cable (FFC), which serves as a media liaison between the main boards with other hardware parts. The production of Flexible Flat Cable (FFC) will go through the process of testing and measuring of the quality Flexible Flat Cable (FFC). Currently, the testing and measurement is still done manually by observing the Light Emitting Diode (LED) by the operator, so there were many problems. This study will be made of test quality Flexible Flat Cable (FFC) computationally utilize Open Source Embedded System. The method used is the measurement with Short Open Test method using Ohm’s Law approach to 4-wire (Kelvin) and fuzzy logic as a decision maker measurement results based on Open Source Arduino Data Logger. This system uses a sensor current INA219 as a sensor to read the voltage value thus obtained resistance value Flexible Flat Cable (FFC). To get a good system we will do the Black-box testing as well as testing the accuracy and precision with the standard deviation method. In testing the system using three models samples were obtained the test results in the form of standard deviation for the first model of 1.921 second model of 4.567 and 6.300 for the third model. While the value of the Standard Error of Mean (SEM) for the first model of the model 0.304 second at 0.736 and 0.996 of the third model. In testing this system, we will also obtain the average value of the measurement tolerance resistance values for the first model of - 3.50% 4.45% second model and the third model of 5.18% with the standard measurement of prisoners and improve productivity becomes 118.33%. From the results of the testing system is expected to improve the quality and productivity in the process of testing Flexible Flat Cable (FFC).
Evaluating Model-Driven Development for large-scale EHRs through the openEHR approach.
Christensen, Bente; Ellingsen, Gunnar
2016-05-01
In healthcare, the openEHR standard is a promising Model-Driven Development (MDD) approach for electronic healthcare records. This paper aims to identify key socio-technical challenges when the openEHR approach is put to use in Norwegian hospitals. More specifically, key fundamental assumptions are investigated empirically. These assumptions promise a clear separation of technical and domain concerns, users being in control of the modelling process, and widespread user commitment. Finally, these assumptions promise an easy way to model and map complex organizations. This longitudinal case study is based on an interpretive approach, whereby data were gathered through 440h of participant observation, 22 semi-structured interviews and extensive document studies over 4 years. The separation of clinical and technical concerns seemed to be aspirational, because both designing the technical system and modelling the domain required technical and clinical competence. Hence developers and clinicians found themselves working together in both arenas. User control and user commitment seemed not to apply in large-scale projects, as modelling the domain turned out to be too complicated and hence to appeal only to especially interested users worldwide, not the local end-users. Modelling proved to be a complex standardization process that shaped both the actual modelling and healthcare practice itself. A broad assemblage of contributors seems to be needed for developing an archetype-based system, in which roles, responsibilities and contributions cannot be clearly defined and delimited. The way MDD occurs has implications for medical practice per se in the form of the need to standardize practices to ensure that medical concepts are uniform across practices. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirsch, Matthias
2009-06-29
At particle accelerators the Standard Model has been tested and will be tested further to a great precision. The data analyzed in this thesis have been collected at the world's highest energetic-collider, the Tevatron, located at the Fermi National Accelerator Laboratory (FNAL) in the vicinity of Chicago, IL, USA. There, protons and antiprotons are collided at a center-of-mass energy of {radical}s = 1.96 TeV. The discovery of the top quark was one of the remarkable results not only for the CDF and D0 experiments at the Tevatron collider, but also for the Standard Model, which had predicted the existence ofmore » the top quark because of symmetry arguments long before already. Still, the Tevatron is the only facility able to produce top quarks. The predominant production mechanism of top quarks is the production of a top-antitop quark pair via the strong force. However, the Standard Model also allows the production of single top quarks via the electroweak interaction. This process features the unique opportunity to measure the |V tb| matrix element of the Cabbibo-Kobayashi-Maskawa (CKM) matrix directly, without assuming unitarity of the matrix or assuming that the number of quark generations is three. Hence, the measurement of the cross section of electroweak top quark production is more than the technical challenge to extract a physics process that only occurs one out of ten billion collisions. It is also an important test of the V-A structure of the electroweak interaction and a potential window to physics beyond the Standard Model in the case where the measurement of |V{sub tb}| would result in a value significantly different from 1, the value predicted by the Standard Model. At the Tevatron two production processes contribute significantly to the production of single top quarks: the production via the t-channel, also called W-gluon fusion, and the production via the s-channel, known as well as W* process. This analysis searches for the combined s+t channel production cross section, assuming the ratio of s-channel production over t-channel production is realized in nature as predicted by the Standard Model. A data set of approximately 1 fb -1 is analyzed, the data set used by the D0 collaboration to claim evidence for single top quark production. Events with two, three, and four jets are used in the analysis if they contain one or two jets that were tagged as originating from the decay of a b hadron, an isolated muon or electron, and a significant amount of missing transverse energy. This selection of events follows the signature that the single top quark events are expected to show in the detector. In the meantime, both collaborations D0 and CDF have analyzed a larger data set and have celebrated the joint observation of single top quark production. The novelty of the analysis presented here is the way discriminating observables are determined. A so-called Multi-Process Factory evaluates each event under several hypotheses. A common analysis technique for example in top quark properties studies is to reconstruct the intermediate particles in the decay chain of the signal process from the final state objects measured in the various subdetectors. An essential part of such a method is to resolve the ambiguities that arise in the assignment of the final state objects to the partons of the decay chain. In a Multi-Process Factory this approach is extended and not only the decay chain of the signal process is reconstructed, but also the decay chains of the most important background processes. From the numerous possible event configurations for each of the signal and background decay chains the most probable configuration is selected based on a likelihood measure. Properties of this configuration, such as mass of the reconstructed top quark, are then used in a multivariate analysis technique to separate the expected signal contribution from the background processes. The technique which is used is called Boosted Decision Trees and has only recently been introduced in high energy physics analyses. A Bayesian approach is used to finally extract the cross section from the discriminant output of Boosted Decision Trees.« less
Schoppe, Oliver; King, Andrew J.; Schnupp, Jan W.H.; Harper, Nicol S.
2016-01-01
Adaptation to stimulus statistics, such as the mean level and contrast of recently heard sounds, has been demonstrated at various levels of the auditory pathway. It allows the nervous system to operate over the wide range of intensities and contrasts found in the natural world. Yet current standard models of the response properties of auditory neurons do not incorporate such adaptation. Here we present a model of neural responses in the ferret auditory cortex (the IC Adaptation model), which takes into account adaptation to mean sound level at a lower level of processing: the inferior colliculus (IC). The model performs high-pass filtering with frequency-dependent time constants on the sound spectrogram, followed by half-wave rectification, and passes the output to a standard linear–nonlinear (LN) model. We find that the IC Adaptation model consistently predicts cortical responses better than the standard LN model for a range of synthetic and natural stimuli. The IC Adaptation model introduces no extra free parameters, so it improves predictions without sacrificing parsimony. Furthermore, the time constants of adaptation in the IC appear to be matched to the statistics of natural sounds, suggesting that neurons in the auditory midbrain predict the mean level of future sounds and adapt their responses appropriately. SIGNIFICANCE STATEMENT An ability to accurately predict how sensory neurons respond to novel stimuli is critical if we are to fully characterize their response properties. Attempts to model these responses have had a distinguished history, but it has proven difficult to improve their predictive power significantly beyond that of simple, mostly linear receptive field models. Here we show that auditory cortex receptive field models benefit from a nonlinear preprocessing stage that replicates known adaptation properties of the auditory midbrain. This improves their predictive power across a wide range of stimuli but keeps model complexity low as it introduces no new free parameters. Incorporating the adaptive coding properties of neurons will likely improve receptive field models in other sensory modalities too. PMID:26758822
Soneson, Joshua E
2017-04-01
Wide-angle parabolic models are commonly used in geophysics and underwater acoustics but have seen little application in medical ultrasound. Here, a wide-angle model for continuous-wave high-intensity ultrasound beams is derived, which approximates the diffraction process more accurately than the commonly used Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation without increasing implementation complexity or computing time. A method for preventing the high spatial frequencies often present in source boundary conditions from corrupting the solution is presented. Simulations of shallowly focused axisymmetric beams using both the wide-angle and standard parabolic models are compared to assess the accuracy with which they model diffraction effects. The wide-angle model proposed here offers improved focusing accuracy and less error throughout the computational domain than the standard parabolic model, offering a facile method for extending the utility of existing KZK codes.
Assessment of DoD Enterprise Resource Planning Business Systems
2011-02-01
activities, and processes to the organizational units that execute them • Architecture standards, such as application of the BPMN • Recommended...Recommendation: • Create and use style guides for the development of BPMN based process models. The style guide would probably include specifications such as...o All processes must have ‘entry points’ or ‘triggers’ in the form of BPMN Events o All processes must have ‘outcomes’ also in the form of BPMN
Process Definition and Modeling Guidebook. Version 01.00.02
1992-12-01
material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from
Chou, Wei-Lung; Wang, Chih-Ta; Chang, Wen-Chun; Chang, Shih-Yu
2010-08-15
In this study, metal hydroxides generated during electrocoagulation (EC) were used to remove the chemical oxygen demand (COD) of oxide chemical mechanical polishing (oxide-CMP) wastewater from a semiconductor manufacturing plant by EC. Adsorption studies were conducted in a batch system for various current densities and temperatures. The COD concentration in the oxide-CMP wastewater was effectively removed and decreased by more than 90%, resulting in a final wastewater COD concentration that was below the Taiwan discharge standard (100 mg L(-1)). Since the processed wastewater quality exceeded the direct discharge standard, the effluent could be considered for reuse. The adsorption kinetic studies showed that the EC process was best described using the pseudo-second-order kinetic model at the various current densities and temperatures. The experimental data were also tested against different adsorption isotherm models to describe the EC process. The Freundlich adsorption isotherm model predictions matched satisfactorily with the experimental observations. Thermodynamic parameters, including the Gibbs free energy, enthalpy, and entropy, indicated that the COD adsorption of oxide-CMP wastewater on metal hydroxides was feasible, spontaneous and endothermic in the temperature range of 288-318 K. Copyright 2010 Elsevier B.V. All rights reserved.
PAST-TENSE GENERATION FROM FORM VERSUS MEANING: BEHAVIOURAL DATA AND SIMULATION EVIDENCE
Woollams, Anna M.; Joanisse, Marc; Patterson, Karalyn
2009-01-01
The standard task used to study inflectional processing of verbs involves presentation of the stem form from which the participant is asked to generate the past tense. This task reveals a processing disadvantage for irregular relative to regular English verbs, more pronounced for lower-frequency items. Dual- and single-mechanism theories of inflectional morphology are both able to account for this pattern; but the models diverge in their predictions concerning the magnitude of the regularity effect expected when the task involves past-tense generation from meaning. In this study, we asked normal speakers to generate the past tense from either form (verb stem) or meaning (action picture). The robust regularity effect observed in the standard form condition was no longer reliable when participants were required to generate the past tense from meaning. This outcome would appear problematic for dual-mechanism theories to the extent that they assume the process of inflection requires stem retrieval. By contrast, it supports single-mechanism models that consider stem retrieval to be task-dependent. We present a single-mechanism model of verb inflection incorporating distributed phonological and semantic representations that reproduces this task-dependent pattern. PMID:20161125
Coastal Algorithms and On-Demand Processing- The Lessons Learnt from CoastColour for Sentinel 3
NASA Astrophysics Data System (ADS)
Brockmann, Carsten; Doerffer, Roland; Boettcher, Martin; Kramer, Uwe; Zuhlke, Marco; Pinnock, Simon
2015-12-01
The ESA DUE CoastColour Project has been initiated to provide water quality products for important costal zones globally. A new 5 component bio-optical model was developed and used in a 3-step approach for regional processing of ocean colour data. The L1P step consists of radiometric and geometric system corrections, and top-of-atmosphere pixel classification including cloud screening, sun glint risk masking or detection of floating vegetation. The second step includes the atmospheric correction and is providing the L2R product, which comprises marine reflectances with error characterisation and normalisation. The third step is the in-water processing which produces IOPs, attenuation coefficient and water constituent concentrations. Each of these steps will benefit from the additional bands on OLCI. The 5 component bio-optical model will already be used in the standard ESA processing of OLCI, and also part of the pixel classification methods will be part of the standard products. Other algorithm adaptation are in preparation. Another important advantage of the CoastColour approach is the highly configurable processing chain which allows adaptation to the individual characteristics of the area of interest, temporal window, algorithm parametrisation and processing chain configuration. This flexibility is made available to data users through the CoastColour on-demand processing service. The complete global MERIS Full and Reduced Resolution data archive is accessible, covering the time range from 17. May 2002 until 08. April 2012, which is almost 200TB of in-put data available online. The CoastColour on-demand processing service can serve as a model for hosted processing, where the software is moved to the data instead of moving the data to the users, which will be a challenge with the large amount of data coming from Sentinel 3.
Van Haute, S; López-Gálvez, F; Gómez-López, V M; Eriksson, Markus; Devlieghere, F; Allende, Ana; Sampers, I
2015-09-02
A methodology to i) assess the feasibility of water disinfection in fresh-cut leafy greens wash water and ii) to compare the disinfectant efficiency of water disinfectants was defined and applied for a combination of peracetic acid (PAA) and lactic acid (LA) and comparison with free chlorine was made. Standardized process water, a watery suspension of iceberg lettuce, was used for the experiments. First, the combination of PAA+LA was evaluated for water recycling. In this case disinfectant was added to standardized process water inoculated with Escherichia coli (E. coli) O157 (6logCFU/mL). Regression models were constructed based on the batch inactivation data and validated in industrial process water obtained from fresh-cut leafy green processing plants. The UV254(F) was the best indicator for PAA decay and as such for the E. coli O157 inactivation with PAA+LA. The disinfection efficiency of PAA+LA increased with decreasing pH. Furthermore, PAA+LA efficacy was assessed as a process water disinfectant to be used within the washing tank, using a dynamic washing process with continuous influx of E. coli O157 and organic matter in the washing tank. The process water contamination in the dynamic process was adequately estimated by the developed model that assumed that knowledge of the disinfectant residual was sufficient to estimate the microbial contamination, regardless the physicochemical load. Based on the obtained results, PAA+LA seems to be better suited than chlorine for disinfecting process wash water with a high organic load but a higher disinfectant residual is necessary due to the slower E. coli O157 inactivation kinetics when compared to chlorine. Copyright © 2015 Elsevier B.V. All rights reserved.
Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook
NASA Technical Reports Server (NTRS)
Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.
1986-01-01
The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.
Sharing environmental models: An Approach using GitHub repositories and Web Processing Services
NASA Astrophysics Data System (ADS)
Stasch, Christoph; Nuest, Daniel; Pross, Benjamin
2016-04-01
The GLUES (Global Assessment of Land Use Dynamics, Greenhouse Gas Emissions and Ecosystem Services) project established a spatial data infrastructure for scientific geospatial data and metadata (http://geoportal-glues.ufz.de), where different regional collaborative projects researching the impacts of climate and socio-economic changes on sustainable land management can share their underlying base scenarios and datasets. One goal of the project is to ease the sharing of computational models between institutions and to make them easily executable in Web-based infrastructures. In this work, we present such an approach for sharing computational models relying on GitHub repositories (http://github.com) and Web Processing Services. At first, model providers upload their model implementations to GitHub repositories in order to share them with others. The GitHub platform allows users to submit changes to the model code. The changes can be discussed and reviewed before merging them. However, while GitHub allows sharing and collaborating of model source code, it does not actually allow running these models, which requires efforts to transfer the implementation to a model execution framework. We thus have extended an existing implementation of the OGC Web Processing Service standard (http://www.opengeospatial.org/standards/wps), the 52°North Web Processing Service (http://52north.org/wps) platform to retrieve all model implementations from a git (http://git-scm.com) repository and add them to the collection of published geoprocesses. The current implementation is restricted to models implemented as R scripts using WPS4R annotations (Hinz et al.) and to Java algorithms using the 52°North WPS Java API. The models hence become executable through a standardized Web API by multiple clients such as desktop or browser GIS and modelling frameworks. If the model code is changed on the GitHub platform, the changes are retrieved by the service and the processes will be updated accordingly. The admin tool of the 52°North WPS was extended to support automated retrieval and deployment of computational models from GitHub repositories. Once the R code is available in the GitHub repo, the contained process can be easily deployed and executed by simply defining the GitHub repository URL in the WPS admin tool. We illustrate the usage of the approach by sharing and running a model for land use system archetypes developed by the Helmholtz Centre for Environmental Research (UFZ, see Vaclavik et al.). The original R code was extended and published in the 52°North WPS using both, public and non-public datasets (Nüst et al., see also https://github.com/52North/glues-wps). Hosting the analysis in a Git repository now allows WPS administrators, client developers, and modelers to easily work together on new versions or completely new web processes using the powerful GitHub collaboration platform. References: Hinz, M. et. al. (2013): Spatial Statistics on the Geospatial Web. In: The 16th AGILE International Conference on Geographic Information Science, Short Papers. http://www.agile-online.org/Conference_Paper/CDs/agile_2013/Short_Papers/SP_S3.1_Hinz.pdf Nüst, D. et. al.: (2015): Open and reproducible global land use classification. In: EGU General Assembly Conference Abstracts . Vol. 17. European Geophysical Union, 2015, p. 9125, http://meetingorganizer.copernicus. org/EGU2015/EGU2015- 9125.pdf Vaclavik, T., et. al. (2013): Mapping global land system archetypes. Global Environmental Change 23(6): 1637-1647. Online available: October 9, 2013, DOI: 10.1016/j.gloenvcha.2013.09.004
Congsheng Fu; Guiling Wang; Michael L. Goulden; Russell L. Scott; Kenneth Bible; Zoe G. Cardon
2016-01-01
Effects of hydraulic redistribution (HR) on hydrological, biogeochemical, and ecological processes have been demonstrated in the field, but the current generation of standard earth system models does not include a representation of HR. Though recent studies have examined the effect of incorporating HR into land surface models, few (if any) have done cross-site...
ERIC Educational Resources Information Center
Schmitz, Bernhard; Wiese, Bettina S.
2006-01-01
The present study combines a standardized diary approach with time-series analysis methods to investigate the process of self-regulated learning. Based on a process-focused adaptation of Zimmerman's (2000) learning model, an intervention (consisting of four weekly training sessions) to increase self-regulated learning was developed. The diaries…
Semiconductor technology program. Progress briefs
NASA Technical Reports Server (NTRS)
Bullis, W. M.
1980-01-01
Measurement technology for semiconductor materials, process control, and devices is reviewed. Activities include: optical linewidth and thermal resistance measurements; device modeling; dopant density profiles; resonance ionization spectroscopy; and deep level measurements. Standardized oxide charge terminology is also described.
[Study on effect of 3 types of drinking water emergent disinfection models in flood/waterlog areas].
Ban, Haiqun; Li, Jin; Li, Xinwu; Zhang, Liubo
2010-09-01
To establish 3 drinking water emergent disinfection processing models, separated medicate dispensing, specific duty medicate dispensing, and centralized filtering, in flood/waterlog areas, and compare the effects of these 3 models on the drinking water disinfection processing. From October to December, 2008, 18 villages were selected as the trial field in Yanglinwei town, Xiantao city, Hubei province, which were divided into three groups, separated medicate dispensing, specific duty medicate dispensing, and centralized filtering. Every 2 weeks, drinking water source water, yielding water of emergency central filtrate water equipment (ECFWE) and container water in the kitchen were sampled and microbe indices of the water sample, standard plate-count bacteria, total coliforms, thermotolerant coliform bacteria, Escherichia coli were measured. The microbe pollution of the water of these 3 water source groups are heavy, all failed. The eliminating rate of the standard plate-count bacteria of the drinking water emergent centralized processing equipment is 99.95%; those of the separate medicate dispensing, specific duty medicate dispensing and centralized filtering are 81.93%, 99.67%, and 98.28%, respectively. The passing rates of the microbe indice of the resident contained water are 13.33%, 70.00%, and 43.33%, respectively. The difference has statistical significance. The drinking water disinfection effects of the centralized filtering model and of the specific duty medicate dispensing model are better than that of the separated medicate dispensing model in the flood/waterlog areas.
Berner, Logan T.; Law, Beverly E.
2016-01-01
Plant trait measurements are needed for evaluating ecological responses to environmental conditions and for ecosystem process model development, parameterization, and testing. We present a standardized dataset integrating measurements from projects conducted by the Terrestrial Ecosystem Research and Regional Analysis- Pacific Northwest (TERRA-PNW) research group between 1999 and 2014 across Oregon and Northern California, where measurements were collected for scaling and modeling regional terrestrial carbon processes with models such as Biome-BGC and the Community Land Model. The dataset contains measurements of specific leaf area, leaf longevity, leaf carbon and nitrogen for 35 tree and shrub species derived from more than 1,200 branch samples collected from over 200 forest plots, including several AmeriFlux sites. The dataset also contains plot-level measurements of forest composition, structure (e.g., tree biomass), and productivity, as well as measurements of soil structure (e.g., bulk density) and chemistry (e.g., carbon). Publically-archiving regional datasets of standardized, co-located, and geo-referenced plant trait measurements will advance the ability of earth system models to capture species-level climate sensitivity at regional to global scales. PMID:26784559
Berner, Logan T.; Law, Beverly E.
2016-01-19
Plant trait measurements are needed for evaluating ecological responses to environmental conditions and for ecosystem process model development, parameterization, and testing. Here, we present a standardized dataset integrating measurements from projects conducted by the Terrestrial Ecosystem Research and Regional Analysis- Pacific Northwest (TERRA-PNW) research group between 1999 and 2014 across Oregon and Northern California, where measurements were collected for scaling and modeling regional terrestrial carbon processes with models such as Biome-BGC and the Community Land Model. The dataset contains measurements of specific leaf area, leaf longevity, leaf carbon and nitrogen for 35 tree and shrub species derived from more thanmore » 1,200 branch samples collected from over 200 forest plots, including several AmeriFlux sites. The dataset also contains plot-level measurements of forest composition, structure (e.g., tree biomass), and productivity, as well as measurements of soil structure (e.g., bulk density) and chemistry (e.g., carbon). Publically-archiving regional datasets of standardized, co-located, and geo-referenced plant trait measurements will advance the ability of earth system models to capture species-level climate sensitivity at regional to global scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Stanley T.
2007-01-01
This thesis describes the first search for Standard Model Higgs boson production in association with a top-antitop quark pair in proton-antiproton collisions at a centre of mass energy of 1.96 TeV. The integrated luminosity for othis search corresponds to 319 pb -1 of data recorded by the Collider Detector at Fermilab. We outline the even selection criteria, evaluate the even acceptance and estimate backgrounds from Standard Model sources. These events are observed that satisfy our event selection, while 2.16 ± 0.66 events are expected from background processes. no significant excess of events above background is thus observed, and we set 95% confidence level upper limits on the production cross section for this process as a function of the Higgs mass. For a Higgs boson mass of 115 GeV/c 2 we find that σ more » $$t\\bar{t}H$$ x BR (H → bb) < 690 fb at 95% C.L. These are the first limits set for $$t\\bar{t}H$$ production. This search also allows us to anticipate the challenges and necessary strategies needed for future searches of $$t\\bar{t}H$$ production.« less
NASA Astrophysics Data System (ADS)
Bernatowicz, P.; Czerski, I.; Jaźwiński, J.; Szymański, S.
2004-08-01
In the standard NMR spectra, the lineshape patterns produced by a molecular rate process are often poorly structured. When alternative theoretical models of such a process are to be compared, even quantitative lineshape fits may then give inconclusive results. A detailed description is presented of an approach involving fits of the competing models to series of Carr-Purcell echo spectra. Its high discriminative power has already been exploited in a number of cases of practical significance. An explanation is given why it can be superior to methods based on the standard spectra. Its applicability in practice is now illustrated on example of the methyl proton spectra in 1,2,3,4-tetrachloro-9,10-dimethyltriptycene TCDMT. It is shown that, in the echo spectra, the recently discovered effect of nonclassical stochastic reorientation of the methyl group can be identified clearly while it is practically nondiscernible in the standard spectra of TCDMT. This is the first detection of the effect at temperatures above 200 K. It is also shown that in computer-assisted interpretation of exchange-broadened echo spectra, the usual description of the stimulating radiofrequency pulses in terms of rotation operators ought to be replaced by a more realistic pulse model.
A Model for Joint Software Reviews
1998-10-01
CEPMAN 1, 1996; Gabb, 1997], and with the growing popularity of outsourcing, they are becoming more important in the commercial sector [ ISO /IEC 12207 ...technical and management reviews [MIL-STD-498, 1996; ISO /IEC 12207 , 1995]. Management reviews occur after technical reviews, and are focused on the cost...characteristics, Standard (No. ISO /IEC 9126-1). [ ISO /IEC 12207 , 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207
USDA-ARS?s Scientific Manuscript database
About 2 in 3 U.S. adults have pre-hypertension or hypertension increasing their risk of cardiovascular disease. Reducing sodium intake can decrease blood pressure and prevent hypertension. About 9 in 10 Americans consume excess sodium, >2300 mg daily. Voluntary sodium reduction standards for commerc...
NASA Astrophysics Data System (ADS)
Alameh, N.; Bambacus, M.; Cole, M.
2006-12-01
Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG has played a key role in recent GEOSS Service Network (GSN) demos and workshops, acting not only as a service and data catalog and discovery client, but also as a portrayal and visualization client to distributed data.
Data Mining of Macromolecular Structures.
van Beusekom, Bart; Perrakis, Anastassis; Joosten, Robbie P
2016-01-01
The use of macromolecular structures is widespread for a variety of applications, from teaching protein structure principles all the way to ligand optimization in drug development. Applying data mining techniques on these experimentally determined structures requires a highly uniform, standardized structural data source. The Protein Data Bank (PDB) has evolved over the years toward becoming the standard resource for macromolecular structures. However, the process selecting the data most suitable for specific applications is still very much based on personal preferences and understanding of the experimental techniques used to obtain these models. In this chapter, we will first explain the challenges with data standardization, annotation, and uniformity in the PDB entries determined by X-ray crystallography. We then discuss the specific effect that crystallographic data quality and model optimization methods have on structural models and how validation tools can be used to make informed choices. We also discuss specific advantages of using the PDB_REDO databank as a resource for structural data. Finally, we will provide guidelines on how to select the most suitable protein structure models for detailed analysis and how to select a set of structure models suitable for data mining.
NASA Technical Reports Server (NTRS)
Goad, Clyde C.; Chadwell, C. David
1993-01-01
GEODYNII is a conventional batch least-squares differential corrector computer program with deterministic models of the physical environment. Conventional algorithms were used to process differenced phase and pseudorange data to determine eight-day Global Positioning system (GPS) orbits with several meter accuracy. However, random physical processes drive the errors whose magnitudes prevent improving the GPS orbit accuracy. To improve the orbit accuracy, these random processes should be modeled stochastically. The conventional batch least-squares algorithm cannot accommodate stochastic models, only a stochastic estimation algorithm is suitable, such as a sequential filter/smoother. Also, GEODYNII cannot currently model the correlation among data values. Differenced pseudorange, and especially differenced phase, are precise data types that can be used to improve the GPS orbit precision. To overcome these limitations and improve the accuracy of GPS orbits computed using GEODYNII, we proposed to develop a sequential stochastic filter/smoother processor by using GEODYNII as a type of trajectory preprocessor. Our proposed processor is now completed. It contains a correlated double difference range processing capability, first order Gauss Markov models for the solar radiation pressure scale coefficient and y-bias acceleration, and a random walk model for the tropospheric refraction correction. The development approach was to interface the standard GEODYNII output files (measurement partials and variationals) with software modules containing the stochastic estimator, the stochastic models, and a double differenced phase range processing routine. Thus, no modifications to the original GEODYNII software were required. A schematic of the development is shown. The observational data are edited in the preprocessor and the data are passed to GEODYNII as one of its standard data types. A reference orbit is determined using GEODYNII as a batch least-squares processor and the GEODYNII measurement partial (FTN90) and variational (FTN80, V-matrix) files are generated. These two files along with a control statement file and a satellite identification and mass file are passed to the filter/smoother to estimate time-varying parameter states at each epoch, improved satellite initial elements, and improved estimates of constant parameters.
NASA Astrophysics Data System (ADS)
Polichtchouk, Yuri; Tokareva, Olga; Bulgakova, Irina V.
2003-03-01
Methodical problems of space images processing for assessment of atmosphere pollution impact on forest ecosystems using geoinformation systems are developed. An approach to quantitative assessment of atmosphere pollution impact on forest ecosystems is based on calculating relative squares of forest landscapes which are inside atmosphere pollution zones. Landscape structure of forested territories in the southern part of Western Siberia are determined on the basis of procession of middle resolution space images from spaceborn Resource-O. Particularities of atmosphere pollution zones modeling caused by gas burning in torches on territories of oil fields are considered. Pollution zones were revealed by modeling of contaminants dispersal in atmosphere with standard models. Polluted landscapes squares are calculated depending on atmosphere pollution level.
FMEA of manual and automated methods for commissioning a radiotherapy treatment planning system.
Wexler, Amy; Gu, Bruce; Goddu, Sreekrishna; Mutic, Maya; Yaddanapudi, Sridhar; Olsen, Lindsey; Harry, Taylor; Noel, Camille; Pawlicki, Todd; Mutic, Sasa; Cai, Bin
2017-09-01
To evaluate the level of risk involved in treatment planning system (TPS) commissioning using a manual test procedure, and to compare the associated process-based risk to that of an automated commissioning process (ACP) by performing an in-depth failure modes and effects analysis (FMEA). The authors collaborated to determine the potential failure modes of the TPS commissioning process using (a) approaches involving manual data measurement, modeling, and validation tests and (b) an automated process utilizing application programming interface (API) scripting, preloaded, and premodeled standard radiation beam data, digital heterogeneous phantom, and an automated commissioning test suite (ACTS). The severity (S), occurrence (O), and detectability (D) were scored for each failure mode and the risk priority numbers (RPN) were derived based on TG-100 scale. Failure modes were then analyzed and ranked based on RPN. The total number of failure modes, RPN scores and the top 10 failure modes with highest risk were described and cross-compared between the two approaches. RPN reduction analysis is also presented and used as another quantifiable metric to evaluate the proposed approach. The FMEA of a MTP resulted in 47 failure modes with an RPN ave of 161 and S ave of 6.7. The highest risk process of "Measurement Equipment Selection" resulted in an RPN max of 640. The FMEA of an ACP resulted in 36 failure modes with an RPN ave of 73 and S ave of 6.7. The highest risk process of "EPID Calibration" resulted in an RPN max of 576. An FMEA of treatment planning commissioning tests using automation and standardization via API scripting, preloaded, and pre-modeled standard beam data, and digital phantoms suggests that errors and risks may be reduced through the use of an ACP. © 2017 American Association of Physicists in Medicine.
Search for third generation squarks in pp collisions at 13 TeV with CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strobbe, Nadja C.
2016-11-14
Three searches for third generation squarks using proton-proton collision data at a center-of-mass energy of 13 TeV recorded by the CMS experiment at the LHC are presented. The data corre- spond to an integrated luminosity of 12.9 fb -1. The analyses define exclusive search regions and estimate contributions from standard model processes to these regions by using control samples in the data. No significant deviation from the standard model expectation is observed in the data. The results are interpreted in simplified SUSY models of direct and gluino-mediated top squark production. Depending on the model, top squark masses up to 910more » GeV and gluino masses up to 1780 GeV are excluded.« less
Search for a Heavy Neutrino and Right-Handed W of the Left-Right Symmetric Model with Cms Detector
NASA Astrophysics Data System (ADS)
Tlisov, Danila
2013-11-01
This work describes the first search for signals from the production of right-handed WR bosons and heavy neutrinos Nℓ (ℓ = e, μ), that arise naturally in the left-right symmetric extension to the Standard Model, with the CMS Experiment at the LHC using the 7 TeV pp collision data collected in 2010 and 2011 corresponding to an integrated luminosity of 240 pb-1. No excess over expectations from Standard Model processes is observed. For models with exact left-right symmetry (the same coupling in the left and right sectors) we exclude the region in the two-dimensional parameter space that extends to (M
Dark Matter "Collider" from Inelastic Boosted Dark Matter.
Kim, Doojin; Park, Jong-Chul; Shin, Seodong
2017-10-20
We propose a novel dark matter (DM) detection strategy for models with a nonminimal dark sector. The main ingredients in the underlying DM scenario are a boosted DM particle and a heavier dark sector state. The relativistic DM impinged on target material scatters off inelastically to the heavier state, which subsequently decays into DM along with lighter states including visible (standard model) particles. The expected signal event, therefore, accompanies a visible signature by the secondary cascade process associated with a recoiling of the target particle, differing from the typical neutrino signal not involving the secondary signature. We then discuss various kinematic features followed by DM detection prospects at large-volume neutrino detectors with a model framework where a dark gauge boson is the mediator between the standard model particles and DM.
Application of overlay modeling and control with Zernike polynomials in an HVM environment
NASA Astrophysics Data System (ADS)
Ju, JaeWuk; Kim, MinGyu; Lee, JuHan; Nabeth, Jeremy; Jeon, Sanghuck; Heo, Hoyoung; Robinson, John C.; Pierson, Bill
2016-03-01
Shrinking technology nodes and smaller process margins require improved photolithography overlay control. Generally, overlay measurement results are modeled with Cartesian polynomial functions for both intra-field and inter-field models and the model coefficients are sent to an advanced process control (APC) system operating in an XY Cartesian basis. Dampened overlay corrections, typically via exponentially or linearly weighted moving average in time, are then retrieved from the APC system to apply on the scanner in XY Cartesian form for subsequent lot exposure. The goal of the above method is to process lots with corrections that target the least possible overlay misregistration in steady state as well as in change point situations. In this study, we model overlay errors on product using Zernike polynomials with same fitting capability as the process of reference (POR) to represent the wafer-level terms, and use the standard Cartesian polynomials to represent the field-level terms. APC calculations for wafer-level correction are performed in Zernike basis while field-level calculations use standard XY Cartesian basis. Finally, weighted wafer-level correction terms are converted to XY Cartesian space in order to be applied on the scanner, along with field-level corrections, for future wafer exposures. Since Zernike polynomials have the property of being orthogonal in the unit disk we are able to reduce the amount of collinearity between terms and improve overlay stability. Our real time Zernike modeling and feedback evaluation was performed on a 20-lot dataset in a high volume manufacturing (HVM) environment. The measured on-product results were compared to POR and showed a 7% reduction in overlay variation including a 22% terms variation. This led to an on-product raw overlay Mean + 3Sigma X&Y improvement of 5% and resulted in 0.1% yield improvement.
Karvounis, E C; Exarchos, T P; Fotiou, E; Sakellarios, A I; Iliopoulou, D; Koutsouris, D; Fotiadis, D I
2013-01-01
With an ever increasing number of biological models available on the internet, a standardized modelling framework is required to allow information to be accessed and visualized. In this paper we propose a novel Extensible Markup Language (XML) based format called ART-ML that aims at supporting the interoperability and the reuse of models of geometry, blood flow, plaque progression and stent modelling, exported by any cardiovascular disease modelling software. ART-ML has been developed and tested using ARTool. ARTool is a platform for the automatic processing of various image modalities of coronary and carotid arteries. The images and their content are fused to develop morphological models of the arteries in 3D representations. All the above described procedures integrate disparate data formats, protocols and tools. ART-ML proposes a representation way, expanding ARTool, for interpretability of the individual resources, creating a standard unified model for the description of data and, consequently, a format for their exchange and representation that is machine independent. More specifically, ARTool platform incorporates efficient algorithms which are able to perform blood flow simulations and atherosclerotic plaque evolution modelling. Integration of data layers between different modules within ARTool are based upon the interchange of information included in the ART-ML model repository. ART-ML provides a markup representation that enables the representation and management of embedded models within the cardiovascular disease modelling platform, the storage and interchange of well-defined information. The corresponding ART-ML model incorporates all relevant information regarding geometry, blood flow, plaque progression and stent modelling procedures. All created models are stored in a model repository database which is accessible to the research community using efficient web interfaces, enabling the interoperability of any cardiovascular disease modelling software models. ART-ML can be used as a reference ML model in multiscale simulations of plaque formation and progression, incorporating all scales of the biological processes.
BioModels.net Web Services, a free and integrated toolkit for computational modelling software.
Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille
2010-05-01
Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.
Phase-I monitoring of standard deviations in multistage linear profiles
NASA Astrophysics Data System (ADS)
Kalaei, Mahdiyeh; Soleimani, Paria; Niaki, Seyed Taghi Akhavan; Atashgar, Karim
2018-03-01
In most modern manufacturing systems, products are often the output of some multistage processes. In these processes, the stages are dependent on each other, where the output quality of each stage depends also on the output quality of the previous stages. This property is called the cascade property. Although there are many studies in multistage process monitoring, there are fewer works on profile monitoring in multistage processes, especially on the variability monitoring of a multistage profile in Phase-I for which no research is found in the literature. In this paper, a new methodology is proposed to monitor the standard deviation involved in a simple linear profile designed in Phase I to monitor multistage processes with the cascade property. To this aim, an autoregressive correlation model between the stages is considered first. Then, the effect of the cascade property on the performances of three types of T 2 control charts in Phase I with shifts in standard deviation is investigated. As we show that this effect is significant, a U statistic is next used to remove the cascade effect, based on which the investigated control charts are modified. Simulation studies reveal good performances of the modified control charts.
Qualitative dynamics semantics for SBGN process description.
Rougny, Adrien; Froidevaux, Christine; Calzone, Laurence; Paulevé, Loïc
2016-06-16
Qualitative dynamics semantics provide a coarse-grain modeling of networks dynamics by abstracting away kinetic parameters. They allow to capture general features of systems dynamics, such as attractors or reachability properties, for which scalable analyses exist. The Systems Biology Graphical Notation Process Description language (SBGN-PD) has become a standard to represent reaction networks. However, no qualitative dynamics semantics taking into account all the main features available in SBGN-PD had been proposed so far. We propose two qualitative dynamics semantics for SBGN-PD reaction networks, namely the general semantics and the stories semantics, that we formalize using asynchronous automata networks. While the general semantics extends standard Boolean semantics of reaction networks by taking into account all the main features of SBGN-PD, the stories semantics allows to model several molecules of a network by a unique variable. The obtained qualitative models can be checked against dynamical properties and therefore validated with respect to biological knowledge. We apply our framework to reason on the qualitative dynamics of a large network (more than 200 nodes) modeling the regulation of the cell cycle by RB/E2F. The proposed semantics provide a direct formalization of SBGN-PD networks in dynamical qualitative models that can be further analyzed using standard tools for discrete models. The dynamics in stories semantics have a lower dimension than the general one and prune multiple behaviors (which can be considered as spurious) by enforcing the mutual exclusiveness between the activity of different nodes of a same story. Overall, the qualitative semantics for SBGN-PD allow to capture efficiently important dynamical features of reaction network models and can be exploited to further refine them.
Enriching the Web Processing Service
NASA Astrophysics Data System (ADS)
Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer
2014-05-01
The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.
HRP's Healthcare Spin-Offs Through Computational Modeling and Simulation Practice Methodologies
NASA Technical Reports Server (NTRS)
Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Peng, Grace; Morrison, Tina; Erdemir, Ahmet; Myers, Jerry
2014-01-01
Spaceflight missions expose astronauts to novel operational and environmental conditions that pose health risks that are currently not well understood, and perhaps unanticipated. Furthermore, given the limited number of humans that have flown in long duration missions and beyond low Earth-orbit, the amount of research and clinical data necessary to predict and mitigate these health and performance risks are limited. Consequently, NASA's Human Research Program (HRP) conducts research and develops advanced methods and tools to predict, assess, and mitigate potential hazards to the health of astronauts. In this light, NASA has explored the possibility of leveraging computational modeling since the 1970s as a means to elucidate the physiologic risks of spaceflight and develop countermeasures. Since that time, substantial progress has been realized in this arena through a number of HRP funded activates such as the Digital Astronaut Project (DAP) and the Integrated Medical Model (IMM). Much of this success can be attributed to HRP's endeavor to establish rigorous verification, validation, and credibility (VV&C) processes that ensure computational models and simulations (M&S) are sufficiently credible to address issues within their intended scope. This presentation summarizes HRP's activities in credibility of modeling and simulation, in particular through its outreach to the community of modeling and simulation practitioners. METHODS: The HRP requires all M&S that can have moderate to high impact on crew health or mission success must be vetted in accordance to NASA Standard for Models and Simulations, NASA-STD-7009 (7009) [5]. As this standard mostly focuses on engineering systems, the IMM and DAP have invested substantial efforts to adapt the processes established in this standard for their application to biological M&S, which is more prevalent in human health and performance (HHP) and space biomedical research and operations [6,7]. These methods have also generated substantial interest by the broader medical community though institutions like the National Institutes of Health (NIH) and the Food and Drug Administration (FDA) to develop similar standards and guidelines applicable to the larger medical operations and research community. DISCUSSION: Similar to NASA, many leading government agencies, health institutions and medical product developers around the world are recognizing the potential of computational M&S to support clinical research and decision making. In this light, substantial investments are being made in computational medicine and notable discoveries are being realized [8]. However, there is a lack of broadly applicable practice guidance for the development and implementation of M&S in clinical care and research in a manner that instills confidence among medical practitioners and biological researchers [9,10]. In this presentation, we will give an overview on how HRP is working with the NIH's Interagency Modeling and Analysis Group (IMAG), the FDA and the American Society of Mechanical Engineers (ASME) to leverage NASA's biomedical VV&C processes to establish a new regulatory standard for Verification and Validation in Computational Modeling of Medical Devices, and Guidelines for Credible Practice of Computational Modeling and Simulation in Healthcare.
Egan, Sarah J; Piek, Jan P; Dyck, Murray J; Rees, Clare S; Hagger, Martin S
2013-10-01
Clinical perfectionism is a transdiagnostic process that has been found to maintain eating disorders, anxiety disorders and depression. Cognitive behavioural models explaining the maintenance of clinical perfectionism emphasize the contribution of dichotomous thinking and resetting standards higher following both success and failure in meeting their goals. There has been a paucity of research examining the predictions of the models and motivation to change perfectionism. Motivation to change is important as individuals with clinical perfectionism often report many perceived benefits of their perfectionism; they are, therefore, likely to be ambivalent regarding changing perfectionism. The aim was to compare qualitative responses regarding questions about motivation to change standards and cognitions regarding failure to meet a personal standard in two contrasting groups with high and low negative perfectionism. Negative perfectionism refers to concern over not meeting personal standards. A clinical group with a range of axis 1 diagnoses who were elevated on negative perfectionism were compared to a group of athletes who were low on negative perfectionism. Results indicated that the clinical group perceived many negative consequences of their perfectionism. They also, however, reported numerous benefits and the majority stated that they would prefer not to change their perfectionism. The clinical group also reported dichotomous thinking and preferring to either keep standards the same or reset standards higher following failure, whilst the athlete group reported they would keep standards the same or set them lower. The findings support predictions of the cognitive behavioural model of clinical perfectionism.
Tabrizi, Jafar-Sadegh; Farahbakhsh, Mostafa; Shahgoli, Javad; Rahbar, Mohammad Reza; Naghavi-Behzad, Mohammad; Ahadi, Hamid-Reza; Azami-Aghdash, Saber
2015-10-01
Excellence and quality models are comprehensive methods for improving the quality of healthcare. The aim of this study was to design excellence and quality model for training centers of primary health care using Delphi method. In this study, Delphi method was used. First, comprehensive information were collected using literature review. In extracted references, 39 models were identified from 34 countries and related sub-criteria and standards were extracted from 34 models (from primary 39 models). Then primary pattern including 8 criteria, 55 sub-criteria, and 236 standards was developed as a Delphi questionnaire and evaluated in four stages by 9 specialists of health care system in Tabriz and 50 specialists from all around the country. Designed primary model (8 criteria, 55 sub-criteria, and 236 standards) were concluded with 8 criteria, 45 sub-criteria, and 192 standards after 4 stages of evaluations by specialists. Major criteria of the model are leadership, strategic and operational planning, resource management, information analysis, human resources management, process management, costumer results, and functional results, where the top score was assigned as 1000 by specialists. Functional results had the maximum score of 195 whereas planning had the minimum score of 60. Furthermore the most and the least sub-criteria was for leadership with 10 sub-criteria and strategic planning with 3 sub-criteria, respectively. The model that introduced in this research has been designed following 34 reference models of the world. This model could provide a proper frame for managers of health system in improving quality.
Modified SPC for short run test and measurement process in multi-stations
NASA Astrophysics Data System (ADS)
Koh, C. K.; Chin, J. F.; Kamaruddin, S.
2018-03-01
Due to short production runs and measurement error inherent in electronic test and measurement (T&M) processes, continuous quality monitoring through real-time statistical process control (SPC) is challenging. Industry practice allows the installation of guard band using measurement uncertainty to reduce the width of acceptance limit, as an indirect way to compensate the measurement errors. This paper presents a new SPC model combining modified guard band and control charts (\\bar{\\text{Z}} chart and W chart) for short runs in T&M process in multi-stations. The proposed model standardizes the observed value with measurement target (T) and rationed measurement uncertainty (U). S-factor (S f) is introduced to the control limits to improve the sensitivity in detecting small shifts. The model was embedded in automated quality control system and verified with a case study in real industry.
Human Language Technology: Opportunities and Challenges
2005-01-01
because of the connections to and reliance on signal processing. Audio diarization critically includes indexing of speakers [12], since speaker ...to reduce inter- speaker variability in training. Standard techniques include vocal-tract length normalization, adaptation of acoustic models using...maximum likelihood linear regression (MLLR), and speaker -adaptive training based on MLLR. The acoustic models are mixtures of Gaussians, typically with
ERIC Educational Resources Information Center
Huang, Xiaoxia; Cribbs, Jennifer
2017-01-01
This study examined mathematics and science teachers' perceptions and use of four types of examples, including typical textbook examples (standard worked examples) and erroneous worked examples in the written form as well as mastery modelling examples and peer modelling examples involving the verbalization of the problem-solving process. Data…
Backus-Gilbert inversion of travel time data
NASA Technical Reports Server (NTRS)
Johnson, L. E.
1972-01-01
Application of the Backus-Gilbert theory for geophysical inverse problems to the seismic body wave travel-time problem is described. In particular, it is shown how to generate earth models that fit travel-time data to within one standard error and having generated such models how to describe their degree of uniqueness. An example is given to illustrate the process.
ERIC Educational Resources Information Center
Powell, Kristin; Wells, Marcella
2002-01-01
Compares the effects of three experiential science lessons in meeting the objectives of the Colorado model content science standards. Uses Kolb's (1984) experiential learning model as a framework for understanding the process by which students engage in learning when participating in experiential learning activities. Uses classroom exams and…
ERIC Educational Resources Information Center
Alonso, Fernando; Manrique, Daniel; Martínez, Loïc; Viñes, José M.
2015-01-01
The main objective of higher education institutions is to educate students to high standards to proficiently perform their role in society. Elsewhere we presented empirical evidence illustrating that the use of a blended learning approach to the learning process that applies a moderate constructivist e-learning instructional model improves…
Dependency-based Siamese long short-term memory network for learning sentence representations
Zhu, Wenhao; Ni, Jianyue; Wei, Baogang; Lu, Zhiguo
2018-01-01
Textual representations play an important role in the field of natural language processing (NLP). The efficiency of NLP tasks, such as text comprehension and information extraction, can be significantly improved with proper textual representations. As neural networks are gradually applied to learn the representation of words and phrases, fairly efficient models of learning short text representations have been developed, such as the continuous bag of words (CBOW) and skip-gram models, and they have been extensively employed in a variety of NLP tasks. Because of the complex structure generated by the longer text lengths, such as sentences, algorithms appropriate for learning short textual representations are not applicable for learning long textual representations. One method of learning long textual representations is the Long Short-Term Memory (LSTM) network, which is suitable for processing sequences. However, the standard LSTM does not adequately address the primary sentence structure (subject, predicate and object), which is an important factor for producing appropriate sentence representations. To resolve this issue, this paper proposes the dependency-based LSTM model (D-LSTM). The D-LSTM divides a sentence representation into two parts: a basic component and a supporting component. The D-LSTM uses a pre-trained dependency parser to obtain the primary sentence information and generate supporting components, and it also uses a standard LSTM model to generate the basic sentence components. A weight factor that can adjust the ratio of the basic and supporting components in a sentence is introduced to generate the sentence representation. Compared with the representation learned by the standard LSTM, the sentence representation learned by the D-LSTM contains a greater amount of useful information. The experimental results show that the D-LSTM is superior to the standard LSTM for sentences involving compositional knowledge (SICK) data. PMID:29513748
Experiment Analysis and Modelling of Compaction Behaviour of Ag60Cu30Sn10 Mixed Metal Powders
NASA Astrophysics Data System (ADS)
Zhou, Mengcheng; Huang, Shangyu; Liu, Wei; Lei, Yu; Yan, Shiwei
2018-03-01
A novel process method combines powder compaction and sintering was employed to fabricate thin sheets of cadmium-free silver based filler metals, the compaction densification behaviour of Ag60Cu30Sn10 mixed metal powders was investigated experimentally. Based on the equivalent density method, the density-dependent Drucker-Prager Cap (DPC) model was introduced to model the powder compaction behaviour. Various experiment procedures were completed to determine the model parameters. The friction coefficients in lubricated and unlubricated die were experimentally determined. The determined material parameters were validated by experiments and numerical simulation of powder compaction process using a user subroutine (USDFLD) in ABAQUS/Standard. The good agreement between the simulated and experimental results indicates that the determined model parameters are able to describe the compaction behaviour of the multicomponent mixed metal powders, which can be further used for process optimization simulations.
Constraints of beyond Standard Model parameters from the study of neutrinoless double beta decay
NASA Astrophysics Data System (ADS)
Stoica, Sabin
2017-12-01
Neutrinoless double beta (0νββ) decay is a beyond Standard Model (BSM) process whose discovery would clarify if the lepton number is conserved, decide on the neutrinos character (are they Dirac or Majorana particles?) and give a hint on the scale of their absolute masses. Also, from the study of 0νββ one can constrain other BSM parameters related to different scenarios by which this process can occur. In this paper I make first a short review on the actual challenges to calculate precisely the phase space factors and nuclear matrix elements entering the 0νββ decay lifetimes, and I report results of our group for these quantities. Then, taking advance of the most recent experimental limits for 0νββ lifetimes, I present new constraints of the neutrino mass parameters associated with different mechanisms of occurrence of the 0νββ decay mode.
Standardization as an Arena for Open Innovation
NASA Astrophysics Data System (ADS)
Grøtnes, Endre
This paper argues that anticipatory standardization can be viewed as an arena for open innovation and shows this through two cases from mobile telecommunication standardization. One case is the Android initiative by Google and the Open Handset Alliance, while the second case is the general standardization work of the Open Mobile Alliance. The paper shows how anticipatory standardization intentionally uses inbound and outbound streams of research and intellectual property to create new innovations. This is at the heart of the open innovation model. The standardization activities use both pooling of R&D and the distribution of freely available toolkits to create products and architectures that can be utilized by the participants and third parties to leverage their innovation. The paper shows that the technology being standardized needs to have a systemic nature to be part of an open innovation process.
On-line identification of fermentation processes for ethanol production.
Câmara, M M; Soares, R M; Feital, T; Naomi, P; Oki, S; Thevelein, J M; Amaral, M; Pinto, J C
2017-07-01
A strategy for monitoring fermentation processes, specifically, simultaneous saccharification and fermentation (SSF) of corn mash, was developed. The strategy covered the development and use of first principles, semimechanistic and unstructured process model based on major kinetic phenomena, along with mass and energy balances. The model was then used as a reference model within an identification procedure capable of running on-line. The on-line identification procedure consists on updating the reference model through the estimation of corrective parameters for certain reaction rates using the most recent process measurements. The strategy makes use of standard laboratory measurements for sugars quantification and in situ temperature and liquid level data. The model, along with the on-line identification procedure, has been tested against real industrial data and have been able to accurately predict the main variables of operational interest, i.e., state variables and its dynamics, and key process indicators. The results demonstrate that the strategy is capable of monitoring, in real time, this complex industrial biomass fermentation. This new tool provides a great support for decision-making and opens a new range of opportunities for industrial optimization.
Bayesian sensitivity analysis of bifurcating nonlinear models
NASA Astrophysics Data System (ADS)
Becker, W.; Worden, K.; Rowson, J.
2013-01-01
Sensitivity analysis allows one to investigate how changes in input parameters to a system affect the output. When computational expense is a concern, metamodels such as Gaussian processes can offer considerable computational savings over Monte Carlo methods, albeit at the expense of introducing a data modelling problem. In particular, Gaussian processes assume a smooth, non-bifurcating response surface. This work highlights a recent extension to Gaussian processes which uses a decision tree to partition the input space into homogeneous regions, and then fits separate Gaussian processes to each region. In this way, bifurcations can be modelled at region boundaries and different regions can have different covariance properties. To test this method, both the treed and standard methods were applied to the bifurcating response of a Duffing oscillator and a bifurcating FE model of a heart valve. It was found that the treed Gaussian process provides a practical way of performing uncertainty and sensitivity analysis on large, potentially-bifurcating models, which cannot be dealt with by using a single GP, although an open problem remains how to manage bifurcation boundaries that are not parallel to coordinate axes.
Hospitals' strategies for orchestrating selection of physician preference items.
Montgomery, Kathleen; Schneller, Eugene S
2007-06-01
This article analyzes hospitals' strategies to shape physicians' behavior and counter suppliers' power in purchasing physician preference items. Two models of standardization are limitations on the range of manufacturers or products (the "formulary" model) and price ceilings for particular item categories (the "payment-cap" model), both requiring processes to define product equivalencies often with inadequate product comparison data. The formulary model is more difficult to implement because of physicians' resistance to top-down dictates. The payment-cap model is more feasible because it preserves physicians' choice while also restraining manufacturers' power. Hospitals may influence physicians' involvement through a process of orchestration that includes committing to improve clinical facilities, scheduling, and training and fostering a culture of mutual trust and respect.
Hospitals' Strategies for Orchestrating Selection of Physician Preference Items
Montgomery, Kathleen; Schneller, Eugene S
2007-01-01
This article analyzes hospitals' strategies to shape physicians' behavior and counter suppliers' power in purchasing physician preference items. Two models of standardization are limitations on the range of manufacturers or products (the “formulary” model) and price ceilings for particular item categories (the “payment-cap” model), both requiring processes to define product equivalencies often with inadequate product comparison data. The formulary model is more difficult to implement because of physicians' resistance to top-down dictates. The payment-cap model is more feasible because it preserves physicians' choice while also restraining manufacturers' power. Hospitals may influence physicians' involvement through a process of orchestration that includes committing to improve clinical facilities, scheduling, and training and fostering a culture of mutual trust and respect. PMID:17517118
Processing on weak electric signals by the autoregressive model
NASA Astrophysics Data System (ADS)
Ding, Jinli; Zhao, Jiayin; Wang, Lanzhou; Li, Qiao
2008-10-01
A model of the autoregressive model of weak electric signals in two plants was set up for the first time. The result of the AR model to forecast 10 values of the weak electric signals is well. It will construct a standard set of the AR model coefficient of the plant electric signal and the environmental factor, and can be used as the preferences for the intelligent autocontrol system based on the adaptive characteristic of plants to achieve the energy saving on agricultural productions.