Sample records for three-valued abstraction refinement

  1. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  2. Thinking Inside the Box: The Health Cube Paradigm for Health and Wellness Program Evaluation and Design

    PubMed Central

    Harris, Sharon

    2013-01-01

    Abstract Appropriately constructed health promotions can improve population health. The authors developed a practical model for designing, evaluating, and improving initiatives to provide optimal value. Three independent model dimensions (impact, engagement, and sustainability) and the resultant three-dimensional paradigm were described using hypothetical case studies, including a walking challenge, a health risk assessment survey, and an individual condition management program. The 3-dimensional model is illustrated and the dimensions are defined. Calculation of a 3-dimensional score for program comparisons, refinements, and measurement is explained. Program 1, the walking challenge, had high engagement and impact, but limited sustainability. Program 2, the health risk assessment survey, had high engagement and sustainability but limited impact. Program 3, the on-site condition management program, had measurable impact and sustainability but limited engagement, because of a lack of program capacity. Each initiative, though successful in 2 dimensions, lacked sufficient evolution along the third axis for optimal value. Calculation of a 3-dimensional score is useful for health promotion program development comparison and refinements, and overall measurement of program success. (Population Health Management 2013;16:291–295) PMID:23869538

  3. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  4. The magnetic structure of Co(NCNH)₂ as determined by (spin-polarized) neutron diffraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, Philipp; Houben, Andreas; Senyshyn, Anatoliy

    The magnetic structure of Co(NCNH)₂ has been studied by neutron diffraction data below 10 K using the SPODI and DNS instruments at FRM II, Munich. There is an intensity change in the (1 1 0) and (0 2 0) reflections around 4 K, to be attributed to the onset of a magnetic ordering of the Co²⁺ spins. Four different spin orientations have been evaluated on the basis of Rietveld refinements, comprising antiferromagnetic as well as ferromagnetic ordering along all three crystallographic axes. Both residual values and supplementary susceptibility measurements evidence that only a ferromagnetic ordering with all Co²⁺ spins parallelmore » to the c axis is a suitable description of the low-temperature magnetic ground state of Co(NCNH)₂. The deviation of the magnetic moment derived by the Rietveld refinement from the expectancy value may be explained either by an incomplete saturation of the moment at temperatures slightly below the Curie temperature or by a small Jahn–Teller distortion. - Graphical abstract: The magnetic ground state of Co(NCNH)₂ has been clarified by (spin-polarized) neutron diffraction data at low temperatures. Intensity changes below 4 K arise due to the onset of ferromagnetic ordering of the Co²⁺ spins parallel to the c axis, corroborated by various (magnetic) Rietveld refinements. Highlights: • Powderous Co(NCNH)₂ has been subjected to (spin-polarized) neutron diffraction. • Magnetic susceptibility data of Co(NCNH)₂ have been collected. • Below 4 K, the magnetic moments align ferromagnetically with all Co²⁺ spins parallel to the c axis. • The magnetic susceptibility data yield an effective magnetic moment of 4.68 and a Weiss constant of -13(2) K. • The ferromagnetic Rietveld refinement leads to a magnetic moment of 2.6 which is close to the expectancy value of 3.« less

  5. Convergence of an hp-Adaptive Finite Element Strategy in Two and Three Space-Dimensions

    NASA Astrophysics Data System (ADS)

    Bürg, Markus; Dörfler, Willy

    2010-09-01

    We show convergence of an automatic hp-adaptive refinement strategy for the finite element method on the elliptic boundary value problem. The strategy is a generalization of a refinement strategy proposed for one-dimensional situations to problems in two and three space-dimensions.

  6. Refining a taxonomy for guideline implementation: results of an exercise in abstract classification

    PubMed Central

    2013-01-01

    Background To better understand the efficacy of various implementation strategies, improved methods for describing and classifying the nature of these strategies are urgently required. The aim of this study was to develop and pilot the feasibility of a taxonomy to classify the nature and content of implementation strategies. Methods A draft implementation taxonomy was developed based on the Cochrane Effective Practice and Organisation of Care (EPOC) data collection checklist. The draft taxonomy had four domains (professional, financial, organisational and regulatory) covering 49 distinct strategies. We piloted the draft taxonomy by using it to classify the implementation strategies described in the conference abstracts of the implementation stream of the 2010 Guideline International Network Conference. Five authors classified the strategies in each abstract individually. Final categorisation was then carried out in a face-to-face consensus meeting involving three authors. Results The implementation strategies described in 71 conference abstracts were classified. Approximately 15.5% of abstracts utilised strategies that could not be categorised using the draft taxonomy. Of those strategies that could be categorised, the majority were professionally focused (57%). A total of 41% of projects used only one implementation strategy, with 29% using two and 31% three or more. The three most commonly used strategies were changes in quality assurance, quality improvement and/or performance measurement systems, changes in information and communication technology, and distribution of guideline materials (via hard-copy, audio-visual and/or electronic means). Conclusions Further refinement of the draft taxonomy is required to provide hierarchical dimensions and granularity, particularly in the areas of patient-focused interventions, those concerned with audit and feedback and quality improvement, and electronic forms of implementation, including electronic decision support. PMID:23497520

  7. Refining a taxonomy for guideline implementation: results of an exercise in abstract classification.

    PubMed

    Mazza, Danielle; Bairstow, Phillip; Buchan, Heather; Chakraborty, Samantha Paubrey; Van Hecke, Oliver; Grech, Cathy; Kunnamo, Ilkka

    2013-03-15

    To better understand the efficacy of various implementation strategies, improved methods for describing and classifying the nature of these strategies are urgently required. The aim of this study was to develop and pilot the feasibility of a taxonomy to classify the nature and content of implementation strategies. A draft implementation taxonomy was developed based on the Cochrane Effective Practice and Organisation of Care (EPOC) data collection checklist. The draft taxonomy had four domains (professional, financial, organisational and regulatory) covering 49 distinct strategies. We piloted the draft taxonomy by using it to classify the implementation strategies described in the conference abstracts of the implementation stream of the 2010 Guideline International Network Conference. Five authors classified the strategies in each abstract individually. Final categorisation was then carried out in a face-to-face consensus meeting involving three authors. The implementation strategies described in 71 conference abstracts were classified. Approximately 15.5% of abstracts utilised strategies that could not be categorised using the draft taxonomy. Of those strategies that could be categorised, the majority were professionally focused (57%). A total of 41% of projects used only one implementation strategy, with 29% using two and 31% three or more. The three most commonly used strategies were changes in quality assurance, quality improvement and/or performance measurement systems, changes in information and communication technology, and distribution of guideline materials (via hard-copy, audio-visual and/or electronic means). Further refinement of the draft taxonomy is required to provide hierarchical dimensions and granularity, particularly in the areas of patient-focused interventions, those concerned with audit and feedback and quality improvement, and electronic forms of implementation, including electronic decision support.

  8. Assume-Guarantee Abstraction Refinement Meets Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas

    2014-01-01

    Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.

  9. Grid-size dependence of Cauchy boundary conditions used to simulate stream-aquifer interactions

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2010-01-01

    This work examines the simulation of stream–aquifer interactions as grids are refined vertically and horizontally and suggests that traditional methods for calculating conductance can produce inappropriate values when the grid size is changed. Instead, different grid resolutions require different estimated values. Grid refinement strategies considered include global refinement of the entire model and local refinement of part of the stream. Three methods of calculating the conductance of the Cauchy boundary conditions are investigated. Single- and multi-layer models with narrow and wide streams produced stream leakages that differ by as much as 122% as the grid is refined. Similar results occur for globally and locally refined grids, but the latter required as little as one-quarter the computer execution time and memory and thus are useful for addressing some scale issues of stream–aquifer interactions. Results suggest that existing grid-size criteria for simulating stream–aquifer interactions are useful for one-layer models, but inadequate for three-dimensional models. The grid dependence of the conductance terms suggests that values for refined models using, for example, finite difference or finite-element methods, cannot be determined from previous coarse-grid models or field measurements. Our examples demonstrate the need for a method of obtaining conductances that can be translated to different grid resolutions and provide definitive test cases for investigating alternative conductance formulations.

  10. Concrete Model Checking with Abstract Matching and Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu Corina S.; Peianek Radek; Visser, Willem

    2005-01-01

    We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.

  11. An adaptive embedded mesh procedure for leading-edge vortex flows

    NASA Technical Reports Server (NTRS)

    Powell, Kenneth G.; Beer, Michael A.; Law, Glenn W.

    1989-01-01

    A procedure for solving the conical Euler equations on an adaptively refined mesh is presented, along with a method for determining which cells to refine. The solution procedure is a central-difference cell-vertex scheme. The adaptation procedure is made up of a parameter on which the refinement decision is based, and a method for choosing a threshold value of the parameter. The refinement parameter is a measure of mesh-convergence, constructed by comparison of locally coarse- and fine-grid solutions. The threshold for the refinement parameter is based on the curvature of the curve relating the number of cells flagged for refinement to the value of the refinement threshold. Results for three test cases are presented. The test problem is that of a delta wing at angle of attack in a supersonic free-stream. The resulting vortices and shocks are captured efficiently by the adaptive code.

  12. An object-oriented approach for parallel self adaptive mesh refinement on block structured grids

    NASA Technical Reports Server (NTRS)

    Lemke, Max; Witsch, Kristian; Quinlan, Daniel

    1993-01-01

    Self-adaptive mesh refinement dynamically matches the computational demands of a solver for partial differential equations to the activity in the application's domain. In this paper we present two C++ class libraries, P++ and AMR++, which significantly simplify the development of sophisticated adaptive mesh refinement codes on (massively) parallel distributed memory architectures. The development is based on our previous research in this area. The C++ class libraries provide abstractions to separate the issues of developing parallel adaptive mesh refinement applications into those of parallelism, abstracted by P++, and adaptive mesh refinement, abstracted by AMR++. P++ is a parallel array class library to permit efficient development of architecture independent codes for structured grid applications, and AMR++ provides support for self-adaptive mesh refinement on block-structured grids of rectangular non-overlapping blocks. Using these libraries, the application programmers' work is greatly simplified to primarily specifying the serial single grid application and obtaining the parallel and self-adaptive mesh refinement code with minimal effort. Initial results for simple singular perturbation problems solved by self-adaptive multilevel techniques (FAC, AFAC), being implemented on the basis of prototypes of the P++/AMR++ environment, are presented. Singular perturbation problems frequently arise in large applications, e.g. in the area of computational fluid dynamics. They usually have solutions with layers which require adaptive mesh refinement and fast basic solvers in order to be resolved efficiently.

  13. Characterization and Evaluation of Re-Refined Engine Lubricating Oil.

    DTIC Science & Technology

    1981-12-01

    performance of re-refineod and virgin oils and to Investigate the potential esubstantlal esquivalknced of re-refined and virgin lubricating oils. The...d 20. Abstract (continued) engine deposits derived from virgin and re-refined engine oils. (2) The effects of virgin and re-refined oils on engine...blowby composition and engine deposit generation were determined using a spark ignition engine and, 3) Virgin and re-refined basestock production

  14. A Mathematical Model for Railway Control Systems

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.

    1996-01-01

    We present a general method for modeling safety aspects of railway control systems. Using our modeling method, one can progressively refine an abstract railway safety model, sucessively adding layers of detail about how a real system actually operates, while maintaining a safety property that refines the original abstract safety property. This method supports a top-down approach to specification of railway control systems and to proof of a variety of safety-related properties. We demonstrate our method by proving safety of the classical block control system.

  15. Separation of Cs and Sr from LiCl-KCl eutectic salt via a zone-refining process for pyroprocessing waste salt minimization

    NASA Astrophysics Data System (ADS)

    Shim, Moonsoo; Choi, Ho-Gil; Choi, Jeong-Hun; Yi, Kyung-Woo; Lee, Jong-Hyeon

    2017-08-01

    The purification of a LiCl-KCl salt mixture was carried out by a zone-refining process. To improve the throughput of zone refining, three heaters were installed in the zone refiner. The zone-refining method was used to grow pure LiCl-KCl salt ingots from a LiCl-KCl-CsCl-SrCl2 salt mixture. The main investigated parameters were the heater speed and the number of passes. From each zone-refined salt ingot, samples were collected axially along the salt ingot and the concentrations of Sr and Cs were determined. Experimental results show that the Sr and Cs concentrations at the initial region of the ingot were low and increased to a maximum at the final freezing region of the salt ingot. Concentration results of the zone-refined salt were compared with theoretical results furnished by the proposed model to validate its predictions. The keff values for Sr and Cs were 0.55 and 0.47, respectively. The correlation between the salt composition and separation behavior was also investigated. The keff values of the Sr in LiCl-KCl-SrCl2 and the Cs in LiCl-KCl-CsCl were found to be 0.53 and 0.44, respectively, by fitting the experimental data into the proposed model.

  16. A tribal abstraction network for SNOMED CT target hierarchies without attribute relationships.

    PubMed

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Chen, Yan; Agrawal, Ankur; Case, James T; Hripcsak, George

    2015-05-01

    Large and complex terminologies, such as Systematized Nomenclature of Medicine-Clinical Terms (SNOMED CT), are prone to errors and inconsistencies. Abstraction networks are compact summarizations of the content and structure of a terminology. Abstraction networks have been shown to support terminology quality assurance. In this paper, we introduce an abstraction network derivation methodology which can be applied to SNOMED CT target hierarchies whose classes are defined using only hierarchical relationships (ie, without attribute relationships) and similar description-logic-based terminologies. We introduce the tribal abstraction network (TAN), based on the notion of a tribe-a subhierarchy rooted at a child of a hierarchy root, assuming only the existence of concepts with multiple parents. The TAN summarizes a hierarchy that does not have attribute relationships using sets of concepts, called tribal units that belong to exactly the same multiple tribes. Tribal units are further divided into refined tribal units which contain closely related concepts. A quality assurance methodology that utilizes TAN summarizations is introduced. A TAN is derived for the Observable entity hierarchy of SNOMED CT, summarizing its content. A TAN-based quality assurance review of the concepts of the hierarchy is performed, and erroneous concepts are shown to appear more frequently in large refined tribal units than in small refined tribal units. Furthermore, more erroneous concepts appear in large refined tribal units of more tribes than of fewer tribes. In this paper we introduce the TAN for summarizing SNOMED CT target hierarchies. A TAN was derived for the Observable entity hierarchy of SNOMED CT. A quality assurance methodology utilizing the TAN was introduced and demonstrated. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Characterization of ultra-fine grained aluminum produced by accumulative back extrusion (ABE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alihosseini, H., E-mail: hamid.alihossieni@gmail.com; Materials Science and Engineering Department, Engineering School, Amirkabir University, Tehran; Faraji, G.

    2012-06-15

    In the present work, the microstructural evolutions and microhardness of AA1050 subjected to one, two and three passes of accumulative back extrusion (ABE) were investigated. The microstructural evolutions were characterized using transmission electron microscopy. The results revealed that applying three passes of accumulative back extrusion led to significant grain refinement. The initial grain size of 47 {mu}m was refined to the grains of 500 nm after three passes of ABE. Increasing the number of passes resulted in more decrease in grain size, better microstructure homogeneity and increase in the microhardness. The cross-section of ABEed specimen consisted of two different zones:more » (i) shear deformation zone, and (ii) normal deformation zone. The microhardness measurements indicated that the hardness increased from the initial value of 31 Hv to 67 Hv, verifying the significant microstructural refinement via accumulative back extrusion. - Highlights: Black-Right-Pointing-Pointer A significant grain refinement can be achieved in AA1050, Al alloy by applying ABE. Black-Right-Pointing-Pointer Microstructural homogeneity of ABEed samples increased by increasing the number of ABE cycles. Black-Right-Pointing-Pointer A substantial increase in the hardness, from 31 Hv to 67 Hv, was recorded.« less

  18. Developing stressor-watershed function relationships to refine the national maps of watershed integrity

    EPA Science Inventory

    Abstract ESA 2017Developing stressor-watershed function relationships to refine the national maps of watershed integrityJohnson, Z.C., S.G. Leibowitz, and R.A. Hill. To be submitted to the Ecological Society of America Annual Meeting, Portland, OR. August 2017.Human-induced ecolo...

  19. Dual-level direct dynamics studies for the hydrogen abstraction reaction of 1,1-difluoroethane with O( 3P)

    NASA Astrophysics Data System (ADS)

    Liu, Jing-yao; Li, Ze-sheng; Dai, Zhen-wen; Zhang, Gang; Sun, Chia-chung

    2004-01-01

    We present dual-level direct dynamics calculations for the CH 3CHF 2 + O( 3P) hydrogen abstraction reaction in a wide temperature range, based on canonical variational transition-state theory including small curvature tunneling corrections. For this reaction, three distinct transition states, one for α-abstraction and two for β-abstraction, have been located. The potential energy surface information is obtained at the MP2(full)/6-311G(d,p) level of theory, and higher-level single-point calculations for the stationary points are preformed at several levels, namely QCISD(T)/6-311+G(3df,3pd), G2, and G3 using the MP2 geometries, as well as at the G3//MP4SDQ/6-311G(d,p) level. The energy profiles are further refined with the interpolated single-point energies method at the G3//MP2(full)/6-311G(d,p) level. The total rate constants match the experimental data reasonable well in the measured temperature range 1110-1340 K. It is shown that at low temperature α-abstraction may be the major reaction channel, while β-abstraction will have more contribution to the whole reaction rate as the temperature increases.

  20. 75 FR 5854 - Proposed Collection; Comment Request for NOT-141440-08

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ...- 08, Production Tax Credit for Refined Coal. DATES: Written comments should be received on or before...: Production Tax Credit for Refined Coal. OMB Number: 1545-2158. Notice Number: NOT-141440-08. Abstract: This notice sets forth interim guidance pending the issuance of regulations relating to the tax credit under...

  1. Cerebellar input configuration toward object model abstraction in manipulation tasks.

    PubMed

    Luque, Niceto R; Garrido, Jesus A; Carrillo, Richard R; Coenen, Olivier J-M D; Ros, Eduardo

    2011-08-01

    It is widely assumed that the cerebellum is one of the main nervous centers involved in correcting and refining planned movement and accounting for disturbances occurring during movement, for instance, due to the manipulation of objects which affect the kinematics and dynamics of the robot-arm plant model. In this brief, we evaluate a way in which a cerebellar-like structure can store a model in the granular and molecular layers. Furthermore, we study how its microstructure and input representations (context labels and sensorimotor signals) can efficiently support model abstraction toward delivering accurate corrective torque values for increasing precision during different-object manipulation. We also describe how the explicit (object-related input labels) and implicit state input representations (sensorimotor signals) complement each other to better handle different models and allow interpolation between two already stored models. This facilitates accurate corrections during manipulations of new objects taking advantage of already stored models.

  2. Scientific knowledge and modern prospecting

    USGS Publications Warehouse

    Neuerburg, G.J.

    1985-01-01

    Modern prospecting is the systematic search for specified and generally ill-exposed components of the Earth's crust known as ore. This prospecting depends entirely on reliable, or scientific knowledge for guidance and for recognition of the search objects. Improvement in prospecting results from additions and refinements to scientific knowledge. Scientific knowledge is an ordered distillation of observations too numerous and too complex in themselves for easy understanding and for effective management. The ordering of these observations is accomplished by an evolutionary hierarchy of abstractions. These abstractions employ simplified descriptions consisting of characterization by selected properties, sampling to represent much larger parts of a phenomenon, generalized mappings of patterns of geometrical and numerical relations among properties, and explanation (theory) of these patterns as functional relations among the selected properties. Each abstraction is predicated on the mode of abstraction anticipated for the next higher level, so that research is a deductive process in which the highest level, theory, is indispensible for the growth and refinement of scientific knowledge, and therefore of prospecting methodology. ?? 1985 Springer-Verlag.

  3. The role of social message using norm abstraction level and ecological value orientation to achieve sustainable consumption

    NASA Astrophysics Data System (ADS)

    Ekasari, A.

    2018-01-01

    Pro-environmental behavior is one of human activities to achieve sustainability. In order to encourage people to do so, it needs contribution from marketing discipline using social message. The research aims to investigate the effect of social message framed by norm abstraction level and ecological value orientation on attitude and intention to act pro-environmental behavior in the context of littering. This study implemented a 3 (message framing: biospheric/altruistic/egoistic) × 2 (norm abstraction level : abstract/concrete) between subject experimental design to collect the data. An independent sample t test was used to analyze the data. The results indicate that a social message using concrete norm combined with the three ecological value orientation gains more positive response than the use of abstract norm with the same ecological value orientations. Findings of the research are expected to help government or other institutions to create an appropriate social message in anti littering campaign and motivates people to change their behavior in practicing sustainable consumption.

  4. GRID REFINEMENT TESTS OF A LEAST-SQUARES FINITE ELEMENT METHOD FOR ADVECTIVE TRANSPORT OF REACTIVE SPECIES. (R825200)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  5. Value-Based Requirements Traceability: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Egyed, Alexander; Grünbacher, Paul; Heindl, Matthias; Biffl, Stefan

    Traceability from requirements to code is mandated by numerous software development standards. These standards, however, are not explicit about the appropriate level of quality of trace links. From a technical perspective, trace quality should meet the needs of the intended trace utilizations. Unfortunately, long-term trace utilizations are typically unknown at the time of trace acquisition which represents a dilemma for many companies. This chapter suggests ways to balance the cost and benefits of requirements traceability. We present data from three case studies demonstrating that trace acquisition requires broad coverage but can tolerate imprecision. With this trade-off our lessons learned suggest a traceability strategy that (1) provides trace links more quickly, (2) refines trace links according to user-defined value considerations, and (3) supports the later refinement of trace links in case the initial value consideration has changed over time. The scope of our work considers the entire life cycle of traceability instead of just the creation of trace links.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leimkuhler, B.; Hermans, J.; Skeel, R.D.

    A workshop was held on algorithms and parallel implementations for macromolecular dynamics, protein folding, and structural refinement. This document contains abstracts and brief reports from that workshop.

  7. Evaluating lubricating capacity of vegetal oils using Abbott-Firestone curve

    NASA Astrophysics Data System (ADS)

    Georgescu, C.; Cristea, G. C.; Dima, C.; Deleanu, L.

    2017-02-01

    The paper presents the change of functional parameters defined on the Abbott-Firestone curve in order to evaluate the surface quality of the balls from the four ball tester, after tests done with several vegetable oils. The tests were done using two grades of rapeseed oil (degummed and refined) and two grades of soybean oil (coarse and degummed) and a common transmission oil (T90). Test parameters were 200 N and 0.576 m/s (1500 rpm) for 60 minutes. For the refined rapeseed oil, the changes in shape of the Abbott-Firestone curves are more dramatic, these being characterized by high values of Spk (the average value for the wear scars on the three balls), thus being 40% of the sum Svk + Sk + Spk, percentage also obtained for the soybean oil, but the value Spk being lower. For the degummed soybean oil, the profile height of the wear scars are taller than those obtained after testing the coarse soybean oil, meaning that the degumming process has a negative influence on the worn surface quality and the lubricating capacity of this oil. Comparing the surface quality of the wear scars on fixed tested balls is a reliable method to point out the lubricant properties of the vegetable oils, especially if they are compared to a “classical” lubricant as a non-additivated transmission mineral oil T90. The best surface after testing was obtained for the soybean oil, followed by T90 oil and the degummed grades of the soybean oil and rapeseed oil (these three giving very close values for the functional parameters), but the refined rapeseed oil generated the poorest quality of the wear scars on the balls, under the same testing conditions.

  8. Agents of Change in Promoting Reflective Abstraction: A Quasi-Experimental, Study on Limits in College Calculus

    ERIC Educational Resources Information Center

    Cappetta, Robert W.; Zollman, Alan

    2013-01-01

    We measured student performance on the concept of limit by promoting reflection through four agents of change: instructor, peer, curriculum and individual. It is based on Piaget's four constructs of reflective abstraction: interiorization, coordination, encapsulation, and generalization, and includes the notion of reversal, as refined into a…

  9. Reasoning abstractly about resources

    NASA Technical Reports Server (NTRS)

    Clement, B.; Barrett, A.

    2001-01-01

    r describes a way to schedule high level activities before distributing them across multiple rovers in order to coordinate the resultant use of shared resources regardless of how each rover decides how to perform its activities. We present an algorithm for summarizing the metric resource requirements of an abstract activity based n the resource usages of its potential refinements.

  10. A locally refined rectangular grid finite element method - Application to computational fluid dynamics and computational physics

    NASA Technical Reports Server (NTRS)

    Young, David P.; Melvin, Robin G.; Bieterman, Michael B.; Johnson, Forrester T.; Samant, Satish S.

    1991-01-01

    The present FEM technique addresses both linear and nonlinear boundary value problems encountered in computational physics by handling general three-dimensional regions, boundary conditions, and material properties. The box finite elements used are defined by a Cartesian grid independent of the boundary definition, and local refinements proceed by dividing a given box element into eight subelements. Discretization employs trilinear approximations on the box elements; special element stiffness matrices are included for boxes cut by any boundary surface. Illustrative results are presented for representative aerodynamics problems involving up to 400,000 elements.

  11. Comparative study of the physicochemical, nutritional, and antioxidant properties of some commercial refined and non-centrifugal sugars.

    PubMed

    Lee, Jong Suk; Ramalingam, Srinivasan; Jo, Il Guk; Kwon, Ye Som; Bahuguna, Ashutosh; Oh, Young Sook; Kwon, O-Jun; Kim, Myunghee

    2018-07-01

    Three refined and four unrefined branded commercial sugars available in Korea were investigated in terms of pH, soluble solids, moisture, ash content, turbidity, color values, microbial profile, reducing power, 2,2-diphenyl-1-picrylhydrazyl and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) radical scavenging activities, cellular antioxidant activity, and total phytoconstituent (i.e. phenolic, flavonoid, mineral, sucrose, glucose, and fructose) contents using standard analytical protocols such as high-performance liquid chromatography, gas chromatography-flame ionization detector/mass spectrometry, and inductively coupled plasma atomic emission spectroscopy. All tested physicochemical parameters were within the recommended standard levels. Significantly high nutritional and antioxidant properties were observed for the unrefined sugars, especially AUNO® sugar, whereas a high sucrose content was detected for the refined sugars. Hence, this study revealed that the degree of purification affects the nutritional values and antioxidant potentials of sugars. The present findings also indicate that unrefined sugars can be used as sweeteners in sugar-based cuisine to obtain nutritional and antioxidant-rich foodstuff. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Comparison of lab, pilot, and industrial scale low consistency mechanical refining for improvements in enzymatic digestibility of pretreated hardwood.

    PubMed

    Jones, Brandon W; Venditti, Richard; Park, Sunkyu; Jameel, Hasan

    2014-09-01

    Mechanical refining has been shown to improve biomass enzymatic digestibility. In this study industrial high-yield sodium carbonate hardwood pulp was subjected to lab, pilot and industrial refining to determine if the mechanical refining improves the enzymatic hydrolysis sugar conversion efficiency differently at different refining scales. Lab, pilot and industrial refining increased the biomass digestibility for lignocellulosic biomass relative to the unrefined material. The sugar conversion was increased from 36% to 65% at 5 FPU/g of biomass with industrial refining at 67.0 kWh/t, which was more energy efficient than lab and pilot scale refining. There is a maximum in the sugar conversion with respect to the amount of refining energy. Water retention value is a good predictor of improvements in sugar conversion for a given fiber source and composition. Improvements in biomass digestibility with refining due to lab, pilot plant and industrial refining were similar with respect to water retention value. Published by Elsevier Ltd.

  13. Mesh Generation via Local Bisection Refinement of Triangulated Grids

    DTIC Science & Technology

    2015-06-01

    Science and Technology Organisation DSTO–TR–3095 ABSTRACT This report provides a comprehensive implementation of an unstructured mesh generation method...and Technology Organisation 506 Lorimer St, Fishermans Bend, Victoria 3207, Australia Telephone: 1300 333 362 Facsimile: (03) 9626 7999 c© Commonwealth...their behaviour is critically linked to Maubach’s method and the data structures N and T . The top- level mesh refinement algorithm is also presented

  14. Dinosaurs can fly -- High performance refining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Treat, J.E.

    1995-09-01

    High performance refining requires that one develop a winning strategy based on a clear understanding of one`s position in one`s company`s value chain; one`s competitive position in the products markets one serves; and the most likely drivers and direction of future market forces. The author discussed all three points, then described measuring performance of the company. To become a true high performance refiner often involves redesigning the organization as well as the business processes. The author discusses such redesigning. The paper summarizes ten rules to follow to achieve high performance: listen to the market; optimize; organize around asset or areamore » teams; trust the operators; stay flexible; source strategically; all maintenance is not equal; energy is not free; build project discipline; and measure and reward performance. The paper then discusses the constraints to the implementation of change.« less

  15. Development of the Contact Lens User Experience: CLUE Scales

    PubMed Central

    Wirth, R. J.; Edwards, Michael C.; Henderson, Michael; Henderson, Terri; Olivares, Giovanna; Houts, Carrie R.

    2016-01-01

    ABSTRACT Purpose The field of optometry has become increasingly interested in patient-reported outcomes, reflecting a common trend occurring across the spectrum of healthcare. This article reviews the development of the Contact Lens User Experience: CLUE system designed to assess patient evaluations of contact lenses. CLUE was built using modern psychometric methods such as factor analysis and item response theory. Methods The qualitative process through which relevant domains were identified is outlined as well as the process of creating initial item banks. Psychometric analyses were conducted on the initial item banks and refinements were made to the domains and items. Following this data-driven refinement phase, a second round of data was collected to further refine the items and obtain final item response theory item parameters estimates. Results Extensive qualitative work identified three key areas patients consider important when describing their experience with contact lenses. Based on item content and psychometric dimensionality assessments, the developing CLUE instruments were ultimately focused around four domains: comfort, vision, handling, and packaging. Item response theory parameters were estimated for the CLUE item banks (377 items), and the resulting scales were found to provide precise and reliable assignment of scores detailing users’ subjective experiences with contact lenses. Conclusions The CLUE family of instruments, as it currently exists, exhibits excellent psychometric properties. PMID:27383257

  16. [French norms of imagery for pictures, for concrete and abstract words].

    PubMed

    Robin, Frédérique

    2006-09-01

    This paper deals with French norms for mental image versus picture agreement for 138 pictures and the imagery value for 138 concrete words and 69 abstract words. The pictures were selected from Snodgrass et Vanderwart's norms (1980). The concrete words correspond to the dominant naming response to the pictorial stimuli. The abstract words were taken from verbal associative norms published by Ferrand (2001). The norms were established according to two variables: 1) mental image vs. picture agreement, and 2) imagery value of words. Three other variables were controlled: 1) picture naming agreement; 2) familiarity of objects referred to in the pictures and the concrete words, and 3) subjective verbal frequency of words. The originality of this work is to provide French imagery norms for the three kinds of stimuli usually compared in research on dual coding. Moreover, these studies focus on figurative and verbal stimuli variations in visual imagery processes.

  17. Designing effective animations for computer science instruction

    NASA Astrophysics Data System (ADS)

    Grillmeyer, Oliver

    This study investigated the potential for animations of Scheme functions to help novice computer science students understand difficult programming concepts. These animations used an instructional framework inspired by theories of constructivism and knowledge integration. The framework had students make predictions, reflect, and specify examples to animate to promote autonomous learning and result in more integrated knowledge. The framework used animated pivotal cases to help integrate disconnected ideas and restructure students' incomplete ideas by illustrating weaknesses in their existing models. The animations scaffolded learners, making the thought processes of experts more visible by modeling complex and tacit information. The animation design was guided by prior research and a methodology of design and refinement. Analysis of pilot studies led to the development of four design concerns to aid animation designers: clearly illustrate the mapping between objects in animations with the actual objects they represent, show causal connections between elements, draw attention to the salient features of the modeled system, and create animations that reduce complexity. Refined animations based on these design concerns were compared to computer-based tools, text-based instruction, and simpler animations that do not embody the design concerns. Four studies comprised this dissertation work. Two sets of animated presentations of list creation functions were compared to control groups. No significant differences were found in support of animations. Three different animated models of traces of recursive functions ranging from concrete to abstract representations were compared. No differences in learning gains were found between the three models in test performance. Three models of animations of applicative operators were compared with students using the replacement modeler and the Scheme interpreter. Significant differences were found favoring animations that addressed causality and salience in their design. Lastly, two binary tree search algorithm animations designed to reduce complexity were compared with hand-tracing of calls. Students made fewer mistakes in predicting the tree traversal when guided by the animations. However, the posttest findings were inconsistent. In summary, animations designed based on the design concerns did not consistently add value to instruction in the form investigated in this research.

  18. Improved parallel image reconstruction using feature refinement.

    PubMed

    Cheng, Jing; Jia, Sen; Ying, Leslie; Liu, Yuanyuan; Wang, Shanshan; Zhu, Yanjie; Li, Ye; Zou, Chao; Liu, Xin; Liang, Dong

    2018-07-01

    The aim of this study was to develop a novel feature refinement MR reconstruction method from highly undersampled multichannel acquisitions for improving the image quality and preserve more detail information. The feature refinement technique, which uses a feature descriptor to pick up useful features from residual image discarded by sparsity constrains, is applied to preserve the details of the image in compressed sensing and parallel imaging in MRI (CS-pMRI). The texture descriptor and structure descriptor recognizing different types of features are required for forming the feature descriptor. Feasibility of the feature refinement was validated using three different multicoil reconstruction methods on in vivo data. Experimental results show that reconstruction methods with feature refinement improve the quality of reconstructed image and restore the image details more accurately than the original methods, which is also verified by the lower values of the root mean square error and high frequency error norm. A simple and effective way to preserve more useful detailed information in CS-pMRI is proposed. This technique can effectively improve the reconstruction quality and has superior performance in terms of detail preservation compared with the original version without feature refinement. Magn Reson Med 80:211-223, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  19. Reconstruing intolerance: abstract thinking reduces conservatives' prejudice against nonnormative groups.

    PubMed

    Luguri, Jamie B; Napier, Jaime L; Dovidio, John F

    2012-07-01

    Myrdal (1944) described the "American dilemma" as the conflict between abstract national values ("liberty and justice for all") and more concrete, everyday prejudices. We leveraged construal-level theory to empirically test Myrdal's proposition that construal level (abstract vs. concrete) can influence prejudice. We measured individual differences in construal level (Study 1) and manipulated construal level (Studies 2 and 3); across these three studies, we found that adopting an abstract mind-set heightened conservatives' tolerance for groups that are perceived as deviating from Judeo-Christian values (gay men, lesbians, Muslims, and atheists). Among participants who adopted a concrete mind-set, conservatives were less tolerant of these nonnormative groups than liberals were, but political orientation did not have a reliable effect on tolerance among participants who adopted an abstract mind-set. Attitudes toward racial out-groups and dominant groups (e.g., Whites, Christians) were unaffected by construal level. In Study 3, we found that the effect of abstract thinking on prejudice was mediated by an increase in concerns about fairness.

  20. A Systematic Review and Aggregated Analysis on the Impact of Amyloid PET Brain Imaging on the Diagnosis, Diagnostic Confidence, and Management of Patients being Evaluated for Alzheimer's Disease.

    PubMed

    Fantoni, Enrico R; Chalkidou, Anastasia; O' Brien, John T; Farrar, Gill; Hammers, Alexander

    2018-01-01

    Amyloid PET (aPET) imaging could improve patient outcomes in clinical practice, but the extent of impact needs quantification. To provide an aggregated quantitative analysis of the value added by aPET in cognitively impaired subjects. Systematic literature searches were performed in Embase and Medline until January 2017. 1,531 cases over 12 studies were included (1,142 cases over seven studies in the primary analysis where aPET was the key biomarker; the remaining cases included as defined groups in the secondary analysis). Data was abstracted by consensus among two observers and assessed for bias. Clinical utility was measured by diagnostic change, diagnostic confidence, and patient management before and after aPET. Three groups were further analyzed: control patients for whom feedback of aPET scan results was delayed; aPET Appropriate Use Criteria (AUC+) cases; and patients undergoing additional FDG/CSF testing. For 1,142 cases with only aPET, 31.3% of diagnoses were revised, whereas 3.2% of diagnoses changed in the delayed aPET control group (p < 0.0001). Increased diagnostic confidence following aPET was found for 62.1% of 870 patients. Management changes with aPET were found in 72.2% of 740 cases and in 55.5% of 299 cases in the control group (p < 0.0001). The diagnostic value of aPET in AUC+ patients or when FDG/CSF were additionally available did not substantially differ from the value of aPET alone in the wider population. Amyloid PET contributed to diagnostic revision in almost a third of cases and demonstrated value in increasing diagnostic confidence and refining management plans.

  1. A Systematic Review and Aggregated Analysis on the Impact of Amyloid PET Brain Imaging on the Diagnosis, Diagnostic Confidence, and Management of Patients being Evaluated for Alzheimer’s Disease

    PubMed Central

    Fantoni, Enrico R.; Chalkidou, Anastasia; O’ Brien, John T.; Farrar, Gill; Hammers, Alexander

    2018-01-01

    Background: Amyloid PET (aPET) imaging could improve patient outcomes in clinical practice, but the extent of impact needs quantification. Objective: To provide an aggregated quantitative analysis of the value added by aPET in cognitively impaired subjects. Methods: Systematic literature searches were performed in Embase and Medline until January 2017. 1,531 cases over 12 studies were included (1,142 cases over seven studies in the primary analysis where aPET was the key biomarker; the remaining cases included as defined groups in the secondary analysis). Data was abstracted by consensus among two observers and assessed for bias. Clinical utility was measured by diagnostic change, diagnostic confidence, and patient management before and after aPET. Three groups were further analyzed: control patients for whom feedback of aPET scan results was delayed; aPET Appropriate Use Criteria (AUC+) cases; and patients undergoing additional FDG/CSF testing. Results: For 1,142 cases with only aPET, 31.3% of diagnoses were revised, whereas 3.2% of diagnoses changed in the delayed aPET control group (p < 0.0001). Increased diagnostic confidence following aPET was found for 62.1% of 870 patients. Management changes with aPET were found in 72.2% of 740 cases and in 55.5% of 299 cases in the control group (p < 0.0001). The diagnostic value of aPET in AUC+ patients or when FDG/CSF were additionally available did not substantially differ from the value of aPET alone in the wider population. Conclusions: Amyloid PET contributed to diagnostic revision in almost a third of cases and demonstrated value in increasing diagnostic confidence and refining management plans. PMID:29689725

  2. Influence of Trabecular Bone on Peri-Implant Stress and Strain Based on Micro-CT Finite Element Modeling of Beagle Dog

    PubMed Central

    Liao, Sheng-hui; Zhu, Xing-hao; Xie, Jing; Sohodeb, Vikesh Kumar; Ding, Xi

    2016-01-01

    The objective of this investigation is to analyze the influence of trabecular microstructure modeling on the biomechanical distribution of the implant-bone interface. Two three-dimensional finite element mandible models, one with trabecular microstructure (a refined model) and one with macrostructure (a simplified model), were built. The values of equivalent stress at the implant-bone interface in the refined model increased compared with those of the simplified model and strain on the contrary. The distributions of stress and strain were more uniform in the refined model of trabecular microstructure, in which stress and strain were mainly concentrated in trabecular bone. It was concluded that simulation of trabecular bone microstructure had a significant effect on the distribution of stress and strain at the implant-bone interface. These results suggest that trabecular structures could disperse stress and strain and serve as load buffers. PMID:27403424

  3. Influence of Trabecular Bone on Peri-Implant Stress and Strain Based on Micro-CT Finite Element Modeling of Beagle Dog.

    PubMed

    Liao, Sheng-Hui; Zhu, Xing-Hao; Xie, Jing; Sohodeb, Vikesh Kumar; Ding, Xi

    2016-01-01

    The objective of this investigation is to analyze the influence of trabecular microstructure modeling on the biomechanical distribution of the implant-bone interface. Two three-dimensional finite element mandible models, one with trabecular microstructure (a refined model) and one with macrostructure (a simplified model), were built. The values of equivalent stress at the implant-bone interface in the refined model increased compared with those of the simplified model and strain on the contrary. The distributions of stress and strain were more uniform in the refined model of trabecular microstructure, in which stress and strain were mainly concentrated in trabecular bone. It was concluded that simulation of trabecular bone microstructure had a significant effect on the distribution of stress and strain at the implant-bone interface. These results suggest that trabecular structures could disperse stress and strain and serve as load buffers.

  4. Embedding Language Support in Developmental Mathematics Lessons: Exploring the Value of Design as Professional Development for Community College Mathematics Instructors

    ERIC Educational Resources Information Center

    Gomez, Kimberley; Gomez, Louis M.; Rodela, Katherine C.; Horton, Emily S.; Cunningham, Jahneille; Ambrocio, Rocio

    2015-01-01

    Three community college faculty members used improvement science techniques to design, develop, and refine contextualized developmental mathematics lessons, where language and literacy pedagogy and related supports figured prominently in these instructional materials. This article reports on the role that their design experiences played in…

  5. Evaluation of the dermal carcinogenicity of lubricant base oils by the mouse skin painting bioassay and other proposed methods.

    PubMed

    Chasey, K L; McKee, R H

    1993-01-01

    Lubricant base oils are petroleum products that are predominantly derived from the vacuum distillation of crude oil. Various types of refinement can be employed during the manufacturing process, and evidence suggests that certain of the associated process streams produce skin cancer. Polycyclic aromatic compounds (PACs), some of which are considered as the causative agents, are removed, concentrated or chemically converted during the refinement process. In order to understand the effects of various types of refinement processes on carcinogenic potential, 94 oils were evaluated in the mouse epidermal cancer bioassay. This Exxon database is unique, because of the wide range of crude oils and processing histories represented. Seven processing history classifications are described, and conclusions concerning the impacts of each refinement process on dermal carcinogenicity are discussed. This research also included an evaluation of selected biological and chemical test methods for predicting carcinogenic potential. These included a modified version of the Ames test for mutagenicity, as well as analytical characterizations of the polycyclic aromatic structures in the oils. For classification purposes, a sample was considered to be carcinogenic if it resulted in the production of two or more tumor-bearing animals (in test groups of either 40 or 50 animals). The modified Ames test was considered to be positive if the mutagenicity index was > or = 2.0, and PAC analyses were similarly designated as positive or negative according to proposed guidelines. All of the alternative test methods showed similar agreement with dermal carcinogenicity bioassay data; concordance values were > or = 80%. However, each test was incorrect in ca. 10%-20% of the cases evaluated.(ABSTRACT TRUNCATED AT 250 WORDS)

  6. Reconstruction of three-dimensional grain structure in polycrystalline iron via an interactive segmentation method

    NASA Astrophysics Data System (ADS)

    Feng, Min-nan; Wang, Yu-cong; Wang, Hao; Liu, Guo-quan; Xue, Wei-hua

    2017-03-01

    Using a total of 297 segmented sections, we reconstructed the three-dimensional (3D) structure of pure iron and obtained the largest dataset of 16254 3D complete grains reported to date. The mean values of equivalent sphere radius and face number of pure iron were observed to be consistent with those of Monte Carlo simulated grains, phase-field simulated grains, Ti-alloy grains, and Ni-based super alloy grains. In this work, by finding a balance between automatic methods and manual refinement, we developed an interactive segmentation method to segment serial sections accurately in the reconstruction of the 3D microstructure; this approach can save time as well as substantially eliminate errors. The segmentation process comprises four operations: image preprocessing, breakpoint detection based on mathematical morphology analysis, optimized automatic connection of the breakpoints, and manual refinement by artificial evaluation.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarocki, John Charles; Zage, David John; Fisher, Andrew N.

    LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.

  8. Advances in Exoplanet Observing by Amateur Astronomers (Abstract)

    NASA Astrophysics Data System (ADS)

    Conti, D. M.

    2017-06-01

    (Abstract only) This past year has seen a marked increase in amateur astronomer participation in exoplanet research. This has ranged from amateur astronomers helping professional astronomers confirm candidate exoplanets, to helping refine the ephemeris of known exoplanets. In addition, amateur astronomers have been involved in characterizing such exotic objects as disintegrating planetesimals. However, the involvement in such pro/am collaborations has also required that amateur astronomers follow a more disciplined approach to exoplanet observing.

  9. THEMIS Surface-Atmosphere Separation Strategy and Preliminary Results

    NASA Technical Reports Server (NTRS)

    Bandfield, J. L.; Smith, M. D.; Christensen, P. R.

    2002-01-01

    Methods refined and adapted from the TES investigation are used to develop a surface-atmosphere separation strategy for THEMIS image analysis and atmospheric temperature and opacity retrievals. Additional information is contained in the original extended abstract.

  10. The identification of clinically important elements within medical journal abstracts: Patient-Population-Problem, Exposure-Intervention, Comparison, Outcome, Duration and Results (PECODR).

    PubMed

    Dawes, Martin; Pluye, Pierre; Shea, Laura; Grad, Roland; Greenberg, Arlene; Nie, Jian-Yun

    2007-01-01

    Information retrieval in primary care is becoming more difficult as the volume of medical information held in electronic databases expands. The lexical structure of this information might permit automatic indexing and improved retrieval. To determine the possibility of identifying the key elements of clinical studies, namely Patient-Population-Problem, Exposure-Intervention, Comparison, Outcome, Duration and Results (PECODR), from abstracts of medical journals. We used a convenience sample of 20 synopses from the journal Evidence-Based Medicine (EBM) and their matching original journal article abstracts obtained from PubMed. Three independent primary care professionals identified PECODR-related extracts of text. Rules were developed to define each PECODR element and the selection process of characters, words, phrases and sentences. From the extracts of text related to PECODR elements, potential lexical patterns that might help identify those elements were proposed and assessed using NVivo software. A total of 835 PECODR-related text extracts containing 41,263 individual text characters were identified from 20 EBM journal synopses. There were 759 extracts in the corresponding PubMed abstracts containing 31,947 characters. PECODR elements were found in nearly all abstracts and synopses with the exception of duration. There was agreement on 86.6% of the extracts from the 20 EBM synopses and 85.0% on the corresponding PubMed abstracts. After consensus this rose to 98.4% and 96.9% respectively. We found potential text patterns in the Comparison, Outcome and Results elements of both EBM synopses and PubMed abstracts. Some phrases and words are used frequently and are specific for these elements in both synopses and abstracts. Results suggest a PECODR-related structure exists in medical abstracts and that there might be lexical patterns specific to these elements. More sophisticated computer-assisted lexical-semantic analysis might refine these results, and pave the way to automating PECODR indexing, and improve information retrieval in primary care.

  11. Optimization of Refining Craft for Vegetable Insulating Oil

    NASA Astrophysics Data System (ADS)

    Zhou, Zhu-Jun; Hu, Ting; Cheng, Lin; Tian, Kai; Wang, Xuan; Yang, Jun; Kong, Hai-Yang; Fang, Fu-Xin; Qian, Hang; Fu, Guang-Pan

    2016-05-01

    Vegetable insulating oil because of its environmental friendliness are considered as ideal material instead of mineral oil used for the insulation and the cooling of the transformer. The main steps of traditional refining process included alkali refining, bleaching and distillation. This kind of refining process used in small doses of insulating oil refining can get satisfactory effect, but can't be applied to the large capacity reaction kettle. This paper using rapeseed oil as crude oil, and the refining process has been optimized for large capacity reaction kettle. The optimized refining process increases the acid degumming process. The alkali compound adds the sodium silicate composition in the alkali refining process, and the ratio of each component is optimized. Add the amount of activated clay and activated carbon according to 10:1 proportion in the de-colorization process, which can effectively reduce the oil acid value and dielectric loss. Using vacuum pumping gas instead of distillation process can further reduce the acid value. Compared some part of the performance parameters of refined oil products with mineral insulating oil, the dielectric loss of vegetable insulating oil is still high and some measures are needed to take to further optimize in the future.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voßwinkel, Daniel; Heletta, Lukas; Hoffmann, Rolf-Dieter

    The YIrGe{sub 2} type ternary germanides RERhGe{sub 2} (RE = Y, Gd-Ho) were synthesized from the elements by arc-melting and characterized by powder X-ray diffraction. The structure of DyRhGe{sub 2} was refined from single crystal X-ray diffractometer data: Immm, a = 426.49(9), b = 885.0(2), c = 1577.4(3) pm, wR2 = 0.0533, 637 F{sup 2} values, 30 variables (300 K data). The structure contains two crystallographically independent dysprosium atoms in pentagonal prismatic and hexagonal prismatic coordination. The three-dimensional [RhGe{sub 2}] polyanion is stabilized through covalent Rh–Ge (243–261 pm) and Ge–Ge (245–251 pm) bonding. The close structural relationship with the slightlymore » rhodium-poorer germanides RE{sub 5}Rh{sub 4}Ge{sub 10} (≡ RERh{sub 0.8}Ge{sub 2}) is discussed. Temperature-dependent magnetic susceptibility measurements reveal Pauli paramagnetism for YRhGe{sub 2} and Curie-Weiss paramagnetism for RERhGe{sub 2} with RE = Gd, Tb, Dy and Ho. These germanides order antiferromagnetically at T{sub N} = 7.2(5), 10.6(5), 8.1(5), and 6.4(5) K, respectively. - Graphical abstract: The germanides RERhGe{sub 2} (RE = Y, Gd-Ho) are new representatives of the YIrGe{sub 2} type.« less

  13. Describing Chinese hospital activity with diagnosis related groups (DRGs). A case study in Chengdu.

    PubMed

    Gong, Zhiping; Duckett, Stephen J; Legge, David G; Pei, Likun

    2004-07-01

    To examine the applicability of an Australian casemix classification system to the description of Chinese hospital activity. A total of 161,478 inpatient episodes from three Chengdu hospitals with demographic, diagnosis, procedure and billing data for the year 1998/1999, 1999/2000 and 2000/2001 were grouped using the Australian refined-diagnosis related groups (AR-DRGs) (version 4.0) grouper. Reduction in variance (R2) and coefficient of variation (CV). Untrimmed reduction in variance (R2) was 0.12 and 0.17 for length of stay (LOS) and cost respectively. After trimming, R2 values were 0.45 and 0.59 for length of stay and cost respectively. The Australian refined DRGs provide a good basis for developing a Chinese grouper.

  14. An approach for using AVHRR data to monitor U.S. great plains grasslands

    USGS Publications Warehouse

    Reed, B.C.; Loveland, Thomas R.; Tieszen, L.L.

    1996-01-01

    Environmental monitoring requires regular observations regarding the status of the landscape- The concept behind most monitoring efforts using satellite data involve deriving normalized difference vegetation index (NDVI) values or accumulating the NDVI over a specified time period. These efforts attempt to estimate the continuous growth of green biomass by using continuous additions of NDVI as a surrogate measure. To build upon this concept, this study proposes three refinements; 1) use an objective definition of the current growing season to adjust the time window during which the NDVI is accumulated, 2) accumulate only the NDVI values which are affected by green vegetation, and 3) base monitoring units upon land cover type. These refinements improve the sensitivity of detecting interannual vegetation variability, reduce the need for extensive and detailed knowledge of ground conditions and crop calendars, provide a framework in which several types of monitoring can take place over diverse land cover types, and provide an objective time frame during which monitoring takes place.

  15. Refining the maintenance techniques for Interlocking Concrete Paver GIs - abstract

    EPA Science Inventory

    Surface clogging adversely affects the performance of Interlocking Concrete Pavements (ICP) by reducing their ability to infiltrate stormwater runoff. Determining the correct methods for remedial maintenances is crucial to recovering and maintaining efficient ICP performance. T...

  16. Are P values and statistical assessments in poster abstracts presented at annual meetings of Taiwan Society of Anesthesiologists relative to the characteristics of hospitals?

    PubMed

    Lee, Fu-Jung; Wu, Chih-Cheng; Peng, Shih-Yen; Fan, Kuo-Tung

    2007-09-01

    Many anesthesiologists in medical centers (MC) or in anesthesiologist-training hospitals (ATH) are accustomed to present their research data in the form of poster abstracts at the annual meetings of Taiwan Society of Anesthesiologists (TSA) to represent their academic gainings in a designated period of time. However, an orphaned P value without mentioning the related specified statistical test has frequently been found in these articles. The difference in presentation of statistical test after P value between MC/ATH and non-MC/non-ATH in recent three TSA consecutive annual meetings was explored in this article. We collected the proceedings handbooks of TSA annual meetings in a period spanning 3 yrs (2003 to 2005) and analyzed the hospital characteristic of first institute-byliner in the poster abstract. Data were analyzed with Fisher's exact test and statistical significance was assumed if P < 0.05. Included were 101 poster abstracts with byliners of 20 hospitals. Only 2 of the 20 hospitals were accredited as non-ATH and 4 as non-MC. There were 64 (63%) abstracts without specified statistical test after P value and no significant difference was found among each category. (P = 0.47 in ATH vs. non-ATH and P = 0.07 in MC vs. non-MC). The basic concept of P value with specified statistical test was not applicable comprehensively in poster abstracts of the annual conferences. Based on our wishful intention, we suggest that the anesthesia administrators and senior anesthesiologists at ATH or MC, and the members of the committee responsible for running academic affairs in TSA, should pay attention to this prodigy and work together to improve our basic statistics in poster presentation.

  17. Effect of Solutes on Grain Refinement of As-Cast Fe-4Si Alloy

    NASA Astrophysics Data System (ADS)

    Li, Ming; Li, Jian-Min; Zheng, Qing; Wang, Geoff; Zhang, Ming-Xing

    2018-06-01

    Grain size is one of the key microstructural factors that control the mechanical properties of steels. The present work aims to extend the theories of grain refinement which were established for cast light alloys to steel systems. Using a designed Fe-4 wt pct Si alloy (all-ferrite structure during whole solidification process), the solute effect on grain refinement/grain coarsening in ferritic systems was comprehensively investigated. Experimental results showed that boron (B), which is associated with the highest Q value (growth restriction factor) in ferrite, significantly refined the as-cast structure of the Fe-4 wt pct Si alloy. Cu and Mo with low Q values had no effect on grain refinement. However, although Y and Zr have relatively high Q values, addition of these two solutes led to grain coarsening in the Fe-4Si alloy. Understanding the results in regards to the growth restriction factor and the driving force for the solidification led to the conclusion that in addition to the grain growth restriction effect, the changes of thermodynamic driving force for solidification due to the solute addition also played a key role in grain refinement in ferritic alloys.

  18. Effect of Solutes on Grain Refinement of As-Cast Fe-4Si Alloy

    NASA Astrophysics Data System (ADS)

    Li, Ming; Li, Jian-Min; Zheng, Qing; Wang, Geoff; Zhang, Ming-Xing

    2018-03-01

    Grain size is one of the key microstructural factors that control the mechanical properties of steels. The present work aims to extend the theories of grain refinement which were established for cast light alloys to steel systems. Using a designed Fe-4 wt pct Si alloy (all-ferrite structure during whole solidification process), the solute effect on grain refinement/grain coarsening in ferritic systems was comprehensively investigated. Experimental results showed that boron (B), which is associated with the highest Q value (growth restriction factor) in ferrite, significantly refined the as-cast structure of the Fe-4 wt pct Si alloy. Cu and Mo with low Q values had no effect on grain refinement. However, although Y and Zr have relatively high Q values, addition of these two solutes led to grain coarsening in the Fe-4Si alloy. Understanding the results in regards to the growth restriction factor and the driving force for the solidification led to the conclusion that in addition to the grain growth restriction effect, the changes of thermodynamic driving force for solidification due to the solute addition also played a key role in grain refinement in ferritic alloys.

  19. Geographic variation in lumbar diskectomy: a protocol for evaluation.

    PubMed

    Barron, M; Kazandjian, V A

    1992-03-01

    In 1989 the Maryland Hospital Association (MHA) began developing a protocol related to lumbar diskectomy, a procedure with widely reported geographic variation in its use. The MHA's Laminectomy Advisory Committee drafted three criteria for performance of lumbar diskectomy and also developed a data-collection instrument with which the eight hospitals participating in a pilot study could abstract the necessary data from medical records. Both individual hospital and aggregate results showed wide variation in compliance with the criteria. These findings suggest research and development activities such as refinement of the data-collection instrument, use of the protocol for bench-marking, further investigation of clinical and other determinants of rate variation, and study of the effect of new diagnostic technology on utilization rates for this procedure.

  20. Multiscale visual quality assessment for cluster analysis with self-organizing maps

    NASA Astrophysics Data System (ADS)

    Bernard, Jürgen; von Landesberger, Tatiana; Bremm, Sebastian; Schreck, Tobias

    2011-01-01

    Cluster analysis is an important data mining technique for analyzing large amounts of data, reducing many objects to a limited number of clusters. Cluster visualization techniques aim at supporting the user in better understanding the characteristics and relationships among the found clusters. While promising approaches to visual cluster analysis already exist, these usually fall short of incorporating the quality of the obtained clustering results. However, due to the nature of the clustering process, quality plays an important aspect, as for most practical data sets, typically many different clusterings are possible. Being aware of clustering quality is important to judge the expressiveness of a given cluster visualization, or to adjust the clustering process with refined parameters, among others. In this work, we present an encompassing suite of visual tools for quality assessment of an important visual cluster algorithm, namely, the Self-Organizing Map (SOM) technique. We define, measure, and visualize the notion of SOM cluster quality along a hierarchy of cluster abstractions. The quality abstractions range from simple scalar-valued quality scores up to the structural comparison of a given SOM clustering with output of additional supportive clustering methods. The suite of methods allows the user to assess the SOM quality on the appropriate abstraction level, and arrive at improved clustering results. We implement our tools in an integrated system, apply it on experimental data sets, and show its applicability.

  1. Improved Estimates of Thermodynamic Parameters

    NASA Technical Reports Server (NTRS)

    Lawson, D. D.

    1982-01-01

    Techniques refined for estimating heat of vaporization and other parameters from molecular structure. Using parabolic equation with three adjustable parameters, heat of vaporization can be used to estimate boiling point, and vice versa. Boiling points and vapor pressures for some nonpolar liquids were estimated by improved method and compared with previously reported values. Technique for estimating thermodynamic parameters should make it easier for engineers to choose among candidate heat-exchange fluids for thermochemical cycles.

  2. Improving the accuracy of macromolecular structure refinement at 7 Å resolution.

    PubMed

    Brunger, Axel T; Adams, Paul D; Fromme, Petra; Fromme, Raimund; Levitt, Michael; Schröder, Gunnar F

    2012-06-06

    In X-ray crystallography, molecular replacement and subsequent refinement is challenging at low resolution. We compared refinement methods using synchrotron diffraction data of photosystem I at 7.4 Å resolution, starting from different initial models with increasing deviations from the known high-resolution structure. Standard refinement spoiled the initial models, moving them further away from the true structure and leading to high R(free)-values. In contrast, DEN refinement improved even the most distant starting model as judged by R(free), atomic root-mean-square differences to the true structure, significance of features not included in the initial model, and connectivity of electron density. The best protocol was DEN refinement with initial segmented rigid-body refinement. For the most distant initial model, the fraction of atoms within 2 Å of the true structure improved from 24% to 60%. We also found a significant correlation between R(free) values and the accuracy of the model, suggesting that R(free) is useful even at low resolution. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  4. Potential Functional Byproducts from Guava Purée Processing.

    PubMed

    Lim, Si Yi; Tham, Paik Yean; Lim, Hilary Yi Ler; Heng, Wooi Shin; Chang, Ying Ping

    2018-05-10

    The valorization of guava waste requires compositional and functional studies. We tested three byproducts of guava purée processing, namely refiner, siever, and decanter. We analyzed the chemical composition and quantified the prebiotic activity score and selected carbohydrates; we also determined the water holding (WHC), oil holding (OHC), cation exchange capacities, bile acid binding, and glucose dialysis retardation (GDR) of the solid fraction and the antioxidative and α-amylase inhibitory capacities (AIC) of the ethanolic extract. Refiner contained 7.7% lipid, 7.08% protein and a relatively high phytate content; it had a high prebiotic activity score and possessed the highest binding capacity with deoxycholic acid. Siever contained high levels of low molecular weight carbohydrates and total tannin but relatively low crude fiber and cellulose contents. It had the highest binding with chenodeoxycholic acid (74.8%), and exhibited the highest 2,2-diphenyl-1-picrylhydrazyl (DPPH) radical scavenging capacity. Decanter was rich in cellulose and had a high prebiotic activity score. The WHC and OHC values of decanter were within a narrow range and also exhibited the highest binding with cholic acid (86.6%), and the highest values of GDR and AIC. The refiner waste could be included in animal feed but requires further processing to reduce the high phytate levels. All three guava byproducts had the potential to be a source of antioxidant dietary fiber (DF), a finding that warrants further in vivo study. To differing extents, the guava byproducts exhibited useful physicochemical binding properties and so possessed the potential for health-promoting activity. These byproducts could also be upgraded to other marketable products so the manufacturers of processed guava might be able to develop their businesses sustainably by making better use of them. © 2018 Institute of Food Technologists®.

  5. Comparison Between Self-Guided Langevin Dynamics and Molecular Dynamics Simulations for Structure Refinement of Protein Loop Conformations

    DTIC Science & Technology

    2011-01-01

    SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18 . NUMBER OF PAGES 9 19a. NAME OF RESPONSIBLE PERSON a. REPORT...unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39- 18 sampling is based on...atom distance-scaled ideal-gas reference state (DFIRE-AA) statistical potential func- tion.[ 18 ] The third approach is the Rosetta all-atom energy func

  6. Fine-mapping of lipid regions in global populations discovers ethnic-specific signals and refines previously identified lipid loci

    PubMed Central

    Zubair, Niha; Luis Ambite, Jose; Bush, William S.; Kichaev, Gleb; Lu, Yingchang; Manichaikul, Ani; Sheu, Wayne H-H.; Absher, Devin; Assimes, Themistocles L.; Bielinski, Suzette J.; Bottinger, Erwin P.; Buzkova, Petra; Chuang, Lee-Ming; Chung, Ren-Hua; Cochran, Barbara; Dumitrescu, Logan; Gottesman, Omri; Haessler, Jeffrey W.; Haiman, Christopher; Heiss, Gerardo; Hsiung, Chao A.; Hung, Yi-Jen; Hwu, Chii-Min; Juang, Jyh-Ming J.; Le Marchand, Loic; Lee, I-Te; Lee, Wen-Jane; Lin, Li-An; Lin, Danyu; Lin, Shih-Yi; Mackey, Rachel H.; Martin, Lisa W.; Pasaniuc, Bogdan; Peters, Ulrike; Predazzi, Irene; Quertermous, Thomas; Reiner, Alex P.; Robinson, Jennifer; Rotter, Jerome I.; Ryckman, Kelli K.; Schreiner, Pamela J.; Stahl, Eli; Tao, Ran; Tsai, Michael Y.; Waite, Lindsay L.; Wang, Tzung-Dau; Buyske, Steven; Ida Chen, Yii-Der; Cheng, Iona; Crawford, Dana C.; Loos, Ruth J.F.; Rich, Stephen S.; Fornage, Myriam; North, Kari E.; Kooperberg, Charles; Carty, Cara L.

    2016-01-01

    Abstract Genome-wide association studies have identified over 150 loci associated with lipid traits, however, no large-scale studies exist for Hispanics and other minority populations. Additionally, the genetic architecture of lipid-influencing loci remains largely unknown. We performed one of the most racially/ethnically diverse fine-mapping genetic studies of HDL-C, LDL-C, and triglycerides to-date using SNPs on the MetaboChip array on 54,119 individuals: 21,304 African Americans, 19,829 Hispanic Americans, 12,456 Asians, and 530 American Indians. The majority of signals found in these groups generalize to European Americans. While we uncovered signals unique to racial/ethnic populations, we also observed systematically consistent lipid associations across these groups. In African Americans, we identified three novel signals associated with HDL-C (LPL, APOA5, LCAT) and two associated with LDL-C (ABCG8, DHODH). In addition, using this population, we refined the location for 16 out of the 58 known MetaboChip lipid loci. These results can guide tailored screening efforts, reveal population-specific responses to lipid-lowering medications, and aid in the development of new targeted drug therapies. PMID:28426890

  7. Probing the accuracy and precision of Hirshfeld atom refinement with HARt interfaced with Olex2.

    PubMed

    Fugel, Malte; Jayatilaka, Dylan; Hupf, Emanuel; Overgaard, Jacob; Hathwar, Venkatesha R; Macchi, Piero; Turner, Michael J; Howard, Judith A K; Dolomanov, Oleg V; Puschmann, Horst; Iversen, Bo B; Bürgi, Hans-Beat; Grabowsky, Simon

    2018-01-01

    Hirshfeld atom refinement (HAR) is a novel X-ray structure refinement technique that employs aspherical atomic scattering factors obtained from stockholder partitioning of a theoretically determined tailor-made static electron density. HAR overcomes many of the known limitations of independent atom modelling (IAM), such as too short element-hydrogen distances, r ( X -H), or too large atomic displacement parameters (ADPs). This study probes the accuracy and precision of anisotropic hydrogen and non-hydrogen ADPs and of r ( X -H) values obtained from HAR. These quantities are compared and found to agree with those obtained from (i) accurate neutron diffraction data measured at the same temperatures as the X-ray data and (ii) multipole modelling (MM), an established alternative method for interpreting X-ray diffraction data with the help of aspherical atomic scattering factors. Results are presented for three chemically different systems: the aromatic hydro-carbon rubrene (orthorhombic 5,6,11,12-tetra-phenyl-tetracene), a co-crystal of zwitterionic betaine, imidazolium cations and picrate anions (BIPa), and the salt potassium hydrogen oxalate (KHOx). The non-hydrogen HAR-ADPs are as accurate and precise as the MM-ADPs. Both show excellent agreement with the neutron-based values and are superior to IAM-ADPs. The anisotropic hydrogen HAR-ADPs show a somewhat larger deviation from neutron-based values than the hydrogen SHADE-ADPs used in MM. Element-hydrogen bond lengths from HAR are in excellent agreement with those obtained from neutron diffraction experiments, although they are somewhat less precise. The residual density contour maps after HAR show fewer features than those after MM. Calculating the static electron density with the def2-TZVP basis set instead of the simpler def2-SVP one does not improve the refinement results significantly. All HARs were performed within the recently introduced HARt option implemented in the Olex2 program. They are easily launched inside its graphical user interface following a conventional IAM.

  8. Probing the accuracy and precision of Hirshfeld atom refinement with HARt interfaced with Olex2

    PubMed Central

    Fugel, Malte; Hathwar, Venkatesha R.; Turner, Michael J.; Howard, Judith A. K.

    2018-01-01

    Hirshfeld atom refinement (HAR) is a novel X-ray structure refinement technique that employs aspherical atomic scattering factors obtained from stockholder partitioning of a theoretically determined tailor-made static electron density. HAR overcomes many of the known limitations of independent atom modelling (IAM), such as too short element–hydrogen distances, r(X—H), or too large atomic displacement parameters (ADPs). This study probes the accuracy and precision of anisotropic hydrogen and non-hydrogen ADPs and of r(X—H) values obtained from HAR. These quantities are compared and found to agree with those obtained from (i) accurate neutron diffraction data measured at the same temperatures as the X-ray data and (ii) multipole modelling (MM), an established alternative method for interpreting X-ray diffraction data with the help of aspherical atomic scattering factors. Results are presented for three chemically different systems: the aromatic hydro­carbon rubrene (orthorhombic 5,6,11,12-tetra­phenyl­tetracene), a co-crystal of zwitterionic betaine, imidazolium cations and picrate anions (BIPa), and the salt potassium hydrogen oxalate (KHOx). The non-hydrogen HAR-ADPs are as accurate and precise as the MM-ADPs. Both show excellent agreement with the neutron-based values and are superior to IAM-ADPs. The anisotropic hydrogen HAR-ADPs show a somewhat larger deviation from neutron-based values than the hydrogen SHADE-ADPs used in MM. Element–hydrogen bond lengths from HAR are in excellent agreement with those obtained from neutron diffraction experiments, although they are somewhat less precise. The residual density contour maps after HAR show fewer features than those after MM. Calculating the static electron density with the def2-TZVP basis set instead of the simpler def2-SVP one does not improve the refinement results significantly. All HARs were performed within the recently introduced HARt option implemented in the Olex2 program. They are easily launched inside its graphical user interface following a conventional IAM. PMID:29354269

  9. USSR and Eastern Europe Scienitfic Abstracts, Geophysics, Astronomy and Space, Number 402

    DTIC Science & Technology

    1977-07-27

    for the stations Arosa , Ox- ford and Hradec Kralove, whereas a negative (dropping) trend was found for the stations Belsk and Potsdam. The highest of...the three positive values is that of Oxford, being +2.15% for the 10-year period. The positive values of the stations Arosa and Hradec Kralove are

  10. Microstructural characteristics of adiabatic shear localization in a metastable beta titanium alloy deformed at high strain rate and elevated temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Hongyi, E-mail: h.zhan@uq.edu.au; Zeng, Weidong; Wang, Gui

    2015-04-15

    The microstructural evolution and grain refinement within adiabatic shear bands in the Ti6554 alloy deformed at high strain rates and elevated temperatures have been characterized using transmission electron microscopy. No stress drops were observed in the corresponding stress–strain curve, indicating that the initiation of adiabatic shear bands does not lead to the loss of load capacity for the Ti6554 alloy. The outer region of the shear bands mainly consists of cell structures bounded by dislocation clusters. Equiaxed subgrains in the core area of the shear band can be evolved from the subdivision of cell structures or reconstruction and transverse segmentationmore » of dislocation clusters. It is proposed that dislocation activity dominates the grain refinement process. The rotational recrystallization mechanism may operate as the kinetic requirements for it are fulfilled. The coexistence of different substructures across the shear bands implies that the microstructural evolution inside the shear bands is not homogeneous and different grain refinement mechanisms may operate simultaneously to refine the structure. - Graphical abstract: Display Omitted - Highlights: • The microstructure within the adiabatic shear band was characterized by TEM. • No stress drops were observed in the corresponding stress–strain curve. • Dislocation activity dominated the grain refinement process. • The kinetic requirements for rotational recrystallization mechanism were fulfilled. • Different grain refinement mechanisms operated simultaneously to refine the structure.« less

  11. Exploiting distant homologues for phasing through the generation of compact fragments, local fold refinement and partial solution combination.

    PubMed

    Millán, Claudia; Sammito, Massimo Domenico; McCoy, Airlie J; Nascimento, Andrey F Ziem; Petrillo, Giovanna; Oeffner, Robert D; Domínguez-Gil, Teresa; Hermoso, Juan A; Read, Randy J; Usón, Isabel

    2018-04-01

    Macromolecular structures can be solved by molecular replacement provided that suitable search models are available. Models from distant homologues may deviate too much from the target structure to succeed, notwithstanding an overall similar fold or even their featuring areas of very close geometry. Successful methods to make the most of such templates usually rely on the degree of conservation to select and improve search models. ARCIMBOLDO_SHREDDER uses fragments derived from distant homologues in a brute-force approach driven by the experimental data, instead of by sequence similarity. The new algorithms implemented in ARCIMBOLDO_SHREDDER are described in detail, illustrating its characteristic aspects in the solution of new and test structures. In an advance from the previously published algorithm, which was based on omitting or extracting contiguous polypeptide spans, model generation now uses three-dimensional volumes respecting structural units. The optimal fragment size is estimated from the expected log-likelihood gain (LLG) values computed assuming that a substructure can be found with a level of accuracy near that required for successful extension of the structure, typically below 0.6 Å root-mean-square deviation (r.m.s.d.) from the target. Better sampling is attempted through model trimming or decomposition into rigid groups and optimization through Phaser's gyre refinement. Also, after model translation, packing filtering and refinement, models are either disassembled into predetermined rigid groups and refined (gimble refinement) or Phaser's LLG-guided pruning is used to trim the model of residues that are not contributing signal to the LLG at the target r.m.s.d. value. Phase combination among consistent partial solutions is performed in reciprocal space with ALIXE. Finally, density modification and main-chain autotracing in SHELXE serve to expand to the full structure and identify successful solutions. The performance on test data and the solution of new structures are described.

  12. Knowledge synthesis methods for generating or refining theory: a scoping review reveals that little guidance is available.

    PubMed

    Tricco, Andrea C; Antony, Jesmin; Soobiah, Charlene; Kastner, Monika; Cogo, Elise; MacDonald, Heather; D'Souza, Jennifer; Hui, Wing; Straus, Sharon E

    2016-05-01

    To describe and compare, through a scoping review, emerging knowledge synthesis methods for generating and refining theory, in terms of expertise required, similarities, differences, strengths, limitations, and steps involved in using the methods. Electronic databases (e.g., MEDLINE) were searched, and two reviewers independently selected studies and abstracted data for qualitative analysis. In total, 287 articles reporting nine knowledge synthesis methods (concept synthesis, critical interpretive synthesis, integrative review, meta-ethnography, meta-interpretation, meta-study, meta-synthesis, narrative synthesis, and realist review) were included after screening of 17,962 citations and 1,010 full-text articles. Strengths of the methods included comprehensive synthesis providing rich contextual data and suitability for identifying gaps in the literature, informing policy, aiding in clinical decisions, addressing complex research questions, and synthesizing patient preferences, beliefs, and values. However, many of the methods were highly subjective and not reproducible. For integrative review, meta-ethnography, and realist review, guidance was provided on all steps of the review process, whereas meta-synthesis had guidance on the fewest number of steps. Guidance for conducting the steps was often vague and sometimes absent. Further work is needed to provide direction on operationalizing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. US refining sector still a whipping-boy: what will it take

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-01-27

    The fast moving US product markets are exerting a powerful pull on crude oil prices. This has meant unalleviated downward pressure on refining margins for most of the past year. Downstream of refining, product marketers want the lower rack and spot prices from refineries. Upstream, independent and major-integrated producers want the highest crude prices they can obtain, with the latter producers also wanting the highest product value realizations. Refiners, especially the major-integrated ones, are rooting for OPEC discipline louder than anybody else. This issue also contains the following: (1) weighted dollar values by product for total product barrel at variousmore » sites around the globe; (2) ED refining netback data for the US Gulf and West Coasts, Rotterdam, and Singapore for late January 1988; and (3) ED fuel price/tax series for both the Western and Eastern Hemispheres, Jan. 1988 edition. 5 figures, 18 tables.« less

  14. Rietveld refinement and ionic conductivity of Ca{sub 8.4}Bi{sub 1.6}(PO{sub 4}){sub 6}O{sub 1.8}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tmar Trabelsi, I., E-mail: ilhem_tmar@yahoo.fr; Madani, A.; Mercier, A.M.

    2013-01-15

    The structure of Ca{sub 8.4}Bi{sub 1.6}(PO{sub 4}){sub 6}O{sub 1.8}, isostructural with Fluoroapatite, was determined by X-ray powder diffraction methods. The results of Rietveld refinement revealed that the formula of this compound is [Ca{sub 4}]{sup 4f}[Ca{sub 4.4}Bi{sub 1.6}]{sup 6h}(PO{sub 4}){sub 6}[O{sub 1.8}]{sup 2a}, space group P63/m (a=9.468 (3) A, c=6.957 (3) A). A total substitution of Bi{sup 3+} ions in the (6h) sites was related particularly to the high polarizability of the Bi{sup 3+} ion compared to Ca{sup 2+}. The observed frequencies in the Raman and infrared spectra were explained and discussed on the basis of unit-cell group analyses and inmore » comparison with Fluoroapatite and other oxyapatites. The ionic conductivity over a wide range of temperature was investigated according to the complex impedance method. The highest overall conductivity values were found at {sigma}{sub 700 Degree-Sign C} =5.03 Multiplication-Sign 10{sup -7} S cm{sup -1} and E{sub a}=0.50 eV. - Graphical abstract: The final Rietveld refinement plot of the Ca{sub 8.4}Bi{sub 1.6} (PO{sub 4}){sub 6}O{sub 1.8}. Highlights: Black-Right-Pointing-Pointer The Rietveld refinement revealed that the formula of this compound is Ca{sub 8.4}Bi{sub 1.6}(PO{sub 4}){sub 6}O{sub 1.8}. Black-Right-Pointing-Pointer Vibrational spectroscopy supports the high symmetry P63/m space group for this apatite. Black-Right-Pointing-Pointer This apatite contained channels where oxygen ions were located in 2a sites. Black-Right-Pointing-Pointer The possibility of anionic conduction along these channels was considered.« less

  15. Assessment of land use impact on water-related ecosystem services capturing the integrated terrestrial-aquatic system.

    PubMed

    Maes, Wouter H; Heuvelmans, Griet; Muys, Bart

    2009-10-01

    Although the importance of green (evaporative) water flows in delivering ecosystem services has been recognized, most operational impact assessment methods still focus only on blue water flows. In this paper, we present a new model to evaluate the effect of land use occupation and transformation on water quantity. Conceptually based on the supply of ecosystem services by terrestrial and aquatic ecosystems, the model is developed for, but not limited to, land use impact assessment in life cycle assessment (LCA) and requires a minimum amount of input data. Impact is minimal when evapotranspiration is equal to that of the potential natural vegetation, and maximal when evapotranspiration is zero or when it exceeds a threshold value derived from the concept of environmental water requirement. Three refinements to the model, requiring more input data, are proposed. The first refinement considers a minimal impact over a certain range based on the boundary evapotranspiration of the potential natural vegetation. In the second refinement the effects of evaporation and transpiration are accounted for separately, and in the third refinement a more correct estimate of evaporation from a fully sealed surface is incorporated. The simplicity and user friendliness of the proposed impact assessment method are illustrated with two examples.

  16. An Integrated Planning Representation Using Macros, Abstractions, and Cases

    NASA Technical Reports Server (NTRS)

    Baltes, Jacky; MacDonald, Bruce

    1992-01-01

    Planning will be an essential part of future autonomous robots and integrated intelligent systems. This paper focuses on learning problem solving knowledge in planning systems. The system is based on a common representation for macros, abstractions, and cases. Therefore, it is able to exploit both classical and case based techniques. The general operators in a successful plan derivation would be assessed for their potential usefulness, and some stored. The feasibility of this approach was studied through the implementation of a learning system for abstraction. New macros are motivated by trying to improve the operatorset. One heuristic used to improve the operator set is generating operators with more general preconditions than existing ones. This heuristic leads naturally to abstraction hierarchies. This investigation showed promising results on the towers of Hanoi problem. The paper concludes by describing methods for learning other problem solving knowledge. This knowledge can be represented by allowing operators at different levels of abstraction in a refinement.

  17. Abstraction of complex concepts with a refined partial-area taxonomy of SNOMED

    PubMed Central

    Wang, Yue; Halper, Michael; Wei, Duo; Perl, Yehoshua; Geller, James

    2012-01-01

    An algorithmically-derived abstraction network, called the partial-area taxonomy, for a SNOMED hierarchy has led to the identification of concepts considered complex. The designation “complex” is arrived at automatically on the basis of structural analyses of overlap among the constituent concept groups of the partial-area taxonomy. Such complex concepts, called overlapping concepts, constitute a tangled portion of a hierarchy and can be obstacles to users trying to gain an understanding of the hierarchy’s content. A new methodology for partitioning the entire collection of overlapping concepts into singly-rooted groups, that are more manageable to work with and comprehend, is presented. Different kinds of overlapping concepts with varying degrees of complexity are identified. This leads to an abstract model of the overlapping concepts called the disjoint partial-area taxonomy, which serves as a vehicle for enhanced, high-level display. The methodology is demonstrated with an application to SNOMED’s Specimen hierarchy. Overall, the resulting disjoint partial-area taxonomy offers a refined view of the hierarchy’s structural organization and conceptual content that can aid users, such as maintenance personnel, working with SNOMED. The utility of the disjoint partial-area taxonomy as the basis for a SNOMED auditing regimen is presented in a companion paper. PMID:21878396

  18. Intent Specifications

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1995-01-01

    We have been investigating the implications of using abstractions based on intent rather than the aggregation and information-hiding abstractions commonly used in software en- gineering: Cognitive psychologists have shown that intent abstraction is consistent with human problem-solving processes. We believe that new types of specifications and designs based on this concept can assist in understanding and specifying requirements, capturing the most important design rationale information in an efficient and economical way, and supporting the process of identifying and analyzing required changes to minimize the introduction of errors. The goal of hierarchical abstraction is to allow both top-down and bottom-up reasoning about a complex system. In computer science, we have made much use of (1) part-whole abstractions where each level of a hierarchy represents an aggregation of the components at a lower level and of (2) information-hiding abstractions where each level contains the same conceptual information but hides some details about the concepts, that is, each level is a refinement of the information at a higher level.

  19. Value-added potential of expeller-pressed canola oil refining: characterization of sinapic acid derivatives and tocopherols from byproducts.

    PubMed

    Chen, Yougui; Thiyam-Hollander, Usha; Barthet, Veronique J; Aachary, Ayyappan A

    2014-10-08

    Valuable phenolic antioxidants are lost during oil refining, but evaluation of their occurrence in refining byproducts is lacking. Rapeseed and canola oil are both rich sources of sinapic acid derivatives and tocopherols. The retention and loss of sinapic acid derivatives and tocopherols in commercially produced expeller-pressed canola oils subjected to various refining steps and the respective byproducts were investigated. Loss of canolol (3) and tocopherols were observed during bleaching (84.9%) and deodorization (37.6%), respectively. Sinapic acid (2) (42.9 μg/g), sinapine (1) (199 μg/g), and canolol (344 μg/g) were found in the refining byproducts, namely, soap stock, spent bleaching clay, and wash water, for the first time. Tocopherols (3.75 mg/g) and other nonidentified phenolic compounds (2.7 mg sinapic acid equivalent/g) were found in deodistillates, a byproduct of deodorization. DPPH radical scavenging confirmed the antioxidant potential of the byproducts. This study confirms the value-added potential of byproducts of refining as sources of endogenous phenolics.

  20. Encounters in an online brand community: development and validation of a metric for value co-creation by customers.

    PubMed

    Hsieh, Pei-Ling

    2015-05-01

    Recent developments in service marketing have demonstrated the potential value co-creation by customers who participate in online brand communities (OBCs). Therefore, this study forecasts the co-created value by understanding the participation/behavior of customers in a multi-stakeholder OBC. This six-phase qualitative and quantitative investigation conceptualizes, constructs, refines, and tests a 12-item three-dimensional scale for measuring key factors that are related to the experience, interpersonal interactions, and social relationships that affect the value co-creation by customers in an OBC. The scale captures stable psychometric properties, measured using various reliability and validity tests, and can be applied across various industries. Finally, the utility implications and limitations of the proposed scale are discussed, and potential future research directions considered.

  1. Fully automatic hp-adaptivity for acoustic and electromagnetic scattering in three dimensions

    NASA Astrophysics Data System (ADS)

    Kurtz, Jason Patrick

    We present an algorithm for fully automatic hp-adaptivity for finite element approximations of elliptic and Maxwell boundary value problems in three dimensions. The algorithm automatically generates a sequence of coarse grids, and a corresponding sequence of fine grids, such that the energy norm of the error decreases exponentially with respect to the number of degrees of freedom in either sequence. At each step, we employ a discrete optimization algorithm to determine the refinements for the current coarse grid such that the projection-based interpolation error for the current fine grid solution decreases with an optimal rate with respect to the number of degrees of freedom added by the refinement. The refinements are restricted only by the requirement that the resulting mesh is at most 1-irregular, but they may be anisotropic in both element size h and order of approximation p. While we cannot prove that our method converges at all, we present numerical evidence of exponential convergence for a diverse suite of model problems from acoustic and electromagnetic scattering. In particular we show that our method is well suited to the automatic resolution of exterior problems truncated by the introduction of a perfectly matched layer. To enable and accelerate the solution of these problems on commodity hardware, we include a detailed account of three critical aspects of our implementation, namely an efficient implementation of sum factorization, several efficient interfaces to the direct multi-frontal solver MUMPS, and some fast direct solvers for the computation of a sequence of nested projections.

  2. Functional analysis of iodotyrosine deiodinase from drosophila melanogaster

    PubMed Central

    Phatarphekar, Abhishek

    2016-01-01

    Abstract The flavoprotein iodotyrosine deiodinase (IYD) was first discovered in mammals through its ability to salvage iodide from mono‐ and diiodotyrosine, the by‐products of thyroid hormone synthesis. Genomic information indicates that invertebrates contain homologous enzymes although their iodide requirements are unknown. The catalytic domain of IYD from Drosophila melanogaster has now been cloned, expressed and characterized to determine the scope of its potential catalytic function as a model for organisms that are not associated with thyroid hormone production. Little discrimination between iodo‐, bromo‐, and chlorotyrosine was detected. Their affinity for IYD ranges from 0.46 to 0.62 μM (K d) and their efficiency of dehalogenation ranges from 2.4 – 9 x 103 M−1 s−1 (k cat/K m). These values fall within the variations described for IYDs from other organisms for which a physiological function has been confirmed. The relative contribution of three active site residues that coordinate to the amino acid substrates was subsequently determined by mutagenesis of IYD from Drosophila to refine future annotations of genomic and meta‐genomic data for dehalogenation of halotyrosines. Substitution of the active site glutamate to glutamine was most detrimental to catalysis. Alternative substitution of an active site lysine to glutamine affected substrate affinity to the greatest extent but only moderately affected catalytic turnover. Substitution of phenylalanine for an active site tyrosine was least perturbing for binding and catalysis. PMID:27643701

  3. Refinement of elastic, poroelastic, and osmotic tissue properties of intervertebral disks to analyze behavior in compression.

    PubMed

    Stokes, Ian A F; Laible, Jeffrey P; Gardner-Morse, Mack G; Costi, John J; Iatridis, James C

    2011-01-01

    Intervertebral disks support compressive forces because of their elastic stiffness as well as the fluid pressures resulting from poroelasticity and the osmotic (swelling) effects. Analytical methods can quantify the relative contributions, but only if correct material properties are used. To identify appropriate tissue properties, an experimental study and finite element analytical simulation of poroelastic and osmotic behavior of intervertebral disks were combined to refine published values of disk and endplate properties to optimize model fit to experimental data. Experimentally, nine human intervertebral disks with adjacent hemi-vertebrae were immersed sequentially in saline baths having concentrations of 0.015, 0.15, and 1.5 M and the loss of compressive force at constant height (force relaxation) was recorded over several hours after equilibration to a 300-N compressive force. Amplitude and time constant terms in exponential force-time curve-fits for experimental and finite element analytical simulations were compared. These experiments and finite element analyses provided data dependent on poroelastic and osmotic properties of the disk tissues. The sensitivities of the model to alterations in tissue material properties were used to obtain refined values of five key material parameters. The relaxation of the force in the three bath concentrations was exponential in form, expressed as mean compressive force loss of 48.7, 55.0, and 140 N, respectively, with time constants of 1.73, 2.78, and 3.40 h. This behavior was analytically well represented by a model having poroelastic and osmotic tissue properties with published tissue properties adjusted by multiplying factors between 0.55 and 2.6. Force relaxation and time constants from the analytical simulations were most sensitive to values of fixed charge density and endplate porosity.

  4. Refinement of Elastic, Poroelastic, and Osmotic Tissue Properties of Intervertebral Disks to Analyze Behavior in Compression

    PubMed Central

    Stokes, Ian A. F.; Laible, Jeffrey P.; Gardner-Morse, Mack G.; Costi, John J.; Iatridis, James C.

    2011-01-01

    Intervertebral disks support compressive forces because of their elastic stiffness as well as the fluid pressures resulting from poroelasticity and the osmotic (swelling) effects. Analytical methods can quantify the relative contributions, but only if correct material properties are used. To identify appropriate tissue properties, an experimental study and finite element analytical simulation of poroelastic and osmotic behavior of intervertebral disks were combined to refine published values of disk and endplate properties to optimize model fit to experimental data. Experimentally, nine human intervertebral disks with adjacent hemi-vertebrae were immersed sequentially in saline baths having concentrations of 0.015, 0.15, and 1.5 M and the loss of compressive force at constant height (force relaxation) was recorded over several hours after equilibration to a 300-N compressive force. Amplitude and time constant terms in exponential force–time curve-fits for experimental and finite element analytical simulations were compared. These experiments and finite element analyses provided data dependent on poroelastic and osmotic properties of the disk tissues. The sensitivities of the model to alterations in tissue material properties were used to obtain refined values of five key material parameters. The relaxation of the force in the three bath concentrations was exponential in form, expressed as mean compressive force loss of 48.7, 55.0, and 140 N, respectively, with time constants of 1.73, 2.78, and 3.40 h. This behavior was analytically well represented by a model having poroelastic and osmotic tissue properties with published tissue properties adjusted by multiplying factors between 0.55 and 2.6. Force relaxation and time constants from the analytical simulations were most sensitive to values of fixed charge density and endplate porosity. PMID:20711754

  5. 2.2 A resolution structure analysis of two refined N-acetylneuraminyl-lactose--wheat germ agglutinin isolectin complexes.

    PubMed

    Wright, C S

    1990-10-20

    The crystal structures of complexes of isolectins 1 and 2 of wheat germ agglutinin (WGA1 and WGA2) with N-acetylneuraminyl-lactose (NeuNAc-alpha(2-3)-Gal-beta(1-4)-Glc) have been refined on the basis of data in the 8 to 2.2 A resolution range to final crystallographic R-factors of 17.2% and 15.3% (Fo greater than 1 sigma), respectively. Specific binding interactions and water association, as well as changes in conformation and mobility of the structure upon ligand binding, were compared in the two complexes. The temperature factors (B = 16.3 A2 and 18.4 A2) were found to be much lower compared with those of their respective native structures (19 to 22 A2). Residues involved in sugar binding, dimerization and in lattice contacts exhibit the largest decreases in B-value, suggesting that sugar binding reduces the overall mobility of the protein molecules in the crystal lattice. The binding mode of this sialyl-trisaccharide, an important cell receptor analogue, has been compared in the two isolectins. Only one of the two unique binding sites (4 per dimer), located in the subunit/subunit interface, is occupied in the crystals. This site, termed the "primary" binding site, contains one of the five amino acid substitutions that differentiate WGA1 and WGA2. Superposition of the refined models in each of the independent crystallographic environments indicates a close match only of the terminal non-reducing NeuNAc residue (root-mean-square delta r of 0.5 to 0.6 A). The Gal-Glc portion was found to superimpose poorly, lack electron density, and possess high atomic thermal factors. In both complexes NeuNAc is stabilized through contact with six amino acid side-chains (Ser114 and Glu115 of subunit 1 and Ser62, Tyr64, Tyr(His)66 and Tyr73 of subunit 2), involving all NeuNAc ring substituents. Refinement has allowed accurate assessment of the contact distances for four hydrogen bonds, a strong buried non-polar contact with the acetamido CH3 group and a large number of van der Waals' interactions with the three aromatic side-chains. The higher affinity of N-acetylneuraminyl-lactose observed by nuclear magnetic resonance studies for WGA1 can be explained by the more favorable binding interactions that occur when residue 66 is a Tyr. The tyrosyl side-chain provides a larger surface for van der Waals' stacking against the NeuNAc pyranose ring than His66 and a hydrogen bond contact with Gal (C2-OH), not possible in WGA2.(ABSTRACT TRUNCATED AT 400 WORDS)

  6. Design of Raft Foundations for High-Rise Buildings on Jointed Rock

    NASA Astrophysics Data System (ADS)

    Justo, J. L.; García-Núñez, J.-C.; Vázquez-Boza, M.; Justo, E.; Durand, P.; Azañón, J. M.

    2014-07-01

    This paper presents calculations of displacements and bending moments in a 2-m-thick reinforced-concrete foundation slab using three-dimensional finite-element software. A preliminary paper was presented by Justo et al. (Rock Mech Rock Eng 43:287-304, 2010). The slab is the base of a tower of 137 m height above foundation, supported on jointed and partly weathered basalt and scoria. Installation of rod extensometers at different depths below foundation allowed comparison between measured displacements and displacements calculated using moduli obtained from rock classification systems and three material models: elastic, Mohr-Coulomb and hardening (H). Although all three material models can provide acceptable results, the H model is preferable when there are unloading processes. Acceptable values of settlement may be achieved with medium meshing and an approximate distribution of loads. The absolute values of negative bending moments (tensions below) increase as the rock mass modulus decreases or when the mesh is refined. The paper stresses the importance of adequately representing the details of the distribution of loads and the necessity for fine meshing to obtain acceptable values of bending moments.

  7. Computation and visualization of geometric partial differential equations

    NASA Astrophysics Data System (ADS)

    Tiee, Christopher L.

    The chief goal of this work is to explore a modern framework for the study and approximation of partial differential equations, recast common partial differential equations into this framework, and prove theorems about such equations and their approximations. A central motivation is to recognize and respect the essential geometric nature of such problems, and take it into consideration when approximating. The hope is that this process will lead to the discovery of more refined algorithms and processes and apply them to new problems. In the first part, we introduce our quantities of interest and reformulate traditional boundary value problems in the modern framework. We see how Hilbert complexes capture and abstract the most important properties of such boundary value problems, leading to generalizations of important classical results such as the Hodge decomposition theorem. They also provide the proper setting for numerical approximations. We also provide an abstract framework for evolution problems in these spaces: Bochner spaces. We next turn to approximation. We build layers of abstraction, progressing from functions, to differential forms, and finally, to Hilbert complexes. We explore finite element exterior calculus (FEEC), which allows us to approximate solutions involving differential forms, and analyze the approximation error. In the second part, we prove our central results. We first prove an extension of current error estimates for the elliptic problem in Hilbert complexes. This extension handles solutions with nonzero harmonic part. Next, we consider evolution problems in Hilbert complexes and prove abstract error estimates. We apply these estimates to the problem for Riemannian hypersurfaces in R. {n+1},generalizing current results for open subsets of R. {n}. Finally, we applysome of the concepts to a nonlinear problem, the Ricci flow on surfaces, and use tools from nonlinear analysis to help develop and analyze the equations. In the appendices, we detail some additional motivation and a source for further examples: canonical geometries that are realized as steady-state solutions to parabolic equations similar to that of Ricci flow. An eventual goal is to compute such solutions using the methods of the previous chapters.

  8. Three-dimensional local grid refinement for block-centered finite-difference groundwater models using iteratively coupled shared nodes: A new method of interpolation and analysis of errors

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2004-01-01

    This paper describes work that extends to three dimensions the two-dimensional local-grid refinement method for block-centered finite-difference groundwater models of Mehl and Hill [Development and evaluation of a local grid refinement method for block-centered finite-difference groundwater models using shared nodes. Adv Water Resour 2002;25(5):497-511]. In this approach, the (parent) finite-difference grid is discretized more finely within a (child) sub-region. The grid refinement method sequentially solves each grid and uses specified flux (parent) and specified head (child) boundary conditions to couple the grids. Iteration achieves convergence between heads and fluxes of both grids. Of most concern is how to interpolate heads onto the boundary of the child grid such that the physics of the parent-grid flow is retained in three dimensions. We develop a new two-step, "cage-shell" interpolation method based on the solution of the flow equation on the boundary of the child between nodes shared with the parent grid. Error analysis using a test case indicates that the shared-node local grid refinement method with cage-shell boundary head interpolation is accurate and robust, and the resulting code is used to investigate three-dimensional local grid refinement of stream-aquifer interactions. Results reveal that (1) the parent and child grids interact to shift the true head and flux solution to a different solution where the heads and fluxes of both grids are in equilibrium, (2) the locally refined model provided a solution for both heads and fluxes in the region of the refinement that was more accurate than a model without refinement only if iterations are performed so that both heads and fluxes are in equilibrium, and (3) the accuracy of the coupling is limited by the parent-grid size - A coarse parent grid limits correct representation of the hydraulics in the feedback from the child grid.

  9. Refinement Of Hexahedral Cells In Euler Flow Computations

    NASA Technical Reports Server (NTRS)

    Melton, John E.; Cappuccio, Gelsomina; Thomas, Scott D.

    1996-01-01

    Topologically Independent Grid, Euler Refinement (TIGER) computer program solves Euler equations of three-dimensional, unsteady flow of inviscid, compressible fluid by numerical integration on unstructured hexahedral coordinate grid refined where necessary to resolve shocks and other details. Hexahedral cells subdivided, each into eight smaller cells, as needed to refine computational grid in regions of high flow gradients. Grid Interactive Refinement and Flow-Field Examination (GIRAFFE) computer program written in conjunction with TIGER program to display computed flow-field data and to assist researcher in verifying specified boundary conditions and refining grid.

  10. Crystal structure and magnetic properties of '{alpha} Prime Prime -Fe{sub 16}N{sub 2}' containing residual {alpha}-Fe prepared by low-temperature ammonia nitridation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, S.; Masubuchi, Y.; Nakazawa, Y.

    2012-10-15

    Slight enhancement of saturation magnetization to 219 A m{sup 2} kg{sup -1} was observed from 199 A m{sup 2} kg{sup -1} for the original {alpha}-Fe on the intermediate nitrided mixture of '{alpha} Prime Prime -Fe{sub 16}N{sub 2}' with residual {alpha}-Fe among the low temperature ammonia nitridation products under 5 T magnetic field at room temperature. The value changed not linearly against the yield as had been expected. Crystal structure refinement indicated that the phase similar to {alpha} Prime Prime -Fe{sub 16}N{sub 2} had deviations on its lattice constants and positional parameters, compared to previously reported values for {alpha} Prime Primemore » -Fe{sub 16}N{sub 2}. Spin-polarized total energy calculations were performed using the projector-augmented wave method as implemented in the Vienna ab-initio simulation package (VASP) to calculate magnetic moment on the refined crystal structure of the intermediate '{alpha} Prime Prime -Fe{sub 16}N{sub 2}'. The calculations supported the observed magnetization enhancement in the intermediate nitridation product. - Graphical abstract: Crystal structural parameters slightly change in the intermediate nitrided '{alpha} Prime Prime -Fe{sub 16}N{sub 2}' from those in {alpha} Prime Prime -Fe{sub 16}N{sub 2} to show the magnetization maxima in the mixture of '{alpha} Prime Prime -Fe{sub 16}N{sub 2}' and the residual {alpha}-F. Highlights: Black-Right-Pointing-Pointer Larger magnetization was observed than the value of Fe{sub 16}N{sub 2} on its intermediate nitrided mixture with residual {alpha}-Fe. Black-Right-Pointing-Pointer The enhancement was related to the crystal structural deviation from Fe{sub 16}N{sub 2} on the intermediate nitride. Black-Right-Pointing-Pointer It was supported by spin-polarized total energy calculation using the deviated structure.« less

  11. 76 FR 10669 - Proposed Collection; Comment Request for Form 8896

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-25

    ... 8896, Low Sulfur Diesel Fuel Production Credit. DATES: Written comments should be received on or before...: Title: Low Sulfur Diesel Fuel Production Credit. OMB Number: 1545-1914. Form Number: 8896. Abstract: IRC section 45H allows small business refiners to claim a credit for the production of low sulfur diesel fuel...

  12. Continuous turbidity monitoring in streams of northwestern California

    Treesearch

    Rand Eads; Jack Lewis

    2002-01-01

    Abstract - Redwood Sciences Laboratory, a field office of the USDA Forest Service, Pacific Southwest Research Station has developed and refined methods and instrumentation to monitor turbidity and suspended sediment in streams of northern California since 1996. Currently we operate 21 stations and have provided assistance in the installation of 6 gaging stations for...

  13. Increasing conclusiveness of clinical breath analysis by improved baseline correction of multi capillary column - ion mobility spectrometry (MCC-IMS) data.

    PubMed

    Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C

    2016-08-05

    Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. How do Medical Societies Select Science for Conference Presentation? How Should They?

    PubMed Central

    Kuczmarski, Thomas M.; Raja, Ali S.; Pallin, Daniel J.

    2015-01-01

    Introduction Nothing has been published to describe the practices of medical societies in choosing abstracts for presentations at their annual meetings. We surveyed medical societies to determine their practices, and also present a theoretical analysis of the topic. Methods We contacted a convenience sample of large U.S. medical conferences, and determined their approach to choosing abstracts. We obtained information from web sites, telephone, and email. Our theoretical analysis compares values-based and empirical approaches for scoring system development. Results We contacted 32 societies and obtained data on 28 (response rate 88%). We excluded one upon learning that research was not presented at its annual meeting, leaving 27 for analysis. Only 2 (7%) made their abstract scoring process available to submitters. Reviews were blinded in most societies (21;78%), and all but one asked reviewers to recuse themselves for conflict of interest (96%). All required ≥3 reviewers. Of the 24 providing information on how scores were generated, 21 (88%) reported using a single gestalt score, and three used a combined score created from pooled domain-specific sub-scores. We present a framework for societies to use in choosing abstracts, and demonstrate its application in the development of a new scoring system. Conclusions Most medical societies use subjective, gestalt methods to select research for presentation at their annual meetings and do not disclose to submitters the details of how abstracts are chosen. We present a new scoring system that is transparent to submitters and reviewers alike with an accompanying statement of values and ground rules. We discuss the challenges faced in selecting abstracts for a large scientific meeting and share the values and practical considerations that undergird the new system. PMID:26265966

  15. How do Medical Societies Select Science for Conference Presentation? How Should They?

    PubMed

    Kuczmarski, Thomas M; Raja, Ali S; Pallin, Daniel J

    2015-07-01

    Nothing has been published to describe the practices of medical societies in choosing abstracts for presentations at their annual meetings. We surveyed medical societies to determine their practices, and also present a theoretical analysis of the topic. We contacted a convenience sample of large U.S. medical conferences, and determined their approach to choosing abstracts. We obtained information from web sites, telephone, and email. Our theoretical analysis compares values-based and empirical approaches for scoring system development. We contacted 32 societies and obtained data on 28 (response rate 88%). We excluded one upon learning that research was not presented at its annual meeting, leaving 27 for analysis. Only 2 (7%) made their abstract scoring process available to submitters. Reviews were blinded in most societies (21;78%), and all but one asked reviewers to recuse themselves for conflict of interest (96%). All required ≥3 reviewers. Of the 24 providing information on how scores were generated, 21 (88%) reported using a single gestalt score, and three used a combined score created from pooled domain-specific sub-scores. We present a framework for societies to use in choosing abstracts, and demonstrate its application in the development of a new scoring system. Most medical societies use subjective, gestalt methods to select research for presentation at their annual meetings and do not disclose to submitters the details of how abstracts are chosen. We present a new scoring system that is transparent to submitters and reviewers alike with an accompanying statement of values and ground rules. We discuss the challenges faced in selecting abstracts for a large scientific meeting and share the values and practical considerations that undergird the new system.

  16. Effects of varying refiner pressure on the machanical properties of loblolly pine fibres

    Treesearch

    Les Groom; Timothy Rials; Rebecca Snell

    2000-01-01

    Loblolly pine chips, separated into mature and juvenile portions, were refined at three pressures (4, 8, and 12 bar) in a single disc refiner at the BioComposites Centre. Fibres were dried in a flash drier to a moisture content of approximately 12 percent. The mechanical properties of single fibres from each refining pressure were determined using a tensile strength...

  17. Management of soybean oil refinery wastes through recycling them for producing biosurfactant using Pseudomonas aeruginosa MR01.

    PubMed

    Partovi, Maryam; Lotfabad, Tayebe Bagheri; Roostaazad, Reza; Bahmaei, Manochehr; Tayyebi, Shokoufe

    2013-06-01

    Biosurfactant production through a fermentation process involving the biodegradation of soybean oil refining wastes was studied. Pseudomonas aeruginosa MR01 was able to produce extracellular biosurfactant when it was cultured in three soybean oil refinement wastes; acid oil, deodorizer distillate and soapstock, at different carbon to nitrogen ratios. Subsequent fermentation kinetics in the three types of waste culture were also investigated and compared with kinetic behavior in soybean oil medium. Biodegradation of wastes, biosurfactant production, biomass growth, nitrate consumption and the number of colony forming units were detected in four proposed media, at specified time intervals. Unexpectedly, wastes could stimulate the biodegradation activity of MR01 bacterial cells and thus biosurfactant synthesis beyond that of the refined soybean oil. This is evident from higher yields of biodegradation and production, as revealed in the waste cultures (Ydeg|(Soybean oil) = 53.9 % < Ydeg|(wastes) and YP/S|(wastes) > YP/S|(Soybean oil) = 0.31 g g(-1), respectively). Although production yields were approximately the same in the three waste cultures (YP/S|(wastes) =/~ 0.5 g g(-1)), microbial activity resulted in higher yields of biodegradation (96.5 ± 1.13 %), maximum specific growth rate (μ max = 0.26 ± 0.02 h(-1)), and biosurfactant purity (89.6 %) with a productivity of 14.55 ± 1.10 g l(-1), during the bioconversion of soapstock into biosurfactant. Consequently, applying soybean oil soapstock as a substrate for the production of biosurfactant with commercial value has the potential to provide a combination of economical production with environmental protection through the biosynthesis of an environmentally friendly (green) compound and reduction of waste load entering the environment. Moreover, this work inferred spectrophotometry as an easy method to detect rhamnolipids in the biosurfactant products.

  18. Pharmacophore Refinement Guides the Design of Nanomolar-Range Botulinum Neurotoxin Serotype A Light Chain Inhibitors

    PubMed Central

    2010-01-01

    Botulinum neurotoxins (BoNTs) are the deadliest of microbial toxins. The enzymes’ zinc(II) metalloprotease, referred to as the light chain (LC) component, inhibits acetylcholine release into neuromuscular junctions, resulting in the disease botulism. Currently, no therapies counter BoNT poisoning postneuronal intoxication; however, it is hypothesized that small molecules may be used to inhibit BoNT LC activity in the neuronal cytosol. Herein, we describe the pharmacophore-based design and chemical synthesis of potent [non-zinc(II) chelating] small molecule (nonpeptidic) inhibitors (SMNPIs) of the BoNT serotype A LC (the most toxic of the BoNT serotype LCs). Specifically, the three-dimensional superimpositions of 2-[4-(4-amidinephenoxy)phenyl]indole-6-amidine-based SMNPI regioisomers [Ki = 0.600 μM (±0.100 μM)], with a novel lead bis-[3-amide-5-(imidazolino)phenyl]terephthalamide (BAIPT)-based SMNPI [Ki = 8.52 μM (±0.53 μM)], resulted in a refined four-zone pharmacophore. The refined model guided the design of BAIPT-based SMNPIs possessing Ki values = 0.572 (±0.041 μM) and 0.900 μM (±0.078 μM). PMID:21116458

  19. Effects of different annealing atmospheres on the properties of cadmium sulfide thin films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yücel, E., E-mail: dr.ersinyucel@gmail.com; Kahraman, S.; Güder, H.S.

    2015-08-15

    Graphical abstract: The effects of different annealing atmospheres (air and sulfur) on the structural, morphological and optical properties of CdS thin films were studied at three different pH values. - Highlights: • Compactness and smoothness of the films were enhanced after sulfur annealing. • Micro-strain values of some films were improved after sulfur annealing. • Dislocation density values of some films were improved after sulfur annealing. • Band gap values of the films were improved after sulfur annealing. - Abstract: Cadmium sulfide (CdS) thin films were prepared on glass substrates by using chemical bath deposition (CBD) technique. The effects ofmore » different annealing atmospheres (air and sulfur) on the structural, morphological and optical properties of CdS thin films were studied at three different pH values. Compactness and smoothness of the films (especially for pH 10.5 and 11) enhanced after sulfur annealing. pH value of the precursor solution remarkably affected the roughness, uniformity and particle sizes of the films. Based on the analysis of X-ray diffraction (XRD) patterns of the films, micro-strain and dislocation density values of the sulfur-annealed films (pH 10.5 and 11) were found to be lower than those of air-annealed films. Air-annealed films (pH 10.5, 11 and 11.5) exhibited higher transmittance than sulfur-annealed films in the wavelength region of 550–800 nm. Optical band gap values of the films were found between 2.31 eV and 2.36 eV.« less

  20. Nutrient and suspended solids removal from petrochemical wastewater via microalgal biofilm cultivation.

    PubMed

    Hodges, Alan; Fica, Zachary; Wanlass, Jordan; VanDarlin, Jessica; Sims, Ronald

    2017-05-01

    Wastewater derived from petroleum refining currently accounts for 33.6 million barrels per day globally. Few wastewater treatment strategies exist to produce value-added products from petroleum refining wastewater. In this study, mixed culture microalgal biofilm-based treatment of petroleum refining wastewater using rotating algae biofilm reactors (RABRs) was compared with suspended-growth open pond lagoon reactors for removal of nutrients and suspended solids. Triplicate reactors were operated for 12 weeks and were continuously fed with petroleum refining wastewater. Effluent wastewater was monitored for nitrogen, phosphorus, total suspended solids (TSS), and chemical oxygen demand (COD). RABR treatment demonstrated a statistically significant increase in removal of nutrients and suspended solids, and increase in biomass productivity, compared to the open pond lagoon treatment. These trends translate to a greater potential for the production of biomass-based fuels, feed, and fertilizer as value-added products. This study is the first demonstration of the cultivation of mixed culture biofilm microalgae on petroleum refining wastewater for the dual purposes of treatment and biomass production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Deployment of e-health services - a business model engineering strategy.

    PubMed

    Kijl, Björn; Nieuwenhuis, Lambert J M; Huis in 't Veld, Rianne M H A; Hermens, Hermie J; Vollenbroek-Hutten, Miriam M R

    2010-01-01

    We designed a business model for deploying a myofeedback-based teletreatment service. An iterative and combined qualitative and quantitative action design approach was used for developing the business model and the related value network. Insights from surveys, desk research, expert interviews, workshops and quantitative modelling were combined to produce the first business model and then to refine it in three design cycles. The business model engineering strategy provided important insights which led to an improved, more viable and feasible business model and related value network design. Based on this experience, we conclude that the process of early stage business model engineering reduces risk and produces substantial savings in costs and resources related to service deployment.

  2. Development of Phenol-Enriched Olive Oil with Phenolic Compounds Extracted from Wastewater Produced by Physical Refining.

    PubMed

    Venturi, Francesca; Sanmartin, Chiara; Taglieri, Isabella; Nari, Anita; Andrich, Gianpaolo; Terzuoli, Erika; Donnini, Sandra; Nicolella, Cristiano; Zinnai, Angela

    2017-08-22

    While in the last few years the use of olive cake and mill wastewater as natural sources of phenolic compounds has been widely considered and several studies have focused on the development of new extraction methods and on the production of functional foods enriched with natural antioxidants, no data has been available on the production of a phenol-enriched refined olive oil with its own phenolic compounds extracted from wastewater produced during physical refining. In this study; we aimed to: (i) verify the effectiveness of a multi-step extraction process to recover the high-added-value phenolic compounds contained in wastewater derived from the preliminary washing degumming step of the physical refining of vegetal oils; (ii) evaluate their potential application for the stabilization of olive oil obtained with refined olive oils; and (iii) evaluate their antioxidant activity in an in vitro model of endothelial cells. The results obtained demonstrate the potential of using the refining wastewater as a source of bioactive compounds to improve the nutraceutical value as well as the antioxidant capacity of commercial olive oils. In the conditions adopted, the phenolic content significantly increased in the prototypes of phenol-enriched olive oils when compared with the control oil.

  3. Development of Phenol-Enriched Olive Oil with Phenolic Compounds Extracted from Wastewater Produced by Physical Refining

    PubMed Central

    Taglieri, Isabella; Nari, Anita; Andrich, Gianpaolo; Terzuoli, Erika; Donnini, Sandra; Nicolella, Cristiano; Zinnai, Angela

    2017-01-01

    While in the last few years the use of olive cake and mill wastewater as natural sources of phenolic compounds has been widely considered and several studies have focused on the development of new extraction methods and on the production of functional foods enriched with natural antioxidants, no data has been available on the production of a phenol-enriched refined olive oil with its own phenolic compounds extracted from wastewater produced during physical refining. In this study; we aimed to: (i) verify the effectiveness of a multi-step extraction process to recover the high-added-value phenolic compounds contained in wastewater derived from the preliminary washing degumming step of the physical refining of vegetal oils; (ii) evaluate their potential application for the stabilization of olive oil obtained with refined olive oils; and (iii) evaluate their antioxidant activity in an in vitro model of endothelial cells. The results obtained demonstrate the potential of using the refining wastewater as a source of bioactive compounds to improve the nutraceutical value as well as the antioxidant capacity of commercial olive oils. In the conditions adopted, the phenolic content significantly increased in the prototypes of phenol-enriched olive oils when compared with the control oil. PMID:28829365

  4. Curves and Surfaces

    DTIC Science & Technology

    1990-01-01

    Morten Dohlen Center for Industrial Rcsearch(SI), Box 124 Blindern, 0314 Oslo 3, Norway. Abstract. The combination of refinement and decomposition...of Technology Faculty of Industrial Design Engineering Section Mechanical Engineering Design Jaffalaan 9 NL-2628 BX Delft The Netherlands louwe...OF A GIVEN SET OF POINTS Leonardo Traversoni Dominguez Division de Ciencias Basicas e Ingenieria Universidad Autonoma Metropolitana (Iztapalapa) ap

  5. Finite element mesh refinement criteria for stress analysis

    NASA Technical Reports Server (NTRS)

    Kittur, Madan G.; Huston, Ronald L.

    1990-01-01

    This paper discusses procedures for finite-element mesh selection and refinement. The objective is to improve accuracy. The procedures are based on (1) the minimization of the stiffness matrix race (optimizing node location); (2) the use of h-version refinement (rezoning, element size reduction, and increasing the number of elements); and (3) the use of p-version refinement (increasing the order of polynomial approximation of the elements). A step-by-step procedure of mesh selection, improvement, and refinement is presented. The criteria for 'goodness' of a mesh are based on strain energy, displacement, and stress values at selected critical points of a structure. An analysis of an aircraft lug problem is presented as an example.

  6. Refined Composite Multiscale Dispersion Entropy and its Application to Biomedical Signals.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Abasolo, Daniel; Escudero, Javier

    2017-12-01

    We propose a novel complexity measure to overcome the deficiencies of the widespread and powerful multiscale entropy (MSE), including, MSE values may be undefined for short signals, and MSE is slow for real-time applications. We introduce multiscale dispersion entropy (DisEn-MDE) as a very fast and powerful method to quantify the complexity of signals. MDE is based on our recently developed DisEn, which has a computation cost of O(N), compared with O(N 2 ) for sample entropy used in MSE. We also propose the refined composite MDE (RCMDE) to improve the stability of MDE. We evaluate MDE, RCMDE, and refined composite MSE (RCMSE) on synthetic signals and three biomedical datasets. The MDE, RCMDE, and RCMSE methods show similar results, although the MDE and RCMDE are faster, lead to more stable results, and discriminate different types of physiological signals better than MSE and RCMSE. For noisy short and long time series, MDE and RCMDE are noticeably more stable than MSE and RCMSE, respectively. For short signals, MDE and RCMDE, unlike MSE and RCMSE, do not lead to undefined values. The proposed MDE and RCMDE are significantly faster than MSE and RCMSE, especially for long signals, and lead to larger differences between physiological conditions known to alter the complexity of the physiological recordings. MDE and RCMDE are expected to be useful for the analysis of physiological signals thanks to their ability to distinguish different types of dynamics. The MATLAB codes used in this paper are freely available at http://dx.doi.org/10.7488/ds/1982.

  7. Dynamic Model of Basic Oxygen Steelmaking Process Based on Multi-zone Reaction Kinetics: Model Derivation and Validation

    NASA Astrophysics Data System (ADS)

    Rout, Bapin Kumar; Brooks, Geoff; Rhamdhani, M. Akbar; Li, Zushu; Schrama, Frank N. H.; Sun, Jianjun

    2018-04-01

    A multi-zone kinetic model coupled with a dynamic slag generation model was developed for the simulation of hot metal and slag composition during the basic oxygen furnace (BOF) operation. The three reaction zones (i) jet impact zone, (ii) slag-bulk metal zone, (iii) slag-metal-gas emulsion zone were considered for the calculation of overall refining kinetics. In the rate equations, the transient rate parameters were mathematically described as a function of process variables. A micro and macroscopic rate calculation methodology (micro-kinetics and macro-kinetics) were developed to estimate the total refining contributed by the recirculating metal droplets through the slag-metal emulsion zone. The micro-kinetics involves developing the rate equation for individual droplets in the emulsion. The mathematical models for the size distribution of initial droplets, kinetics of simultaneous refining of elements, the residence time in the emulsion, and dynamic interfacial area change were established in the micro-kinetic model. In the macro-kinetics calculation, a droplet generation model was employed and the total amount of refining by emulsion was calculated by summing the refining from the entire population of returning droplets. A dynamic FetO generation model based on oxygen mass balance was developed and coupled with the multi-zone kinetic model. The effect of post-combustion on the evolution of slag and metal composition was investigated. The model was applied to a 200-ton top blowing converter and the simulated value of metal and slag was found to be in good agreement with the measured data. The post-combustion ratio was found to be an important factor in controlling FetO content in the slag and the kinetics of Mn and P in a BOF process.

  8. X-ray structure determination at low resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunger, Axel T., E-mail: brunger@stanford.edu; Department of Molecular and Cellular Physiology, Stanford University; Department of Neurology and Neurological Sciences, Stanford University

    2009-02-01

    Refinement is meaningful even at 4 Å or lower, but with present methodologies it should start from high-resolution crystal structures whenever possible. As an example of structure determination in the 3.5–4.5 Å resolution range, crystal structures of the ATPase p97/VCP, consisting of an N-terminal domain followed by a tandem pair of ATPase domains (D1 and D2), are discussed. The structures were originally solved by molecular replacement with the high-resolution structure of the N-D1 fragment of p97/VCP, whereas the D2 domain was manually built using its homology to the D1 domain as a guide. The structure of the D2 domain alonemore » was subsequently solved at 3 Å resolution. The refined model of D2 and the high-resolution structure of the N-D1 fragment were then used as starting models for re-refinement against the low-resolution diffraction data for full-length p97. The re-refined full-length models showed significant improvement in both secondary structure and R values. The free R values dropped by as much as 5% compared with the original structure refinements, indicating that refinement is meaningful at low resolution and that there is information in the diffraction data even at ∼4 Å resolution that objectively assesses the quality of the model. It is concluded that de novo model building is problematic at low resolution and refinement should start from high-resolution crystal structures whenever possible.« less

  9. The role of FDG-PET/CT in gynaecological cancers

    PubMed Central

    Cross, Susan; Flanagan, Sean; Moore, Elizabeth; Avril, Norbert

    2012-01-01

    Abstract There is now a growing body of evidence supporting the use of fluorodeoxyglucose (FDG)-positron emission tomography (PET)/computed tomography (CT) in gynaecological malignancies. Although this molecular imaging technique is becoming increasingly available, PET/CT remains an expensive imaging tool. It is essential to be familiar with the circumstances in which FDG-PET/CT can add value and contribute to patient management and indeed to know when it is unlikely to be of benefit. It is also important to understand and recognize the potential pitfalls. FDG-PET/CT has been most widely adopted for staging patients with suspected advanced disease or in suspected recurrence, offering a whole-body imaging approach. However, there is great potential for this technique to act as a predictive biomarker of response to treatment, as well as a prognostic biomarker. In addition, FDG-PET images may now be incorporated into radiotherapy planning in order to refine the delineation of dose according to metabolically active sites of disease. This article reviews the literature that provides the evidence for the use of FDG-PET in gynaecological malignancies, identifies areas of real benefit and future potential, and highlights circumstances where there is limited value. PMID:22391444

  10. Illustrative case using the RISK21 roadmap and matrix: prioritization for evaluation of chemicals found in drinking water

    PubMed Central

    Wolf, Douglas C.; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I.; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P.; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R.

    2016-01-01

    ABSTRACT The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework’s roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization. PMID:26451723

  11. Naturalistic Decision Making For Power System Operators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Podmore, Robin; Robinson, Marck

    2009-06-23

    Abstract: Motivation -- As indicated by the Blackout of 2003, the North American interconnected electric system is vulnerable to cascading outages and widespread blackouts. Investigations of large scale outages often attribute the causes to the three T’s: Trees, Training and Tools. A systematic approach has been developed to document and understand the mental processes that an expert power system operator uses when making critical decisions. The approach has been developed and refined as part of a capability demonstration of a high-fidelity real-time power system simulator under normal and emergency conditions. To examine naturalistic decision making (NDM) processes, transcripts of operator-to-operatormore » conversations are analyzed to reveal and assess NDM-based performance criteria. Findings/Design -- The results of the study indicate that we can map the Situation Awareness Level of the operators at each point in the scenario. We can also identify clearly what mental models and mental simulations are being performed at different points in the scenario. As a result of this research we expect that we can identify improved training methods and improved analytical and visualization tools for power system operators. Originality/Value -- The research applies for the first time, the concepts of Recognition Primed Decision Making, Situation Awareness Levels and Cognitive Task Analysis to training of electric power system operators. Take away message -- The NDM approach provides an ideal framework for systematic training management and mitigation to accelerate learning in team-based training scenarios with high-fidelity power grid simulators.« less

  12. Synthesis and crystal structure determination of yttrium ultraphosphate YP 5O 14

    NASA Astrophysics Data System (ADS)

    Mbarek, A.; Graia, M.; Chadeyron, G.; Zambon, D.; Bouaziz, J.; Fourati, M.

    2009-03-01

    The crystal structure of monoclinic YP 5O 14 (space group C2/ c, a=12.919(2) Å, b=12.796(4) Å, c=12.457(2) Å, β=91.30(1)°, Z=8) has been refined from single-crystal X-ray diffraction data. Full-matrix least-squares refinement on F2 using 2249 independent reflections for 183 refinable parameters results in a final R value of 0.027 ( ωR=0.069). The structure is isotypic with HoP 5O 14. This structure is built up from infinite layers of PO 4 tetrahedra linked through isolated YO 8 polyhedra. The three-dimensional cohesion of the framework results from Y-O-P bridges. This crystal structure refinement leads to the calculated X-ray diffraction powder pattern of this monoclinic polymorph, which has been the starting point of a thorough study of the solid-state synthesis of this ultraphosphate. This investigation further leads to a better outstanding of features observed during the synthesis of powdered samples. The thermal behavior of this ultraphosphate has been studied by DTA and TGA analyses. The infrared and Raman spectroscopic characterizations have been carried out on polycrystalline samples. The luminescence properties of the Eu 3+ ion incorporated in the monoclinic C2/ c polymorph of YP 5O 14 as local structural probe show that in YP 5O 14: 5% Eu 3+ sample, the Eu 3+ ions are distributed over the two Y 3+ crystallographic sites of C 2 symmetry of this structure.

  13. Error estimation and adaptive mesh refinement for parallel analysis of shell structures

    NASA Technical Reports Server (NTRS)

    Keating, Scott C.; Felippa, Carlos A.; Park, K. C.

    1994-01-01

    The formulation and application of element-level, element-independent error indicators is investigated. This research culminates in the development of an error indicator formulation which is derived based on the projection of element deformation onto the intrinsic element displacement modes. The qualifier 'element-level' means that no information from adjacent elements is used for error estimation. This property is ideally suited for obtaining error values and driving adaptive mesh refinements on parallel computers where access to neighboring elements residing on different processors may incur significant overhead. In addition such estimators are insensitive to the presence of physical interfaces and junctures. An error indicator qualifies as 'element-independent' when only visible quantities such as element stiffness and nodal displacements are used to quantify error. Error evaluation at the element level and element independence for the error indicator are highly desired properties for computing error in production-level finite element codes. Four element-level error indicators have been constructed. Two of the indicators are based on variational formulation of the element stiffness and are element-dependent. Their derivations are retained for developmental purposes. The second two indicators mimic and exceed the first two in performance but require no special formulation of the element stiffness mesh refinement which we demonstrate for two dimensional plane stress problems. The parallelizing of substructures and adaptive mesh refinement is discussed and the final error indicator using two-dimensional plane-stress and three-dimensional shell problems is demonstrated.

  14. A Variable Resolution Atmospheric General Circulation Model for a Megasite at the North Slope of Alaska

    NASA Astrophysics Data System (ADS)

    Dennis, L.; Roesler, E. L.; Guba, O.; Hillman, B. R.; McChesney, M.

    2016-12-01

    The Atmospheric Radiation Measurement (ARM) climate research facility has three siteslocated on the North Slope of Alaska (NSA): Barrrow, Oliktok, and Atqasuk. These sites, incombination with one other at Toolik Lake, have the potential to become a "megasite" whichwould combine observational data and high resolution modeling to produce high resolutiondata products for the climate community. Such a data product requires high resolutionmodeling over the area of the megasite. We present three variable resolution atmosphericgeneral circulation model (AGCM) configurations as potential alternatives to stand-alonehigh-resolution regional models. Each configuration is based on a global cubed-sphere gridwith effective resolution of 1 degree, with a refinement in resolution down to 1/8 degree overan area surrounding the ARM megasite. The three grids vary in the size of the refined areawith 13k, 9k, and 7k elements. SquadGen, NCL, and GIMP are used to create the grids.Grids vary based upon the selection of areas of refinement which capture climate andweather processes that may affect a proposed NSA megasite. A smaller area of highresolution may not fully resolve climate and weather processes before they reach the NSA,however grids with smaller areas of refinement have a significantly reduced computationalcost compared with grids with larger areas of refinement. Optimal size and shape of thearea of refinement for a variable resolution model at the NSA is investigated.

  15. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  16. Modeling Turbulent Combustion for Variable Prandtl and Schmidt Number

    NASA Technical Reports Server (NTRS)

    Hassan, H. A.

    2004-01-01

    This report consists of two abstracts submitted for possible presentation at the AIAA Aerospace Science Meeting to be held in January 2005. Since the submittal of these abstracts we are continuing refinement of the model coefficients derived for the case of a variable Turbulent Prandtl number. The test cases being investigated are a Mach 9.2 flow over a degree ramp and a Mach 8.2 3-D calculation of crossing shocks. We have developed an axisymmetric code for treating axisymmetric flows. In addition the variable Schmidt number formulation was incorporated in the code and we are in the process of determining the model constants.

  17. [Effects of refined konjac meal on lipid metabolism and blood viscosity of rats fed by high fat forage].

    PubMed

    Wang, Zhongxia; Yang, Lili; Liu, Hong; Tan, Xiutao

    2002-04-01

    Male SD rats are fed by high fat diet supplemented with refined konjac meal for 6 weeks and the effect of refined konjac meal on the serum lipid peroxides (LPO) and blood viscosity are observed. The results showed that the refined konjac meal can obviously decrease serum cholesterol, triglyceride and serum LPO of rats in comparison with those of rats fed only by high fat forage, and can elevate, at the same time, the high density lipoprotein-cholesterol and high density lipoprotein-cholesterol/triglyceride value. It is also shown that the refined konjac meal can decrease the blood viscosity, but has no effect on forage intake and weight gain of rats.

  18. The Classification, Detection and Handling of Imperfect Theory Problems.

    DTIC Science & Technology

    1987-04-20

    Explanation-Based Learning: Failure-Driven Schema Refinement." Proceedings of the Third IEEE Conference on Artificial Intelligence Applications . Orlando...A. Rajamoney. Gerald F. DeJong Artificial Intelligence Research Group " . Coordinated Science Laboratory " University of Illinois at Urbana-Champaign...Urbana. IL 61801 . April 1987 ABSTRACT This paper also appears in the Proceedings of the Tenth International Conference on Artificial Intelligence

  19. Unstructured Euler flow solutions using hexahedral cell refinement

    NASA Technical Reports Server (NTRS)

    Melton, John E.; Cappuccio, Gelsomina; Thomas, Scott D.

    1991-01-01

    An attempt is made to extend grid refinement into three dimensions by using unstructured hexahedral grids. The flow solver is developed using the TIGER (topologically Independent Grid, Euler Refinement) as the starting point. The program uses an unstructured hexahedral mesh and a modified version of the Jameson four-stage, finite-volume Runge-Kutta algorithm for integration of the Euler equations. The unstructured mesh allows for local refinement appropriate for each freestream condition, thereby concentrating mesh cells in the regions of greatest interest. This increases the computational efficiency because the refinement is not required to extend throughout the entire flow field.

  20. Discovering and visualizing indirect associations between biomedical concepts

    PubMed Central

    Tsuruoka, Yoshimasa; Miwa, Makoto; Hamamoto, Kaisei; Tsujii, Jun'ichi; Ananiadou, Sophia

    2011-01-01

    Motivation: Discovering useful associations between biomedical concepts has been one of the main goals in biomedical text-mining, and understanding their biomedical contexts is crucial in the discovery process. Hence, we need a text-mining system that helps users explore various types of (possibly hidden) associations in an easy and comprehensible manner. Results: This article describes FACTA+, a real-time text-mining system for finding and visualizing indirect associations between biomedical concepts from MEDLINE abstracts. The system can be used as a text search engine like PubMed with additional features to help users discover and visualize indirect associations between important biomedical concepts such as genes, diseases and chemical compounds. FACTA+ inherits all functionality from its predecessor, FACTA, and extends it by incorporating three new features: (i) detecting biomolecular events in text using a machine learning model, (ii) discovering hidden associations using co-occurrence statistics between concepts, and (iii) visualizing associations to improve the interpretability of the output. To the best of our knowledge, FACTA+ is the first real-time web application that offers the functionality of finding concepts involving biomolecular events and visualizing indirect associations of concepts with both their categories and importance. Availability: FACTA+ is available as a web application at http://refine1-nactem.mc.man.ac.uk/facta/, and its visualizer is available at http://refine1-nactem.mc.man.ac.uk/facta-visualizer/. Contact: tsuruoka@jaist.ac.jp PMID:21685059

  1. 40 CFR 80.101 - Standards applicable to refiners and importers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... are produced or imported over a period no longer than one month; (iii) Uses the total of the volumes... Administrator may grant an averaging period of two, three, four or five years upon petition of a refiner who: (A... for a two or three year averaging period must be submitted by June 1, 2003. Regardless of the...

  2. 40 CFR 80.101 - Standards applicable to refiners and importers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... are produced or imported over a period no longer than one month; (iii) Uses the total of the volumes... Administrator may grant an averaging period of two, three, four or five years upon petition of a refiner who: (A... for a two or three year averaging period must be submitted by June 1, 2003. Regardless of the...

  3. 40 CFR 80.101 - Standards applicable to refiners and importers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... are produced or imported over a period no longer than one month; (iii) Uses the total of the volumes... Administrator may grant an averaging period of two, three, four or five years upon petition of a refiner who: (A... for a two or three year averaging period must be submitted by June 1, 2003. Regardless of the...

  4. Three-dimensional unstructured grid refinement and optimization using edge-swapping

    NASA Technical Reports Server (NTRS)

    Gandhi, Amar; Barth, Timothy

    1993-01-01

    This paper presents a three-dimensional (3-D) 'edge-swapping method based on local transformations. This method extends Lawson's edge-swapping algorithm into 3-D. The 3-D edge-swapping algorithm is employed for the purpose of refining and optimizing unstructured meshes according to arbitrary mesh-quality measures. Several criteria including Delaunay triangulations are examined. Extensions from two to three dimensions of several known properties of Delaunay triangulations are also discussed.

  5. Hierarchical specification of the SIFT fault tolerant flight control system

    NASA Technical Reports Server (NTRS)

    Melliar-Smith, P. M.; Schwartz, R. L.

    1981-01-01

    The specification and mechanical verification of the Software Implemented Fault Tolerance (SIFT) flight control system is described. The methodology employed in the verification effort is discussed, and a description of the hierarchical models of the SIFT system is given. To meet the objective of NASA for the reliability of safety critical flight control systems, the SIFT computer must achieve a reliability well beyond the levels at which reliability can be actually measured. The methodology employed to demonstrate rigorously that the SIFT computer meets as reliability requirements is described. The hierarchy of design specifications from very abstract descriptions of system function down to the actual implementation is explained. The most abstract design specifications can be used to verify that the system functions correctly and with the desired reliability since almost all details of the realization were abstracted out. A succession of lower level models refine these specifications to the level of the actual implementation, and can be used to demonstrate that the implementation has the properties claimed of the abstract design specifications.

  6. Use of the RISK21 roadmap and matrix: human health risk assessment of the use of a pyrethroid in bed netting

    PubMed Central

    Doe, John E.; Lander, Deborah R.; Doerrer, Nancy G.; Heard, Nina; Hines, Ronald N.; Lowit, Anna B.; Pastoor, Timothy; Phillips, Richard D.; Sargent, Dana; Sherman, James H.; Young Tanir, Jennifer; Embry, Michelle R.

    2016-01-01

    Abstract The HESI-coordinated RISK21 roadmap and matrix are tools that provide a transparent method to compare exposure and toxicity information and assess whether additional refinement is required to obtain the necessary precision level for a decision regarding safety. A case study of the use of a pyrethroid, “pseudomethrin,” in bed netting to control malaria is presented to demonstrate the application of the roadmap and matrix. The evaluation began with a problem formulation step. The first assessment utilized existing information pertaining to the use and the class of chemistry. At each stage of the step-wise approach, the precision of the toxicity and exposure estimates were refined as necessary by obtaining key data which enabled a decision on safety to be made efficiently and with confidence. The evaluation demonstrated the concept of using existing information within the RISK21 matrix to drive the generation of additional data using a value-of-information approach. The use of the matrix highlighted whether exposure or toxicity required further investigation and emphasized the need to address the default uncertainty factor of 100 at the highest tier of the evaluation. It also showed how new methodology such as the use of in vitro studies and assays could be used to answer the specific questions which arise through the use of the matrix. The matrix also serves as a useful means to communicate progress to stakeholders during an assessment of chemical use. PMID:26517449

  7. Identification of refined petroleum products in contaminated soils using an identification index for GC chromatograms.

    PubMed

    Kwon, Dongwook; Ko, Myoung-Soo; Yang, Jung-Seok; Kwon, Man Jae; Lee, Seung-Woo; Lee, Seunghak

    2015-08-01

    Hydrocarbons found in the environment are typically characterized by gas chromatography (GC). The shape of the GC chromatogram has been used to identify the source of petroleum contamination. However, the conventional practice of simply comparing the peak patterns of source products to those of environmental samples is dependent on the subjective decisions of individual analysts. We have developed and verified a quantitative analytical method for interpreting GC chromatograms to distinguish refined petroleum products in contaminated soils. We found that chromatograms for gasoline, kerosene, and diesel could be divided into three ranges with boundaries at C6, C8, C16, and C26. In addition, the relative peak area (RPA(GC)) of each range, a dimensionless ratio of the peak area within each range to that of the total range (C6-C26), had a unique value for each petroleum product. An identification index for GC chromatograms (ID(GC)), defined as the ratio of RPA(GC) of C8-C16 to that of C16-C26, was able to identify diesel and kerosene sources in samples extracted from artificially contaminated soils even after weathering. Thus, the ID(GC) can be used to effectively distinguish between refined petroleum products in contaminated soils.

  8. A multigrid method for steady Euler equations on unstructured adaptive grids

    NASA Technical Reports Server (NTRS)

    Riemslagh, Kris; Dick, Erik

    1993-01-01

    A flux-difference splitting type algorithm is formulated for the steady Euler equations on unstructured grids. The polynomial flux-difference splitting technique is used. A vertex-centered finite volume method is employed on a triangular mesh. The multigrid method is in defect-correction form. A relaxation procedure with a first order accurate inner iteration and a second-order correction performed only on the finest grid, is used. A multi-stage Jacobi relaxation method is employed as a smoother. Since the grid is unstructured a Jacobi type is chosen. The multi-staging is necessary to provide sufficient smoothing properties. The domain is discretized using a Delaunay triangular mesh generator. Three grids with more or less uniform distribution of nodes but with different resolution are generated by successive refinement of the coarsest grid. Nodes of coarser grids appear in the finer grids. The multigrid method is started on these grids. As soon as the residual drops below a threshold value, an adaptive refinement is started. The solution on the adaptively refined grid is accelerated by a multigrid procedure. The coarser multigrid grids are generated by successive coarsening through point removement. The adaption cycle is repeated a few times. Results are given for the transonic flow over a NACA-0012 airfoil.

  9. Automatic Large-Scalae 3d Building Shape Refinement Using Conditional Generative Adversarial Networks

    NASA Astrophysics Data System (ADS)

    Bittner, K.; d'Angelo, P.; Körner, M.; Reinartz, P.

    2018-05-01

    Three-dimensional building reconstruction from remote sensing imagery is one of the most difficult and important 3D modeling problems for complex urban environments. The main data sources provided the digital representation of the Earths surface and related natural, cultural, and man-made objects of the urban areas in remote sensing are the digital surface models (DSMs). The DSMs can be obtained either by light detection and ranging (LIDAR), SAR interferometry or from stereo images. Our approach relies on automatic global 3D building shape refinement from stereo DSMs using deep learning techniques. This refinement is necessary as the DSMs, which are extracted from image matching point clouds, suffer from occlusions, outliers, and noise. Though most previous works have shown promising results for building modeling, this topic remains an open research area. We present a new methodology which not only generates images with continuous values representing the elevation models but, at the same time, enhances the 3D object shapes, buildings in our case. Mainly, we train a conditional generative adversarial network (cGAN) to generate accurate LIDAR-like DSM height images from the noisy stereo DSM input. The obtained results demonstrate the strong potential of creating large areas remote sensing depth images where the buildings exhibit better-quality shapes and roof forms.

  10. Biomedical image representation approach using visualness and spatial information in a concept feature space for interactive region-of-interest-based retrieval

    PubMed Central

    Rahman, Md. Mahmudur; Antani, Sameer K.; Demner-Fushman, Dina; Thoma, George R.

    2015-01-01

    Abstract. This article presents an approach to biomedical image retrieval by mapping image regions to local concepts where images are represented in a weighted entropy-based concept feature space. The term “concept” refers to perceptually distinguishable visual patches that are identified locally in image regions and can be mapped to a glossary of imaging terms. Further, the visual significance (e.g., visualness) of concepts is measured as the Shannon entropy of pixel values in image patches and is used to refine the feature vector. Moreover, the system can assist the user in interactively selecting a region-of-interest (ROI) and searching for similar image ROIs. Further, a spatial verification step is used as a postprocessing step to improve retrieval results based on location information. The hypothesis that such approaches would improve biomedical image retrieval is validated through experiments on two different data sets, which are collected from open access biomedical literature. PMID:26730398

  11. The distribution of probability values in medical abstracts: an observational study.

    PubMed

    Ginsel, Bastiaan; Aggarwal, Abhinav; Xuan, Wei; Harris, Ian

    2015-11-26

    A relatively high incidence of p values immediately below 0.05 (such as 0.047 or 0.04) compared to p values immediately above 0.05 (such as 0.051 or 0.06) has been noticed anecdotally in published medical abstracts. If p values immediately below 0.05 are over-represented, such a distribution may reflect the true underlying distribution of p values or may be due to error (a false distribution). If due to error, a consistent over-representation of p values immediately below 0.05 would be a systematic error due either to publication bias or (overt or inadvertent) bias within studies. We searched the Medline 2012 database to identify abstracts containing a p value. Two thousand abstracts out of 80,649 abstracts were randomly selected. Two independent researchers extracted all p values. The p values were plotted and compared to a predicted curve. Chi square test was used to test assumptions and significance was set at 0.05. 2798 p value ranges and 3236 exact p values were reported. 4973 of these (82%) were significant (<0.05). There was an over-representation of p values immediately below 0.05 (between 0.01 and 0.049) compared to those immediately above 0.05 (between 0.05 and 0.1) (p = 0.001). The distribution of p values in reported medical abstracts provides evidence for systematic error in the reporting of p values. This may be due to publication bias, methodological errors (underpowering, selective reporting and selective analyses) or fraud.

  12. Incremental triangulation by way of edge swapping and local optimization

    NASA Technical Reports Server (NTRS)

    Wiltberger, N. Lyn

    1994-01-01

    This document is intended to serve as an installation, usage, and basic theory guide for the two dimensional triangulation software 'HARLEY' written for the Silicon Graphics IRIS workstation. This code consists of an incremental triangulation algorithm based on point insertion and local edge swapping. Using this basic strategy, several types of triangulations can be produced depending on user selected options. For example, local edge swapping criteria can be chosen which minimizes the maximum interior angle (a MinMax triangulation) or which maximizes the minimum interior angle (a MaxMin or Delaunay triangulation). It should be noted that the MinMax triangulation is generally only locally optical (not globally optimal) in this measure. The MaxMin triangulation, however, is both locally and globally optical. In addition, Steiner triangulations can be constructed by inserting new sites at triangle circumcenters followed by edge swapping based on the MaxMin criteria. Incremental insertion of sites also provides flexibility in choosing cell refinement criteria. A dynamic heap structure has been implemented in the code so that once a refinement measure is specified (i.e., maximum aspect ratio or some measure of a solution gradient for the solution adaptive grid generation) the cell with the largest value of this measure is continually removed from the top of the heap and refined. The heap refinement strategy allows the user to specify either the number of cells desired or refine the mesh until all cell refinement measures satisfy a user specified tolerance level. Since the dynamic heap structure is constantly updated, the algorithm always refines the particular cell in the mesh with the largest refinement criteria value. The code allows the user to: triangulate a cloud of prespecified points (sites), triangulate a set of prespecified interior points constrained by prespecified boundary curve(s), Steiner triangulate the interior/exterior of prespecified boundary curve(s), refine existing triangulations based on solution error measures, and partition meshes based on the Cuthill-McKee, spectral, and coordinate bisection strategies.

  13. Towards Understanding How to Leverage Sense-Making, Induction and Refinement, and Fluency to Improve Robust Learning

    ERIC Educational Resources Information Center

    Doroudi, Shayan; Holstein, Kenneth; Aleven, Vincent; Brunskill, Emma

    2015-01-01

    The field of EDM has focused more on modeling student knowledge than on investigating what sequences of different activity types achieve good learning outcomes. In this paper we consider three activity types, targeting sense-making, induction and refinement, and fluency building. We investigate what mix of the three types might be most effective…

  14. Tocopherol composition of deodorization distillates and their antioxidative activity.

    PubMed

    Nogala-Kalucka, Malgorzata; Korczak, Jozef; Wagner, Karl-Heinz; Elmadfa, Ibrahim

    2004-02-01

    During the last stage of plant oil refining, deodorization distillates containing very important biological substances such as tocopherols, sterols, terpenoids or hydrocarbons are formed as a by-products. This study aimed at evaluating the content and antioxidant capacity of tocopherol concentrates from deodorization distillates obtained after the refining of rapeseed, soybean and sunflower oil. The majority of the matrix substances were eliminated from deodorization distillates by freezing with an acetone solution at -70 degrees C. The tocopherol concentrates obtained in this way contained approximately fivefold more tocopherols than the quantity in condensates after deodorization. Antioxidant activity was investigated by observing the peroxide value at 25 degrees C and using the Oxidograph test. The test medium was lard enriched with the tocopherol concentrates of the three plant oils versus single, synthetic alpha-, gamma- and delta-tocopherols (-T), which served for comparison. In these model systems, all investigated tocopherol concentrates exhibited antioxidant capacity. Their antioxidant effect was significantly lower than that of single delta-T and gamma-T, but significantly higher than alpha-T. The results prove that natural tocopherol concentrates obtained from plant oils are valuable food antioxidants and they also increase the biological and nutritional value of food especially when administered to animal fats or food of animal origin. Tocopherol concentrates can fully replace synthetic antioxidants that have been used thus far.

  15. Effect of visual target blurring on accommodation under distance viewing

    NASA Astrophysics Data System (ADS)

    Iwata, Yo; Handa, Tomoya; Ishikawa, Hitoshi

    2018-04-01

    Abstract Purpose To examine the effect of visual target blurring on accommodation. Methods We evaluated the objective refraction values when the visual target (asterisk; 8°) was changed from the state without Gaussian blur (15 s) to the state with Gaussian blur adapted [0(without blur) → 10, 0 → 50, 0 → 100: 15 s each]. Results In Gaussian blur 10, when blurring of the target occurred, refraction value did not change significantly. In Gaussian blur 50 and 100, when blurring of the target occurred, the refraction value became significantly myopic. Conclusion Blurring of the distant visual target results in intervention of accommodation.

  16. [Research on non-rigid registration of multi-modal medical image based on Demons algorithm].

    PubMed

    Hao, Peibo; Chen, Zhen; Jiang, Shaofeng; Wang, Yang

    2014-02-01

    Non-rigid medical image registration is a popular subject in the research areas of the medical image and has an important clinical value. In this paper we put forward an improved algorithm of Demons, together with the conservation of gray model and local structure tensor conservation model, to construct a new energy function processing multi-modal registration problem. We then applied the L-BFGS algorithm to optimize the energy function and solve complex three-dimensional data optimization problem. And finally we used the multi-scale hierarchical refinement ideas to solve large deformation registration. The experimental results showed that the proposed algorithm for large de formation and multi-modal three-dimensional medical image registration had good effects.

  17. Nano-Sized Grain Refinement Using Friction Stir Processing

    DTIC Science & Technology

    2013-03-01

    friction stir weld is a very fine grain microstructure produced as a result of dynamic recrystallization. The friction stir ... Friction Stir Processing, Magnesium, Nano-size grains Abstract A key characteristic of a friction stir weld is a very fine grain microstructure...state process developed on the basis of the friction stir welding (FSW) technique invented by The Welding Institute (TWI) in 1991 [2]. During

  18. Francisco de Castro: Localizationism, intelligence and the frontal lobe

    PubMed Central

    Oliveira, Pedro Sudbrack; Engelhardt, Eliasz; Gomes, Marleide da Mota

    2017-01-01

    ABSTRACT This article addresses the largely unknown legacy of Francisco de Castro regarding the neurological sciences. His essay "Psychogenic Cortical Centers", written in 1881 for his admission to the Imperial Academy of Medicine in Rio de Janeiro, is a refined appreciation of the theory of localized cortical functions that was in evidence in Europe in the second half of the nineteenth century. PMID:29213528

  19. Development in corrosion resistance by microstructural refinement in Zr-16 SS 304 alloy using suction casting technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, N., E-mail: nirupamd@barc.gov.in; Sengupta, P.; Abraham, G.

    Highlights: • Grain refinement was made in Zr–16 wt.% SS alloy while prepared by suction casting process. • Distribution of Laves phase, e.g., Zr{sub 2}(Fe, Cr) was raised in suction cast (SC) Zr–16 wt.% SS. • Corrosion resistance was improved in SC alloy compared to that of arc-melt-cast alloy. • Grain refinement in SC alloy assisted for an increase in its corrosion resistance. - Abstract: Zirconium (Zr)-stainless steel (SS) hybrid alloys are being considered as baseline alloys for developing metallic-waste-form (MWF) with the motivation of disposing of Zr and SS base nuclear metallic wastes. Zr–16 wt.% SS, a MWF alloymore » optimized from previous studies, exhibit significant grain refinement and changes in phase assemblages (soft phase: Zr{sub 2}(Fe, Cr)/α-Zr vs. hard phase: Zr{sub 3}(Fe, Ni)) when prepared by suction casting (SC) technique in comparison to arc-cast-melt (AMC) route. Variation in Cr-distribution among different phases are found to be low in suction cast alloy, which along with grain refinement restricted Cr-depletion at the Zr{sub 2}(Fe, Cr)/Zr interfaces, prone to localized attack. Hence, SC alloy, compared to AMC alloy, showed lower current density, higher potential at the breakdown of passivity and higher corrosion potential during polarization experiments (carried out under possible geological repository environments, viz., pH 8, 5 and 1) indicating its superior corrosion resistance.« less

  20. Introducing a "Means-End" Approach to Human-Computer Interaction: Why Users Choose Particular Web Sites Over Others.

    ERIC Educational Resources Information Center

    Subramony, Deepak Prem

    Gutman's means-end theory, widely used in market research, identifies three levels of abstraction: attributes, consequences, and values--associated with the use of products, representing the process by which physical attributes of products gain personal meaning for users. The primary methodological manifestation of means-end theory is the…

  1. Active Thermochemical Tables: The Adiabatic Ionization Energy of Hydrogen Peroxide.

    PubMed

    Changala, P Bryan; Nguyen, T Lam; Baraban, Joshua H; Ellison, G Barney; Stanton, John F; Bross, David H; Ruscic, Branko

    2017-11-22

    The adiabatic ionization energy of hydrogen peroxide (HOOH) is investigated, both by means of theoretical calculations and theoretically assisted reanalysis of previous experimental data. Values obtained by three different approaches: 10.638 ± 0.012 eV (purely theoretical determination), 10.649 ± 0.005 eV (reanalysis of photoelectron spectrum), and 10.645 ± 0.010 eV (reanalysis of photoionization spectrum) are in excellent mutual agreement. Further refinement of the latter two values to account for asymmetry of the rotational profile of the photoionization origin band leads to a reduction of 0.007 ± 0.006 eV, which tends to bring them into even closer alignment with the purely theoretical value. Detailed analysis of this fundamental quantity by the Active Thermochemical Tables approach, using the present results and extant literature, gives a final estimate of 10.641 ± 0.006 eV.

  2. Active Thermochemical Tables: The Adiabatic Ionization Energy of Hydrogen Peroxide

    DOE PAGES

    Changala, P. Bryan; Nguyen, T. Lam; Baraban, Joshua H.; ...

    2017-09-07

    The adiabatic ionization energy of hydrogen peroxide (HOOH) is investigated, both by means of theoretical calculations and theoretically-assisted reanalysis of previous experimental data. Values obtained by three different approaches: 10.638 ± 0.012 eV (purely theoretical determination), 10.649 ± 0.005 eV (reanalysis of photoelectron spectrum) and 10.645 ± 0.010 eV (reanalysis of photoionization spectrum) are in excellent mutual agreement. Further refinement of the latter two values to account for asymmetry of the rotational profile of the photoionization origin band leads to a reduction of 0.007 ± 0.006 eV, which tends to bring them into even closer alignment with the purely theoreticalmore » value. As a result, detailed analysis of this fundamental quantity by the Active Thermochemical Tables (ATcT) approach, using the present results and extant literature, gives a final estimate of 10.641 ± 0.006 eV.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Changala, P. Bryan; Nguyen, T. Lam; Baraban, Joshua H.

    The adiabatic ionization energy of hydrogen peroxide (HOOH) is investigated, both by means of theoretical calculations and theoretically-assisted reanalysis of previous experimental data. Values obtained by three different approaches: 10.638 ± 0.012 eV (purely theoretical determination), 10.649 ± 0.005 eV (reanalysis of photoelectron spectrum) and 10.645 ± 0.010 eV (reanalysis of photoionization spectrum) are in excellent mutual agreement. Further refinement of the latter two values to account for asymmetry of the rotational profile of the photoionization origin band leads to a reduction of 0.007 ± 0.006 eV, which tends to bring them into even closer alignment with the purely theoreticalmore » value. As a result, detailed analysis of this fundamental quantity by the Active Thermochemical Tables (ATcT) approach, using the present results and extant literature, gives a final estimate of 10.641 ± 0.006 eV.« less

  4. Proving refinement transformations using extended denotational semantics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, V.L.; Boyle, J.M.

    1996-04-01

    TAMPR is a fully automatic transformation system based on syntactic rewrites. Our approach in a correctness proof is to map the transformation into an axiomatized mathematical domain where formal (and automated) reasoning can be performed. This mapping is accomplished via an extended denotational semantic paradigm. In this approach, the abstract notion of a program state is distributed between an environment function and a store function. Such a distribution introduces properties that go beyond the abstract state that is being modeled. The reasoning framework needs to be aware of these properties in order to successfully complete a correctness proof. This papermore » discusses some of our experiences in proving the correctness of TAMPR transformations.« less

  5. Auditing SNOMED Relationships Using a Converse Abstraction Network

    PubMed Central

    Wei, Duo; Halper, Michael; Elhanan, Gai; Chen, Yan; Perl, Yehoshua; Geller, James; Spackman, Kent A.

    2009-01-01

    In SNOMED CT, a given kind of attribute relationship is defined between two hierarchies, a source and a target. Certain hierarchies (or subhierarchies) serve only as targets, with no outgoing relationships of their own. However, converse relationships—those pointing in a direction opposite to the defined relationships—while not explicitly represented in SNOMED’s inferred view, can be utilized in forming an alternative view of a source. In particular, they can help shed light on a source hierarchy’s overall relationship structure. Toward this end, an abstraction network, called the converse abstraction network (CAN), derived automatically from a given SNOMED hierarchy is presented. An auditing methodology based on the CAN is formulated. The methodology is applied to SNOMED’s Device subhierarchy and the related device relationships of the Procedure hierarchy. The results indicate that the CAN is useful in finding opportunities for refining and improving SNOMED. PMID:20351941

  6. Structure refinement for tantalum nitrides nanocrystals with various morphologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Lianyun; School of Science, Beijing Jiaotong University, 3 Shang Yuan Cun, Haidian District, Beijing 100044; Huang, Kai

    2012-07-15

    Graphical abstract: Tantalum nitrides nanocrystals with various phases and morphologies for the first time have been synthesized through homogenous sodium reduction under low temperature with the subsequent annealing process under high vacuum. Highlights: ► The spherical TaN, cuboidal TaN{sub 0.83} and TaN{sub 0.5} nanocrystals have been synthesized through homogenous sodium reduction under low temperature with the subsequent annealing process under high vacuum. ► The crystal structures of different tantalum nitrides were determined by Rietveld refinement on the X-ray diffraction data and the examinations of electron microcopies. ► The specific surface area of the tantalum nitrides powders was around 10 m{supmore » 2} g{sup −1}. ► Tantalum nitrides powders could be suitable for capacitor with high specific capacitance. -- Abstract: Tantalum nitrides (TaN{sub x}) nanocrystals with different phase and morphology have been synthesized through homogenous sodium reduction under low temperature with the subsequent annealing process under high vacuum. The crystal structures of tantalum nitrides were determined by Rietveld refinement based on the X-ray diffraction data. The morphologies of various tantalum nitrides nanocrystals in high quality were analyzed through the electron microcopies examinations. The spherical TaN nanoparticles, cuboidal TaN{sub 0.83} and TaN{sub 0.5} nanocrystals have been selectively prepared at different annealing temperatures. In addition, the specific surface areas of the tantalum nitrides nanocrystals measured by BET method were around 9.87–11.64 m{sup 2} g{sup −1}, indicating that such nano-sized tantalum nitrides could be suitable for capacitor with high specific capacitance.« less

  7. Global classical solvability and stabilization in a two-dimensional chemotaxis-Navier-Stokes system modeling coral fertilization

    NASA Astrophysics Data System (ADS)

    Espejo, Elio; Winkler, Michael

    2018-04-01

    The interplay of chemotaxis, convection and reaction terms is studied in the particular framework of a refined model for coral broadcast spawning, consisting of three equations describing the population densities of unfertilized sperms and eggs and the concentration of a chemical released by the latter, coupled to the incompressible Navier-Stokes equations. Under mild assumptions on the initial data, global existence of classical solutions to an associated initial-boundary value problem in bounded planar domains is established. Moreover, all these solutions are shown to approach a spatially homogeneous equilibrium in the large time limit.

  8. Production and characterization of refined oils obtained from Indian oil sardine (Sardinella longiceps).

    PubMed

    Chakraborty, Kajal; Joseph, Deepu

    2015-01-28

    Crude Sardinella longiceps oil was refined in different stages such as degumming, neutralization, bleaching, and deodorization. The efficiency of these processes was evaluated on the basis of free fatty acid (FFA), peroxide (PV), p-anisidine (pAV), total oxidation (TOTOX), thiobarbituric acid reactive species (TBARS) values, Lovibond CIE-L*a*b* color analyses, and (1)H NMR or GC-MS experiments. The utilities of NMR-based proton signal characteristics as new analytical tools to understand the signature peaks and relative abundance of different fatty acids and monitoring the refining process of fish oil have been demonstrated. Phosphoric acid (1%) was found to be an effective degumming reagent to obtain oil with the lowest FFA, PV, pAV, TOTOX, and TBARS values and highest color reduction. Significant reduction in the contents of hydrocarbon functionalities as shown by the decrease in proton integral in the characteristic (1)H NMR region was demonstrated by using 1% H3PO4 during the course of the degumming process. A combination (1.25:3.75%) of activated charcoal and Fuller's earth at 3% concentration for a stirring time of 40 min was found to be effective in bleaching the sardine oil. This study demonstrated that unfavorable odor-causing components, particularly low molecular weight carbonyl compounds, could successfully be removed by the refining process. The alkane-dienals/alkanes, which cause unfavorable fishy odors, were successfully removed by distillation (100 °C) under vacuum with aqueous acetic acid solution (0.25 N) to obtain greater quality of refined sardine oil, a rich source of essential fatty acids and improved oxidative stability. The present study demonstrated that the four-stage refinement process of sardine oil resulted in a significant improvement in quality characteristics and nutritional values, particularly n-3 PUFAs, with improved fish oil characteristics for use in the pharmaceutical and functional food industries.

  9. Optimization of palm oil physical refining process for reduction of 3-monochloropropane-1,2-diol (3-MCPD) ester formation.

    PubMed

    Zulkurnain, Musfirah; Lai, Oi Ming; Tan, Soo Choon; Abdul Latip, Razam; Tan, Chin Ping

    2013-04-03

    The reduction of 3-monochloropropane-1,2-diol (3-MCPD) ester formation in refined palm oil was achieved by incorporation of additional processing steps in the physical refining process to remove chloroester precursors prior to the deodorization step. The modified refining process was optimized for the least 3-MCPD ester formation and acceptable refined palm oil quality using response surface methodology (RSM) with five processing parameters: water dosage, phosphoric acid dosage, degumming temperature, activated clay dosage, and deodorization temperature. The removal of chloroester precursors was largely accomplished by increasing the water dosage, while the reduction of 3-MCPD esters was a compromise in oxidative stability and color of the refined palm oil because some factors such as acid dosage, degumming temperature, and deodorization temperature showed contradictory effects. The optimization resulted in 87.2% reduction of 3-MCPD esters from 2.9 mg/kg in the conventional refining process to 0.4 mg/kg, with color and oil stability index values of 2.4 R and 14.3 h, respectively.

  10. ADAPTIVE TETRAHEDRAL GRID REFINEMENT AND COARSENING IN MESSAGE-PASSING ENVIRONMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallberg, J.; Stagg, A.

    2000-10-01

    A grid refinement and coarsening scheme has been developed for tetrahedral and triangular grid-based calculations in message-passing environments. The element adaption scheme is based on an edge bisection of elements marked for refinement by an appropriate error indicator. Hash-table/linked-list data structures are used to store nodal and element formation. The grid along inter-processor boundaries is refined and coarsened consistently with the update of these data structures via MPI calls. The parallel adaption scheme has been applied to the solution of a transient, three-dimensional, nonlinear, groundwater flow problem. Timings indicate efficiency of the grid refinement process relative to the flow solvermore » calculations.« less

  11. Building a Middle-Range Theory of Adaptive Spirituality.

    PubMed

    Dobratz, Marjorie C

    2016-04-01

    The purpose of this article is to describe a Roy adaptation model based- research abstraction, the findings of which were synthesized into a middle-range theory (MRT) of adaptive spirituality. The published literature yielded 21 empirical studies that investigated religion/spirituality. Quantitative results supported the influence of spirituality on quality of life, psychosocial adjustment, well-being, adaptive coping, and the self-concept mode. Qualitative findings showed the importance of spiritual expressions, values, and beliefs in adapting to chronic illness, bereavement, death, and other life transitions. These findings were abstracted into six theoretical statements, a conceptual definition of adaptive spirituality, and three hypotheses for future testing. © The Author(s) 2016.

  12. Water-refined solution structure of the human Grb7-SH2 domain in complex with the erbB2 receptor peptide pY1139.

    PubMed

    Pias, Sally C; Johnson, Dennis L; Smith, David E; Lyons, Barbara A

    2012-08-01

    We report a refinement in implicit water of the previously published solution structure of the Grb7-SH2 domain bound to the erbB2 receptor peptide pY1139. Structure quality measures indicate substantial improvement, with residues in the most favored regions of the Ramachandran plot increasing by 14 % and with WHAT IF statistics (Vriend, G. J. Mol. Graph., 1990, 8(1), 52-56) falling closer to expected values for well-refined structures.

  13. Ultra-high resolution X-ray structures of two forms of human recombinant insulin at 100 K.

    PubMed

    Lisgarten, David R; Palmer, Rex A; Lobley, Carina M C; Naylor, Claire E; Chowdhry, Babur Z; Al-Kurdi, Zakieh I; Badwan, Adnan A; Howlin, Brendan J; Gibbons, Nicholas C J; Saldanha, José W; Lisgarten, John N; Basak, Ajit K

    2017-08-01

    The crystal structure of a commercially available form of human recombinant (HR) insulin, Insugen (I), used in the treatment of diabetes has been determined to 0.92 Å resolution using low temperature, 100 K, synchrotron X-ray data collected at 16,000 keV (λ = 0.77 Å). Refinement carried out with anisotropic displacement parameters, removal of main-chain stereochemical restraints, inclusion of H atoms in calculated positions, and 220 water molecules, converged to a final value of R = 0.1112 and R free  = 0.1466. The structure includes what is thought to be an ordered propanol molecule (POL) only in chain D(4) and a solvated acetate molecule (ACT) coordinated to the Zn atom only in chain B(2). Possible origins and consequences of the propanol and acetate molecules are discussed. Three types of amino acid representation in the electron density are examined in detail: (i) sharp with very clearly resolved features; (ii) well resolved but clearly divided into two conformations which are well behaved in the refinement, both having high quality geometry; (iii) poor density and difficult or impossible to model. An example of type (ii) is observed for the intra-chain disulphide bridge in chain C(3) between Sγ6-Sγ11 which has two clear conformations with relative refined occupancies of 0.8 and 0.2, respectively. In contrast the corresponding S-S bridge in chain A(1) shows one clearly defined conformation. A molecular dynamics study has provided a rational explanation of this difference between chains A and C. More generally, differences in the electron density features between corresponding residues in chains A and C and chains B and D is a common observation in the Insugen (I) structure and these effects are discussed in detail. The crystal structure, also at 0.92 Å and 100 K, of a second commercially available form of human recombinant insulin, Intergen (II), deposited in the Protein Data Bank as 3W7Y which remains otherwise unpublished is compared here with the Insugen (I) structure. In the Intergen (II) structure there is no solvated propanol or acetate molecule. The electron density of Intergen (II), however, does also exhibit the three types of amino acid representations as in Insugen (I). These effects do not necessarily correspond between chains A and C or chains B and D in Intergen (II), or between corresponding residues in Insugen (I). The results of this comparison are reported. Graphical abstract Conformations of PheB25 and PheD25 in three insulin structures: implications for biological activity? Insulin residues PheB25 and PheD25 are considered to be important for insulin receptor binding and changes in biological activity occur when these residues are modified. In porcine insulin and Intergen (II) PheB25 adopts conformation B and PheD25 conformation D. However, unexpectedly PheB25 in Insugen (I) human recombinant insulin adopts two distinct conformations corresponding to B and D, Figure 1 and PheD25 adopts a single conformation corresponding to B not D, Figure 2. Conformations of this residue in the ultra-high resolution structure of Insugen (I) are therefore unique within this set. Figures were produced with Biovia, Discovery Studio 2016.

  14. NMRe: a web server for NMR protein structure refinement with high-quality structure validation scores.

    PubMed

    Ryu, Hyojung; Lim, GyuTae; Sung, Bong Hyun; Lee, Jinhyuk

    2016-02-15

    Protein structure refinement is a necessary step for the study of protein function. In particular, some nuclear magnetic resonance (NMR) structures are of lower quality than X-ray crystallographic structures. Here, we present NMRe, a web-based server for NMR structure refinement. The previously developed knowledge-based energy function STAP (Statistical Torsion Angle Potential) was used for NMRe refinement. With STAP, NMRe provides two refinement protocols using two types of distance restraints. If a user provides NOE (Nuclear Overhauser Effect) data, the refinement is performed with the NOE distance restraints as a conventional NMR structure refinement. Additionally, NMRe generates NOE-like distance restraints based on the inter-hydrogen distances derived from the input structure. The efficiency of NMRe refinement was validated on 20 NMR structures. Most of the quality assessment scores of the refined NMR structures were better than those of the original structures. The refinement results are provided as a three-dimensional structure view, a secondary structure scheme, and numerical and graphical structure validation scores. NMRe is available at http://psb.kobic.re.kr/nmre/. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Experimental Test Rig for Optimal Control of Flexible Space Robotic Arms

    DTIC Science & Technology

    2016-12-01

    was used to refine the test bed design and the experimental workflow. Three concepts incorporated various strategies to design a robust flexible link...used to refine the test bed design and the experimental workflow. Three concepts incorporated various strategies to design a robust flexible link... designed to perform the experimentation . The first and second concepts use traditional elastic springs in varying configurations while a third uses a

  16. A Counterexample Guided Abstraction Refinement Framework for Verifying Concurrent C Programs

    DTIC Science & Technology

    2005-05-24

    source code are routinely executed. The source code is written in languages ranging from C/C++/Java to ML/ Ocaml . These languages differ not only in...from the difficulty to model computer programs—due to the complexity of programming languages as compared to hardware description languages —to...intermediate specification language lying between high-level Statechart- like formalisms and transition systems. Actions are encoded as changes in

  17. USSR and Eastern Europe Scientific Abstracts, Materials Science and Metallurgy, Number 56.

    DTIC Science & Technology

    1978-10-05

    metals and materials, coatings, composites , metal corrosion, extraction and refining, forming, instrumentation, lubricants, mechanical and physical...Aluminum and Its Alloys 1 Analysis and Testing 5 Beryllium • > • 1 Coatings • 8 Composite Materials 9 Conferences • 15 Corrosion 18 Graphite...alloys, consisting in changing the chemi- cal composition of the surface layer, which plays an important role in corrosion processes. The content of

  18. Vehicle Concept Model Abstractions for Integrated Geometric, Inertial, Rigid Body, Powertrain, and FE Analysis

    DTIC Science & Technology

    2011-01-01

    refinement of the vehicle body structure through quantitative assessment of stiffness and modal parameter changes resulting from modifications to the beam...differential placed on the axle , adjustment of the torque output to the opposite wheel may be required to obtain the correct solution. Thus...represented by simple inertial components with appropriate model connectivity instead to determine the free modal response of powertrain type

  19. Development of Encapsulated Dye for Surface Impact Damage Indicator System.

    DTIC Science & Technology

    1987-09-01

    GROUP SUB-GROUP Composites Ultrasonics Dye Impact Microcapsules 11 04 NDE polyurethane 11 1 0Encapsulation Paint 19. ABSTRACT (Continue on reverse if...encapsulation, microencapsule incorporation into the USAF polyurethane paint, dnd initial correlation study of impact damage to impact coating indication. It is...project were to: 1. Refine the microcapsule formulation to be compatible with MIL-C-83286 paint. 2. Fabricate composite panels from isotropic graphite

  20. Enabling Incremental Iterative Development at Scale: Quality Attribute Refinement and Allocation in Practice

    DTIC Science & Technology

    2015-06-01

    abstract constraints along six dimen- sions for expansion: user, actions, data , business rules, interfaces, and quality attributes [Gottesdiener 2010...relevant open source systems. For example, the CONNECT and HADOOP Distributed File System (HDFS) projects have many user stories that deal with...Iteration Zero involves architecture planning before writing any code. An overly long Iteration Zero is equivalent to the dysfunctional “ Big Up-Front

  1. Constrained-transport Magnetohydrodynamics with Adaptive Mesh Refinement in CHARM

    NASA Astrophysics Data System (ADS)

    Miniati, Francesco; Martin, Daniel F.

    2011-07-01

    We present the implementation of a three-dimensional, second-order accurate Godunov-type algorithm for magnetohydrodynamics (MHD) in the adaptive-mesh-refinement (AMR) cosmological code CHARM. The algorithm is based on the full 12-solve spatially unsplit corner-transport-upwind (CTU) scheme. The fluid quantities are cell-centered and are updated using the piecewise-parabolic method (PPM), while the magnetic field variables are face-centered and are evolved through application of the Stokes theorem on cell edges via a constrained-transport (CT) method. The so-called multidimensional MHD source terms required in the predictor step for high-order accuracy are applied in a simplified form which reduces their complexity in three dimensions without loss of accuracy or robustness. The algorithm is implemented on an AMR framework which requires specific synchronization steps across refinement levels. These include face-centered restriction and prolongation operations and a reflux-curl operation, which maintains a solenoidal magnetic field across refinement boundaries. The code is tested against a large suite of test problems, including convergence tests in smooth flows, shock-tube tests, classical two- and three-dimensional MHD tests, a three-dimensional shock-cloud interaction problem, and the formation of a cluster of galaxies in a fully cosmological context. The magnetic field divergence is shown to remain negligible throughout.

  2. A CTSA Agenda to Advance Methods for Comparative Effectiveness Research

    PubMed Central

    Helfand, Mark; Tunis, Sean; Whitlock, Evelyn P.; Pauker, Stephen G.; Basu, Anirban; Chilingerian, Jon; Harrell Jr., Frank E.; Meltzer, David O.; Montori, Victor M.; Shepard, Donald S.; Kent, David M.

    2011-01-01

    Abstract Clinical research needs to be more useful to patients, clinicians, and other decision makers. To meet this need, more research should focus on patient‐centered outcomes, compare viable alternatives, and be responsive to individual patients’ preferences, needs, pathobiology, settings, and values. These features, which make comparative effectiveness research (CER) fundamentally patient‐centered, challenge researchers to adopt or develop methods that improve the timeliness, relevance, and practical application of clinical studies. In this paper, we describe 10 priority areas that address 3 critical needs for research on patient‐centered outcomes (PCOR): (1) developing and testing trustworthy methods to identify and prioritize important questions for research; (2) improving the design, conduct, and analysis of clinical research studies; and (3) linking the process and outcomes of actual practice to priorities for research on patient‐centered outcomes. We argue that the National Institutes of Health, through its clinical and translational research program, should accelerate the development and refinement of methods for CER by linking a program of methods research to the broader portfolio of large, prospective clinical and health system studies it supports. Insights generated by this work should be of enormous value to PCORI and to the broad range of organizations that will be funding and implementing CER. Clin Trans Sci 2011; Volume 4: 188–198 PMID:21707950

  3. The stratigraphy and evolution of lower Mount Sharp from spectral, morphological, and thermophysical orbital data sets

    PubMed Central

    Ehlmann, B. L.; Arvidson, R. E.; Edwards, C. S.; Grotzinger, J. P.; Milliken, R. E.; Quinn, D. P.; Rice, M. S.

    2016-01-01

    Abstract We have developed a refined geologic map and stratigraphy for lower Mount Sharp using coordinated analyses of new spectral, thermophysical, and morphologic orbital data products. The Mount Sharp group consists of seven relatively planar units delineated by differences in texture, mineralogy, and thermophysical properties. These units are (1–3) three spatially adjacent units in the Murray formation which contain a variety of secondary phases and are distinguishable by thermal inertia and albedo differences, (4) a phyllosilicate‐bearing unit, (5) a hematite‐capped ridge unit, (6) a unit associated with material having a strongly sloped spectral signature at visible near‐infrared wavelengths, and (7) a layered sulfate unit. The Siccar Point group consists of the Stimson formation and two additional units that unconformably overlie the Mount Sharp group. All Siccar Point group units are distinguished by higher thermal inertia values and record a period of substantial deposition and exhumation that followed the deposition and exhumation of the Mount Sharp group. Several spatially extensive silica deposits associated with veins and fractures show that late‐stage silica enrichment within lower Mount Sharp was pervasive. At least two laterally extensive hematitic deposits are present at different stratigraphic intervals, and both are geometrically conformable with lower Mount Sharp strata. The occurrence of hematite at multiple stratigraphic horizons suggests redox interfaces were widespread in space and/or in time, and future measurements by the Mars Science Laboratory Curiosity rover will provide further insights into the depositional settings of these and other mineral phases. PMID:27867788

  4. Reconstructing a mid-Cretaceous landscape from paleosols in western Canada

    USGS Publications Warehouse

    Ufnar, David F.; Gonzalez, Luis A.; Ludvigson, Greg A.; Brenner, Richard L.; Witzke, B.J.; Leckie, D.

    2005-01-01

    The Albian Stage of the mid-Cretaceous was a time of equable climate conditions with high sea levels and broad shallow epeiric seas that may have had a moderating affect on continental climates. A Late Albian landscape surface that developed during a regression and subsequent sea-level rise in the Western Canada Foreland Basin is reconstructed on the basis of correlation of paleosols penetrated by cores through the Paddy Member of the Peace River Formation. Reconstruction of this landscape refines chronostratigraphic relationships and will benefit future paleoclimatological studies milizing continental sphaerosiderite proxy records. The paleosols developed in estuarine sandstones and mudstones, and they exhibit evidence of a polygenetic history. Upon initial exposure and pedogenesis, the Paddy Member developed deeply weathered, well-drained cumulative soil profiles. Later stages of pedogenesis were characterized by hydromorphic soil conditions. The stages of soil development interpreted for the Paddy Member correlate with inferred stages of pedogenic development in time-equivalent formations located both basinward and downslope (upper Viking Formation), and landward and upslope (Boulder Creek Formation). On the basis of the genetic similarity among paleosols in these three correlative formations, the paleosols are interpreted as having formed along a single, continuous landscape surface. Results of this study indicate that the catena concept of pedogenesis along sloping landscapes is applicable to ancient successions. Sphaerosiderites in the Paddy Mem ber paleosols are used to provide proxy values for meteoric ??18O values at 52?? N paleolatitude in the Cretaceous Western Interior Basin. The meteoric ??18O values are used to refine existing interpretations about the mid-Cretaceous paleolatitudinal gradient in meteoric ?? 18O values, and the mid-Cretaceous hydrologic cycle. Copyright ?? 2005, SEPM (Society for Sedimentary Geology).

  5. Abstract-Reasoning Software for Coordinating Multiple Agents

    NASA Technical Reports Server (NTRS)

    Clement, Bradley; Barrett, Anthony; Rabideau, Gregg; Knight, Russell

    2003-01-01

    A computer program for scheduling the activities of multiple agents that share limited resources has been incorporated into the Automated Scheduling and Planning Environment (ASPEN) software system, aspects of which have been reported in several previous NASA Tech Briefs articles. In the original intended application, the agents would be multiple spacecraft and/or robotic vehicles engaged in scientific exploration of distant planets. The program could also be used on Earth in such diverse settings as production lines and military maneuvers. This program includes a planning/scheduling subprogram of the iterative repair type that reasons about the activities of multiple agents at abstract levels in order to greatly improve the scheduling of their use of shared resources. The program summarizes the information about the constraints on, and resource requirements of, abstract activities on the basis of the constraints and requirements that pertain to their potential refinements (decomposition into less-abstract and ultimately to primitive activities). The advantage of reasoning about summary information is that time needed to find consistent schedules is exponentially smaller than the time that would be needed for reasoning about the same tasks at the primitive level.

  6. On Correspondence of BRST-BFV, Dirac, and Refined Algebraic Quantizations of Constrained Systems

    NASA Astrophysics Data System (ADS)

    Shvedov, O. Yu.

    2002-11-01

    The correspondence between BRST-BFV, Dirac, and refined algebraic (group averaging, projection operator) approaches to quantizing constrained systems is analyzed. For the closed-algebra case, it is shown that the component of the BFV wave function corresponding to maximal (minimal) value of number of ghosts and antighosts in the Schrodinger representation may be viewed as a wave function in the refined algebraic (Dirac) quantization approach. The Giulini-Marolf group averaging formula for the inner product in the refined algebraic quantization approach is obtained from the Batalin-Marnelius prescription for the BRST-BFV inner product, which should be generally modified due to topological problems. The considered prescription for the correspondence of states is observed to be applicable to the open-algebra case. The refined algebraic quantization approach is generalized then to the case of nontrivial structure functions. A simple example is discussed. The correspondence of observables for different quantization methods is also investigated.

  7. Statistical Analysis of CFD Solutions from 2nd Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Hemsch, M. J.; Morrison, J. H.

    2004-01-01

    In June 2001, the first AIAA Drag Prediction Workshop was held to evaluate results obtained from extensive N-Version testing of a series of RANS CFD codes. The geometry used for the computations was the DLR-F4 wing-body combination which resembles a medium-range subsonic transport. The cases reported include the design cruise point, drag polars at eight Mach numbers, and drag rise at three values of lift. Although comparisons of the code-to-code medians with available experimental data were similar to those obtained in previous studies, the code-to-code scatter was more than an order-of-magnitude larger than expected and far larger than desired for design and for experimental validation. The second Drag Prediction Workshop was held in June 2003 with emphasis on the determination of installed pylon-nacelle drag increments and on grid refinement studies. The geometry used was the DLR-F6 wing-body-pylon-nacelle combination for which the design cruise point and the cases run were similar to the first workshop except for additional runs on coarse and fine grids to complement the runs on medium grids. The code-to-code scatter was significantly reduced for the wing-body configuration compared to the first workshop, although still much larger than desired. However, the grid refinement studies showed no sign$cant improvement in code-to-code scatter with increasing grid refinement.

  8. Adaptive mesh refinement for time-domain electromagnetics using vector finite elements :a feasibility study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, C. David; Kotulski, Joseph Daniel; Pasik, Michael Francis

    This report investigates the feasibility of applying Adaptive Mesh Refinement (AMR) techniques to a vector finite element formulation for the wave equation in three dimensions. Possible error estimators are considered first. Next, approaches for refining tetrahedral elements are reviewed. AMR capabilities within the Nevada framework are then evaluated. We summarize our conclusions on the feasibility of AMR for time-domain vector finite elements and identify a path forward.

  9. Abstracting Sequences: Reasoning That Is a Key to Academic Achievement.

    PubMed

    Pasnak, Robert; Kidd, Julie K; Gadzichowski, K Marinka; Gallington, Debbie A; Schmerold, Katrina Lea; West, Heather

    2015-01-01

    The ability to understand sequences of items may be an important cognitive ability. To test this proposition, 8 first-grade children from each of 36 classes were randomly assigned to four conditions. Some were taught sequences that represented increasing or decreasing values, or were symmetrical, or were rotations of an object through 6 or 8 positions. Control children received equal numbers of sessions on mathematics, reading, or social studies. Instruction was conducted three times weekly in 15-min sessions for seven months. In May, the children taught sequences applied their understanding to novel sequences, and scored as well or better on three standardized reading tests as the control children. They outscored all children on tests of mathematics concepts, and scored better than control children on some mathematics scales. These findings indicate that developing an understanding of sequences is a form of abstraction, probably involving fluid reasoning, that provides a foundation for academic achievement in early education.

  10. Microarray data mining using Bioconductor packages.

    PubMed

    Nie, Haisheng; Neerincx, Pieter B T; van der Poel, Jan; Ferrari, Francesco; Bicciato, Silvio; Leunissen, Jack A M; Groenen, Martien A M

    2009-07-16

    This paper describes the results of a Gene Ontology (GO) term enrichment analysis of chicken microarray data using the Bioconductor packages. By checking the enriched GO terms in three contrasts, MM8-PM8, MM8-MA8, and MM8-MM24, of the provided microarray data during this workshop, this analysis aimed to investigate the host reactions in chickens occurring shortly after a secondary challenge with either a homologous or heterologous species of Eimeria. The results of GO enrichment analysis using GO terms annotated to chicken genes and GO terms annotated to chicken-human orthologous genes were also compared. Furthermore, a locally adaptive statistical procedure (LAP) was performed to test differentially expressed chromosomal regions, rather than individual genes, in the chicken genome after Eimeria challenge. GO enrichment analysis identified significant (raw p-value < 0.05) GO terms for all three contrasts included in the analysis. Some of the GO terms linked to, generally, primary immune responses or secondary immune responses indicating the GO enrichment analysis is a useful approach to analyze microarray data. The comparisons of GO enrichment results using chicken gene information and chicken-human orthologous gene information showed more refined GO terms related to immune responses when using chicken-human orthologous gene information, this suggests that using chicken-human orthologous gene information has higher power to detect significant GO terms with more refined functionality. Furthermore, three chromosome regions were identified to be significantly up-regulated in contrast MM8-PM8 (q-value < 0.01). Overall, this paper describes a practical approach to analyze microarray data in farm animals where the genome information is still incomplete. For farm animals, such as chicken, with currently limited gene annotation, borrowing gene annotation information from orthologous genes in well-annotated species, such as human, will help improve the pathway analysis results substantially. Furthermore, LAP analysis approach is a relatively new and very useful way to be applied in microarray analysis.

  11. Study of labor-negotiation productivity concerns in the petroleum-refining industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, J.E.

    The primary objective of this study was to identify productivity factors relative to negotiating future labor contracts with the Oil, Chemical and Atomic Workers International Union (OCAWIU). A Delphi research method was utilized to accomplish this purpose. The study utilized three rounds to obtain the stated objectives. Round one involved the use of an open instrument to solicit productivity factors that would be beneficial in future negotiations with the OCAWIU. In round two, two separate instruments were sent to the panel members who were asked to judge the value of each item on the first instrument, and to rank themore » ten most significant items on the second. The round three instruments were individualized for each panel member. The productivity items were rated by the panel members, and descriptive statistics were used to describe the combined order of listings and weights for determining the relative importance of each factor in the consensus model. The nonparametric statistics were used to examine the degree of consensus between the mean values on the first instrument with the ranked values for the second instrument. No significant differences were formed. Twenty-five productivity items were identified and prioritized as viable negotiable items with the OCAWIU.« less

  12. Dietary macronutrients and food consumption as determinants of long-term weight change in adult populations: a systematic literature review

    PubMed Central

    Fogelholm, Mikael; Anderssen, Sigmund; Gunnarsdottir, Ingibjörg; Lahti-Koski, Marjaana

    2012-01-01

    This systematic literature review examined the role of dietary macronutrient composition, food consumption and dietary patterns in predicting weight or waist circumference (WC) change, with and without prior weight reduction. The literature search covered year 2000 and onwards. Prospective cohort studies, case–control studies and interventions were included. The studies had adult (18–70 y), mostly Caucasian participants. Out of a total of 1,517 abstracts, 119 full papers were identified as potentially relevant. After a careful scrutiny, 50 papers were quality graded as A (highest), B or C. Forty-three papers with grading A or B were included in evidence grading, which was done separately for all exposure-outcome combinations. The grade of evidence was classified as convincing, probable, suggestive or no conclusion. We found probable evidence for high intake of dietary fibre and nuts predicting less weight gain, and for high intake of meat in predicting more weight gain. Suggestive evidence was found for a protective role against increasing weight from whole grains, cereal fibre, high-fat dairy products and high scores in an index describing a prudent dietary pattern. Likewise, there was suggestive evidence for both fibre and fruit intake in protection against larger increases in WC. Also suggestive evidence was found for high intake of refined grains, and sweets and desserts in predicting more weight gain, and for refined (white) bread and high energy density in predicting larger increases in WC. The results suggested that the proportion of macronutrients in the diet was not important in predicting changes in weight or WC. In contrast, plenty of fibre-rich foods and dairy products, and less refined grains, meat and sugar-rich foods and drinks were associated with less weight gain in prospective cohort studies. The results on the role of dietary macronutrient composition in prevention of weight regain (after prior weight loss) were inconclusive. PMID:22893781

  13. Feasibility study: refinement of the TTC concept by additional rules based on in silico and experimental data.

    PubMed

    Hauge-Nilsen, Kristin; Keller, Detlef

    2015-01-01

    Starting from a single generic limit value, the threshold of toxicological concern (TTC) concept has been further developed over the years, e.g., by including differentiated structural classes according to the rules of Cramer et al. (Food Chem Toxicol 16: 255-276, 1978). In practice, the refined TTC concept of Munro et al. (Food Chem Toxicol 34: 829-867, 1996) is often applied. The purpose of this work was to explore the possibility of refining the concept by introducing additional structure-activity relationships and available toxicity data. Computer modeling was performed using the OECD Toolbox. No observed (adverse) effect level (NO(A)EL) data of 176 substances were collected in a basic data set. New subgroups were created applying the following criteria: extended Cramer rules, low bioavailability, low acute toxicity, no protein binding affinity, and consideration of predicted liver metabolism. The highest TTC limit value of 236 µg/kg/day was determined for a subgroup that combined the criteria "no protein binding affinity" and "predicted liver metabolism." This value was approximately eight times higher than the original Cramer class 1 limit value of 30 µg/kg/day. The results of this feasibility study indicate that inclusion of the proposed criteria may lead to improved TTC values. Thereby, the applicability of the TTC concept in risk assessment could be extended which could reduce the need to perform animal tests.

  14. Refined structures of three crystal forms of toxic shock syndrome toxin-1 and of a tetramutant with reduced activity.

    PubMed Central

    Prasad, G. S.; Radhakrishnan, R.; Mitchell, D. T.; Earhart, C. A.; Dinges, M. M.; Cook, W. J.; Schlievert, P. M.; Ohlendorf, D. H.

    1997-01-01

    The structure of toxic shock syndrome toxin-1 (TSST-1), the causative agent in toxic shock syndrome, has been determined in three crystal forms. The three structural models have been refined to R-factors of 0.154, 0.150, and 0.198 at resolutions of 2.05 A, 2.90 A, and 2.75 A, respectively. One crystal form of TSST-1 contains a zinc ion bound between two symmetry-related molecules. Although not required for biological activity, zinc dramatically potentiates the mitogenicity of TSST-1 at very low concentrations. In addition, the structure of the tetramutant TSST-1H [T69I, Y80W, E132K, I140T], which is nonmitogenic and does not amplify endotoxin shock, has been determined and refined in a fourth crystal form (R-factor = 0.173 to 1.9 A resolution). PMID:9194182

  15. Vertical Scan (V-SCAN) for 3-D Grid Adaptive Mesh Refinement for an atmospheric Model Dynamical Core

    NASA Astrophysics Data System (ADS)

    Andronova, N. G.; Vandenberg, D.; Oehmke, R.; Stout, Q. F.; Penner, J. E.

    2009-12-01

    One of the major building blocks of a rigorous representation of cloud evolution in global atmospheric models is a parallel adaptive grid MPI-based communication library (an Adaptive Blocks for Locally Cartesian Topologies library -- ABLCarT), which manages the block-structured data layout, handles ghost cell updates among neighboring blocks and splits a block as refinements occur. The library has several modules that provide a layer of abstraction for adaptive refinement: blocks, which contain individual cells of user data; shells - the global geometry for the problem, including a sphere, reduced sphere, and now a 3D sphere; a load balancer for placement of blocks onto processors; and a communication support layer which encapsulates all data movement. A major performance concern with adaptive mesh refinement is how to represent calculations that have need to be sequenced in a particular order in a direction, such as calculating integrals along a specific path (e.g. atmospheric pressure or geopotential in the vertical dimension). This concern is compounded if the blocks have varying levels of refinement, or are scattered across different processors, as can be the case in parallel computing. In this paper we describe an implementation in ABLCarT of a vertical scan operation, which allows computing along vertical paths in the correct order across blocks transparent to their resolution and processor location. We test this functionality on a 2D and a 3D advection problem, which tests the performance of the model’s dynamics (transport) and physics (sources and sinks) for different model resolutions needed for inclusion of cloud formation.

  16. Valuing Exercises for the Middle School. Resource Monograph No. 11.

    ERIC Educational Resources Information Center

    Casteel, J. Doyle; And Others

    One of the major goals of the middle school is to help students gain and refine skills in the area of values clarification. One way of securing such value clarification is to plan and assign value sheets--carefully planned and written activities designed to elicit value clarification patterns of language usage from students. Six different formats…

  17. Stepwise construction of a metabolic network in Event-B: The heat shock response.

    PubMed

    Sanwal, Usman; Petre, Luigia; Petre, Ion

    2017-12-01

    There is a high interest in constructing large, detailed computational models for biological processes. This is often done by putting together existing submodels and adding to them extra details/knowledge. The result of such approaches is usually a model that can only answer questions on a very specific level of detail, and thus, ultimately, is of limited use. We focus instead on an approach to systematically add details to a model, with formal verification of its consistency at each step. In this way, one obtains a set of reusable models, at different levels of abstraction, to be used for different purposes depending on the question to address. We demonstrate this approach using Event-B, a computational framework introduced to develop formal specifications of distributed software systems. We first describe how to model generic metabolic networks in Event-B. Then, we apply this method for modeling the biological heat shock response in eukaryotic cells, using Event-B refinement techniques. The advantage of using Event-B consists in having refinement as an intrinsic feature; this provides as a final result not only a correct model, but a chain of models automatically linked by refinement, each of which is provably correct and reusable. This is a proof-of-concept that refinement in Event-B is suitable for biomodeling, serving for mastering biological complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Homology‐based hydrogen bond information improves crystallographic structures in the PDB

    PubMed Central

    van Beusekom, Bart; Touw, Wouter G.; Tatineni, Mahidhar; Somani, Sandeep; Rajagopal, Gunaretnam; Luo, Jinquan; Gilliland, Gary L.; Perrakis, Anastassis

    2017-01-01

    Abstract The Protein Data Bank (PDB) is the global archive for structural information on macromolecules, and a popular resource for researchers, teachers, and students, amassing more than one million unique users each year. Crystallographic structure models in the PDB (more than 100,000 entries) are optimized against the crystal diffraction data and geometrical restraints. This process of crystallographic refinement typically ignored hydrogen bond (H‐bond) distances as a source of information. However, H‐bond restraints can improve structures at low resolution where diffraction data are limited. To improve low‐resolution structure refinement, we present methods for deriving H‐bond information either globally from well‐refined high‐resolution structures from the PDB‐REDO databank, or specifically from on‐the‐fly constructed sets of homologous high‐resolution structures. Refinement incorporating HOmology DErived Restraints (HODER), improves geometrical quality and the fit to the diffraction data for many low‐resolution structures. To make these improvements readily available to the general public, we applied our new algorithms to all crystallographic structures in the PDB: using massively parallel computing, we constructed a new instance of the PDB‐REDO databank (https://pdb-redo.eu). This resource is useful for researchers to gain insight on individual structures, on specific protein families (as we demonstrate with examples), and on general features of protein structure using data mining approaches on a uniformly treated dataset. PMID:29168245

  19. A Matlab-based finite-difference solver for the Poisson problem with mixed Dirichlet-Neumann boundary conditions

    NASA Astrophysics Data System (ADS)

    Reimer, Ashton S.; Cheviakov, Alexei F.

    2013-03-01

    A Matlab-based finite-difference numerical solver for the Poisson equation for a rectangle and a disk in two dimensions, and a spherical domain in three dimensions, is presented. The solver is optimized for handling an arbitrary combination of Dirichlet and Neumann boundary conditions, and allows for full user control of mesh refinement. The solver routines utilize effective and parallelized sparse vector and matrix operations. Computations exhibit high speeds, numerical stability with respect to mesh size and mesh refinement, and acceptable error values even on desktop computers. Catalogue identifier: AENQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v3.0 No. of lines in distributed program, including test data, etc.: 102793 No. of bytes in distributed program, including test data, etc.: 369378 Distribution format: tar.gz Programming language: Matlab 2010a. Computer: PC, Macintosh. Operating system: Windows, OSX, Linux. RAM: 8 GB (8, 589, 934, 592 bytes) Classification: 4.3. Nature of problem: To solve the Poisson problem in a standard domain with “patchy surface”-type (strongly heterogeneous) Neumann/Dirichlet boundary conditions. Solution method: Finite difference with mesh refinement. Restrictions: Spherical domain in 3D; rectangular domain or a disk in 2D. Unusual features: Choice between mldivide/iterative solver for the solution of large system of linear algebraic equations that arise. Full user control of Neumann/Dirichlet boundary conditions and mesh refinement. Running time: Depending on the number of points taken and the geometry of the domain, the routine may take from less than a second to several hours to execute.

  20. Participation and environmental aspects in education and the ICF and the ICF-CY: findings from a systematic literature review.

    PubMed

    Maxwell, Gregor; Alves, Ines; Granlund, Mats

    2012-01-01

    This paper presents findings from a systematic review of the literature related to participation and the ICF/ICF-CY in educational research. To analyse how and investigate the application of participation in educational research. Specifically, how participation is related to the environmental dimensions availability, accessibility, affordability, accommodability and acceptability. A systematic literature review using database keyword searches and refinement protocols using inclusion and exclusion criteria at abstract, full-text and extraction. Four hundred and twenty-one initial works were found. Twenty-three met the inclusion criteria. Availability and accommodations are the most investigated dimensions. Operationalization of participation is not always consistent with definitions used. Research is developing a holistic approach to investigating participation as, although all papers reference at least one environmental dimension, only four of the 11 empirical works reviewed present a fully balanced approach when theorizing and operationalizing participation; hopefully this balanced approach will continue and influence educational policy and school practice.

  1. Partitioning an object-oriented terminology schema.

    PubMed

    Gu, H; Perl, Y; Halper, M; Geller, J; Kuo, F; Cimino, J J

    2001-07-01

    Controlled medical terminologies are increasingly becoming strategic components of various healthcare enterprises. However, the typical medical terminology can be difficult to exploit due to its extensive size and high density. The schema of a medical terminology offered by an object-oriented representation is a valuable tool in providing an abstract view of the terminology, enhancing comprehensibility and making it more usable. However, schemas themselves can be large and unwieldy. We present a methodology for partitioning a medical terminology schema into manageably sized fragments that promote increased comprehension. Our methodology has a refinement process for the subclass hierarchy of the terminology schema. The methodology is carried out by a medical domain expert in conjunction with a computer. The expert is guided by a set of three modeling rules, which guarantee that the resulting partitioned schema consists of a forest of trees. This makes it easier to understand and consequently use the medical terminology. The application of our methodology to the schema of the Medical Entities Dictionary (MED) is presented.

  2. MODFLOW-2005, The U.S. Geological Survey Modular Ground-Water Model - Documentation of the Multiple-Refined-Areas Capability of Local Grid Refinement (LGR) and the Boundary Flow and Head (BFH) Package

    USGS Publications Warehouse

    Mehl, Steffen W.; Hill, Mary C.

    2007-01-01

    This report documents the addition of the multiple-refined-areas capability to shared node Local Grid Refinement (LGR) and Boundary Flow and Head (BFH) Package of MODFLOW-2005, the U.S. Geological Survey modular, three-dimensional, finite-difference ground-water flow model. LGR now provides the capability to simulate ground-water flow by using one or more block-shaped, higher resolution local grids (child model) within a coarser grid (parent model). LGR accomplishes this by iteratively coupling separate MODFLOW-2005 models such that heads and fluxes are balanced across the shared interfacing boundaries. The ability to have multiple, nonoverlapping areas of refinement is important in situations where there is more than one area of concern within a regional model. In this circumstance, LGR can be used to simulate these distinct areas with higher resolution grids. LGR can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined ground-water systems. The BFH Package can be used to simulate these situations by using either the parent or child models independently.

  3. Investigation of Novel Glass Scintillators for Gamma Ray Detection

    DTIC Science & Technology

    2013-02-01

    February 2013 HDTRA1-07-1-0017 Mansoor Sheik- Bahae Prepared by: Regents of University of New Mexico Department of...PAGE ABSTRACT OF PAGES Unclassified Unclassified Unclassified uu 17 19a. NAME OF RESPONSIBLE PERSON Mansoor Sheik- Bahae 19b. TaEPHONE NUMBER...t4 I transition Fig. 2: Effect of conduction band edge position on the relaxation dynamics of the Ce3+ excited state. The key here was refining the

  4. Evaluating Between-Pathway Models with Expression Data

    PubMed Central

    Leiserson, M.D.M.; Cowen, L.J.; Slonim, D.K.

    2010-01-01

    Abstract Between-pathway models (BPMs) are network motifs consisting of pairs of putative redundant pathways. In this article, we show how adding another source of high-throughput data—microarray gene expression data from knockout experiments—allows us to identify a compensatory functional relationship between genes from the two BPM pathways. We evaluate the quality of the BPMs from four different studies, and we describe how our methods might be extended to refine pathways. PMID:20377458

  5. Developing an Experiential Definition of Recovery: Participatory Research with Recovering Substance Abusers from Multiple Pathways

    PubMed Central

    Borkman, Thomasina J.; Stunz, Aina; Kaskutas, Lee Ann

    2016-01-01

    Background The What is Recovery? (WIR) study identified specific elements of a recovery definition that people in substance abuse recovery from multiple pathways would endorse. Objectives To explain how participatory research contributed to the development of a comprehensive pool of items defining recovery; and to identify the commonality between the specific items endorsed by participants as defining recovery and the abstract components of recovery found in four important broad recovery definitions Methods A four-step, mixed-methods, iterative process was used to develop and pretest items (August 2010 to February 2012). Online survey recruitment (n=238) was done via email lists of individuals in recovery and electronic advertisements; 54 were selected for in-depth telephone interviews. Analyses using experientially-based and survey research criteria resulted in a revised item pool of 47 refined and specific items. The WIR items were matched with the components of four important definitions. Results Recovering participants (1) proposed and validated new items; (2) developed an alternative response category to the Likert; (3) suggested criteria for eliminating items irrelevant to recovery. The matching of WIR items with the components of important abstract definitions revealed extensive commonality. Conclusions, importance The WIR items define recovery as ways of being, as a growth and learning process involving internal values and self-awareness with moral dimensions. This is the first wide-scale research identifying specific items defining recovery, which can be used to guide service provision in Recovery-Oriented Systems of Care. PMID:27159851

  6. Developing an Experiential Definition of Recovery: Participatory Research With Recovering Substance Abusers From Multiple Pathways.

    PubMed

    Borkman, Thomasina Jo; Stunz, Aina; Kaskutas, Lee Ann

    2016-07-28

    The What is Recovery? (WIR) study identified specific elements of a recovery definition that people in substance abuse recovery from multiple pathways would endorse. To explain how participatory research contributed to the development of a comprehensive pool of items defining recovery; and to identify the commonality between the specific items endorsed by participants as defining recovery and the abstract components of recovery found in four important broad recovery definitions. A four-step, mixed-methods, iterative process was used to develop and pretest items (August 2010 to February 2012). Online survey recruitment (n = 238) was done via email lists of individuals in recovery and electronic advertisements; 54 were selected for in-depth telephone interviews. Analyses using experientially-based and survey research criteria resulted in a revised item pool of 47 refined and specific items. The WIR items were matched with the components of four important definitions. Recovering participants (1) proposed and validated new items; (2) developed an alternative response category to the Likert; (3) suggested criteria for eliminating items irrelevant to recovery. The matching of WIR items with the components of important abstract definitions revealed extensive commonality. The WIR items define recovery as ways of being, as a growth and learning process involving internal values and self-awareness with moral dimensions. This is the first wide-scale research identifying specific items defining recovery, which can be used to guide service provision in Recovery-Oriented Systems of Care.

  7. Value co-creation platform design within the context of technology-driven businesses

    NASA Astrophysics Data System (ADS)

    Tanev, Stoyan; Ruskov, Petko

    2010-02-01

    The article provides a discussion of value co-creation platforms within the context of technology driven business. It emphasizes the need for a terminological refinement of the value co-creation paradigm as well as for an articulation of its implications for the design and reconfiguration of the company value network.

  8. Teaching professionalism in graduate medical education: What is the role of simulation?

    PubMed

    Wali, Eisha; Pinto, Jayant M; Cappaert, Melissa; Lambrix, Marcie; Blood, Angela D; Blair, Elizabeth A; Small, Stephen D

    2016-09-01

    We systematically reviewed the literature concerning simulation-based teaching and assessment of the Accreditation Council for Graduate Medical Education professionalism competencies to elucidate best practices and facilitate further research. A systematic review of English literature for "professionalism" and "simulation(s)" yielded 697 abstracts. Two independent raters chose abstracts that (1) focused on graduate medical education, (2) described the simulation method, and (3) used simulation to train or assess professionalism. Fifty abstracts met the criteria, and seven were excluded for lack of relevant information. The raters, 6 professionals with medical education, simulation, and clinical experience, discussed 5 of these articles as a group; they calibrated coding and applied further refinements, resulting in a final, iteratively developed evaluation form. The raters then divided into 2 teams to read and assess the remaining articles. Overall, 15 articles were eliminated, and 28 articles underwent final analysis. Papers addressed a heterogeneous range of professionalism content via multiple methods. Common specialties represented were surgery (46.4%), pediatrics (17.9%), and emergency medicine (14.3%). Sixteen articles (57%) referenced a professionalism framework; 14 (50%) incorporated an assessment tool; and 17 (60.7%) reported debriefing participants, though in limited detail. Twenty-three (82.1%) articles evaluated programs, mostly using subjective trainee reports. Despite early innovation, reporting of simulation-based professionalism training and assessment is nonstandardized in methods and terminology and lacks the details required for replication. We offer minimum standards for reporting of future professionalism-focused simulation training and assessment as well as a basic framework for better mapping proper simulation methods to the targeted domain of professionalism. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Impact of Operating Context on the Use of Structure in Air Traffic Controller Cognitive Processes

    NASA Technical Reports Server (NTRS)

    Davison, Hayley J.; Histon, Jonathan M.; Ragnarsdottir, Margret Dora; Major, Laura M.; Hansman, R. John

    2004-01-01

    This paper investigates the influence of structure on air traffic controllers cognitive processes in the TRACON, En Route, and Oceanic environments. Radar data and voice command analyses were conducted to support hypotheses generated through observations and interviews conducted at the various facilities. Three general types of structure-based abstractions (standard flows, groupings, and critical points) have been identified as being used in each context, though the details of their application varied in accordance with the constraints of the particular operational environment. Projection emerged as a key cognitive process aided by the structure-based abstractions, and there appears to be a significant difference between how time-based versus spatial-based projection is performed by controllers. It is recommended that consideration be given to the value provided by the structure-based abstractions to the controller as well as to maintain consistency between the type (time or spatial) of information support provided to the controller.

  10. Multi-scale approach for 3D hydrostratigraphic and groundwater flow modelling of Milan (Northern Italy) urban aquifers.

    NASA Astrophysics Data System (ADS)

    De Caro, Mattia; Crosta, Giovanni; Frattini, Paolo; Perico, Roberta

    2017-04-01

    During the last century, urban groundwater was heavily exploited for public and industrial supply. As the water demands of industry have fallen, many cities are now experiencing rising groundwater levels with consequent concerns about localized flooding of basements, reduction of soil bearing capacities under foundations, soil internal erosion and the mobilization of contaminants. The city of Milan (Northern Italy) draws water for domestic and industrial purposes from aquifers beneath the urban area. The rate of abstraction has been varying during the last century, depending upon the number of inhabitants and the development of industrial activities. The groundwater abstraction raised to a maximum of about 350x106 m3/yr in the middle 1970s and has successively decreased to a value of about 230x106 m3/yr at present days. This caused a water table raise at an average rate of about 1 m/yr inducing infiltrations and flooding of deep constructions (e.g. building foundations and basements, underground subway excavations). Starting from a large hydrostratigraphic database (8628 borehole logs), a multi-scale approach for the reconstruction of the aquifers geometry (unconfined and semi-confined) at regional-scale has been used. First, a three-level hierarchical classification of the lithologies (lithofacies, hydrofacies, aquifer groups) has been adopted. Then, the interpretation of several 2D cross-sections was attained with Target for ArcGIS exploration software. The interpretation of cross-sections was based on the characteristics of depositional environments of the analysed aquifers (from meandering plain to proximal outwash deposits), on the position of quaternary deposits, and on the distribution of geochemical parameters (i.e. indicator contaminants and major ions). Finally, the aquifer boundary surfaces were interpolated with standard algorithms. The hydraulic properties of analysed aquifers were estimated through the analyses of available step-drawdown tests (Theis with the superimposition solution) and use of empirical relationships from grain-size distribution data, respectively for semi-confined and unconfined aquifers. Finally, 3D Finite Element groundwater flow models have been developed both at regional and local "metropolitan" scale. The regional model covers an area of 3,135.15 km2, while the local model comprises the Milan metropolitan area with an extension of 457 km2. Both models were discretized into a triangular finite element mesh with local refinement in proximity of pumping wells. The element size ranges from 350 to 30 meters and from 200 to 2 meters, respectively for regional and local model. The calibration was done by the comparison with the available water level data for different years (from 1994 to 2016). The calibrated permeability values are consistent with the estimated ones and the sensitivity analysis on hydraulic parameters suggests a minor influence of the aquiclude layer separating the two aquifers. The challenge is to provide the basis for new types of applied outputs so that they may better inform strategic planning options, ground investigation, and abstraction strategies.

  11. Russian State Time and Earth Rotation Service: Observations, Eop Series, Prediction

    NASA Astrophysics Data System (ADS)

    Kaufman, M.; Pasynok, S.

    2010-01-01

    Russian State Time, Frequency and Earth Rotation Service provides the official EOP data and time for use in scientific, technical and metrological works in Russia. The observations of GLONASS and GPS on 30 stations in Russia, and also the Russian and worldwide observations data of VLBI (35 stations) and SLR (20 stations) are used now. To these three series of EOP the data calculated in two other Russian analysis centers are added: IAA (VLBI, GPS and SLR series) and MCC (SLR). Joint processing of these 7 series is carried out every day (the operational EOP data for the last day and the predicted values for 50 days). The EOP values are weekly refined and systematic errors of every individual series are corrected. The combined results become accessible on the VNIIFTRI server (ftp.imvp.ru) approximately at 6h UT daily.

  12. Effect of the addition of fatty by-products from the refining of vegetable oil on methane production in co-digestion.

    PubMed

    Torrijos, M; Sousbie, P; Badey, L; Bosque, F; Steyer, J P

    2012-01-01

    The purpose of this work was to investigate the effects of the addition of by-products from the refining of vegetable oil on the behavior of co-digestion reactors treating a mixture of grass, cow dung and fruit and vegetable waste. Three by-products were used: one soapstock, one used winterization earth and one skimming of aeroflotation of the effluents. Three 15 l reactors were run in parallel and fed five times a week. In a first phase of 4 weeks, the three reactors were fed with the co-digestion substrates alone (grass, cow dung and fruit and vegetable waste) at an organic loading rate (OLR) of 1.5 g VS/kg d (VS: volatile solids). Then, a different by-product from the refining of oil was added to the feed of each reactor at an OLR of 0.5 g VS/kg d, generating a 33% increase in the OLR. The results show that the addition of by-products from the refining of oil is an efficient way of increasing the methane production of co-digestion reactors thanks to high methane yield of such by-products (0.69-0.77 l CH(4)/g VS loaded). In fact, in this work, it was possible to raise the methane production of the reactors by about 60% through a 33% increase in the OLR thanks to the addition of the by-products from the refining of vegetable oil.

  13. Content Abstract Classification Using Naive Bayes

    NASA Astrophysics Data System (ADS)

    Latif, Syukriyanto; Suwardoyo, Untung; Aldrin Wihelmus Sanadi, Edwin

    2018-03-01

    This study aims to classify abstract content based on the use of the highest number of words in an abstract content of the English language journals. This research uses a system of text mining technology that extracts text data to search information from a set of documents. Abstract content of 120 data downloaded at www.computer.org. Data grouping consists of three categories: DM (Data Mining), ITS (Intelligent Transport System) and MM (Multimedia). Systems built using naive bayes algorithms to classify abstract journals and feature selection processes using term weighting to give weight to each word. Dimensional reduction techniques to reduce the dimensions of word counts rarely appear in each document based on dimensional reduction test parameters of 10% -90% of 5.344 words. The performance of the classification system is tested by using the Confusion Matrix based on comparative test data and test data. The results showed that the best classification results were obtained during the 75% training data test and 25% test data from the total data. Accuracy rates for categories of DM, ITS and MM were 100%, 100%, 86%. respectively with dimension reduction parameters of 30% and the value of learning rate between 0.1-0.5.

  14. Internal consistency of the self-reporting questionnaire-20 in occupational groups

    PubMed Central

    Santos, Kionna Oliveira Bernardes; Carvalho, Fernando Martins; de Araújo, Tânia Maria

    2016-01-01

    ABSTRACT OBJECTIVE To assess the internal consistency of the measurements of the Self-Reporting Questionnaire (SRQ-20) in different occupational groups. METHODS A validation study was conducted with data from four surveys with groups of workers, using similar methods. A total of 9,959 workers were studied. In all surveys, the common mental disorders were assessed via SRQ-20. The internal consistency considered the items belonging to dimensions extracted by tetrachoric factor analysis for each study. Item homogeneity assessment compared estimates of Cronbach’s alpha (KD-20), the alpha applied to a tetrachoric correlation matrix and stratified Cronbach’s alpha. RESULTS The SRQ-20 dimensions showed adequate values, considering the reference parameters. The internal consistency of the instrument items, assessed by stratified Cronbach’s alpha, was high (> 0.80) in the four studies. CONCLUSIONS The SRQ-20 showed good internal consistency in the professional categories evaluated. However, there is still a need for studies using alternative methods and additional information able to refine the accuracy of latent variable measurement instruments, as in the case of common mental disorders. PMID:27007682

  15. [Can the local energy minimization refine the PDB structures of different resolution universally?].

    PubMed

    Godzi, M G; Gromova, A P; Oferkin, I V; Mironov, P V

    2009-01-01

    The local energy minimization was statistically validated as the refinement strategy for PDB structure pairs of different resolution. Thirteen pairs of structures with the only difference in resolution were extracted from PDB, and the structures of 11 identical proteins obtained by different X-ray diffraction techniques were represented. The distribution of RMSD value was calculated for these pairs before and after the local energy minimization of each structure. The MMFF94 field was used for energy calculations, and the quasi-Newton method was used for local energy minimization. By comparison of these two RMSD distributions, the local energy minimization was proved to statistically increase the structural differences in pairs so that it cannot be used for refinement purposes. To explore the prospects of complex refinement strategies based on energy minimization, randomized structures were obtained by moving the initial PDB structures as far as the minimized structures had been moved in a multidimensional space of atomic coordinates. For these randomized structures, the RMSD distribution was calculated and compared with that for minimized structures. The significant differences in their mean values proved the energy surface of the protein to have only few minima near the conformations of different resolution obtained by X-ray diffraction for PDB. Some other results obtained by exploring the energy surface near these conformations are also presented. These results are expected to be very useful for the development of new protein refinement strategies based on energy minimization.

  16. Analyzing Creative Products: Refinement and Test of a Judging Instrument.

    ERIC Educational Resources Information Center

    Besemer, Susan; O'Quin, Karen

    1986-01-01

    The Creative Product Analysis Matrix was evaluated and refined by asking 133 undergraduate students to apply the questionnaire items in three areas (novelty, resolution, elaboration/synthesis) to two T-shirts, only one of predetermined creative design. Results indicated the instrument reliably assessed overall perceptions of the product. (DB)

  17. Improved cryoEM-Guided Iterative Molecular Dynamics–Rosetta Protein Structure Refinement Protocol for High Precision Protein Structure Prediction

    PubMed Central

    2016-01-01

    Many excellent methods exist that incorporate cryo-electron microscopy (cryoEM) data to constrain computational protein structure prediction and refinement. Previously, it was shown that iteration of two such orthogonal sampling and scoring methods – Rosetta and molecular dynamics (MD) simulations – facilitated exploration of conformational space in principle. Here, we go beyond a proof-of-concept study and address significant remaining limitations of the iterative MD–Rosetta protein structure refinement protocol. Specifically, all parts of the iterative refinement protocol are now guided by medium-resolution cryoEM density maps, and previous knowledge about the native structure of the protein is no longer necessary. Models are identified solely based on score or simulation time. All four benchmark proteins showed substantial improvement through three rounds of the iterative refinement protocol. The best-scoring final models of two proteins had sub-Ångstrom RMSD to the native structure over residues in secondary structure elements. Molecular dynamics was most efficient in refining secondary structure elements and was thus highly complementary to the Rosetta refinement which is most powerful in refining side chains and loop regions. PMID:25883538

  18. Trends in P Value, Confidence Interval, and Power Analysis Reporting in Health Professions Education Research Reports: A Systematic Appraisal.

    PubMed

    Abbott, Eduardo F; Serrano, Valentina P; Rethlefsen, Melissa L; Pandian, T K; Naik, Nimesh D; West, Colin P; Pankratz, V Shane; Cook, David A

    2018-02-01

    To characterize reporting of P values, confidence intervals (CIs), and statistical power in health professions education research (HPER) through manual and computerized analysis of published research reports. The authors searched PubMed, Embase, and CINAHL in May 2016, for comparative research studies. For manual analysis of abstracts and main texts, they randomly sampled 250 HPER reports published in 1985, 1995, 2005, and 2015, and 100 biomedical research reports published in 1985 and 2015. Automated computerized analysis of abstracts included all HPER reports published 1970-2015. In the 2015 HPER sample, P values were reported in 69/100 abstracts and 94 main texts. CIs were reported in 6 abstracts and 22 main texts. Most P values (≥77%) were ≤.05. Across all years, 60/164 two-group HPER studies had ≥80% power to detect a between-group difference of 0.5 standard deviations. From 1985 to 2015, the proportion of HPER abstracts reporting a CI did not change significantly (odds ratio [OR] 2.87; 95% CI 1.04, 7.88) whereas that of main texts reporting a CI increased (OR 1.96; 95% CI 1.39, 2.78). Comparison with biomedical studies revealed similar reporting of P values, but more frequent use of CIs in biomedicine. Automated analysis of 56,440 HPER abstracts found 14,867 (26.3%) reporting a P value, 3,024 (5.4%) reporting a CI, and increased reporting of P values and CIs from 1970 to 2015. P values are ubiquitous in HPER, CIs are rarely reported, and most studies are underpowered. Most reported P values would be considered statistically significant.

  19. Compositional Abstraction and Refinement for Aspects (CARA)

    DTIC Science & Technology

    2004-03-01

    tight. 5 5 The SAL Language Manual by Leonardo de Moura, Sam Owre, and N. Shankar. Avail- able as [9]. The heart of the SAL system is its language , also...called SAL. The SAL language provides an attractive language for writing specifications, and it is also suitable as a target for translating...key part of the SAL framework is a language for describing transition systems. This language serves as a specification language and as the target for

  20. Development of the Hospital Ship Replacement (HSR) Concept - Maximizing Capability & Affordability

    DTIC Science & Technology

    2014-08-01

    be restricted by weight when it comes to passenger capacity. For verification, these patient capacity estimates were compared to seating ...ABSTRACT The Center for Innovation in Ship Design (CISD) requested a design effort to refine and expand upon a previous development of a concept that...could serve as a replacement for the existing hospital ships, USNS Mercy (T-AHS 19) and USNS Comfort (T-AHS 20). These ships are over 35 years old and

  1. Summary of flat-plate solar array project documentation. Abstracts of published documents, 1975 to June 1982

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Technologies that will enable the private sector to manufacture and widely use photovoltaic systems for the generation of electricity in residential, commercial, industrial, and government applications at a cost per watt that is competitive with other means is investigated. Silicon refinement processes, advanced silicon sheet growth techniques, solar cell development, encapsulation, automated fabrication process technology, advanced module/array design, and module/array test and evaluation techniques are developed.

  2. US Army Attack Aviation in a Decisive Action Environment: History, Doctrine, and a Need for Doctrinal Refinement

    DTIC Science & Technology

    2015-05-23

    flight. The design used the engine, transmission, and rotor system of the UH-1 design. In doing so, Bell helicopters publicly declared that the Cobra...Public Release; Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The attack helicopter airframe and role evolved slowly, over time, to...attack helicopter doctrine, heavily influenced by the Global War on Terror and the 11th Attack Helicopter Regiment’s disastrous deep attack during

  3. Energy conservation: Industry. Citations from the NITS data base

    NASA Astrophysics Data System (ADS)

    Hundemann, A. S.

    1980-07-01

    The 335 citations, 37 of which are new entries, discuss potential methods of conserving energy. Many abstracts deal with reports that also cover processes used, amount of energy consumed, and environmental considerations of energy conserving options. Industries covered include food, paper, chemical, cement, metals, petroleum refining, contract construction, synthetic rubber, plastics, drug manufacturing, and stone, clay, and glass. Energy conservation through the use of waste heat is covered in a related Published Search entitled Waste Heat Utilization.

  4. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  5. Theoretical studies of the decomposition mechanisms of 1,2,4-butanetriol trinitrate.

    PubMed

    Pei, Liguan; Dong, Kehai; Tang, Yanhui; Zhang, Bo; Yu, Chang; Li, Wenzuo

    2017-12-06

    Density functional theory (DFT) and canonical variational transition-state theory combined with a small-curvature tunneling correction (CVT/SCT) were used to explore the decomposition mechanisms of 1,2,4-butanetriol trinitrate (BTTN) in detail. The results showed that the γ-H abstraction reaction is the initial pathway for autocatalytic BTTN decomposition. The three possible hydrogen atom abstraction reactions are all exothermic. The rate constants for autocatalytic BTTN decomposition are 3 to 10 40 times greater than the rate constants for the two unimolecular decomposition reactions (O-NO 2 cleavage and HONO elimination). The process of BTTN decomposition can be divided into two stages according to whether the NO 2 concentration is above a threshold value. HONO elimination is the main reaction channel during the first stage because autocatalytic decomposition requires NO 2 and the concentration of NO 2 is initially low. As the reaction proceeds, the concentration of NO 2 gradually increases; when it exceeds the threshold value, the second stage begins, with autocatalytic decomposition becoming the main reaction channel.

  6. Preservation of three-dimensional anatomy in phosphatized fossil arthropods enriches evolutionary inference.

    PubMed

    Schwermann, Achim H; Dos Santos Rolo, Tomy; Caterino, Michael S; Bechly, Günter; Schmied, Heiko; Baumbach, Tilo; van de Kamp, Thomas

    2016-02-05

    External and internal morphological characters of extant and fossil organisms are crucial to establishing their systematic position, ecological role and evolutionary trends. The lack of internal characters and soft-tissue preservation in many arthropod fossils, however, impedes comprehensive phylogenetic analyses and species descriptions according to taxonomic standards for Recent organisms. We found well-preserved three-dimensional anatomy in mineralized arthropods from Paleogene fissure fillings and demonstrate the value of these fossils by utilizing digitally reconstructed anatomical structure of a hister beetle. The new anatomical data facilitate a refinement of the species diagnosis and allowed us to reject a previous hypothesis of close phylogenetic relationship to an extant congeneric species. Our findings suggest that mineralized fossils, even those of macroscopically poor preservation, constitute a rich but yet largely unexploited source of anatomical data for fossil arthropods.

  7. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art

    PubMed Central

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time. PMID:29118692

  8. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art.

    PubMed

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time.

  9. Characteristics of ADC12/nano Al2O3composites with Addition of Ti Produced By Stir Casting Method

    NASA Astrophysics Data System (ADS)

    Zulfia, A.; Krisiphala; Ferdian, D.; Utomo, B. W.; Dhaneswara, D.

    2018-03-01

    The mechanical properties and microstructure of ADC12/nano Al2O3 matrix composites have been studied in this work. The composites were produced by stir casting method. ADC 12 as matrix composites was combined by Mg and Ti. The addition of Ti was varied from 0.02 to 0.08 wt-% as grain refinement wetting to improve mechanical properties such as tensile strength, hardness and wear resistance, while Mg addition was to promote wetting between ADC 12 and nano Al2O3. The optimum tensile strength was found at 0.04 wt-% addition of Ti with value of 132.5 MPa, further adding more Ti cause a poisoning mechanism that will hindered the grain refining process and reduce the tensile strength. The hardness and wear resistance of composites would also increase because of the refinement process. and the added Magnesium in the material that will form Mg2Si primary phases who have a high hardness value.

  10. Severity of illness and ambulatory care-sensitive conditions.

    PubMed

    Yuen, Elaine J

    2004-09-01

    This study describes how severity of illness may refine the definition of ambulatory care-sensitive conditions, or ACSCs. Hospital discharge abstract data from Philadelphia were combined with census data to develop population-based adjusted rates of hospitalization for diabetes and asthma, two ACSCs. By stratifying ACSC hospitalization by severity of illness, variations were observed by age, race, and gender. Minority groups may be at higher risk for less access to outpatient primary care and were observed to have higher rates of more severely ill, Stage 3 hospitalization. Geographic map displays indicated wide ranges of age-sex-adjusted rates for high-severity hospitalization in the five-county Philadelphia region. This refined ACSC measure may help to identify specific groups and clinical conditions within a population to assist health care planners estimate health care resources such as facilities, manpower, and programs, as well as to evaluate their outcomes.

  11. Comparison of Antioxidant Properties of Refined and Whole Wheat Flour and Bread

    PubMed Central

    Yu, Lilei; Nanguet, Anne-Laure; Beta, Trust

    2013-01-01

    Antioxidant properties of refined and whole wheat flour and their resultant bread were investigated to document the effects of baking. Total phenolic content (TPC), 2,2-diphenyl-1-picrylhydrazyl (DPPH) radical scavenging activity and oxygen radical absorbance capacity (ORAC) were employed to determine the content of ethanol extractable phenolic compounds. HPLC was used to detect the presence of phenolic acids prior to their confirmation using LC-MS/MS. Whole wheat flour showed significantly higher antioxidant activity than refined flour (p < 0.05). There was a significant effect of the bread-making process with the TPC of whole wheat bread (1.50–1.65 mg/g) and white bread (0.79–1.03 mg/g) showing a respective reduction of 28% and 33% of the levels found in whole wheat and refined flour. Similarly, baking decreased DPPH radical scavenging capacity by 32% and 30%. ORAC values, however, indicated that baking increased the antioxidant activities of whole wheat and refined flour by 1.8 and 2.9 times, respectively. HPLC analysis showed an increase of 18% to 35% in ferulic acid after baking to obtain whole and refined wheat bread containing 330.1 and 25.3 µg/g (average), respectively. Whole wheat flour and bread were superior to refined flour and bread in in vitro antioxidant properties. PMID:26784470

  12. Grain Refinement Efficiency in Commercial-Purity Aluminum Influenced by the Addition of Al-4Ti Master Alloys with Varying TiAl3 Particles

    PubMed Central

    Zhao, Jianhua; He, Jiansheng; Tang, Qi; Wang, Tao; Chen, Jing

    2016-01-01

    A series of Al-4Ti master alloys with various TiAl3 particles were prepared via pouring the pure aluminum added with K2TiF6 or sponge titanium into three different molds made of graphite, copper, and sand. The microstructure and morphology of TiAl3 particles were characterized and analyzed by scanning electron microscope (SEM) with energy dispersive spectroscopy (EDS). The microstructure of TiAl3 particles in Al-4Ti master alloys and their grain refinement efficiency in commercial-purity aluminum were investigated in this study. Results show that there were three different morphologies of TiAl3 particles in Al-4Ti master alloys: petal-like structures, blocky structures, and flaky structures. The Al-4Ti master alloy with blocky TiAl3 particles had better and more stable grain refinement efficiency than the master alloys with petal-like and flaky TiAl3 particles. The average grain size of the refined commercial-purity aluminum always hereditarily followed the size of the original TiAl3 particles. In addition, the grain refinement efficiency of Al-4Ti master alloys with the same morphology, size, and distribution of TiAl3 particles prepared through different processes was almost identical. PMID:28773987

  13. Grain Refinement Efficiency in Commercial-Purity Aluminum Influenced by the Addition of Al-4Ti Master Alloys with Varying TiAl₃ Particles.

    PubMed

    Zhao, Jianhua; He, Jiansheng; Tang, Qi; Wang, Tao; Chen, Jing

    2016-10-26

    A series of Al-4Ti master alloys with various TiAl₃ particles were prepared via pouring the pure aluminum added with K₂TiF₆ or sponge titanium into three different molds made of graphite, copper, and sand. The microstructure and morphology of TiAl₃ particles were characterized and analyzed by scanning electron microscope (SEM) with energy dispersive spectroscopy (EDS). The microstructure of TiAl₃ particles in Al-4Ti master alloys and their grain refinement efficiency in commercial-purity aluminum were investigated in this study. Results show that there were three different morphologies of TiAl₃ particles in Al-4Ti master alloys: petal-like structures, blocky structures, and flaky structures. The Al-4Ti master alloy with blocky TiAl₃ particles had better and more stable grain refinement efficiency than the master alloys with petal-like and flaky TiAl₃ particles. The average grain size of the refined commercial-purity aluminum always hereditarily followed the size of the original TiAl₃ particles. In addition, the grain refinement efficiency of Al-4Ti master alloys with the same morphology, size, and distribution of TiAl₃ particles prepared through different processes was almost identical.

  14. Segregation Coefficients of Impurities in Selenium by Zone Refining

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Sha, Yi-Gao

    1998-01-01

    The purification of Se by zone refining process was studied. The impurity solute levels along the length of a zone-refined Se sample were measured by spark source mass spectrographic analysis. By comparing the experimental concentration levels with theoretical curves the segregation coefficient, defined as the ratio of equilibrium concentration of a given solute in the solid to that in the liquid, k = x(sub s)/x(sub l) for most of the impurities in Se are found to be close to unity, i.e., between 0.85 and 1.15, with the k value for Si, Zn, Fe, Na and Al greater than 1 and that for S, Cl, Ca, P, As, Mn and Cr less than 1. This implies that a large number of passes is needed for the successful implementation of zone refining in the purification of Se.

  15. Comparison of corrosion behavior between coarse grained and nano/ultrafine grained alloy 690

    NASA Astrophysics Data System (ADS)

    Jinlong, Lv; Tongxiang, Liang; Chen, Wang; Ting, Guo

    2016-01-01

    The effect of grain refinement on corrosion resistance of alloy 690 was investigated. The electron work function value of coarse grained alloy 690 was higher than that of nano/ultrafine grained one. The grain refinement reduced the electron work function of alloy 690. The passive films formed on coarse grained and nano/ultrafine grained alloy 690 in borate buffer solution were studied by potentiodynamic curves and electrochemical impedance spectroscopy and X-ray photoelectron spectroscopy. The results showed that the grain refinement improved corrosion resistance of alloy 690. This was attributed to the fact that grain refinement promoted the enrichment of Cr2O3 and inhibited Cr(OH)3 in the passive film. More Cr2O3 in passive film could significantly improve the corrosion resistance of the nano/ultrafine grained alloy 690.

  16. Comparison of protocols measuring diffusion and partition coefficients in the stratum corneum

    PubMed Central

    Rothe, H.; Obringer, C.; Manwaring, J.; Avci, C.; Wargniez, W.; Eilstein, J.; Hewitt, N.; Cubberley, R.; Duplan, H.; Lange, D.; Jacques‐Jamin, C.; Klaric, M.; Schepky, A.

    2017-01-01

    Abstract Partition (K) and diffusion (D) coefficients are important to measure for the modelling of skin penetration of chemicals through the stratum corneum (SC). We compared the feasibility of three protocols for the testing of 50 chemicals in our main studies, using three cosmetics‐relevant model chemicals with a wide range of logP values. Protocol 1: SC concentration‐depth profile using tape‐stripping (measures KSC/v and DSC/HSC 2, where HSC is the SC thickness); Protocol 2A: incubation of isolated SC with chemical (direct measurement of KSC/v only) and Protocol 2B: diffusion through isolated SC mounted on a Franz cell (measures KSC/v and DSC/HSC 2, and is based on Fick's laws). KSC/v values for caffeine and resorcinol using Protocol 1 and 2B were within 30% of each other, values using Protocol 2A were ~two‐fold higher, and all values were within 10‐fold of each other. Only indirect determination of KSC/v by Protocol 2B was different from the direct measurement of KSC/v by Protocol 2A and Protocol 1 for 7‐EC. The variability of KSC/v for all three chemicals using Protocol 2B was higher compared to Protocol 1 and 2A. DSC/HSC 2 values for the three chemicals were of the same order of magnitude using all three protocols. Additionally, using Protocol 1, there was very little difference between parameters measured in pig and human SC. In conclusion, KSC/v, and DSC values were comparable using different methods. Pig skin might be a good surrogate for human skin for the three chemicals tested. Copyright © 2017 The Authors Journal of Applied Toxicology published by John Wiley & Sons Ltd. PMID:28139006

  17. Validity of Three-Dimensional Photonic Scanning Technique for Estimating Percent Body Fat.

    PubMed

    Shitara, K; Kanehisa, H; Fukunaga, T; Yanai, T; Kawakami, Y

    2013-01-01

    Three-dimensional photonic scanning (3DPS) was recently developed to measure dimensions of a human body surface. The purpose of this study was to explore the validity of body volume measured by 3DPS for estimating the percent body fat (%fat). Design, setting, participants, and measurement: The body volumes were determined by 3DPS in 52 women. The body volume was corrected for residual lung volume. The %fat was estimated from body density and compared with the corresponding reference value determined by the dual-energy x-ray absorptiometry (DXA). No significant difference was found for the mean values of %fat obtained by 3DPS (22.2 ± 7.6%) and DXA (23.5 ± 4.9%). The root mean square error of %fat between 3DPS and reference technique was 6.0%. For each body segment, there was a significant positive correlation between 3DPS- and DXA-values, although the corresponding value for the head was slightly larger in 3DPS than in DXA. Residual lung volume was negatively correlated with the estimated error in %fat. The body volume determined with 3DPS is potentially useful for estimating %fat. A possible strategy for enhancing the measurement accuracy of %fat might be to refine the protocol for preparing the subject's hair prior to scanning and to improve the accuracy in the measurement of residual lung volume.

  18. Value-Based Delivery of Education: MOOCs as Messengers

    ERIC Educational Resources Information Center

    Gilfoil, David M.; Focht, Jeffrey W.

    2015-01-01

    Value-based delivery of healthcare has been discussed in the literature for almost a decade. The concept focuses on the patient and defines value as the improvement of patient outcomes divided by healthcare costs. Further refinements, called the Triple Aim model, focus on improving patient outcomes, reducing treatment costs, and improving patient…

  19. Cost Perception and the Expectancy-Value Model of Achievement Motivation.

    ERIC Educational Resources Information Center

    Anderson, Patricia N.

    The expectancy-value model of achievement motivation, first described by J. Atkinson (1957) and refined by J. Eccles and her colleagues (1983, 1992, 1994) predicts achievement motivation based on expectancy for success and perceived task value. Cost has been explored very little. To explore the possibility that cost is different from expectancy…

  20. Value of Information Evaluation using Field Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trainor-Guitton, W.

    2015-06-15

    Value of information (VOI) provides the ability to identify and prioritize useful information gathering for a geothermal prospect, either hydrothermal or for enhanced geothermal systems. Useful information provides a value greater than the cost of the information; wasteful information costs more than the expected value of the information. In this project we applied and refined VOI methodologies on selected geothermal prospects.

  1. Radiometric Calibration of the Earth Observing System's Imaging Sensors

    NASA Technical Reports Server (NTRS)

    Slater, Philip N. (Principal Investigator)

    1997-01-01

    The work on the grant was mainly directed towards developing new, accurate, redundant methods for the in-flight, absolute radiometric calibration of satellite multispectral imaging systems and refining the accuracy of methods already in use. Initially the work was in preparation for the calibration of MODIS and HIRIS (before the development of that sensor was canceled), with the realization it would be applicable to most imaging multi- or hyper-spectral sensors provided their spatial or spectral resolutions were not too coarse. The work on the grant involved three different ground-based, in-flight calibration methods reflectance-based radiance-based and diffuse-to-global irradiance ratio used with the reflectance-based method. This continuing research had the dual advantage of: (1) developing several independent methods to create the redundancy that is essential for the identification and hopefully the elimination of systematic errors; and (2) refining the measurement techniques and algorithms that can be used not only for improving calibration accuracy but also for the reverse process of retrieving ground reflectances from calibrated remote-sensing data. The grant also provided the support necessary for us to embark on other projects such as the ratioing radiometer approach to on-board calibration (this has been further developed by SBRS as the 'solar diffuser stability monitor' and is incorporated into the most important on-board calibration system for MODIS)- another example of the work, which was a spin-off from the grant funding, was a study of solar diffuser materials. Journal citations, titles and abstracts of publications authored by faculty, staff, and students are also attached.

  2. Differentiation of whole grain and refined wheat (T. aestivum) flour using a fuzzy mass spectrometric fingerprinting and chemometric approaches

    USDA-ARS?s Scientific Manuscript database

    A fuzzy mass spectrometric (MS) fingerprinting method combined with chemometric analysis was established to provide rapid discrimination between whole grain and refined wheat flour. Twenty one samples, including thirteen samples from three cultivars and eight from local grocery store, were studied....

  3. 40 CFR 80.76 - Registration of refiners, importers or oxygenate blenders.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Reformulated Gasoline § 80.76... required for any refiner and importer that produces or imports any reformulated gasoline or RBOB, and any... November 1, 1994, or not later than three months in advance of the first date that such person will produce...

  4. 40 CFR 80.76 - Registration of refiners, importers or oxygenate blenders.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Reformulated Gasoline § 80.76... required for any refiner and importer that produces or imports any reformulated gasoline or RBOB, and any... November 1, 1994, or not later than three months in advance of the first date that such person will produce...

  5. Fossil energy biotechnology: A research needs assessment. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-11-01

    The Office of Program Analysis of the US Department of Energy commissioned this study to evaluate and prioritize research needs in fossil energy biotechnology. The objectives were to identify research initiatives in biotechnology that offer timely and strategic options for the more efficient and effective uses of the Nation`s fossil resource base, particularly the early identification of new and novel applications of biotechnology for the use or conversion of domestic fossil fuels. Fossil energy biotechnology consists of a number of diverse and distinct technologies, all related by the common denominator -- biocatalysis. The expert panel organized 14 technical subjects intomore » three interrelated biotechnology programs: (1) upgrading the fuel value of fossil fuels; (2) bioconversion of fossil feedstocks and refined products to added value chemicals; and, (3) the development of environmental management strategies to minimize and mitigate the release of toxic and hazardous petrochemical wastes.« less

  6. How can systems engineering inform the methods of programme evaluation in health professions education?

    PubMed

    Rojas, David; Grierson, Lawrence; Mylopoulos, Maria; Trbovich, Patricia; Bagli, Darius; Brydges, Ryan

    2018-04-01

    We evaluate programmes in health professions education (HPE) to determine their effectiveness and value. Programme evaluation has evolved from use of reductionist frameworks to those addressing the complex interactions between programme factors. Researchers in HPE have recently suggested a 'holistic programme evaluation' aiming to better describe and understand the implications of 'emergent processes and outcomes'. We propose a programme evaluation framework informed by principles and tools from systems engineering. Systems engineers conceptualise complexity and emergent elements in unique ways that may complement and extend contemporary programme evaluations in HPE. We demonstrate how the abstract decomposition space (ADS), an engineering knowledge elicitation tool, provides the foundation for a systems engineering informed programme evaluation designed to capture both planned and emergent programme elements. We translate the ADS tool to use education-oriented language, and describe how evaluators can use it to create a programme-specific ADS through iterative refinement. We provide a conceptualisation of emergent elements and an equation that evaluators can use to identify the emergent elements in their programme. Using our framework, evaluators can analyse programmes not as isolated units with planned processes and planned outcomes, but as unfolding, complex interactive systems that will exhibit emergent processes and emergent outcomes. Subsequent analysis of these emergent elements will inform the evaluator as they seek to optimise and improve the programme. Our proposed systems engineering informed programme evaluation framework provides principles and tools for analysing the implications of planned and emergent elements, as well as their potential interactions. We acknowledge that our framework is preliminary and will require application and constant refinement. We suggest that our framework will also advance our understanding of the construct of 'emergence' in HPE research. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  7. Some observations on mesh refinement schemes applied to shock wave phenomena

    NASA Technical Reports Server (NTRS)

    Quirk, James J.

    1995-01-01

    This workshop's double-wedge test problem is taken from one of a sequence of experiments which were performed in order to classify the various canonical interactions between a planar shock wave and a double wedge. Therefore to build up a reasonably broad picture of the performance of our mesh refinement algorithm we have simulated three of these experiments and not just the workshop case. Here, using the results from these simulations together with their experimental counterparts, we make some general observations concerning the development of mesh refinement schemes for shock wave phenomena.

  8. Correcting pervasive errors in RNA crystallography through enumerative structure prediction.

    PubMed

    Chou, Fang-Chieh; Sripakdeevong, Parin; Dibrov, Sergey M; Hermann, Thomas; Das, Rhiju

    2013-01-01

    Three-dimensional RNA models fitted into crystallographic density maps exhibit pervasive conformational ambiguities, geometric errors and steric clashes. To address these problems, we present enumerative real-space refinement assisted by electron density under Rosetta (ERRASER), coupled to Python-based hierarchical environment for integrated 'xtallography' (PHENIX) diffraction-based refinement. On 24 data sets, ERRASER automatically corrects the majority of MolProbity-assessed errors, improves the average R(free) factor, resolves functionally important discrepancies in noncanonical structure and refines low-resolution models to better match higher-resolution models.

  9. MODFLOW-LGR-Modifications to the streamflow-routing package (SFR2) to route streamflow through locally refined grids

    USGS Publications Warehouse

    Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    This report documents modifications to the Streamflow-Routing Package (SFR2) to route streamflow through grids constructed using the multiple-refined-areas capability of shared node Local Grid Refinement (LGR) of MODFLOW-2005. MODFLOW-2005 is the U.S. Geological Survey modular, three-dimensional, finite-difference groundwater-flow model. LGR provides the capability to simulate groundwater flow by using one or more block-shaped, higher resolution local grids (child model) within a coarser grid (parent model). LGR accomplishes this by iteratively coupling separate MODFLOW-2005 models such that heads and fluxes are balanced across the shared interfacing boundaries. Compatibility with SFR2 allows for streamflow routing across grids. LGR can be used in two- and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined groundwater systems.

  10. Deriving Lava Eruption Temperatures on Io Using Lava Tube Skylights

    NASA Astrophysics Data System (ADS)

    Davies, A. G.; Keszthelyi, L. P.; McEwen, A. S.

    2015-12-01

    The eruption temperature of Io's silicate lavas constrains Io's interior state and composition [1] but reliably measuring this temperature remotely is a challenge that has not yet been met. Previously, we established that eruption processes that expose large areas at the highest temperatures, such as roiling lava lakes or lava fountains, are suitable targets for this task [2]. In this study we investigate the thermal emission from lava tube skylights for basaltic and ultramafic composition lavas. Tube-fed lava flows are known on Io so skylights could be common. Unlike the surfaces of lava flows, lava lakes, and lava fountains which all cool very rapidly, skylights have steady thermal emission on a scale of days to months. The thermal emission from such a target, measured at multiple visible and NIR wavelengths, can provide a highly accurate diagnostic of eruption temperature. However, the small size of skylights means that close flybys of Io are necessary, requiring a dedicated Io mission [3]. We have modelled the thermal emission spectrum for different skylight sizes, lava flow stream velocities, end-member lava compositions, and skylight radiation shape factors, determining the flow surface cooling rates. We calculate the resulting thermal emission spectrum as a function of viewing angle. From the resulting 0.7:0.9 μm ratios, we see a clear distinction between basaltic and ultramafic compositions for skylights smaller than 20 m across, even if sub-pixel. If the skylight is not resolved, observations distributed over weeks that show a stationary and steady hot spot allow the presence of a skylight to be confidently inferred. This inference allows subsequent refining of observation design to improve viewing geometry of the target. Our analysis will be further refined as accurate high-temperature short-wavelength emissivity values become available [4]. This work was performed at the Jet Propulsion Laboratory-California Institute of Technology, under contract to NASA. We thank the NASA OPR Program for support. References: [1] Keszthelyi et al. (2007) Icarus, 192, 491-502. [2] Davies et al. (2012) GRL, 38, L21308. [3] McEwen et al. (2015) The Io Volcano Observer (IVO), LPSC-46, abstract 1627. [4] Ramsey and Harris (2015) IAVCEI-2015, Prague, Cz. Rep., abstract IUGG-3519.

  11. Orthogonal polynomials for refinable linear functionals

    NASA Astrophysics Data System (ADS)

    Laurie, Dirk; de Villiers, Johan

    2006-12-01

    A refinable linear functional is one that can be expressed as a convex combination and defined by a finite number of mask coefficients of certain stretched and shifted replicas of itself. The notion generalizes an integral weighted by a refinable function. The key to calculating a Gaussian quadrature formula for such a functional is to find the three-term recursion coefficients for the polynomials orthogonal with respect to that functional. We show how to obtain the recursion coefficients by using only the mask coefficients, and without the aid of modified moments. Our result implies the existence of the corresponding refinable functional whenever the mask coefficients are nonnegative, even when the same mask does not define a refinable function. The algorithm requires O(n^2) rational operations and, thus, can in principle deliver exact results. Numerical evidence suggests that it is also effective in floating-point arithmetic.

  12. Value of Collaboration With Standardized Patients and Patient Facilitators in Enhancing Reflection During the Process of Building a Simulation.

    PubMed

    Stanley, Claire; Lindsay, Sally; Parker, Kathryn; Kawamura, Anne; Samad Zubairi, Mohammad

    2018-05-09

    We previously reported that experienced clinicians find the process of collectively building and participating in simulations provide (1) a unique reflective opportunity; (2) a venue to identify different perspectives through discussion and action in a group; and (3) a safe environment for learning. No studies have assessed the value of collaborating with standardized patients (SPs) and patient facilitators (PFs) in the process. In this work, we describe this collaboration in building a simulation and the key elements that facilitate reflection. Three simulation scenarios surrounding communication were built by teams of clinicians, a PF, and SPs. Six build sessions were audio recorded, transcribed, and thematically analyzed through an iterative process to (1) describe the steps of building a simulation scenario and (2) identify the key elements involved in the collaboration. The five main steps to build a simulation scenario were (1) storytelling and reflection; (2) defining objectives and brainstorming ideas; (3) building a stem and creating a template; (4) refining the scenario with feedback from SPs; and (5) mock run-throughs with follow-up discussion. During these steps, the PF shared personal insights, challenging participants to reflect deeper to better understand and consider the patient's perspective. The SPs provided unique outside perspective to the group. In addition, the interaction between the SPs and the PF helped refine character roles. A collaborative approach incorporating feedback from PFs and SPs to create a simulation scenario is a valuable method to enhance reflective practice for clinicians.

  13. An adaptive mesh-moving and refinement procedure for one-dimensional conservation laws

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Flaherty, Joseph E.; Arney, David C.

    1993-01-01

    We examine the performance of an adaptive mesh-moving and /or local mesh refinement procedure for the finite difference solution of one-dimensional hyperbolic systems of conservation laws. Adaptive motion of a base mesh is designed to isolate spatially distinct phenomena, and recursive local refinement of the time step and cells of the stationary or moving base mesh is performed in regions where a refinement indicator exceeds a prescribed tolerance. These adaptive procedures are incorporated into a computer code that includes a MacCormack finite difference scheme wih Davis' artificial viscosity model and a discretization error estimate based on Richardson's extrapolation. Experiments are conducted on three problems in order to qualify the advantages of adaptive techniques relative to uniform mesh computations and the relative benefits of mesh moving and refinement. Key results indicate that local mesh refinement, with and without mesh moving, can provide reliable solutions at much lower computational cost than possible on uniform meshes; that mesh motion can be used to improve the results of uniform mesh solutions for a modest computational effort; that the cost of managing the tree data structure associated with refinement is small; and that a combination of mesh motion and refinement reliably produces solutions for the least cost per unit accuracy.

  14. Outcome of abstracts presented at the British Association of Paediatric Surgeons congresses (1999-2008).

    PubMed

    Macdonald, Alexander L; Parsons, Christopher; Davenport, Mark

    2012-02-01

    Abstracts presented at the British Association of Paediatric Surgeons annual congress have the potential to influence practice. However, it is not known what percentage of accepted abstracts actually go on to withstand peer review and be published in the literature. Abstract books were reviewed for the period 1999 to 2008. A MEDLINE search using keywords from title and authors' names was used to identify subsequent publication. Categorical analysis for variation and trend with P < .05 was accepted as significant. Data were expressed as median (interquartile range). During the 10-year period, 862 abstracts were presented orally and were derived from 36 countries, with a median of 18 (17-19) countries represented each year. Of these, 375 (43%) abstracts originated from 25 United Kingdom (UK) institutions with most (45%) from London and specifically the Institute of Child Health/Great Ormond Street Hospital (n = 118, 14%). The annual median number of presentations was 81 (74-97). This fell during the first half of the decade but is now rising with a significant increase in the UK proportion (P = .001). Thirty (27-35) abstracts per year (overall, n = 302) were subsequently published with the proportion (36% [33%-39%]) remaining remarkably consistent over the period. Abstracts were published in a range of 26 journals, but most (69%) were published in the Journal of Pediatric Surgery. The publication rate of the British Association of Paediatric Surgeons congress and hence entry into the "evidence base" as published material is consistent at just over one third of submissions. Whether this represents a waste of scientific endeavor or further refinement of quality is a moot point. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Beyond Captions: Linking Figures with Abstract Sentences in Biomedical Articles

    PubMed Central

    Bockhorst, Joseph P.; Conroy, John M.; Agarwal, Shashank; O’Leary, Dianne P.; Yu, Hong

    2012-01-01

    Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an -score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org. PMID:22815711

  16. Mining the protein data bank with CReF to predict approximate 3-D structures of polypeptides.

    PubMed

    Dorn, Márcio; de Souza, Osmar Norberto

    2010-01-01

    n this paper we describe CReF, a Central Residue Fragment-based method to predict approximate 3-D structures of polypeptides by mining the Protein Data Bank (PDB). The approximate predicted structures are good enough to be used as starting conformations in refinement procedures employing state-of-the-art molecular mechanics methods such as molecular dynamics simulations. CReF is very fast and we illustrate its efficacy in three case studies of polypeptides whose sizes vary from 34 to 70 amino acids. As indicated by the RMSD values, our initial results show that the predicted structures adopt the expected fold, similar to the experimental ones.

  17. A Conceptual Model of Career Development to Enhance Academic Motivation

    ERIC Educational Resources Information Center

    Collins, Nancy Creighton

    2010-01-01

    The purpose of this study was to develop, refine, and validate a conceptual model of career development to enhance the academic motivation of community college students. To achieve this end, a straw model was built from the theoretical and empirical research literature. The model was then refined and validated through three rounds of a Delphi…

  18. Associations of whole and refined grain intakes with adiposity-related cancer risk in the Framingham Offspring Cohort (1991-2013)

    USDA-ARS?s Scientific Manuscript database

    Objective: The objective of this prospective cohort study is to evaluate associations between whole and refined grains and their food sources in relation to risk of adiposity-related cancers combined and three of the most commonly diagnosed site-specific cancers in the US: breast, prostate, and colo...

  19. Medium density fiberboard from mixed southern hardwoods

    Treesearch

    George E. Woodson

    1977-01-01

    Medium-density fiberboards of acceptable quality were made from a mixture of barky chips from 14 southern hardwoods. Boards made from fiber refined at three different plate clearances did not vary significantly in bending, internal bond (IB), or linear expansion. but, lack of replications and the fact that the refiner was not loaded to capacity caused these results to...

  20. Eating Behaviour in the General Population: An Analysis of the Factor Structure of the German Version of the Three-Factor-Eating-Questionnaire (TFEQ) and Its Association with the Body Mass Index

    PubMed Central

    Löffler, Antje; Luck, Tobias; Then, Francisca S.; Sikorski, Claudia; Kovacs, Peter; Böttcher, Yvonne; Breitfeld, Jana; Tönjes, Anke; Horstmann, Annette; Löffler, Markus; Engel, Christoph; Thiery, Joachim; Villringer, Arno; Stumvoll, Michael; Riedel-Heller, Steffi G.

    2015-01-01

    The Three-Factor-Eating-Questionnaire (TFEQ) is an established instrument to assess eating behaviour. Analysis of the TFEQ-factor structure was based on selected, convenient and clinical samples so far. Aims of this study were (I) to analyse the factor structure of the German version of the TFEQ and (II)—based on the refined factor structure—to examine the association between eating behaviour and the body mass index (BMI) in a general population sample of 3,144 middle-aged and older participants (40–79 years) of the ongoing population based cohort study of the Leipzig Research Center for Civilization Diseases (LIFE Health Study). The factor structure was examined in a split-half analysis with both explorative and confirmatory factor analysis. Associations between TFEQ-scores and BMI values were tested with multiple regression analyses controlled for age, gender, and education. We found a three factor solution for the TFEQ with an ‘uncontrolled eating’, a ‘cognitive restraint’ and an ‘emotional eating’ domain including 29 of the original 51 TFEQ-items. Scores of the ‘uncontrolled eating domain’ showed the strongest correlation with BMI values (partial r = 0.26). Subjects with scores above the median in both ‘uncontrolled eating’ and ‘emotional eating’ showed the highest BMI values (mean = 29.41 kg/m²), subjects with scores below the median in all three domains showed the lowest BMI values (mean = 25.68 kg/m²; F = 72.074, p<0.001). Our findings suggest that the TFEQ is suitable to identify subjects with specific patterns of eating behaviour that are associated with higher BMI values. Such information may help health care professionals to develop and implement more tailored interventions for overweight and obese individuals. PMID:26230264

  1. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  2. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  3. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  4. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  5. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  6. KoBaMIN: a knowledge-based minimization web server for protein structure refinement.

    PubMed

    Rodrigues, João P G L M; Levitt, Michael; Chopra, Gaurav

    2012-07-01

    The KoBaMIN web server provides an online interface to a simple, consistent and computationally efficient protein structure refinement protocol based on minimization of a knowledge-based potential of mean force. The server can be used to refine either a single protein structure or an ensemble of proteins starting from their unrefined coordinates in PDB format. The refinement method is particularly fast and accurate due to the underlying knowledge-based potential derived from structures deposited in the PDB; as such, the energy function implicitly includes the effects of solvent and the crystal environment. Our server allows for an optional but recommended step that optimizes stereochemistry using the MESHI software. The KoBaMIN server also allows comparison of the refined structures with a provided reference structure to assess the changes brought about by the refinement protocol. The performance of KoBaMIN has been benchmarked widely on a large set of decoys, all models generated at the seventh worldwide experiments on critical assessment of techniques for protein structure prediction (CASP7) and it was also shown to produce top-ranking predictions in the refinement category at both CASP8 and CASP9, yielding consistently good results across a broad range of model quality values. The web server is fully functional and freely available at http://csb.stanford.edu/kobamin.

  7. Summary of flat-plate solar array project documentation: Abstracts of published documents, 1975-1986, revision 1

    NASA Technical Reports Server (NTRS)

    Phillips, M. J.

    1986-01-01

    Abstracts of final reports, or the latest quarterly or annual, of the Flat-Plate Solar Array (FSA) Project Contractor of Jet Propulsion Laboratory (JPL) in-house activities are presented. Also presented is a list of proceedings and publications, by author, of work connected with the project. The aim of the program has been to stimulate the development of technology that will enable the private sector to manufacture and widely use photovoltaic systems for the generation of electricity in residential, commercial, industrial, and Government applications at a cost per watt that is competitive with utility generated power. FSA Project activities have included the sponsoring of research and development efforts in silicon refinement processes, advanced silicon sheet growth techniques, higher efficiency solar cells, solar cell/module fabrication processes, encapsulation, module/array engineering and reliability, and economic analyses.

  8. Polymorphic Contracts

    NASA Astrophysics Data System (ADS)

    Belo, João Filipe; Greenberg, Michael; Igarashi, Atsushi; Pierce, Benjamin C.

    Manifest contracts track precise properties by refining types with predicates - e.g., {x : Int |x > 0 } denotes the positive integers. Contracts and polymorphism make a natural combination: programmers can give strong contracts to abstract types, precisely stating pre- and post-conditions while hiding implementation details - for example, an abstract type of stacks might specify that the pop operation has input type {x :α Stack |not ( empty x )} . We formalize this combination by defining FH, a polymorphic calculus with manifest contracts, and establishing fundamental properties including type soundness and relational parametricity. Our development relies on a significant technical improvement over earlier presentations of contracts: instead of introducing a denotational model to break a problematic circularity between typing, subtyping, and evaluation, we develop the metatheory of contracts in a completely syntactic fashion, omitting subtyping from the core system and recovering it post facto as a derived property.

  9. Macromolecular refinement by model morphing using non-atomic parameterizations.

    PubMed

    Cowtan, Kevin; Agirre, Jon

    2018-02-01

    Refinement is a critical step in the determination of a model which explains the crystallographic observations and thus best accounts for the missing phase components. The scattering density is usually described in terms of atomic parameters; however, in macromolecular crystallography the resolution of the data is generally insufficient to determine the values of these parameters for individual atoms. Stereochemical and geometric restraints are used to provide additional information, but produce interrelationships between parameters which slow convergence, resulting in longer refinement times. An alternative approach is proposed in which parameters are not attached to atoms, but to regions of the electron-density map. These parameters can move the density or change the local temperature factor to better explain the structure factors. Varying the size of the region which determines the parameters at a particular position in the map allows the method to be applied at different resolutions without the use of restraints. Potential applications include initial refinement of molecular-replacement models with domain motions, and potentially the use of electron density from other sources such as electron cryo-microscopy (cryo-EM) as the refinement model.

  10. Numerical analysis of impurity separation from waste salt by investigating the change of concentration at the interface during zone refining process

    NASA Astrophysics Data System (ADS)

    Choi, Ho-Gil; Shim, Moonsoo; Lee, Jong-Hyeon; Yi, Kyung-Woo

    2017-09-01

    The waste salt treatment process is required for the reuse of purified salts, and for the disposal of the fission products contained in waste salt during pyroprocessing. As an alternative to existing fission product separation methods, the horizontal zone refining process is used in this study for the purification of waste salt. In order to evaluate the purification ability of the process, three-dimensional simulation is conducted, considering heat transfer, melt flow, and mass transfer. Impurity distributions and decontamination factors are calculated as a function of the heater traverse rate, by applying a subroutine and the equilibrium segregation coefficient derived from the effective segregation coefficients. For multipass cases, 1d solutions and the effective segregation coefficient obtained from three-dimensional simulation are used. In the present study, the topic is not dealing with crystal growth, but the numerical technique used is nearly the same since the zone refining technique was just introduced in the treatment of waste salt from nuclear power industry because of its merit of simplicity and refining ability. So this study can show a new application of single crystal growth techniques to other fields, by taking advantage of the zone refining multipass possibility. The final goal is to achieve the same high degree of decontamination in the waste salt as in zone freezing (or reverse Bridgman) method.

  11. Preservation of three-dimensional anatomy in phosphatized fossil arthropods enriches evolutionary inference

    PubMed Central

    Schwermann, Achim H; dos Santos Rolo, Tomy; Caterino, Michael S; Bechly, Günter; Schmied, Heiko; Baumbach, Tilo; van de Kamp, Thomas

    2016-01-01

    External and internal morphological characters of extant and fossil organisms are crucial to establishing their systematic position, ecological role and evolutionary trends. The lack of internal characters and soft-tissue preservation in many arthropod fossils, however, impedes comprehensive phylogenetic analyses and species descriptions according to taxonomic standards for Recent organisms. We found well-preserved three-dimensional anatomy in mineralized arthropods from Paleogene fissure fillings and demonstrate the value of these fossils by utilizing digitally reconstructed anatomical structure of a hister beetle. The new anatomical data facilitate a refinement of the species diagnosis and allowed us to reject a previous hypothesis of close phylogenetic relationship to an extant congeneric species. Our findings suggest that mineralized fossils, even those of macroscopically poor preservation, constitute a rich but yet largely unexploited source of anatomical data for fossil arthropods. DOI: http://dx.doi.org/10.7554/eLife.12129.001 PMID:26854367

  12. Refining Trait Resilience: Identifying Engineering, Ecological, and Adaptive Facets from Extant Measures of Resilience

    PubMed Central

    Maltby, John; Day, Liz; Hall, Sophie

    2015-01-01

    The current paper presents a new measure of trait resilience derived from three common mechanisms identified in ecological theory: Engineering, Ecological and Adaptive (EEA) resilience. Exploratory and confirmatory factor analyses of five existing resilience scales suggest that the three trait resilience facets emerge, and can be reduced to a 12-item scale. The conceptualization and value of EEA resilience within the wider trait and well-being psychology is illustrated in terms of differing relationships with adaptive expressions of the traits of the five-factor personality model and the contribution to well-being after controlling for personality and coping, or over time. The current findings suggest that EEA resilience is a useful and parsimonious model and measure of trait resilience that can readily be placed within wider trait psychology and that is found to contribute to individual well-being. PMID:26132197

  13. Clinically significant and practical! Enhancing precision does make a difference. Reply to McGlinchey and Jacobson, Hsu, and Speer.

    PubMed

    Hageman, W J; Arrindell, W A

    1999-12-01

    Based on a secondary analysis of the Jacobson and Truax [Jacobson, N.S. & Truax, P. (1991). a statistical approach to defining meaningful change in psychotherapy research. Journal of Consulting and Clinical Psychology, 59, 12-19.] data using both their own traditional approach and the refined method advanced by Hageman and Arrindell [Hageman, W.J.J.M., & Arrindell, W.A. (1999). Establishing clinically significant change: increment of precision and the distinction between individual and group level of analysis. Behaviour Research and Therapy, 37, 1169-1193], McGlinchey and Jacobson [McGlinchey, J. B., & Jacobson, N. S. (1999). Clinically significant but impractical? A response to Hageman and Arrindell. Behaviour Research and Therapy, 37, 1211-1217.] reported practically identical findings on reliable and clinically significant change across the two approaches. This led McGlinchey and Jacobson to conclude that there is little practical gain in utilizing the refined method over the traditional approach. Close inspection of the data used by McGlinchey and Jacobson however revealed a serious mistake with respect to the value of the standard error of measurement that was employed in their calculations. When the proper index value was utilised, further re-analysis by the present authors disclosed clear differences (i.e. different classifications of S's) across the two approaches. Importantly, these differences followed exactly the same pattern as depicted in Table 2 in Hageman and Arrindell (1999). The theoretical advantages of the refined method, i.e. enhanced precision, appropriate distinction between analysis at the individual and group levels, and maximal comparability of findings across studies, exceed those of the traditional method. Application of the refined method may be carried out within approximately half an hour, which not only supports its practical manageability, but also challenges the suggestion of McGlinchey and Jacobson (1999) that the relevant method would be too complex (impractical) for the average scientist. The reader is offered the opportunity of obtaining an SPSS setup in the form of an ASCII text file by means of which the relevant calculations can be carried out. The ways in which the valuable commentaries by Hsu [Hsu, L. M. (1999). A comparison of three methods of identifying reliable and clinically significant client changes: commentary on Hageman and Arrindell. Behaviour Research and Therapy, 37, 1195-1202.] and Speer [Speer, D. C. (1999). What is the role of two-wave designs in clinical research? Comment on Hageman and Arrindell. Behaviour Research and Therapy, 37, 1203-1210.) contribute to a better understanding of the technical/statistical backgrounds of the traditional and refined methods were also discussed.

  14. Interim Report on Heuristics about Inspection Parameters: Updates to Heuristics Resulting from Refinement on Projects

    NASA Technical Reports Server (NTRS)

    Shull, Forrest; Seaman, Carolyn; Feldman, Raimund; Haingaertner, Ralf; Regardie, Myrna

    2008-01-01

    In 2008, we have continued analyzing the inspection data in an effort to better understand the applicability and effect of the inspection heuristics on inspection outcomes. Our research goals during this period are: 1. Investigate the effect of anomalies in the dataset (e.g. the very large meeting length values for some inspections) on our results 2. Investigate the effect of the heuristics on other inspection outcome variables (e.g. effort) 3. Investigate whether the recommended ranges can be modified to give inspection planners more flexibility without sacrificing effectiveness 4. Investigate possible refinements or modifications to the heuristics for specific subdomains (partitioned, e.g., by size, domain, or Center) This memo reports our results to date towards addressing these goals. In the next section, the first goal is addressed by describing the types of anomalies we have found in our dataset, how we have addressed them, and the effect of these changes on our previously reported results. In the following section, on "methodology", we describe the analyses we have conducted to address the other three goals and the results of these analyses are described in the "results" section. Finally, we conclude with future plans for continuing our investigation.

  15. Effect of addition of semi refined carrageenan on mechanical characteristics of gum arabic edible film

    NASA Astrophysics Data System (ADS)

    Setyorini, D.; Nurcahyani, P. R.

    2016-04-01

    Currently the seaweed is processed flour and Semi Refined Carraagenan (SRC). However, total production is small, but both of these products have a high value and are used in a wide variety of products such as cosmetics, processed foods, medicines, and edible film. The aim of this study were (1) to determine the effect of SRC on mechanical characteristics of edible film, (2) to determine the best edible film which added by SRC with different concentration. The edible film added by SRC flour which divided into three concentrations of SRC. There are 1.5%; 3%; and 4.5% of SRC, then added 3% glycerol and 0.6% arabic gum. The mechanical properties of the film measured by a universal testing machine Orientec Co. Ltd., while the water vapor permeability measured by the gravimetric method dessicant modified. The experimental design used was completely randomized design with a further test of Duncan. The result show SRC concentration differences affect the elongation breaking point and tensile strength. But not significant effect on the thickness, yield strength and the modulus of elasticity. The best edible film is edible film with the addition of SRC 4.5%.

  16. Parallel Adaptive Mesh Refinement Library

    NASA Technical Reports Server (NTRS)

    Mac-Neice, Peter; Olson, Kevin

    2005-01-01

    Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.

  17. Direct Reduction of Waste through Refining of DOE Metal Assets - 13632

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargett, Michael C.; Terekhov, Dimitri; Khozan, Kamran M.

    2013-07-01

    CVMR{sup R} presents a technology for refining nickel from the enrichment barrier materials of the DOE that is proven through 100 years of use by the metals industry. CVMR{sup R} applies modern controls, instrumentation for process and monitoring of the system, and innovative production methods to produce a wide spectrum of products that generate new technology applications and improvements to our society and economy. CVMR{sup R} will receive barrier materials as a secure operation and size reduce the metal to a shred that is fed to a carbonylation reactor where nickel is reacted with carbon monoxide and generate nickel carbonyl.more » The carbonyl will be filtered and decomposed with heat to form a variety of products that include high value nano powders, coated substrates, net shapes and pure nickel. The residue from the reactor will retain radionuclides from enrichment activities. The carbon monoxide will only react and extract nickel under the operating conditions to leave volumetric contamination in the unreacted residue. A demonstration plant was designed and built by CVMR{sup R} and operated by BWXT, to demonstrate the systems capabilities to DOE in 2006. A pilot plant operation precedes the detailed design of the nickel refinery and provides essential data for design, safe work practices, waste characterizations and system kinetics and confirms the project feasibility. CVMR{sup R} produces nickel products that are cleaner than the nickel in U.S. commerce and used by industry today. The CVMR{sup R} process and systems for nickel refining is well suited for DOE materials and will provide value through environmental stewardship, recovery of high value assets, and support of the DOE environmental remediation programs as the refined nickel generates additional long term benefits to local communities. (authors)« less

  18. Natural flow and water consumption in the Milk River basin, Montana and Alberta, Canada

    USGS Publications Warehouse

    Thompson, R.E.

    1986-01-01

    A study was conducted to determine the differences between natural and nonnatural Milk River streamflow, to delineate and quantify the types and effects of water consumption on streamflow, and to refine the current computation procedure into one which computes and apportions natural flow. Water consumption consists principally of irrigated agriculture, municipal use, and evapotranspiration. Mean daily water consumption by irrigation ranged from 10 cu ft/sec to 26 cu ft/sec in the Canada part and from 6 cu ft/sec to 41 cu ft/sec in the US part. Two Canadian municipalities consume about 320 acre-ft and one US municipality consumes about 20 acre-ft yearly. Evaporation from the water surface comprises 80% 0 90% of the flow reduction in the Milk River attributed to total evapotranspiration. The current water-budget approach for computing natural flow of the Milk River where it reenters the US was refined into an interim procedure which includes allowances for man-induced consumption and a method for apportioning computed natural flow between the US and Canada. The refined procedure is considered interim because further study of flow routing, tributary inflow, and man-induced consumption is needed before a more accurate procedure for computing natural flow can be developed. (Author 's abstract)

  19. Refining of crude rubber seed oil as a feedstock for biofuel production.

    PubMed

    Gurdeep Singh, Haswin Kaur; Yusup, Suzana; Abdullah, Bawadi; Cheah, Kin Wai; Azmee, Fathin Nabilah; Lam, Hon Loong

    2017-12-01

    Crude rubber seed oil is a potential source for biofuel production. However it contains undesirable impurities such as peroxides and high oxidative components that not only affect the oil stability, colour and shelf-life but promote insoluble gums formation with time that could cause deposition in the combustion engines. Therefore to overcome these problems the crude rubber seed oil is refined by undergoing degumming and bleaching process. The effect of bleaching earth dosage (15-40 wt %), phosphoric acid dosage (0.5-1.0 wt %) and reaction time (20-40 min) were studied over the reduction of the peroxide value in a refined crude rubber seed oil. The analysis of variance shows that bleaching earth dosage was the most influencing factor followed by reaction time and phosphoric acid dosage. A minimum peroxide value of 0.1 milliequivalents/gram was achieved under optimized conditions of 40 wt % of bleaching earth dosage, 1.0 wt % of phosphoric acid dosage and 20 min of reaction time using Response Surface Methodology design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Catholic institutions: mirror or model for society?

    PubMed

    Fitzpatrick, A; Gaylor, C C

    1987-04-01

    Certain values and priorities help establish and shape an organization's identity. Catholic organizations--through the values by which they operate--can determine whether they function as a centrifugal force that shapes the values of the larger society or whether they are driven by the centripetal force of American values, thus accommodating their actions to succeed in self-serving, narrow ways. Catholic organizations can evaluate their practices against three "environments," each composed of value strata, that characterize workplaces: Subpersonal environment. Workers are alienated; the employee is seldom acknowledged as a person. Stratum one--base values. The operative values focus on mechanical working qualities such as punctuality and productivity, rather than human interaction. Stratum two--civil values. Some interaction takes place, but it is geared only to customer satisfaction. Relational environment. Relationships are important to organizational functioning. Stratum three--corporate values. Workers must have more interpersonal skills but are seen as a means to an end: benefit for the corporation. Stratum four--ethical culture values. The worker is recognized as a person to be respected. Operative values are fair play and improving the human condition. Religious environment. Workers affirm the existence of a Godhead, which creates a "community" workplace. Stratum five--Judeo-Christian values. A commitment to charity and mercy and serving others is evident. Stratum six--Catholic values. Persons are seen as the body of Christ; the organization challenges society's tenets when these ignore the human person.(ABSTRACT TRUNCATED AT 250 WORDS)

  1. Development of a problematic mobile phone use scale for Turkish adolescents.

    PubMed

    Güzeller, Cem Oktay; Coşguner, Tolga

    2012-04-01

    Abstract The aim of this study was to evaluate the psychometric properties of the Problematic Mobile Phone Use Scale (PMPUS) for Turkish Adolescents. The psychometric properties of PMPUS were tested in two separate sample groups that consisted of 950 Turkish high school students. The first sample group (n=309) was used to determine the factor structure of the scale. The second sample group (n=461) was used to test data conformity with the identified structure, discriminant validity and concurrent scale validity, internal consistency reliability calculations, and item statistics calculations. The results of exploratory factor analyses indicated that the scale had three factors: interference with negative effect, compulsion/persistence, and withdrawal/tolerance. The results showed that item and construct reliability values yielded satisfactory rates in general for the three-factor construct. On the other hand, the average variance extracted value remained below the scale value for three subscales. The scores for the scale significantly correlated with depression and loneliness. In addition, the discriminant validity value was above the scale in all sub-dimensions except one. Based on these data, the reliability of the PMPUS scale appears to be satisfactory and provides good internal consistency. Therefore, with limited exception, the PMPUS was found to be reliable and valid in the context of Turkish adolescents.

  2. First-principles study on doping and temperature dependence of thermoelectric property of Bi{sub 2}S{sub 3} thermoelectric material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Donglin; Hu, Chenguo, E-mail: hucg@cqu.edu.cn; Zhang, Cuiling

    2013-05-15

    Graphical abstract: The direction-induced ZT is found. At ZZ direction and n = 1.47 × 10{sup 19} cm{sup −3}, the ZT can reach maximal value, 0.36, which is three times as much as maximal laboratorial value. This result matches well the analysis of electron effective mass. Highlights: ► Electrical transportations of Bi{sub 2}S{sub 3} depend on the concentration and temperature. ► The direction-induced ZT is found. ► At ZZ direction and n = 1.47 × 10{sup 19} cm{sup −3}, the ZT can reach maximal value, 0.36. ► The maximal ZT value is three times as much as maximal laboratorial value.more » ► By doping and temperature tuning, Bi{sub 2}S{sub 3} is a promising thermoelectric material. - Abstract: The electronic structure and thermoelectric property of Bi{sub 2}S{sub 3} are investigated. The electron and hole effective mass of Bi{sub 2}S{sub 3} is analyzed in detail, from which we find that the thermoelectric transportation varies in different directions in Bi{sub 2}S{sub 3} crystal. Along ac plane the higher figure of merit (ZT) could be achieved. For n-type doped Bi{sub 2}S{sub 3}, the optimal doping concentration is found in the range of (1.0–5.0) × 10{sup 19} cm{sup −3}, in which the maximal ZT reaches 0.21 at 900 K, but along ZZ direction, the maximal ZT reaches 0.36. These findings provide a new understanding of thermoelectricity-dependent structure factors and improving ZT ways. The donor concentration N increases as T increases at one bar of pressure under a suitable chemical potential μ, but above this chemical potential μ, the donor concentration N keeps a constant.« less

  3. Convergence characteristics of nonlinear vortex-lattice methods for configuration aerodynamics

    NASA Technical Reports Server (NTRS)

    Seginer, A.; Rusak, Z.; Wasserstrom, E.

    1983-01-01

    Nonlinear panel methods have no proof for the existence and uniqueness of their solutions. The convergence characteristics of an iterative, nonlinear vortex-lattice method are, therefore, carefully investigated. The effects of several parameters, including (1) the surface-paneling method, (2) an integration method of the trajectories of the wake vortices, (3) vortex-grid refinement, and (4) the initial conditions for the first iteration on the computed aerodynamic coefficients and on the flow-field details are presented. The convergence of the iterative-solution procedure is usually rapid. The solution converges with grid refinement to a constant value, but the final value is not unique and varies with the wing surface-paneling and wake-discretization methods within some range in the vicinity of the experimental result.

  4. Nursing interventions for rehabilitation in Parkinson's disease: cross mapping of terms

    PubMed Central

    Tosin, Michelle Hyczy de Siqueira; Campos, Débora Moraes; de Andrade, Leonardo Tadeu; de Oliveira, Beatriz Guitton Renaud Baptista; Santana, Rosimere Ferreira

    2016-01-01

    ABSTRACT Objective: to perform a cross-term mapping of nursing language in the patient record with the Nursing Interventions Classification system, in rehabilitation patients with Parkinson's disease. Method: a documentary research study to perform cross mapping. A probabilistic, simple random sample composed of 67 records of patients with Parkinson's disease who participated in a rehabilitation program, between March of 2009 and April of 2013. The research was conducted in three stages, in which the nursing terms were mapped to natural language and crossed with the Nursing Interventions Classification. Results: a total of 1,077 standard interventions that, after crossing with the taxonomy and refinement performed by the experts, resulted in 32 interventions equivalent to the Nursing Interventions Classification (NIC) system. The NICs, "Education: The process of the disease.", "Contract with the patient", and "Facilitation of Learning" were present in 100% of the records. For these interventions, 40 activities were described, representing 13 activities by intervention. Conclusion: the cross mapping allowed for the identification of corresponding terms with the nursing interventions used every day in rehabilitation nursing, and compared them to the Nursing Interventions Classification. PMID:27508903

  5. APPRIS 2017: principal isoforms for multiple gene sets

    PubMed Central

    Rodriguez-Rivas, Juan; Di Domenico, Tomás; Vázquez, Jesús; Valencia, Alfonso

    2018-01-01

    Abstract The APPRIS database (http://appris-tools.org) uses protein structural and functional features and information from cross-species conservation to annotate splice isoforms in protein-coding genes. APPRIS selects a single protein isoform, the ‘principal’ isoform, as the reference for each gene based on these annotations. A single main splice isoform reflects the biological reality for most protein coding genes and APPRIS principal isoforms are the best predictors of these main proteins isoforms. Here, we present the updates to the database, new developments that include the addition of three new species (chimpanzee, Drosophila melangaster and Caenorhabditis elegans), the expansion of APPRIS to cover the RefSeq gene set and the UniProtKB proteome for six species and refinements in the core methods that make up the annotation pipeline. In addition APPRIS now provides a measure of reliability for individual principal isoforms and updates with each release of the GENCODE/Ensembl and RefSeq reference sets. The individual GENCODE/Ensembl, RefSeq and UniProtKB reference gene sets for six organisms have been merged to produce common sets of splice variants. PMID:29069475

  6. A refined definition improves the measurement reliability of the tip-apex distance.

    PubMed

    Sakagoshi, Daigo; Sawaguchi, Takeshi; Shima, Yosuke; Inoue, Daisuke; Oshima, Takeshi; Goldhahn, Sabine

    2016-07-01

    Tip-apex distance (TAD) is reported as a good predictor for cut-outs of lag screws and spiral blades in the treatment of intertrochanteric fractures, and surgeons are advised to strive for TAD within 20 mm. However, the femoral neck axis and the position of the lower limb in the lateral radiograph are not clearly defined and can lead to measurement errors. We propose a refined TAD by defining these factors. The objective of this study was to analyze the reliability of this refined TAD. The radiographs of 130 prospective cases with unstable trochanteric fractures were used for the analysis of the refined TAD. The refined TAD was independently measured by 2 raters with clinical experience of more than 10 years (rater 1, 2) and 2 raters with much less clinical experience (rater 3, 4) after they received a training about the new measurement method. Intraclass correlation coefficient (ICC [2,4]) was calculated to assess the interrater reliability. The mean refined TADs were 18.2:18.4:18.2:18.2 mm for rater 1:2:3:4. There was a strong correlation among all four raters (ICC 0.998, (95% CI: 0.998, 0.999). Regardless of the clinical experience of raters, the refined TAD is a reliable tool and can be used to develop new TAD recommendations for predicting failure of fixation. Future studies with larger samples are needed to evaluate the predictive value of the refined TAD. Copyright © 2016 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  7. Three-dimensional (3D) structure prediction of the American and African oil-palms β-ketoacyl-[ACP] synthase-II protein by comparative modelling

    PubMed Central

    Wang, Edina; Chinni, Suresh; Bhore, Subhash Janardhan

    2014-01-01

    Background: The fatty-acid profile of the vegetable oils determines its properties and nutritional value. Palm-oil obtained from the African oil-palm [Elaeis guineensis Jacq. (Tenera)] contains 44% palmitic acid (C16:0), but, palm-oil obtained from the American oilpalm [Elaeis oleifera] contains only 25% C16:0. In part, the b-ketoacyl-[ACP] synthase II (KASII) [EC: 2.3.1.179] protein is responsible for the high level of C16:0 in palm-oil derived from the African oil-palm. To understand more about E. guineensis KASII (EgKASII) and E. oleifera KASII (EoKASII) proteins, it is essential to know its structures. Hence, this study was undertaken. Objective: The objective of this study was to predict three-dimensional (3D) structure of EgKASII and EoKASII proteins using molecular modelling tools. Materials and Methods: The amino-acid sequences for KASII proteins were retrieved from the protein database of National Center for Biotechnology Information (NCBI), USA. The 3D structures were predicted for both proteins using homology modelling and ab-initio technique approach of protein structure prediction. The molecular dynamics (MD) simulation was performed to refine the predicted structures. The predicted structure models were evaluated and root mean square deviation (RMSD) and root mean square fluctuation (RMSF) values were calculated. Results: The homology modelling showed that EgKASII and EoKASII proteins are 78% and 74% similar with Streptococcus pneumonia KASII and Brucella melitensis KASII, respectively. The EgKASII and EoKASII structures predicted by using ab-initio technique approach shows 6% and 9% deviation to its structures predicted by homology modelling, respectively. The structure refinement and validation confirmed that the predicted structures are accurate. Conclusion: The 3D structures for EgKASII and EoKASII proteins were predicted. However, further research is essential to understand the interaction of EgKASII and EoKASII proteins with its substrates. PMID:24748752

  8. Three-dimensional (3D) structure prediction of the American and African oil-palms β-ketoacyl-[ACP] synthase-II protein by comparative modelling.

    PubMed

    Wang, Edina; Chinni, Suresh; Bhore, Subhash Janardhan

    2014-01-01

    The fatty-acid profile of the vegetable oils determines its properties and nutritional value. Palm-oil obtained from the African oil-palm [Elaeis guineensis Jacq. (Tenera)] contains 44% palmitic acid (C16:0), but, palm-oil obtained from the American oilpalm [Elaeis oleifera] contains only 25% C16:0. In part, the b-ketoacyl-[ACP] synthase II (KASII) [EC: 2.3.1.179] protein is responsible for the high level of C16:0 in palm-oil derived from the African oil-palm. To understand more about E. guineensis KASII (EgKASII) and E. oleifera KASII (EoKASII) proteins, it is essential to know its structures. Hence, this study was undertaken. The objective of this study was to predict three-dimensional (3D) structure of EgKASII and EoKASII proteins using molecular modelling tools. The amino-acid sequences for KASII proteins were retrieved from the protein database of National Center for Biotechnology Information (NCBI), USA. The 3D structures were predicted for both proteins using homology modelling and ab-initio technique approach of protein structure prediction. The molecular dynamics (MD) simulation was performed to refine the predicted structures. The predicted structure models were evaluated and root mean square deviation (RMSD) and root mean square fluctuation (RMSF) values were calculated. The homology modelling showed that EgKASII and EoKASII proteins are 78% and 74% similar with Streptococcus pneumonia KASII and Brucella melitensis KASII, respectively. The EgKASII and EoKASII structures predicted by using ab-initio technique approach shows 6% and 9% deviation to its structures predicted by homology modelling, respectively. The structure refinement and validation confirmed that the predicted structures are accurate. The 3D structures for EgKASII and EoKASII proteins were predicted. However, further research is essential to understand the interaction of EgKASII and EoKASII proteins with its substrates.

  9. In the pursuit of a semantic similarity metric based on UMLS annotations for articles in PubMed Central Open Access.

    PubMed

    Garcia Castro, Leyla Jael; Berlanga, Rafael; Garcia, Alexander

    2015-10-01

    Although full-text articles are provided by the publishers in electronic formats, it remains a challenge to find related work beyond the title and abstract context. Identifying related articles based on their abstract is indeed a good starting point; this process is straightforward and does not consume as many resources as full-text based similarity would require. However, further analyses may require in-depth understanding of the full content. Two articles with highly related abstracts can be substantially different regarding the full content. How similarity differs when considering title-and-abstract versus full-text and which semantic similarity metric provides better results when dealing with full-text articles are the main issues addressed in this manuscript. We have benchmarked three similarity metrics - BM25, PMRA, and Cosine, in order to determine which one performs best when using concept-based annotations on full-text documents. We also evaluated variations in similarity values based on title-and-abstract against those relying on full-text. Our test dataset comprises the Genomics track article collection from the 2005 Text Retrieval Conference. Initially, we used an entity recognition software to semantically annotate titles and abstracts as well as full-text with concepts defined in the Unified Medical Language System (UMLS®). For each article, we created a document profile, i.e., a set of identified concepts, term frequency, and inverse document frequency; we then applied various similarity metrics to those document profiles. We considered correlation, precision, recall, and F1 in order to determine which similarity metric performs best with concept-based annotations. For those full-text articles available in PubMed Central Open Access (PMC-OA), we also performed dispersion analyses in order to understand how similarity varies when considering full-text articles. We have found that the PubMed Related Articles similarity metric is the most suitable for full-text articles annotated with UMLS concepts. For similarity values above 0.8, all metrics exhibited an F1 around 0.2 and a recall around 0.1; BM25 showed the highest precision close to 1; in all cases the concept-based metrics performed better than the word-stem-based one. Our experiments show that similarity values vary when considering only title-and-abstract versus full-text similarity. Therefore, analyses based on full-text become useful when a given research requires going beyond title and abstract, particularly regarding connectivity across articles. Visualization available at ljgarcia.github.io/semsim.benchmark/, data available at http://dx.doi.org/10.5281/zenodo.13323. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Formulation and characterization of novel composite semi-refined iota carrageenan-based edible film incorporating palmitic acid

    NASA Astrophysics Data System (ADS)

    Praseptiangga, Danar; Giovani, Sarah; Manuhara, Godras Jati; Muhammad, Dimas Rahadian Aji

    2017-09-01

    Novel composite films based on semi-refined iota-carrageenan (SRIC) incorporating palmitic acid (PA) were prepared by an emulsification method. Palmitic acid (PA) as hydrophobic material was incorporated into semi-refined iota-carrageenan edible films in order to improve water vapor barrier properties. Composite SRIC-based films with varying concentrations of PA (10%, 20%, and 30% w/w) were obtained by a solvent casting method. Their mechanical and barrier properties were investigated. Results showed that the incorporation of PA in films caused a significant increase (p < 0.05) in thickness as the concentration of PA increased (from 10% to 30% w/w). The mechanical properties of semi-refined iota-carrageenan were also affected by PA incorporation; increasing the concentration of PA (from 10% to 30% w/w) in films improved the tensile strength (TS). Interestingly, the TS value increased to a peak at 20% w/w PA. However, the TS value showed a decrease when PA were added at 30% w/w. Elongation-at-break (EAB) were significantly (p < 0.05) decreased when the concentration of PA in films increased (from 10% to 30% w/w). Furthermore, the incorporation of PA also affected the water vapor barrier properties of the films. Water vapor transmission rate (WVTR) of the composite semi-refined iota-carrageenan-based edible film decreased significantly (p < 0.05) as the concentration of palmitic acid increased (from 10% to 30% w/w). Composite SRIC-based edible film incorporating 30% w/w of PA presented better water vapor barrier properties as compared to other films with 10% and 20% w/w PA incorporation. Thus, formulation containing 30% w/w palmitic acid promoted films with a highly beneficial to improve water vapor barrier properties and it has the potential for food packaging applications.

  11. Industrial garnet

    USGS Publications Warehouse

    Olson, D.W.

    2010-01-01

    In 2009, U.S. production of crude garnet concentrate for industrial use was estimated to be 56.5 kt (62,300 st), valued at about $8.85 million. This was a 10-percent decrease in quantity compared with 2008 production. Refined garnet material sold or used was 28 kt (31,000 st) valued at $7.96 million.

  12. Segmental Refinement: A Multigrid Technique for Data Locality

    DOE PAGES

    Adams, Mark F.; Brown, Jed; Knepley, Matt; ...

    2016-08-04

    In this paper, we investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. Finally, we present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinementmore » and report performance results with up to 64K cores on a Cray XC30.« less

  13. The Gravity Field, Orientation, and Ephemeris of Mercury from MESSENGER Observations After Three Years in Orbit

    NASA Technical Reports Server (NTRS)

    Mazarico, Erwan M.; Genova, Antonio; Goossens, Sander; Lemoine, Gregory; Neumann, Gregory A.; Zuber, Maria T.; Smith, David E.; Solomon, Sean C.

    2014-01-01

    We have analyzed three years of radio tracking data from the MESSENGER spacecraft in orbit around Mercury and determined the gravity field, planetary orientation, and ephemeris of the innermost planet. With improvements in spatial coverage, force modeling, and data weighting, we refined an earlier global gravity field both in quality and resolution, and we present here a spherical harmonic solution to degree and order 50. In this field, termed HgM005, uncertainties in low-degree coefficients are reduced by an order of magnitude relative to the earlier global field, and we obtained a preliminary value of the tidal Love number k(sub 2) of 0.451+/-0.014. We also estimated Mercury's pole position, and we obtained an obliquity value of 2.06 +/- 0.16 arcmin, in good agreement with analysis of Earth-based radar observations. From our updated rotation period (58.646146 +/- 0.000011 days) and Mercury ephemeris, we verified experimentally the planet's 3: 2 spin-orbit resonance to greater accuracy than previously possible. We present a detailed analysis of the HgM005 covariance matrix, and we describe some near-circular frozen orbits around Mercury that could be advantageous for future exploration.

  14. Students' flexible use of ontologies and the value of tentative reasoning: Examples of conceptual understanding in three canonical topics of quantum mechanics

    NASA Astrophysics Data System (ADS)

    Hoehn, Jessica R.; Finkelstein, Noah D.

    2018-06-01

    As part of a research study on student reasoning in quantum mechanics, we examine students' use of ontologies, or the way students' categorically organize entities they are reasoning about. In analyzing three episodes of focus group discussions with modern physics students, we present evidence of the dynamic nature of ontologies, and refine prior theoretical frameworks for thinking about dynamic ontologies. We find that in a given reasoning episode ontologies can be dynamic in construction (referring to when the reasoner constructs the ontologies) or application (referring to which ontologies are applied in a given reasoning episode). In our data, we see instances of students flexibly switching back and forth between parallel stable structures as well as constructing and negotiating new ontologies in the moment. Methodologically, we use a collective conceptual blending framework as an analytic tool for capturing student reasoning in groups. In this research, we value the messiness of student reasoning and argue that reasoning in a tentative manner can be productive for students learning quantum mechanics. As such, we shift away from a binary view of student learning which sees students as either having the correct answer or not.

  15. Direct numerical simulation of particle alignment in viscoelastic fluids

    NASA Astrophysics Data System (ADS)

    Hulsen, Martien; Jaensson, Nick; Anderson, Patrick

    2016-11-01

    Rigid particles suspended in viscoelastic fluids under shear can align in string-like structures in flow direction. To unravel this phenomenon, we present 3D direct numerical simulations of the alignment of two and three rigid, non-Brownian particles in a shear flow of a viscoelastic fluid. The equations are solved on moving, boundary-fitted meshes, which are locally refined to accurately describe the polymer stresses around and in between the particles. A small minimal gap size between the particles is introduced. The Giesekus model is used and the effect of the Weissenberg number, shear thinning and solvent viscosity is investigated. Alignment of two and three particles is observed. Morphology plots have been created for various combinations of fluid parameters. Alignment is mainly governed by the value of the elasticity parameter S, defined as half of the ratio between the first normal stress difference and shear stress of the suspending fluid. Alignment appears to occur above a critical value of S, which decreases with increasing shear thinning. This result, together with simulations of a shear-thinning Carreau fluid, leads us to the conclusion that normal stress differences are essential for particle alignment to occur, but it is also strongly promoted by shear thinning.

  16. Bulk Nanolaminated Nickel: Preparation, Microstructure, Mechanical Property, and Thermal Stability

    NASA Astrophysics Data System (ADS)

    Liu, Fan; Yuan, Hao; Goel, Sunkulp; Liu, Ying; Wang, Jing Tao

    2018-02-01

    A bulk nanolaminated (NL) structure with distinctive fractions of low- and high-angle grain boundaries ( f LAGBs and f HAGBs) is produced in pure nickel, through a two-step process of primary grain refinement by equal-channel angular pressing (ECAP), followed by a secondary geometrical refinement via liquid nitrogen rolling (LNR). The lamellar boundary spacings of 2N and 4N nickel are refined to 40 and 70 nm, respectively, and the yield strength of the NL structure in 2N nickel reaches 1.5 GPa. The impacts of the deformation path, material purity, grain boundary (GB) misorientation, and energy on the microstructure, refinement ability, mechanical strength, and thermal stability are investigated to understand the inherent governing mechanisms. GB migration is the main restoration mechanism limiting the refinement of an NL structure in 4N nickel, while in 2N nickel, shear banding occurs and mediates one-fifth of the total true normal rolling strain at the mesoscale, restricting further refinement. Three typical structures [ultrafine grained (UFG), NL with low f LAGBs, and NL with high f LAGBs] obtained through three different combinations of ECAP and LNR were studied by isochronal annealing for 1 hour at temperatures ranging from 433 K to 973 K (160 °C to 700 °C). Higher thermal stability in the NL structure with high f LAGBs is shown by a 50 K (50 °C) delay in the initiation temperature of recrystallization. Based on calculations and analyses of the stored energies of deformed structures from strain distribution, as characterized by kernel average misorientation (KAM), and from GB misorientations, higher thermal stability is attributed to high f LAGBs in this type of NL structure. This is confirmed by a slower change in the microstructure, as revealed by characterizing its annealing kinetics using KAM maps.

  17. A methodology for quadrilateral finite element mesh coarsening

    DOE PAGES

    Staten, Matthew L.; Benzley, Steven; Scott, Michael

    2008-03-27

    High fidelity finite element modeling of continuum mechanics problems often requires using all quadrilateral or all hexahedral meshes. The efficiency of such models is often dependent upon the ability to adapt a mesh to the physics of the phenomena. Adapting a mesh requires the ability to both refine and/or coarsen the mesh. The algorithms available to refine and coarsen triangular and tetrahedral meshes are very robust and efficient. However, the ability to locally and conformally refine or coarsen all quadrilateral and all hexahedral meshes presents many difficulties. Some research has been done on localized conformal refinement of quadrilateral and hexahedralmore » meshes. However, little work has been done on localized conformal coarsening of quadrilateral and hexahedral meshes. A general method which provides both localized conformal coarsening and refinement for quadrilateral meshes is presented in this paper. This method is based on restructuring the mesh with simplex manipulations to the dual of the mesh. Finally, this method appears to be extensible to hexahedral meshes in three dimensions.« less

  18. Grain Refinement of Al-Si Hypoeutectic Alloys by Al3Ti1B Master Alloy and Ultrasonic Treatment

    NASA Astrophysics Data System (ADS)

    Wang, Gui; Wang, Eric Qiang; Prasad, Arvind; Dargusch, Matthew; StJohn, David H.

    Al-Si alloys are widely used in automotive and aerospace industries due to their excellent castability, high strength to weight ratio and good corrosion resistance. However, Si poisoning severely limits the degree of grain refinement with the grain size becoming larger as the Si content increases. Generally the effect of Si poisoning is reduced by increasing the amount of master alloy added to the melt during casting. However, an alternative approach is physical grain refinement through the application of an external force (e.g. mechanical or electromagnetic stirring, intensive shearing and ultrasonic irradiation). This work compares the grain refining efficiency of three approaches to the grain refinement of a range of hypoeutectic Al-Si alloys by (i) the addition of A13Ti1B master alloy, (ii) the application of Ultrasonic Treatment (UT) and (iii) the combined addition of A13Ti1B master alloy and the application of UT.

  19. Second invitational well-testing symposium proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-01-01

    The symposium dealt with the state of the art of injection of fluids underground, and its application to geothermal systems in particular. Separate abstracts were prepared for fourteen papers and three abstracts of papers were listed by title. Three papers were previously abstracted for EDB.

  20. Refined Dummy Atom Model of Mg(2+) by Simple Parameter Screening Strategy with Revised Experimental Solvation Free Energy.

    PubMed

    Jiang, Yang; Zhang, Haiyang; Feng, Wei; Tan, Tianwei

    2015-12-28

    Metal ions play an important role in the catalysis of metalloenzymes. To investigate metalloenzymes via molecular modeling, a set of accurate force field parameters for metal ions is highly imperative. To extend its application range and improve the performance, the dummy atom model of metal ions was refined through a simple parameter screening strategy using the Mg(2+) ion as an example. Using the AMBER ff03 force field with the TIP3P model, the refined model accurately reproduced the experimental geometric and thermodynamic properties of Mg(2+). Compared with point charge models and previous dummy atom models, the refined dummy atom model yields an enhanced performance for producing reliable ATP/GTP-Mg(2+)-protein conformations in three metalloenzyme systems with single or double metal centers. Similar to other unbounded models, the refined model failed to reproduce the Mg-Mg distance and favored a monodentate binding of carboxylate groups, and these drawbacks needed to be considered with care. The outperformance of the refined model is mainly attributed to the use of a revised (more accurate) experimental solvation free energy and a suitable free energy correction protocol. This work provides a parameter screening strategy that can be readily applied to refine the dummy atom models for metal ions.

  1. Autohydrolysis Pretreatment of Lignocellulosic Biomass for Bioethanol Production

    NASA Astrophysics Data System (ADS)

    Han, Qiang

    Autohydrolysis, a simple and environmental friendly process, has long been studied but often abandoned as a financially viable pretreatment for bioethanol production due to the low yields of fermentable sugars at economic enzyme dosages. The introduction of mechanical refining can generate substantial improvements for autohydrolysis process, making it an attractive pretreatment technology for bioethanol commercialization. In this study, several lignocellulosic biomass including wheat straw, switchgrass, corn stover, waste wheat straw have been subjected to autohydrolysis pretreatment followed by mechanical refining to evaluate the total sugar recovery at affordable enzyme dosages. Encouraging results have been found that using autohydrolysis plus refining strategy, the total sugar recovery of most feedstock can be as high as 76% at 4 FPU/g enzymes dosages. The mechanical refining contributed to the improvement of enzymatic sugar yield by as much as 30%. Three non-woody biomass (sugarcane bagasse, wheat straw, and switchgrass) and three woody biomass (maple, sweet gum, and nitens) have been subjected to autohydrolysis pretreatment to acquire a fundamental understanding of biomass characteristics that affect the autohydrolysis and the following enzymatic hydrolysis. It is of interest to note that the nonwoody biomass went through substantial delignification during autohydrolysis compared to woody biomass due to a significant amount of p-coumaric acid and ferulic acid. It has been found that hardwood which has a higher S/V ratio in the lignin structure tends to have a higher total sugar recovery from autohydrolysis pretreatment. The economics of bioethanol production from autohydrolysis of different feedstocks have been investigated. Regardless of different feedstocks, in the conventional design, producing bioethanol and co-producing steam and power, the minimum ethanol revenues (MER) required to generate a 12% internal rate of return (IRR) are high enough to discourage investors due to the high capital investment relative to low US ethanol price. Nevertheless, the economics of autohydrolysis can be substantially improved by upgrading the value of unhydrolyzed residues, such as the fuel pellets. Moreover, the utilization of proven technology and equipment renders autohydrolysis adaptable to pulp and paper industrial. Attractive economics have been found when autohydrolysis based bioethanol plant is co-located to a pulp and paper mill or the distressed pulp and paper mill is being repurposed to produce bioethanol. An alternative to autohydrolysis combined with refining, thermomechanical pulping (TMP) process has been evaluated using corn stover as the feedstock. A significant low solids yield after the pretreatment process has been observed due to the harsh condition operated and the limitation of lab equipment. But the TMP process has great potential to be employed as a pretreatment for bioethanol production in an industrial scale if the process is optimized.

  2. Influence of dentin pretreatment on bond strength of universal adhesives

    PubMed Central

    Poggio, Claudio; Beltrami, Riccardo; Colombo, Marco; Chiesa, Marco; Scribante, Andrea

    2017-01-01

    Abstract Objective: The purpose of the present study was to compare bond strength of different universal adhesives under three different testing conditions: when no pretreatment was applied, after 37% phosphoric acid etching and after glycine application. Materials and methods: One hundred and fifty bovine permanent mandibular incisors were used as a substitute for human teeth. Five different universal adhesives were tested: Futurabond M+, Scotchbond Universal, Clearfil Universal Bond, G-Premio BOND, Peak Universal Bond. The adhesive systems were applied following each manufacturer’s instructions. The teeth were randomly assigned to three different dentin surface pretreatments: no pretreatment agent (control), 37% phosphoric acid etching, glycine pretreatment. The specimens were placed in a universal testing machine in order to measure and compare bond strength values. Results: The Kruskal–Wallis analysis of variance and the Mann–Whitney test were applied to assess significant differences among the groups. Dentin pretreatments provided different bond strength values for the adhesives tested, while similar values were registered in groups without dentin pretreatment. Conclusions: In the present report, dentin surface pretreatment did not provide significant differences in shear bond strength values of almost all groups. Acid pretreatment lowered bond strength values of Futurabond and Peak Universal Adhesives, whereas glycine pretreatment increased bond strength values of G Praemio Bond adhesive system. PMID:28642929

  3. Abstraction and Consolidation

    ERIC Educational Resources Information Center

    Monaghan, John; Ozmantar, Mehmet Fatih

    2006-01-01

    The framework for this paper is a recently developed theory of abstraction in context. The paper reports on data collected from one student working on tasks concerned with absolute value functions. It examines the relationship between mathematical constructions and abstractions. It argues that an abstraction is a consolidated construction that can…

  4. The standard of healthcare accreditation standards: a review of empirical research underpinning their development and impact

    PubMed Central

    2012-01-01

    Background Healthcare accreditation standards are advocated as an important means of improving clinical practice and organisational performance. Standard development agencies have documented methodologies to promote open, transparent, inclusive development processes where standards are developed by members. They assert that their methodologies are effective and efficient at producing standards appropriate for the health industry. However, the evidence to support these claims requires scrutiny. The study’s purpose was to examine the empirical research that grounds the development methods and application of healthcare accreditation standards. Methods A multi-method strategy was employed over the period March 2010 to August 2011. Five academic health research databases (Medline, Psych INFO, Embase, Social work abstracts, and CINAHL) were interrogated, the websites of 36 agencies associated with the study topic were investigated, and a snowball search was undertaken. Search criteria included accreditation research studies, in English, addressing standards and their impact. Searching in stage 1 initially selected 9386 abstracts. In stage 2, this selection was refined against the inclusion criteria; empirical studies (n = 2111) were identified and refined to a selection of 140 papers with the exclusion of clinical or biomedical and commentary pieces. These were independently reviewed by two researchers and reduced to 13 articles that met the study criteria. Results The 13 articles were analysed according to four categories: overall findings; standards development; implementation issues; and impact of standards. Studies have only occurred in the acute care setting, predominately in 2003 (n = 5) and 2009 (n = 4), and in the United States (n = 8). A multidisciplinary focus (n = 9) and mixed method approach (n = 11) are common characteristics. Three interventional studies were identified, with the remaining 10 studies having research designs to investigate clinical or organisational impacts. No study directly examined standards development or other issues associated with their progression. Only one study noted implementation issues, identifying several enablers and barriers. Standards were reported to improve organisational efficiency and staff circumstances. However, the impact on clinical quality was mixed, with both improvements and a lack of measurable effects recorded. Conclusion Standards are ubiquitous within healthcare and are generally considered to be an important means by which to improve clinical practice and organisational performance. However, there is a lack of robust empirical evidence examining the development, writing, implementation and impacts of healthcare accreditation standards. PMID:22995152

  5. The standard of healthcare accreditation standards: a review of empirical research underpinning their development and impact.

    PubMed

    Greenfield, David; Pawsey, Marjorie; Hinchcliff, Reece; Moldovan, Max; Braithwaite, Jeffrey

    2012-09-20

    Healthcare accreditation standards are advocated as an important means of improving clinical practice and organisational performance. Standard development agencies have documented methodologies to promote open, transparent, inclusive development processes where standards are developed by members. They assert that their methodologies are effective and efficient at producing standards appropriate for the health industry. However, the evidence to support these claims requires scrutiny. The study's purpose was to examine the empirical research that grounds the development methods and application of healthcare accreditation standards. A multi-method strategy was employed over the period March 2010 to August 2011. Five academic health research databases (Medline, Psych INFO, Embase, Social work abstracts, and CINAHL) were interrogated, the websites of 36 agencies associated with the study topic were investigated, and a snowball search was undertaken. Search criteria included accreditation research studies, in English, addressing standards and their impact. Searching in stage 1 initially selected 9386 abstracts. In stage 2, this selection was refined against the inclusion criteria; empirical studies (n = 2111) were identified and refined to a selection of 140 papers with the exclusion of clinical or biomedical and commentary pieces. These were independently reviewed by two researchers and reduced to 13 articles that met the study criteria. The 13 articles were analysed according to four categories: overall findings; standards development; implementation issues; and impact of standards. Studies have only occurred in the acute care setting, predominately in 2003 (n = 5) and 2009 (n = 4), and in the United States (n = 8). A multidisciplinary focus (n = 9) and mixed method approach (n = 11) are common characteristics. Three interventional studies were identified, with the remaining 10 studies having research designs to investigate clinical or organisational impacts. No study directly examined standards development or other issues associated with their progression. Only one study noted implementation issues, identifying several enablers and barriers. Standards were reported to improve organisational efficiency and staff circumstances. However, the impact on clinical quality was mixed, with both improvements and a lack of measurable effects recorded. Standards are ubiquitous within healthcare and are generally considered to be an important means by which to improve clinical practice and organisational performance. However, there is a lack of robust empirical evidence examining the development, writing, implementation and impacts of healthcare accreditation standards.

  6. Kinetics studies of the reactions of main fourth-period monocations (Ga+, Ge+, As+, and Se+) with methyl fluoride.

    PubMed

    Barrientos, Carmen; Rayón, Víctor Manuel; Largo, Antonio; Sordo, José Ángel; Redondo, Pilar

    2013-08-22

    Thermodynamics and kinetics theoretical studies on the gas-phase reactions of fluoromethane with main fourth-period monocations (Ga(+), Ge(+), As(+), and Se(+)) have been carried out. Density functional theory (in particular mPW1K functional) was employed in the description of the potential energy surfaces, and refinement of the energies were done at the CCSD(T) level. The reaction rate constants were estimated using variational/conventional microcanonical transition state theory. From a thermodynamic viewpoint, the fluorine abstraction product is predicted for Ga(+) and Ge(+), whereas for As(+) and Se(+) the elimination product, MCH2(+) (M = As, Se) + HF, is the preferred one. Nevertheless, the most favorable channel for the reactions of CH3F with Ga(+) and Se(+) cations present a net activation barrier. In the case of Ga(+), the reaction proceeds via an addition channel forming the adduct complex, CH3FGa(+), whereas for Se(+) no reaction is found, in agreement with the experiments. The predicted reaction rate constants are in reasonable good agreement with the experimental values available. Apart from the harpoon-like mechanism, our results suggest that an oxidative addition mechanism seems to play a relevant role.

  7. Finding pathway-modulating genes from a novel Ontology Fingerprint-derived gene network.

    PubMed

    Qin, Tingting; Matmati, Nabil; Tsoi, Lam C; Mohanty, Bidyut K; Gao, Nan; Tang, Jijun; Lawson, Andrew B; Hannun, Yusuf A; Zheng, W Jim

    2014-10-01

    To enhance our knowledge regarding biological pathway regulation, we took an integrated approach, using the biomedical literature, ontologies, network analyses and experimental investigation to infer novel genes that could modulate biological pathways. We first constructed a novel gene network via a pairwise comparison of all yeast genes' Ontology Fingerprints--a set of Gene Ontology terms overrepresented in the PubMed abstracts linked to a gene along with those terms' corresponding enrichment P-values. The network was further refined using a Bayesian hierarchical model to identify novel genes that could potentially influence the pathway activities. We applied this method to the sphingolipid pathway in yeast and found that many top-ranked genes indeed displayed altered sphingolipid pathway functions, initially measured by their sensitivity to myriocin, an inhibitor of de novo sphingolipid biosynthesis. Further experiments confirmed the modulation of the sphingolipid pathway by one of these genes, PFA4, encoding a palmitoyl transferase. Comparative analysis showed that few of these novel genes could be discovered by other existing methods. Our novel gene network provides a unique and comprehensive resource to study pathway modulations and systems biology in general. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Finding pathway-modulating genes from a novel Ontology Fingerprint-derived gene network

    PubMed Central

    Qin, Tingting; Matmati, Nabil; Tsoi, Lam C.; Mohanty, Bidyut K.; Gao, Nan; Tang, Jijun; Lawson, Andrew B.; Hannun, Yusuf A.; Zheng, W. Jim

    2014-01-01

    To enhance our knowledge regarding biological pathway regulation, we took an integrated approach, using the biomedical literature, ontologies, network analyses and experimental investigation to infer novel genes that could modulate biological pathways. We first constructed a novel gene network via a pairwise comparison of all yeast genes’ Ontology Fingerprints—a set of Gene Ontology terms overrepresented in the PubMed abstracts linked to a gene along with those terms’ corresponding enrichment P-values. The network was further refined using a Bayesian hierarchical model to identify novel genes that could potentially influence the pathway activities. We applied this method to the sphingolipid pathway in yeast and found that many top-ranked genes indeed displayed altered sphingolipid pathway functions, initially measured by their sensitivity to myriocin, an inhibitor of de novo sphingolipid biosynthesis. Further experiments confirmed the modulation of the sphingolipid pathway by one of these genes, PFA4, encoding a palmitoyl transferase. Comparative analysis showed that few of these novel genes could be discovered by other existing methods. Our novel gene network provides a unique and comprehensive resource to study pathway modulations and systems biology in general. PMID:25063300

  9. Refinement of the karyological aspects of Psidium guineense (Swartz, 1788): a comparison with Psidium guajava (Linnaeus, 1753)

    PubMed Central

    Marques, Anelise Machado; Tuler, Amélia Carlos; Carvalho, Carlos Roberto; Carrijo, Tatiana Tavares; Ferreira, Marcia Flores da Silva; Clarindo, Wellington Ronildo

    2016-01-01

    Abstract Euploidy plays an important role in the evolution and diversification of Psidium Linnaeus, 1753. However, few data about the nuclear DNA content, chromosome characterization (morphometry and class) and molecular markers have been reported for this genus. In this context, the present study aims to shed light on the genome of Psidium guineense Swartz, 1788, comparing it with Psidium guajava Linnaeus, 1753. Using flow cytometry, the nuclear 2C value of Psidium guineense was 2C = 1.85 picograms (pg), and the karyotype showed 2n = 4x = 44 chromosomes. Thus, Psidium guineense has four chromosome sets, in accordance with the basic chromosome number of Psidium (x = 11). In addition, karyomorphometric analysis revealed morphologically identical chromosome groups in the karyotype of Psidium guineense. The high transferability of microsatellites (98.6%) further corroborates with phylogenetic relationship between Psidium guajava and Psidium guineense. Based on the data regarding nuclear genome size, karyotype morphometry and molecular markers of Psidium guineense and Psidium guajava (2C = 0.95 pg, 2n = 2x = 22 chromosomes), Psidium guineense is a tetraploid species. These data reveal the role of euploidy in the diversification of the genus Psidium. PMID:27186342

  10. Strategies for carbohydrate model building, refinement and validation

    PubMed Central

    2017-01-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein–sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade. PMID:28177313

  11. Strategies for carbohydrate model building, refinement and validation.

    PubMed

    Agirre, Jon

    2017-02-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein-sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade.

  12. Moral concepts set decision strategies to abstract values.

    PubMed

    Caspers, Svenja; Heim, Stefan; Lucas, Marc G; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl

    2011-04-01

    Persons have different value preferences. Neuroimaging studies where value-based decisions in actual conflict situations were investigated suggest an important role of prefrontal and cingulate brain regions. General preferences, however, reflect a superordinate moral concept independent of actual situations as proposed in psychological and socioeconomic research. Here, the specific brain response would be influenced by abstract value systems and moral concepts. The neurobiological mechanisms underlying such responses are largely unknown. Using functional magnetic resonance imaging (fMRI) with a forced-choice paradigm on word pairs representing abstract values, we show that the brain handles such decisions depending on the person's superordinate moral concept. Persons with a predominant collectivistic (altruistic) value system applied a "balancing and weighing" strategy, recruiting brain regions of rostral inferior and intraparietal, and midcingulate and frontal cortex. Conversely, subjects with mainly individualistic (egocentric) value preferences applied a "fight-and-flight" strategy by recruiting the left amygdala. Finally, if subjects experience a value conflict when rejecting an alternative congruent to their own predominant value preference, comparable brain regions are activated as found in actual moral dilemma situations, i.e., midcingulate and dorsolateral prefrontal cortex. Our results demonstrate that superordinate moral concepts influence the strategy and the neural mechanisms in decision processes, independent of actual situations, showing that decisions are based on general neural principles. These findings provide a novel perspective to future sociological and economic research as well as to the analysis of social relations by focusing on abstract value systems as triggers of specific brain responses.

  13. Moral Concepts Set Decision Strategies to Abstract Values

    PubMed Central

    Caspers, Svenja; Heim, Stefan; Lucas, Marc G.; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl

    2011-01-01

    Persons have different value preferences. Neuroimaging studies where value-based decisions in actual conflict situations were investigated suggest an important role of prefrontal and cingulate brain regions. General preferences, however, reflect a superordinate moral concept independent of actual situations as proposed in psychological and socioeconomic research. Here, the specific brain response would be influenced by abstract value systems and moral concepts. The neurobiological mechanisms underlying such responses are largely unknown. Using functional magnetic resonance imaging (fMRI) with a forced-choice paradigm on word pairs representing abstract values, we show that the brain handles such decisions depending on the person's superordinate moral concept. Persons with a predominant collectivistic (altruistic) value system applied a “balancing and weighing” strategy, recruiting brain regions of rostral inferior and intraparietal, and midcingulate and frontal cortex. Conversely, subjects with mainly individualistic (egocentric) value preferences applied a “fight-and-flight” strategy by recruiting the left amygdala. Finally, if subjects experience a value conflict when rejecting an alternative congruent to their own predominant value preference, comparable brain regions are activated as found in actual moral dilemma situations, i.e., midcingulate and dorsolateral prefrontal cortex. Our results demonstrate that superordinate moral concepts influence the strategy and the neural mechanisms in decision processes, independent of actual situations, showing that decisions are based on general neural principles. These findings provide a novel perspective to future sociological and economic research as well as to the analysis of social relations by focusing on abstract value systems as triggers of specific brain responses. PMID:21483767

  14. The Role of Motion Concepts in Understanding Non-Motion Concepts

    PubMed Central

    Khatin-Zadeh, Omid; Banaruee, Hassan; Khoshsima, Hooshang; Marmolejo-Ramos, Fernando

    2017-01-01

    This article discusses a specific type of metaphor in which an abstract non-motion domain is described in terms of a motion event. Abstract non-motion domains are inherently different from concrete motion domains. However, motion domains are used to describe abstract non-motion domains in many metaphors. Three main reasons are suggested for the suitability of motion events in such metaphorical descriptions. Firstly, motion events usually have high degrees of concreteness. Secondly, motion events are highly imageable. Thirdly, components of any motion event can be imagined almost simultaneously within a three-dimensional space. These three characteristics make motion events suitable domains for describing abstract non-motion domains, and facilitate the process of online comprehension throughout language processing. Extending the main point into the field of mathematics, this article discusses the process of transforming abstract mathematical problems into imageable geometric representations within the three-dimensional space. This strategy is widely used by mathematicians to solve highly abstract and complex problems. PMID:29240715

  15. Crystal structure of the mutant D52S hen egg white lysozyme with an oligosaccharide product.

    PubMed

    Hadfield, A T; Harvey, D J; Archer, D B; MacKenzie, D A; Jeenes, D J; Radford, S E; Lowe, G; Dobson, C M; Johnson, L N

    1994-11-11

    The crystal structure of a mutant hen egg white lysozyme, in which the key catalytic residue aspartic acid 52 has been changed to a serine residue (D52S HEWL), has been determined and refined to a crystallographic R value of 0.173 for all data F > 0 between 8 and 1.9 A resolution. The D52S HEWL structure is very similar to the native HEWL structure (r.m.s. deviation of main-chain atoms 0.20 A). Small shifts that result from the change in hydrogen bonding pattern on substitution of Asp by Ser were observed in the loop between beta-strands in the region of residues 46 to 49. D52S HEWL exhibits less than 1% activity against the bacterial cell wall substrate. Cocrystallisation experiments with the hexasaccharide substrate beta(1-4) polymer of N-acetyl-D-glucosamine (GlcNAc6) resulted in crystals between 5 days and 14 days after the initial mixing of enzyme and substrate. Analysis by laser absorption mass spectrometry of the oligosaccharides present after incubation with native and D52S HEWL under conditions similar to those used for crystal growth showed that after 14 days with native HEWL complete catalysis to GlcNAc3. GlcNAc2 and GlcNac had occurred but with D52S HEWL only partial catalysis to the major products GlcNAc4 and GlcNAc2 had occurred and at least 50% of the GlcNAc6 remained intact. X-ray analysis of the D52S-oligosaccharide complex crystals showed that they contained the product GlcNAc4. The structure of the D52S HEWL-GlcNAc4 complex has been determined and refined to an R value of 0.160 for data between 8 and 2 A resolution. GlcNAc4 occupies sites A to D in the active site cleft. Careful refinement and examination of 2Fo-Fc electron density maps showed that the sugar in site D has the sofa conformation, a conformation previously observed with the HEWL complex with tetra-N-acetylglucosamine lactone transition state analogue, the HEWL complex with the cell wall trisaccharide and the phage T4 lysozyme complex with a cell wall product. The semi-axial C(5)-C(6) geometry of the sofa is stabilised by hydrogen bonds from the O-6 hydroxyl group to the main-chain N of Val109 and main-chain O of Ala107. The sugar in site D adopts the alpha configuration, seemingly in conflict with the observation that the hydrolysis of beta (1-4) glycosidie linkage by HEWL proceeds with 99.9% retention of beta-configuration.(ABSTRACT TRUNCATED AT 400 WORDS)

  16. White Whole-Wheat Flour Can Be Partially Substituted for Refined-Wheat Flour in Pizza Crust in School Meals without Affecting Consumption

    ERIC Educational Resources Information Center

    Chan, Hing Wan; Burgess Champoux, Teri; Reicks, Marla; Vickers, Zata; Marquart, Len

    2008-01-01

    Objectives: Recent dietary guidance recommends that children consume at least three servings of whole-grains daily. This study examined whether white whole-wheat (WWW) flour can be partially substituted for refined-wheat (RW) flour in pizza crust without affecting consumption by children in a school cafeteria. Methods: Subjects included first to…

  17. Report of a Meeting on Contemporary Topics in Zebrafish Husbandry and Care

    PubMed Central

    Osborne, Nikki; Paull, Gregory; Grierson, Adam; Dunford, Karen; Busch-Nentwich, Elisabeth M.; Sneddon, Lynne U.; Wren, Natalie; Higgins, Joe

    2016-01-01

    Abstract A meeting on Contemporary Topics in Zebrafish Husbandry and Care was held in the United Kingdom in 2014, with the aim of providing a discussion forum for researchers, animal technologists, and veterinarians from academia and industry to share good practice and exchange ideas. Presentation topics included protocols for optimal larval rearing, implementing the 3Rs (replacement, reduction, and refinement) in large-scale colony management, and environmental enrichment. The audience also participated in a survey of current practice relating to practical husbandry, cryopreservation, and the provision of enrichment. PMID:27537782

  18. Principles and guidelines for good practice in Indigenous engagement in water planning

    NASA Astrophysics Data System (ADS)

    Jackson, Sue; Tan, Poh-Ling; Mooney, Carla; Hoverman, Suzanne; White, Ian

    2012-12-01

    SummaryIndigenous rights, values and interests relating to water have been identified by Australia's National Water Commission as a national priority area, requiring greater understanding, research attention and government action. Yet Indigenous water values are rarely addressed in water planning, despite objectives in national policy requiring Indigenous participation and the identification of Indigenous social, spiritual and customary values in water plans. Water planners are presently equipped with a very limited number of engagement tools tailored to the water resource management context to redress the historical neglect of Indigenous interests. In an Australian research project focused on water planning, seven participatory planning tools were employed in three Australian case studies with different social and hydrological characteristics to improve the way in which Indigenous values are elicited and incorporated and to enhance the status of Indigenous knowledge in water planning. The results from the two Murray Darling Basin (MDB) case studies reveal the many ways in which Indigenous values have been adversely affected by recent water resource developments and concomitant water scarcity. In the third case on the Tiwi Islands in the Northern Territory, where land title to the entire water planning area is vested in Indigenous traditional owners, methods were refined to ensure engagement and generate capacity to manage the development of a solely Indigenous-owned, first-generation Water Management Strategy, in collaboration with a range of stakeholders. This paper describes the needs and aspirations of Indigenous people, the engagement strategies employed to elicit Indigenous knowledge, assess Indigenous values, and incorporate the results into three developing water plans. In addition, it outlines a set of general principles to guide water planning in other regions and thereby to improve Indigenous access to water.

  19. A reconstruction theorem for Connes-Landi deformations of commutative spectral triples

    NASA Astrophysics Data System (ADS)

    Ćaćić, Branimir

    2015-12-01

    We formulate and prove an extension of Connes's reconstruction theorem for commutative spectral triples to so-called Connes-Landi or isospectral deformations of commutative spectral triples along the action of a compact Abelian Lie group G, also known as toric noncommutative manifolds. In particular, we propose an abstract definition for such spectral triples, where noncommutativity is entirely governed by a deformation parameter sitting in the second group cohomology of the Pontryagin dual of G, and then show that such spectral triples are well-behaved under further Connes-Landi deformation, thereby allowing for both quantisation from and dequantisation to G-equivariant abstract commutative spectral triples. We then use a refinement of the Connes-Dubois-Violette splitting homomorphism to conclude that suitable Connes-Landi deformations of commutative spectral triples by a rational deformation parameter are almost-commutative in the general, topologically non-trivial sense.

  20. Refining Housing, Husbandry and Care for Animals Used in Studies Involving Biotelemetry

    PubMed Central

    Hawkins, Penny

    2014-01-01

    Simple Summary Biotelemetry, the remote detection and measurement of an animal function or activity, is widely used in animal research. Biotelemetry devices transmit physiological or behavioural data and may be surgically implanted into animals, or externally attached. This can help to reduce animal numbers and improve welfare, e.g., if animals can be group housed and move freely instead of being tethered to a recording device. However, biotelemetry can also cause pain and distress to animals due to surgery, attachment, single housing and long term laboratory housing. This article explains how welfare and science can be improved by avoiding or minimising these harms. Abstract Biotelemetry can contribute towards reducing animal numbers and suffering in disciplines including physiology, pharmacology and behavioural research. However, the technique can also cause harm to animals, making biotelemetry a ‘refinement that needs refining’. Current welfare issues relating to the housing and husbandry of animals used in biotelemetry studies are single vs. group housing, provision of environmental enrichment, long term laboratory housing and use of telemetered data to help assess welfare. Animals may be singly housed because more than one device transmits on the same wavelength; due to concerns regarding damage to surgical sites; because they are wearing exteriorised jackets; or if monitoring systems can only record from individually housed animals. Much of this can be overcome by thoughtful experimental design and surgery refinements. Similarly, if biotelemetry studies preclude certain enrichment items, husbandry refinement protocols can be adapted to permit some environmental stimulation. Nevertheless, long-term laboratory housing raises welfare concerns and maximum durations should be defined. Telemetered data can be used to help assess welfare, helping to determine endpoints and refine future studies. The above measures will help to improve data quality as well as welfare, because experimental confounds due to physiological and psychological stress will be minimised. PMID:26480045

  1. Discontinuous and continuous hardening processes in calcium and calcium—tin micro-alloyed lead: influence of 'secondary-lead' impurities

    NASA Astrophysics Data System (ADS)

    Bouirden, L.; Hilger, J. P.; Hertz, J.

    Different transformations in leadcalcium and leadcaciumtin alloys are observed with various complementary techniques such as anisothermal microcalorimetry, optical and electronic microscopy, hardness measurements. Three alloy states are studied: as-cast, rehomogenised/water-quenched, rehomogenised/air-cooled. With binary leadcalcium alloys, three successive discontinuous transformations are observed, namely: an initial and complete discontinuous transformation with regular moving of the front of reaction; a second and incomplete discontinuous transformation (puzzle-shaped); a third and incomplete discontinuous transformation with precipitation of Pb 3Ca. The role of secondary-lead impurities is complex: Ag reduces and Bi accelerates the rate of the discontinuous reaction, while Al refines the grain size. Leadcalciumtin are characterized by the Sn/Ca ratio. For very small values of this ratio, the hardening is similar to that of leadcalcium alloys. For high ratio values, the hardening takes place after an incubation period and proceeds via a continuous micro-precipitation of the (PbSn) 3Ca Compound. For intermediate ratios, the different processes are able to operate separately in sequence. Ag increases the rate of the continuous precipitation and reduces the incubation time. No significant effects are observed with Bi or Al. The kinetic laws of the different transformations are presented and values for the energy of activation are determined.

  2. Numerical assessment of bone remodeling around conventionally and early loaded titanium and titanium-zirconium alloy dental implants.

    PubMed

    Akça, Kıvanç; Eser, Atılım; Çavuşoğlu, Yeliz; Sağırkaya, Elçin; Çehreli, Murat Cavit

    2015-05-01

    The aim of this study was to investigate conventionally and early loaded titanium and titanium-zirconium alloy implants by three-dimensional finite element stress analysis. Three-dimensional model of a dental implant was created and a thread area was established as a region of interest in trabecular bone to study a localized part of the global model with a refined mesh. The peri-implant tissues around conventionally loaded (model 1) and early loaded (model 2) implants were implemented and were used to explore principal stresses, displacement values, and equivalent strains in the peri-implant region of titanium and titanium-zirconium implants under static load of 300 N with or without 30° inclination applied on top of the abutment surface. Under axial loading, principal stresses in both models were comparable for both implants and models. Under oblique loading, principal stresses around titanium-zirconium implants were slightly higher in both models. Comparable stress magnitudes were observed in both models. The displacement values and equivalent strain amplitudes around both implants and models were similar. Peri-implant bone around titanium and titanium-zirconium implants experiences similar stress magnitudes coupled with intraosseous implant displacement values under conventional loading and early loading simulations. Titanium-zirconium implants have biomechanical outcome comparable to conventional titanium implants under conventional loading and early loading.

  3. From deep TLS validation to ensembles of atomic models built from elemental motions. II. Analysis of TLS refinement results by explicit interpretation

    DOE PAGES

    Afonine, Pavel V.; Adams, Paul D.; Urzhumtsev, Alexandre

    2018-06-08

    TLS modelling was developed by Schomaker and Trueblood to describe atomic displacement parameters through concerted (rigid-body) harmonic motions of an atomic group [Schomaker & Trueblood (1968), Acta Cryst. B 24 , 63–76]. The results of a TLS refinement are T , L and S matrices that provide individual anisotropic atomic displacement parameters (ADPs) for all atoms belonging to the group. These ADPs can be calculated analytically using a formula that relates the elements of the TLS matrices to atomic parameters. Alternatively, ADPs can be obtained numerically from the parameters of concerted atomic motions corresponding to the TLS matrices. Both proceduresmore » are expected to produce the same ADP values and therefore can be used to assess the results of TLS refinement. Here, the implementation of this approach in PHENIX is described and several illustrations, including the use of all models from the PDB that have been subjected to TLS refinement, are provided.« less

  4. From deep TLS validation to ensembles of atomic models built from elemental motions. II. Analysis of TLS refinement results by explicit interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afonine, Pavel V.; Adams, Paul D.; Urzhumtsev, Alexandre

    TLS modelling was developed by Schomaker and Trueblood to describe atomic displacement parameters through concerted (rigid-body) harmonic motions of an atomic group [Schomaker & Trueblood (1968), Acta Cryst. B 24 , 63–76]. The results of a TLS refinement are T , L and S matrices that provide individual anisotropic atomic displacement parameters (ADPs) for all atoms belonging to the group. These ADPs can be calculated analytically using a formula that relates the elements of the TLS matrices to atomic parameters. Alternatively, ADPs can be obtained numerically from the parameters of concerted atomic motions corresponding to the TLS matrices. Both proceduresmore » are expected to produce the same ADP values and therefore can be used to assess the results of TLS refinement. Here, the implementation of this approach in PHENIX is described and several illustrations, including the use of all models from the PDB that have been subjected to TLS refinement, are provided.« less

  5. Interactional Metadiscourse in Research Article Abstracts

    ERIC Educational Resources Information Center

    Gillaerts, Paul; Van de Velde, Freek

    2010-01-01

    This paper deals with interpersonality in research article abstracts analysed in terms of interactional metadiscourse. The evolution in the distribution of three prominent interactional markers comprised in Hyland's (2005a) model, viz. hedges, boosters and attitude markers, is investigated in three decades of abstract writing in the field of…

  6. Adaptively-refined overlapping grids for the numerical solution of systems of hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Brislawn, Kristi D.; Brown, David L.; Chesshire, Geoffrey S.; Saltzman, Jeffrey S.

    1995-01-01

    Adaptive mesh refinement (AMR) in conjunction with higher-order upwind finite-difference methods have been used effectively on a variety of problems in two and three dimensions. In this paper we introduce an approach for resolving problems that involve complex geometries in which resolution of boundary geometry is important. The complex geometry is represented by using the method of overlapping grids, while local resolution is obtained by refining each component grid with the AMR algorithm, appropriately generalized for this situation. The CMPGRD algorithm introduced by Chesshire and Henshaw is used to automatically generate the overlapping grid structure for the underlying mesh.

  7. Localization, Localization, Localization

    NASA Technical Reports Server (NTRS)

    Parker, T.; Malin, M.; Golombek, M.; Duxbury, T.; Johnson, A.; Guinn, J.; McElrath, T.; Kirk, R.; Archinal, B.; Soderblom, L.

    2004-01-01

    Localization of the two Mars Exploration Rovers involved three independent approaches to place the landers with respect to the surface of Mars and to refine the location of those points on the surface with the Mars control net: 1) Track the spacecraft through entry, descent, and landing, then refine the final roll stop position by radio tracking and comparison to images taken during descent; 2) Locate features on the horizon imaged by the two rovers and compare them to the MOC and THEMIS VIS images, and the DIMES images on the two MER landers; and 3) 'Check' and refine locations by acquisition of MOC 1.5 meter and 50 cm/pixel images.

  8. Assessment of incidence of severe sepsis in Sweden using different ways of abstracting International Classification of Diseases codes: difficulties with methods and interpretation of results.

    PubMed

    Wilhelms, Susanne B; Huss, Fredrik R; Granath, Göran; Sjöberg, Folke

    2010-06-01

    To compare three International Classification of Diseases code abstraction strategies that have previously been reported to mirror severe sepsis by examining retrospective Swedish national data from 1987 to 2005 inclusive. Retrospective cohort study. Swedish hospital discharge database. All hospital admissions during the period 1987 to 2005 were extracted and these patients were screened for severe sepsis using the three International Classification of Diseases code abstraction strategies, which were adapted for the Swedish version of the International Classification of Diseases. Two code abstraction strategies included both International Classification of Diseases, Ninth Revision and International Classification of Diseases, Tenth Revision codes, whereas one included International Classification of Diseases, Tenth Revision codes alone. None. The three International Classification of Diseases code abstraction strategies identified 37,990, 27,655, and 12,512 patients, respectively, with severe sepsis. The incidence increased over the years, reaching 0.35 per 1000, 0.43 per 1000, and 0.13 per 1000 inhabitants, respectively. During the International Classification of Diseases, Ninth Revision period, we found 17,096 unique patients and of these, only 2789 patients (16%) met two of the code abstraction strategy lists and 14,307 (84%) met one list. The International Classification of Diseases, Tenth Revision period included 46,979 unique patients, of whom 8% met the criteria of all three International Classification of Diseases code abstraction strategies, 7% met two, and 84% met one only. The three different International Classification of Diseases code abstraction strategies generated three almost separate cohorts of patients with severe sepsis. Thus, the International Classification of Diseases code abstraction strategies for recording severe sepsis in use today provides an unsatisfactory way of estimating the true incidence of severe sepsis. Further studies relating International Classification of Diseases code abstraction strategies to the American College of Chest Physicians/Society of Critical Care Medicine scores are needed.

  9. Towards solution and refinement of organic crystal structures by fitting to the atomic pair distribution function.

    PubMed

    Prill, Dragica; Juhás, Pavol; Billinge, Simon J L; Schmidt, Martin U

    2016-01-01

    A method towards the solution and refinement of organic crystal structures by fitting to the atomic pair distribution function (PDF) is developed. Approximate lattice parameters and molecular geometry must be given as input. The molecule is generally treated as a rigid body. The positions and orientations of the molecules inside the unit cell are optimized starting from random values. The PDF is obtained from carefully measured X-ray powder diffraction data. The method resembles `real-space' methods for structure solution from powder data, but works with PDF data instead of the diffraction pattern itself. As such it may be used in situations where the organic compounds are not long-range-ordered, are poorly crystalline, or nanocrystalline. The procedure was applied to solve and refine the crystal structures of quinacridone (β phase), naphthalene and allopurinol. In the case of allopurinol it was even possible to successfully solve and refine the structure in P1 with four independent molecules. As an example of a flexible molecule, the crystal structure of paracetamol was refined using restraints for bond lengths, bond angles and selected torsion angles. In all cases, the resulting structures are in excellent agreement with structures from single-crystal data.

  10. Accurate green water loads calculation using naval hydro pack

    NASA Astrophysics Data System (ADS)

    Jasak, H.; Gatin, I.; Vukčević, V.

    2017-12-01

    An extensive verification and validation of Finite Volume based CFD software Naval Hydro based on foam-extend is presented in this paper for green water loads. Two-phase numerical model with advanced methods for treating the free surface is employed. Pressure loads on horizontal deck of Floating Production Storage and Offloading vessel (FPSO) model are compared to experimental results from [1] for three incident regular waves. Pressure peaks and integrals of pressure in time are measured on ten different locations on deck for each case. Pressure peaks and integrals are evaluated as average values among the measured incident wave periods, where periodic uncertainty is assessed for both numerical and experimental results. Spatial and temporal discretization refinement study is performed providing numerical discretization uncertainties.

  11. Locally refined block-centred finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and the performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are: (a) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed, and (b) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  12. Locally refined block-centered finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling and predictions

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are (1) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed and (2) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  13. Performance Comparison of Al-Ti Master Alloys with Different Microstructures in Grain Refinement of Commercial Purity Aluminum.

    PubMed

    Ding, Wanwu; Xia, Tiandong; Zhao, Wenjun

    2014-05-07

    Three types of Al-5Ti master alloys were synthesized by a method of thermal explosion reaction in pure molten aluminum. Performance comparison of Al-5Ti master alloy in grain refinement of commercial purity Al with different additions (0.6%, 1.0%, 1.6%, 2.0%, and 3.0%) and holding time (10, 30, 60 and 120 min) were investigated. The results show that Al-5Ti master alloy with blocky TiAl₃ particles clearly has better refining efficiency than the master alloy with mixed TiAl₃ particles and the master alloy with needle-like TiAl₃ particles. The structures of master alloys, differing by sizes, morphologies and quantities of TiAl₃ crystals, were found to affect the pattern of the grain refining properties with the holding time. The grain refinement effect was revealed to reduce markedly for master alloys with needle-like TiAl₃ crystals and to show the further significant improvement at a longer holding time for the master alloy containing both larger needle-like and blocky TiAl₃ particles. For the master alloy with finer blocky particles, the grain refining effect did not obviously decrease during the whole studied range of the holding time.

  14. Semi-experimental equilibrium structure of pyrazinamide from gas-phase electron diffraction. How much experimental is it?

    NASA Astrophysics Data System (ADS)

    Tikhonov, Denis S.; Vishnevskiy, Yury V.; Rykov, Anatolii N.; Grikina, Olga E.; Khaikin, Leonid S.

    2017-03-01

    A semi-experimental equilibrium structure of free molecules of pyrazinamide has been determined for the first time using gas electron diffraction method. The refinement was carried using regularization of geometry by calculated quantum chemical parameters. It is discussed to which extent is the final structure experimental. A numerical approach for estimation of the amount of experimental information in the refined parameters is suggested. The following values of selected internuclear distances were determined (values are in Å with 1σ in the parentheses): re(Cpyrazine-Cpyrazine)av = 1.397(2), re(Npyrazine-Cpyrazine)av = 1.332(3), re(Cpyrazine-Camide) = 1.493(1), re(Namide-Camide) = 1.335(2), re(Oamide-Camide) = 1.219(1). The given standard deviations represent pure experimental uncertainties without the influence of regularization.

  15. Editorial: Bayesian benefits for child psychology and psychiatry researchers.

    PubMed

    Oldehinkel, Albertine J

    2016-09-01

    For many scientists, performing statistical tests has become an almost automated routine. However, p-values are frequently used and interpreted incorrectly; and even when used appropriately, p-values tend to provide answers that do not match researchers' questions and hypotheses well. Bayesian statistics present an elegant and often more suitable alternative. The Bayesian approach has rarely been applied in child psychology and psychiatry research so far, but the development of user-friendly software packages and tutorials has placed it well within reach now. Because Bayesian analyses require a more refined definition of hypothesized probabilities of possible outcomes than the classical approach, going Bayesian may offer the additional benefit of sparkling the development and refinement of theoretical models in our field. © 2016 Association for Child and Adolescent Mental Health.

  16. Differentiation of refined and virgin edible oils by means of the trans- and cis-phytol isomer distribution.

    PubMed

    Vetter, Walter; Schröder, Markus; Lehnert, Katja

    2012-06-20

    The differentiation of nonrefined (e.g., cold-pressed) and refined edible oils is an important task in food control because of the higher commercial value of the former. Here, we explored the suitability of the relative abundance of cis-phytol as a marker for authentication of nonrefined edible oils. Phytol, the tetramethyl-branched, monoenoic alcohol, is found widespread in nature as a part of chlorophyll. In chlorophyll, only trans-phytol is found. In this study, we present a method for the analysis of the phytol isomers, considering that traces of cis-phytol (contributing 0.1% to the phytol content) can be determined next to trans-phytol. For this purpose, phytol was gathered with the unsaponifiable matter from the oil, trimethylsilylated, and analyzed by gas chromatography coupled to mass spectrometry. With this method, 27 samples of edible oils (16 refined and 11 nonrefined edible oils) were analyzed for the abundance of cis-phytol relative to trans-phytol. In the nonrefined oils (e.g., olive oil, rapeseed oil, maize oil, and sunflower oil), cis-phytol contributed 0.1% (n = 3) or less (n = 8) to the phytol content. In contrast, the refined olive oils (n = 4) contained a share of 1.3-3% cis-phytol; the refined rapeseed oil (n = 3) contained a share of 0.7-1.0% cis-phytol; and the refined sunflower oil (n = 4) contained a share of 0.3-0.9% cis-phytol. Only one refined pomegranate kernel did not contain cis-phytol. The phytol concentration was not suited to distinguish nonrefined from refined oils. In contrast, our data suggest that the virtual absence of cis-phytol can be used as a marker for nonrefined (e.g., cold-pressed) edible oils.

  17. Structure Refinement of Protein Low Resolution Models Using the GNEIMO Constrained Dynamics Method

    PubMed Central

    Park, In-Hee; Gangupomu, Vamshi; Wagner, Jeffrey; Jain, Abhinandan; Vaidehi, Nagara-jan

    2012-01-01

    The challenge in protein structure prediction using homology modeling is the lack of reliable methods to refine the low resolution homology models. Unconstrained all-atom molecular dynamics (MD) does not serve well for structure refinement due to its limited conformational search. We have developed and tested the constrained MD method, based on the Generalized Newton-Euler Inverse Mass Operator (GNEIMO) algorithm for protein structure refinement. In this method, the high-frequency degrees of freedom are replaced with hard holonomic constraints and a protein is modeled as a collection of rigid body clusters connected by flexible torsional hinges. This allows larger integration time steps and enhances the conformational search space. In this work, we have demonstrated the use of a constraint free GNEIMO method for protein structure refinement that starts from low-resolution decoy sets derived from homology methods. In the eight proteins with three decoys for each, we observed an improvement of ~2 Å in the RMSD to the known experimental structures of these proteins. The GNEIMO method also showed enrichment in the population density of native-like conformations. In addition, we demonstrated structural refinement using a “Freeze and Thaw” clustering scheme with the GNEIMO framework as a viable tool for enhancing localized conformational search. We have derived a robust protocol based on the GNEIMO replica exchange method for protein structure refinement that can be readily extended to other proteins and possibly applicable for high throughput protein structure refinement. PMID:22260550

  18. A Community-Centered Astronomy Research Program (Abstract)

    NASA Astrophysics Data System (ADS)

    Boyce, P.; Boyce, G.

    2017-12-01

    (Abstract only) The Boyce Research Initiatives and Education Foundation (BRIEF) is providing semester-long, hands-on, astronomy research experiences for students of all ages that results in their publishing peer-reviewed papers. The course in astronomy and double star research has evolved from a face-to-face learning experience with two instructors to an online hybrid course that simultaneously supports classroom instruction at a variety of schools in the San Diego area. Currently, there are over 65 students enrolled in three community colleges, seven high schools, and one university as well as individual adult learners. Instructional experience, courseware, and supporting systems were developed and refined through experience gained in classroom settings from 2014 through 2016. Topics of instruction include Kepler's Laws, basic astrometry, properties of light, CCD imaging, use of filters for varying stellar spectral types, and how to perform research, scientific writing, and proposal preparation. Volunteer instructors were trained by taking the course and producing their own research papers. An expanded program was launched in the fall semester of 2016. Twelve papers from seven schools were produced; eight have been accepted for publication by the Journal of Double Star Observations (JDSO) and the remainder are in peer review. Three additional papers have been accepted by the JDSO and two more are in process papers. Three college professors and five advanced amateur astronomers are now qualified volunteer instructors. Supporting tools are provided by a BRIEF server and other online services. The server-based tools range from Microsoft Office and planetarium software to top-notch imaging programs and computational software for data reduction for each student team. Observations are performed by robotic telescopes worldwide supported by BRIEF. With this success, student demand has increased significantly. Many of the graduates of the first semester course wanted to expand their astronomy knowledge and experience. To answer this demand, BRIEF is developing additional astronomy research courses with partners in advanced astrometry, photometry, and exoplanets. The program provides a significant opportunity for schools, teachers, and advanced amateur astronomers to introduce high school and college

  19. Catholic/Jesuit Values in an Introductory Religious Studies Course

    ERIC Educational Resources Information Center

    Lynch, Patrick; S. J.; Mizak, Pat

    2012-01-01

    A growing interest in the communication to students of the mission and identity of a higher education institution prompted this study about the presence of Catholic, Jesuit values in the introductory religious studies course at a faith-based university. To conduct this study a survey instrument was developed, piloted, further refined, and then…

  20. MICROREFINING OF WASTE GLYCEROL FOR THE PRODUCTION OF A VALUE-ADDED PRODUCT

    EPA Science Inventory

    As a result of Phase I, a process to refine crude glycerin waste to value-added products was designed. An economic analysis was performed to determine the capital and operating costs for a commercial facility that implements this design. Using the estimated 1,800 gallons of ra...

  1. Engineering Education on the "Fuzzy" Front End: A High-Technology Entrepreneurship Model

    ERIC Educational Resources Information Center

    Crawford, G. P.; Broer, D. J.; Bastiaansen, C. W. M.

    2006-01-01

    We have developed a university entrepreneurship program, culminating in a prototype and business plan, with five objectives: (1) value for all academic participants' (2) introduction to engineering issues for students; (3) refinement of engineering students' "soft" skills; (4) training for students in creating value out of embryonic ideas; and (5)…

  2. An effective ostrich oil bleaching technique using peroxide value as an indicator.

    PubMed

    Palanisamy, Uma Devi; Sivanathan, Muniswaran; Radhakrishnan, Ammu Kutty; Haleagrahara, Nagaraja; Subramaniam, Thavamanithevi; Chiew, Gan Seng

    2011-07-05

    Ostrich oil has been used extensively in the cosmetic and pharmaceutical industries. However, rancidity causes undesirable chemical changes in flavour, colour, odour and nutritional value. Bleaching is an important process in refining ostrich oil. Bleaching refers to the removal of certain minor constituents (colour pigments, free fatty acid, peroxides, odour and non-fatty materials) from crude fats and oils to yield purified glycerides. There is a need to optimize the bleaching process of crude ostrich oil prior to its use for therapeutic purposes. The objective of our study was to establish an effective method to bleach ostrich oil using peroxide value as an indicator of refinement. In our study, we showed that natural earth clay was better than bentonite and acid-activated clay to bleach ostrich oil. It was also found that 1 hour incubation at a 150 °C was suitable to lower peroxide value by 90%. In addition, the nitrogen trap technique in the bleaching process was as effective as the continuous nitrogen flow technique and as such would be the recommended technique due to its cost effectiveness.

  3. Adherence to outpatient epilepsy quality indicators at a tertiary epilepsy center

    PubMed Central

    Pourdeyhimi, R.; Wolf, B.J.; Simpson, A.N.; Martz, G.U.

    2014-01-01

    Introduction Quality indicators for the treatment of people with epilepsy were published in 2010. This is the first report of adherence to all measures in routine care of people with epilepsy at a level 4 comprehensive epilepsy center in the US. Methods Two hundred patients with epilepsy were randomly selected from the clinics of our comprehensive epilepsy center, and all visits during 2011 were abstracted for documentation of adherence to the eight quality indicators. Alternative measures were constructed to evaluate failure of adherence. Detailed descriptions of all equations are provided. Results Objective measures (EEG, imaging) showed higher adherence than counseling measures (safety). Initial visits showed higher adherence. Variations in the interpretation of the quality measure result in different adherence values. Advanced practice providers and physicians had different adherence patterns. No patient-specific patterns of adherence were seen. Discussion This is the first report of adherence to all the epilepsy quality indicators for a sample of patients during routine care in a level 4 epilepsy center in the US. Overall adherence was similar to that previously reported on similar measures. Precise definitions of adherence equations are essential for accurate measurement. Complex measures result in lower adherence. Counseling measures showed low adherence, possibly highlighting a difference between practice and documentation. Adherence to the measures as written does not guarantee high quality care. Conclusion The current quality indicators have value in the process of improving quality of care. Future approaches may be refined to eliminate complex measures and incorporate features linked to outcomes. PMID:25171260

  4. APS-5: 5th international symposium on automotive propulsion systems. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1980-10-01

    Fifty-three papers or panel discussions were presented at the meeting. A separate abstract was prepared for each of 50 papers. Three papers were previously processed for the Energy Data Base. Abstracts for individual papers were not prepared for Energy Abstracts for Policy Analysis (EAPA). (LCL)

  5. Wrestlers' minimal weight: anthropometry, bioimpedance, and hydrostatic weighing compared.

    PubMed

    Oppliger, R A; Nielsen, D H; Vance, C G

    1991-02-01

    The need for accurate assessment of minimal wrestling weight among interscholastic wrestlers has been well documented. Previous research has demonstrated the validity of anthropometric methods for this purpose, but little research has examined the validity of bioelectrical impedance (BIA) measurements. Comparisons between BIA systems has received limited attention. With these two objectives, we compared the prediction of minimal weight (MW) among 57 interscholastic wrestlers using three anthropometric methods (skinfolds (SF) and two skeletal dimensions equations) and three BIA systems (Berkeley Medical Research (BMR), RJL, and Valhalla (VAL]. All methods showed high correlations (r values greater than 0.92) with hydrostatic weighting (HW) and between methods (r values greater than 0.90). The standard errors of estimate (SEE) were relatively small for all methods, especially for SF and the three BIA systems (SEE less than 0.70 kg). The total errors of prediction (E) for RJL and VAL (E = 4.4 and 3.9 kg) were significantly larger than observed nonsignificant BMR and SF values (E = 2.3 and 1.8 kg, respectively). Significant mean differences were observed between HW, RJL, VAL, and the two skeletal dimensions equations, but nonsignificant differences were observed between HW, BMR, and SF. BMR differed significantly from the RJL and VAL systems. The results suggest that RJL and VAL have potential application for this subpopulation. Prediction equation refinement with the addition of selected anthropometric measurement or moderating variables may enhance their utility. However, within the scope of our study, SF and BMR BIA appear to be the most valid methods for determining MW in interscholastic wrestlers.

  6. HPI markets and strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekolf, W.D.

    1988-03-01

    How the HPI and government react to new directions will not only set the course for the future of refining and marketing, it will have profound implications for the entire energy industry. Strategies developed by individual refiners and marketers in response to this changing environment will determine their future in the industry. In developing scenarios for the downstream, Cambridge Energy Research Associates (CERA), has identified three forces that will determine the downstream playing field in the nineties: 1. Imbalances between market demands and refinery capacity will continue to promote intense competition and to depress margins, 2. Product and crude pricemore » volatility will be at least as great in the future as it has been in the last three years and 3. Renewed environmental concerns will add new capital investment burdens to the industry. The implications of these three forces on refiners are clear - being in the downstream business is likely to become increasingly expensive, competitive and risky. The author shares CERA's perspective on why these forces have evolved and, in turn, led to new strategies and developments in the industry. Then he outlines how we think these new themes may affect players in the industry. Finally, he summarizes some key uncertainties the future holds.« less

  7. Reporting of Numerical and Statistical Differences in Abstracts

    PubMed Central

    Dryver, Eric; Hux, Janet E

    2002-01-01

    OBJECTIVE The reporting of relative risk reductions (RRRs) or absolute risk reductions (ARRs) to quantify binary outcomes in trials engenders differing perceptions of therapeutic efficacy, and the merits of P values versus confidence intervals (CIs) are also controversial. We describe the manner in which numerical and statistical difference in treatment outcomes is presented in published abstracts. DESIGN A descriptive study of abstracts published in 1986 and 1996 in 8 general medical and specialty journals. Inclusion criteria: controlled, intervention trials with a binary primary or secondary outcome. Seven items were recorded: raw data (outcomes for each treatment arm), measure of relative difference (e.g., RRR), ARR, number needed to treat, P value, CI, and verbal statement of statistical significance. The prevalence of these items was compared between journals and across time. RESULTS Of 5,293 abstracts, 300 met the inclusion criteria. In 1986, 60% of abstracts did not provide both the raw data and a corresponding P value or CI, while 28% failed to do so in 1Dr. Hux is a Career Scientist of the Ontario Ministry of Health and receives salary support from the Institute for Clinical Evaluative Sciences in Ontario.996 (P < .001; RRR of 53%; ARR of 32%; CI for ARR 21% to 43%). The variability between journals was highly significant (P < .001). In 1986, 100% of abstracts lacked a measure of absolute difference while 88% of 1996 abstracts did so (P < .001). In 1986, 98% of abstracts lacked a CI while 65% of 1996 abstracts did so (P < .001). CONCLUSIONS The provision of quantitative outcome and statistical quantitative information has significantly increased between 1986 and 1996. However, further progress can be made to make abstracts more informative. PMID:11929506

  8. Reporting of numerical and statistical differences in abstracts: improving but not optimal.

    PubMed

    Dryver, Eric; Hux, Janet E

    2002-03-01

    The reporting of relative risk reductions (RRRs) or absolute risk reductions (ARRs) to quantify binary outcomes in trials engenders differing perceptions of therapeutic efficacy, and the merits of P values versus confidence intervals (CIs) are also controversial. We describe the manner in which numerical and statistical difference in treatment outcomes is presented in published abstracts. A descriptive study of abstracts published in 1986 and 1996 in 8 general medical and specialty journals. controlled, intervention trials with a binary primary or secondary outcome. Seven items were recorded: raw data (outcomes for each treatment arm), measure of relative difference (e.g., RRR), ARR, number needed to treat, P value, CI, and verbal statement of statistical significance. The prevalence of these items was compared between journals and across time. Of 5,293 abstracts, 300 met the inclusion criteria. In 1986, 60% of abstracts did not provide both the raw data and a corresponding P value or CI, while 28% failed to do so in 1Dr. Hux is a Career Scientist of the Ontario Ministry of Health and receives salary support from the Institute for Clinical Evaluative Sciences in Ontario.996 ( P <.001; RRR of 53%; ARR of 32%; CI for ARR 21% to 43%). The variability between journals was highly significant ( P <.001). In 1986, 100% of abstracts lacked a measure of absolute difference while 88% of 1996 abstracts did so ( P <.001). In 1986, 98% of abstracts lacked a CI while 65% of 1996 abstracts did so ( P <.001). The provision of quantitative outcome and statistical quantitative information has significantly increased between 1986 and 1996. However, further progress can be made to make abstracts more informative.

  9. Transonic Drag Prediction on a DLR-F6 Transport Configuration Using Unstructured Grid Solvers

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, E. M.; Frink, N. T.; Mavriplis, D. J.; Rausch, R. D.; Milholen, W. E.

    2004-01-01

    A second international AIAA Drag Prediction Workshop (DPW-II) was organized and held in Orlando Florida on June 21-22, 2003. The primary purpose was to inves- tigate the code-to-code uncertainty. address the sensitivity of the drag prediction to grid size and quantify the uncertainty in predicting nacelle/pylon drag increments at a transonic cruise condition. This paper presents an in-depth analysis of the DPW-II computational results from three state-of-the-art unstructured grid Navier-Stokes flow solvers exercised on similar families of tetrahedral grids. The flow solvers are USM3D - a tetrahedral cell-centered upwind solver. FUN3D - a tetrahedral node-centered upwind solver, and NSU3D - a general element node-centered central-differenced solver. For the wingbody, the total drag predicted for a constant-lift transonic cruise condition showed a decrease in code-to-code variation with grid refinement as expected. For the same flight condition, the wing/body/nacelle/pylon total drag and the nacelle/pylon drag increment predicted showed an increase in code-to-code variation with grid refinement. Although the range in total drag for the wingbody fine grids was only 5 counts, a code-to-code comparison of surface pressures and surface restricted streamlines indicated that the three solvers were not all converging to the same flow solutions- different shock locations and separation patterns were evident. Similarly, the wing/body/nacelle/pylon solutions did not appear to be converging to the same flow solutions. Overall, grid refinement did not consistently improve the correlation with experimental data for either the wingbody or the wing/body/nacelle pylon configuration. Although the absolute values of total drag predicted by two of the solvers for the medium and fine grids did not compare well with the experiment, the incremental drag predictions were within plus or minus 3 counts of the experimental data. The correlation with experimental incremental drag was not significantly changed by specifying transition. Although the sources of code-to-code variation in force and moment predictions for the three unstructured grid codes have not yet been identified, the current study reinforces the necessity of applying multiple codes to the same application to assess uncertainty.

  10. Application of Receiver Operating Characteristic Analysis to Refine the Prediction of Potential Digoxin Drug Interactions

    PubMed Central

    Ellens, Harma; Deng, Shibing; Coleman, JoAnn; Bentz, Joe; Taub, Mitchell E.; Ragueneau-Majlessi, Isabelle; Chung, Sophie P.; Herédi-Szabó, Krisztina; Neuhoff, Sibylle; Palm, Johan; Balimane, Praveen; Zhang, Lei; Jamei, Masoud; Hanna, Imad; O’Connor, Michael; Bednarczyk, Dallas; Forsgard, Malin; Chu, Xiaoyan; Funk, Christoph; Guo, Ailan; Hillgren, Kathleen M.; Li, LiBin; Pak, Anne Y.; Perloff, Elke S.; Rajaraman, Ganesh; Salphati, Laurent; Taur, Jan-Shiang; Weitz, Dietmar; Wortelboer, Heleen M.; Xia, Cindy Q.; Xiao, Guangqing; Yamagata, Tetsuo

    2013-01-01

    In the 2012 Food and Drug Administration (FDA) draft guidance on drug-drug interactions (DDIs), a new molecular entity that inhibits P-glycoprotein (P-gp) may need a clinical DDI study with a P-gp substrate such as digoxin when the maximum concentration of inhibitor at steady state divided by IC50 ([I1]/IC50) is ≥0.1 or concentration of inhibitor based on highest approved dose dissolved in 250 ml divide by IC50 ([I2]/IC50) is ≥10. In this article, refined criteria are presented, determined by receiver operating characteristic analysis, using IC50 values generated by 23 laboratories. P-gp probe substrates were digoxin for polarized cell-lines and N-methyl quinidine or vinblastine for P-gp overexpressed vesicles. Inhibition of probe substrate transport was evaluated using 15 known P-gp inhibitors. Importantly, the criteria derived in this article take into account variability in IC50 values. Moreover, they are statistically derived based on the highest degree of accuracy in predicting true positive and true negative digoxin DDI results. The refined criteria of [I1]/IC50 ≥ 0.03 and [I2]/IC50 ≥ 45 and FDA criteria were applied to a test set of 101 in vitro-in vivo digoxin DDI pairs collated from the literature. The number of false negatives (none predicted but DDI observed) were similar, 10 and 12%, whereas the number of false positives (DDI predicted but not observed) substantially decreased from 51 to 40%, relative to the FDA criteria. On the basis of estimated overall variability in IC50 values, a theoretical 95% confidence interval calculation was developed for single laboratory IC50 values, translating into a range of [I1]/IC50 and [I2]/IC50 values. The extent by which this range falls above the criteria is a measure of risk associated with the decision, attributable to variability in IC50 values. PMID:23620486

  11. Machining-induced deformation in stepped specimens of PH 13-8 Mo, 18 nickel maraging steel grade 200T1 and grain-refined HP 9-4-20

    NASA Technical Reports Server (NTRS)

    Wigley, D. A.

    1985-01-01

    The results of a study to evaluate the dimensional changes created during machining and subsequent cycling to cryogenic temperatures for three different metallic alloys are presented. Experimental techniques are described and results presented for 18 Ni Grade 200 maraging steel, PH-13-8 Mo stainless steel, and Grain-refined HP 9-4-20.

  12. Computational investigations and grid refinement study of 3D transient flow in a cylindrical tank using OpenFOAM

    NASA Astrophysics Data System (ADS)

    Mohd Sakri, F.; Mat Ali, M. S.; Sheikh Salim, S. A. Z.

    2016-10-01

    The study of physic fluid for a liquid draining inside a tank is easily accessible using numerical simulation. However, numerical simulation is expensive when the liquid draining involves the multi-phase problem. Since an accurate numerical simulation can be obtained if a proper method for error estimation is accomplished, this paper provides systematic assessment of error estimation due to grid convergence error using OpenFOAM. OpenFOAM is an open source CFD-toolbox and it is well-known among the researchers and institutions because of its free applications and ready to use. In this study, three types of grid resolution are used: coarse, medium and fine grids. Grid Convergence Index (GCI) is applied to estimate the error due to the grid sensitivity. A monotonic convergence condition is obtained in this study that shows the grid convergence error has been progressively reduced. The fine grid has the GCI value below 1%. The extrapolated value from Richardson Extrapolation is in the range of the GCI obtained.

  13. Extrapolation of astrophysical S factors for the reaction {sup 14}N((p, {gamma}) {sup 15}O to near-zero energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Artemov, S. V.; Igamov, S. B., E-mail: igamov@inp.uz; Tursunmakhatov, Q. I.

    2012-03-15

    The astrophysical S factors for the radiative-capture reaction {sup 14}N(p, {gamma}){sup 15}O in the region of ultralow energies were calculated on the basis of the R-matrix approach. The values of the radiative and protonic widths were fitted to new experimental data. The contribution of direct radiative capture to bound states of the {sup 15}O nucleus was determined with the aid of asymptotic normalization coefficients, whose values were refined in the present study on the basis of the results obtained from an analysis of the reaction {sup 14}N({sup 3}He, d){sup 15}O at three different energies of incident helium ions. A valuemore » of S(0) = 1.79 {+-} 0.31 keV b was obtained for the total astrophysical S factor, and the reaction rate was determined for the process {sup 14}N(p, {gamma}){sup 15}O.« less

  14. Using Lava Tube Skylights To Derive Lava Eruption Temperatures on Io

    NASA Astrophysics Data System (ADS)

    Davies, Ashley Gerard; Keszthelyi, Laszlo P.; McEwen, Alfred S.

    2015-11-01

    The eruption temperature of Io’s silicate lavas constrains Io’s interior state and composition [1]. We have examined the theoretical thermal emission from lava tube skylights above basaltic and ultramafic lava channels. Assuming that tube-fed lava flows are common on Io, skylights could also be common. Skylights present steady thermal emission on a scale of days to months. We find that the thermal emission from such a target, measured at multiple visible and NIR wavelengths, can provide a highly accurate diagnostic of eruption temperature. However, the small size of skylights means that close flybys of Io are necessary, requiring a dedicated Io mission [2]. Observations would ideally be at night or in eclipse. We have modelled the thermal emission spectrum for different skylight sizes, lava flow stream velocities, end-member lava compositions, and skylight radiation shape factors, determining the resulting flow surface cooling rates. We calculate the resulting thermal emission spectrum as a function of viewing geometry. From the resulting 0.7:0.9 μm ratios, we see a clear distinction between basaltic and ultramafic compositions for skylights smaller than 20 m across, even if sub-pixel. Our analysis will be further refined as accurate high-temperature short-wavelength emissivity values become available [3]. This work was performed at the Jet Propulsion Laboratory-California Institute of Technology, under contract to NASA. We thank the NASA OPR Program for support. References: [1] Keszthelyi et al. (2007) Icarus 192, 491-502 [2] McEwen et al. (2015) The Io Volcano Observer (IVO) LPSC-46 abstract 1627 [3] Ramsey and Harris (2015) IAVCEI-2015, Prague, Cz. Rep., abstract IUGG-3519.

  15. Concept formation: a supportive process for early career nurses.

    PubMed

    Thornley, Tracey; West, Sandra

    2010-09-01

    Individuals come to understand abstract constructs such as that of the 'expert' through the formation of concepts. Time and repeated opportunity for observation to support the generalisation and abstraction of the developing concept are essential if the concept is to form successfully. Development of an effective concept of the 'expert nurse' is critical for early career nurses who are attempting to integrate theory, values and beliefs as they develop their clinical practice. This study explores the use of a concept development framework in a grounded theory study of the 'expert nurse'. Qualitative. Using grounded theory methods for data collection and analysis, semi-structured interviews were conducted with registered nurses. The participants were asked to describe their concept of the 'expert nurse' and to discuss their experience of developing this. Participants reported forming their concept of the 'expert nurse', after multiple opportunities to engage with nurses identified as 'expert'. This identification did not necessarily relate to the designated position of the 'expert nurse' or assigned mentors. When the early career nurse does not successfully form a concept of the 'expert nurse', difficulties in personal and professional development including skill/knowledge development may arise. To underpin development of their clinical practice effectively, early career nurses need to be provided with opportunities that facilitate the purposive formation of their own concept of the 'expert nurse'. Formation of this concept is not well supported by the common practice of assigning mentors. Early career nurses must be provided with the time and the opportunity to individually develop and refine their concept of the 'expert nurse'. To achieve this, strategies including providing opportunities to engage with expert nurses and discussion of the process of concept formation and its place in underpinning personal judgments may be of assistance. © 2010 Blackwell Publishing Ltd.

  16. Value Addition to Cartosat-I Imagery

    NASA Astrophysics Data System (ADS)

    Mohan, M.

    2014-11-01

    In the sector of remote sensing applications, the use of stereo data is on the steady rise. An attempt is hereby made to develop a software suite specifically for exploitation of Cartosat-I data. A few algorithms to enhance the quality of basic Cartosat-I products will be presented. The algorithms heavily exploit the Rational Function Coefficients (RPCs) that are associated with the image. The algorithms include improving the geometric positioning through Bundle Block Adjustment and producing refined RPCs; generating portable stereo views using raw / refined RPCs autonomously; orthorectification and mosaicing; registering a monoscopic image rapidly with a single seed point. The outputs of these modules (including the refined RPCs) are in standard formats for further exploitation in 3rd party software. The design focus has been on minimizing the user-interaction and to customize heavily to suit the Indian context. The core libraries are in C/C++ and some of the applications come with user-friendly GUI. Further customization to suit a specific workflow is feasible as the requisite photogrammetric tools are in place and are continuously upgraded. The paper discusses the algorithms and the design considerations of developing the tools. The value-added products so produced using these tools will also be presented.

  17. The effect of high pressure torsion on structural refinement and mechanical properties of an austenitic stainless steel.

    PubMed

    Krawczynska, Agnieszka Teresa; Lewandowska, Malgorzata; Pippan, Reinhard; Kurzydlowski, Krzysztof Jan

    2013-05-01

    In the present study, the high pressure torsion (HPT) was used to refine the grain structure down to the nanometer scale in an austenitic stainless steel. The principles of HPT lay on torsional deformation under simultaneous high pressure of the specimen, which results in substantial reduction in the grain size. Disks of the 316LVM austenitic stainless steel of 10 mm in diameter were subjected to equivalent strains epsilon of 32 at RT and 450 degrees C under the pressure of 4 GPa. Furthermore, two-stage HPT processes, i.e., deformation at room temperature followed by deformation at 450 degrees C, were performed. The resulting microstructures were investigated in TEM observations. The mechanical properties were measured in terms of the microhardness and in tensile tests. HPT performed at two-stage conditions (firstly at RT next at 450 degrees C) gives similar values of microhardness to the ones obtained after deforming only at 450 degrees C but performed to higher values of the overall equivalent strain epsilon. The effect of high pressure torsion on structural refinement and mechanical properties of an austenitic stainless steel was evaluated.

  18. The survey of ecologically acceptable flows in Slovenia

    NASA Astrophysics Data System (ADS)

    Smolar-Žvanut, Nataša; Burja, Darko

    2008-11-01

    Excessive water abstractions from watercourses constitute a negative impact on the structure and functioning of aquatic and riparian ecosystems. In order to preserve and improve the aquatic ecosystems it is therefore necessary to maintain adequate quantity and quality of water in watercourses, which can be ensured by providing ecologically acceptable flow (EAF). In Slovenia, a large diversity of watercourses regarding their hydrologic, morphological and ecological characteristics dictates the determination of EAF separately for individual sections of watercourses. Since 1994, the determination of EAF in Slovenia has been carried out primarily for the existing water abstractions such as hydroelectric power plants, fish farms, and to a lesser extent for the abstractions for drinking water, process water, recreation facilities and at the outflows from reservoirs. The results of EAF value analyses showed that the EAF values for individual water abstractions differed widely both with respect to the values of the mean annual minimum flow and the values of the mean daily flow. The results of analyses support the basis for the determination of EAF used in most EU countries, namely that EAF must be determined through interdisciplinary approach where the hydrologic data represent the benchmark values for the determination of EAF.

  19. AMMOS2: a web server for protein–ligand–water complexes refinement via molecular mechanics

    PubMed Central

    Labbé, Céline M.; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O.; Pajeva, Ilza

    2017-01-01

    Abstract AMMOS2 is an interactive web server for efficient computational refinement of protein–small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein–ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein–ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein–ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein–ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein–ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein–ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. PMID:28486703

  20. Synthesis, structure refinement and chromate sorption characteristics of an Al-rich bayerite-based layered double hydroxide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Britto, Sylvia, E-mail: sylviabritto11@gmail.com; Kamath, P. Vishnu

    2014-07-01

    “Imbibition” of Zn{sup 2+} ions into the cation vacancies of bayerite–Al(OH){sub 3} and NO{sub 3}{sup −} ions into the interlayer gallery yields an Al-rich layered double hydroxide with Al/Zn ratio ∼3. NO{sub 3}{sup −} ions are intercalated with their molecular planes inclined at an angle to the plane of the metal hydroxide slab and bonded to it by hydrogen bonds. Rietveld refinement of the structure shows that the monoclinic symmetry of the precursor bayerite is preserved in the product, showing that the imbibition is topochemical in nature. The nitrate ion is labile and is quantitatively replaced by CrO{sub 4}{sup 2−}more » ions from solution. The uptake of CrO{sub 4}{sup 2−} ions follows a Langmuir adsorption isotherm, thus showing that the hydroxide is a candidate material for green chemistry applications for the removal of CrO{sub 4}{sup 2−} ions from waste water. Rietveld refinement of the structure of the hydroxide after CrO{sub 4}{sup 2−} inclusion reveals that the CrO{sub 4}{sup 2−} ion is intercalated with one of its 2-fold axes parallel to the b-crystallographic axis of the crystal, also the principal 2 axis of the monoclinic cell. - Graphical abstract: The structure of the [Zn–Al4-nitrate] LDH viewed along the a-axis. - Highlights: • Synthesis of Al-rich layered double hydroxide with Al/Zn ratio ∼3. • Rietveld refinement indicates that the imbibition of Zn into Al(OH){sub 3} is topochemical in nature. • The uptake of CrO{sub 4}{sup 2−} ions follows a Langmuir adsorption isotherm.« less

  1. Key-value store with internal key-value storage interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Ting, Dennis P. J.

    A key-value store is provided having one or more key-value storage interfaces. A key-value store on at least one compute node comprises a memory for storing a plurality of key-value pairs; and an abstract storage interface comprising a software interface module that communicates with at least one persistent storage device providing a key-value interface for persistent storage of one or more of the plurality of key-value pairs, wherein the software interface module provides the one or more key-value pairs to the at least one persistent storage device in a key-value format. The abstract storage interface optionally processes one or moremore » batch operations on the plurality of key-value pairs. A distributed embodiment for a partitioned key-value store is also provided.« less

  2. On the Power of Abstract Interpretation

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.; Kamin, Samuel N.

    1991-01-01

    Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.

  3. Capture and dissociation in the complex-forming CH + H2 → CH2 + H, CH + H2 reactions.

    PubMed

    González, Miguel; Saracibar, Amaia; Garcia, Ernesto

    2011-02-28

    The rate coefficients for the capture process CH + H(2)→ CH(3) and the reactions CH + H(2)→ CH(2) + H (abstraction), CH + H(2) (exchange) have been calculated in the 200-800 K temperature range, using the quasiclassical trajectory (QCT) method and the most recent global potential energy surface. The reactions, which are of interest in combustion and in astrochemistry, proceed via the formation of long-lived CH(3) collision complexes, and the three H atoms become equivalent. QCT rate coefficients for capture are in quite good agreement with experiments. However, an important zero point energy (ZPE) leakage problem occurs in the QCT calculations for the abstraction, exchange and inelastic exit channels. To account for this issue, a pragmatic but accurate approach has been applied, leading to a good agreement with experimental abstraction rate coefficients. Exchange rate coefficients have also been calculated using this approach. Finally, calculations employing QCT capture/phase space theory (PST) models have been carried out, leading to similar values for the abstraction rate coefficients as the QCT and previous quantum mechanical capture/PST methods. This suggests that QCT capture/PST models are a good alternative to the QCT method for this and similar systems.

  4. Detailed chemical analysis of regional-scale air pollution in western Portugal using an adapted version of MCM v3.1.

    PubMed

    Pinho, P G; Lemos, L T; Pio, C A; Evtyugina, M G; Nunes, T V; Jenkin, M E

    2009-03-01

    A version of the Master Chemical Mechanism (MCM) v3.1, refined on the basis of recent chamber evaluations, has been incorporated into a Photochemical Trajectory Model (PTM) and applied to the simulation of boundary layer photochemistry in the Portuguese west coast region. Comparison of modelled concentrations of ozone and a number of other species (NO(x) and selected hydrocarbons and organic oxygenates) was carried out, using data from three connected sites on two case study days when well-defined sea breeze conditions were established. The ozone concentrations obtained through the application of the PTM are a good approximation to the measured values, the average difference being ca. 15%, indicating that the model was acceptable for evaluation of the details of the chemical processing. The detailed chemistry is examined, allowing conclusions to be drawn concerning chemical interferences in the measurements of NO(2), and in relation to the sensitivity of ozone formation to changes in ambient temperature. Three important, and comparable, contributions to the temperature sensitivity are identified and quantified, namely (i) an effect of increasing biogenic emissions with temperature; (ii) an effect of increasing ambient water vapour concentration with temperature, and its influence on radical production; and (iii) an increase in VOC oxidation chain lengths resulting from the temperature-dependence of the kinetic parameters, particularly in relation to the stability of PAN and its higher analogues. The sensitivity of the simulations to the refinements implemented into MCM v3.1 are also presented and discussed.

  5. 75 FR 26716 - Seamless Refined Copper Pipe and Tube from the People's Republic of China: Preliminary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-12

    ...The Department of Commerce (the ``Department'') has preliminarily determined that seamless refined copper pipe and tube (``copper pipe and tube'') from the People's Republic of China (``PRC'') is being, or is likely to be, sold in the United States at less than fair value (``LTFV''), as provided in section 733 of the Tariff Act of 1930, as amended (the ``Act''). The estimated dumping margins are shown in the ``Preliminary Determination'' section of this notice. Interested parties are invited to comment on the preliminary determination.

  6. Effect of synthesis methods on the Ca{sub 3}Co{sub 4}O{sub 9} thermoelectric ceramic performances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotelo, A.; Rasekh, Sh.; Torres, M.A.

    2015-01-15

    Three different synthesis methods producing nanometric grain sizes, coprecipitation with ammonium carbonate, oxalic acid, and by attrition milling have been studied to produce Ca{sub 3}Co{sub 4}O{sub 9} ceramics and compared with the classical solid state route. These three processes have produced high reactive precursors and all the organic material and CaCO{sub 3}·have been decomposed in a single thermal treatment. Coprecipitation leads to pure Ca{sub 3}Co{sub 4}O{sub 9} phase, while attrition milling and classical solid state produce small amounts of Ca{sub 3}Co{sub 2}O{sub 6} secondary phase. Power factor values are similar for all three samples, being slightly lower for the onesmore » produced by attrition milling. These values are much higher than the obtained in samples prepared by the classical solid state method, used as reference. The maximum power factor values determined at 800 °C (∼0.43 mW/K{sup 2} m) are slightly higher than the best reported values obtained in textured ones which also show much higher density values. - Graphical abstract: Impressive raise of PF in Ca{sub 3}Co{sub 4}O{sub 9} thermoelectric materials obtained from nanometric grains. - Highlights: • Ca{sub 3}Co{sub 4}O{sub 9} has been produced by four different methods. • Precursors particle sizes influences on the final performances. • Coprecipitation methods produce single Ca{sub 3}Co{sub 4}O{sub 9} phase. • Power factor reaches values comparable to high density textured materials.« less

  7. Usability Testing as a Method to Refine a Health Sciences Library Website.

    PubMed

    Denton, Andrea H; Moody, David A; Bennett, Jason C

    2016-01-01

    User testing, a method of assessing website usability, can be a cost-effective and easily administered process to collect information about a website's effectiveness. A user experience (UX) team at an academic health sciences library has employed user testing for over three years to help refine the library's home page. Test methodology used in-person testers using the "think aloud" method to complete tasks on the home page. Review of test results revealed problem areas of the design and redesign; further testing was effective in refining the page. User testing has proved to be a valuable method to engage users and provide feedback to continually improve the library's home page.

  8. How does tunneling contribute to counterintuitive H-abstraction reactivity of nonheme Fe(IV)O oxidants with alkanes?

    PubMed

    Mandal, Debasish; Ramanan, Rajeev; Usharani, Dandamudi; Janardanan, Deepa; Wang, Binju; Shaik, Sason

    2015-01-21

    This article addresses the intriguing hydrogen-abstraction (H-abstraction) and oxygen-transfer (O-transfer) reactivity of a series of nonheme [Fe(IV)(O)(TMC)(Lax)](z+) complexes, with a tetramethyl cyclam ligand and a variable axial ligand (Lax), toward three substrates: 1,4-cyclohexadiene, 9,10-dihydroanthracene, and triphenyl phosphine. Experimentally, O-transfer-reactivity follows the relative electrophilicity of the complexes, whereas the corresponding H-abstraction-reactivity generally increases as the axial ligand becomes a better electron donor, hence exhibiting an antielectrophilic trend. Our theoretical results show that the antielectrophilic trend in H-abstraction is affected by tunneling contributions. Room-temperature tunneling increases with increase of the electron donation power of the axial-ligand, and this reverses the natural electrophilic trend, as revealed through calculations without tunneling, and leads to the observed antielectrophilic trend. By contrast, O-transfer-reactivity, not being subject to tunneling, retains an electrophilic-dependent reactivity trend, as revealed experimentally and computationally. Tunneling-corrected kinetic-isotope effect (KIE) calculations matched the experimental KIE values only if all of the H-abstraction reactions proceeded on the quintet state (S = 2) surface. As such, the present results corroborate the initially predicted two-state reactivity (TSR) scenario for these reactions. The increase of tunneling with the electron-releasing power of the axial ligand, and the reversal of the "natural" reactivity pattern, support the "tunneling control" hypothesis (Schreiner et al., ref 19). Should these predictions be corroborated, the entire field of C-H bond activation in bioinorganic chemistry would lay open to reinvestigation.

  9. Nursing diagnoses, diagnosis-related group, and hospital outcomes.

    PubMed

    Welton, John M; Halloran, Edward J

    2005-12-01

    There are no nursing centric data in the hospital discharge abstract. This study investigates whether adding nursing data in the form of nursing diagnoses to medical diagnostic data in the discharge abstract can improve overall explanation of variance in commonly studied hospital outcomes. A retrospective analyses of 123,241 sequential patient admissions to a university hospital in a Midwestern city was performed. Two data sets were combined: (1) a daily collection of patient assessments by nurses using nursing diagnosis terminology (NDX); and (2) the summary discharge information from the hospital discharge abstract including diagnosis-related group (DRG) and all payer refined DRG (APR-DRG). Each of 61 daily NDX observations were collapsed as frequency of occurrence for the hospital stay and inserted into the discharge abstract. NDX was then compared to both DRG and APR-DRG across 5 hospital outcome variables using multivariate regression or logistic regression. In all statistical models, DRG, APR-DRG, and NDX were significantly associated with the 5 hospital outcome variables (P <.0001). When NDX was added to models containing either the DRG or the APR-DRG, explanatory power (R2) and model discrimination (c statistic) improved by 30% to 146% across the outcome variables of hospital length of stay, ICU length of stay, total charges, probably of death, and discharge to a nursing home (P <.0001). The findings support the contention that nursing care is an independent predictor of patient hospital outcomes. These nursing data are not redundant with the medical diagnosis, in particular, the DRG. The findings support the argument for including nursing care data in the hospital discharge abstract. Further study is needed to clarify which nursing data are the best fit for the current hospital discharge abstract data collection scheme.

  10. Sugar-based bicyclic monomers for aliphatic polyesters: a comparative appraisal of acetalized alditols and isosorbide

    PubMed Central

    Zakharova, Elena; Martínez de Ilarduya, Antxon; León, Salvador; Muñoz-Guerra, Sebastián

    2017-01-01

    Abstract Three series of polyalkanoates (adipates, suberates and sebacates) were synthesized using as monomers three sugar-based bicyclic diols derived from D-glucose (Glux-diol and isosorbide) and D-mannose (Manx-diol). Polycondensations were conducted in the melt applying similar reaction conditions for all cases. The aim was to compare the three bicyclic diols regarding their suitability to render aliphatic polyesters with enhanced thermal and mechanical properties. The ensuing polyesters had molecular weights (M w) in the 25,000–50,000 g mol−1 range with highest values being attained for Glux-diol. All the polyesters started to decompose above 300 °C and most of them did not display perceivable crystallinity. On the contrary, they had glass transition temperatures much higher than usually found in homologous polyesters made of alkanediols, and showed a stress–strain behavior consistent with their T g values. Glux-diol was particularly effective in increasing the T g and to render therefore polyesters with high elastic modulus and considerable mechanical strength. PMID:29491789

  11. Parallel deterministic neutronics with AMR in 3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C.; Ferguson, J.; Hendrickson, C.

    1997-12-31

    AMTRAN, a three dimensional Sn neutronics code with adaptive mesh refinement (AMR) has been parallelized over spatial domains and energy groups and runs on the Meiko CS-2 with MPI message passing. Block refined AMR is used with linear finite element representations for the fluxes, which allows for a straight forward interpretation of fluxes at block interfaces with zoning differences. The load balancing algorithm assumes 8 spatial domains, which minimizes idle time among processors.

  12. Conformation-dependent backbone geometry restraints set a new standard for protein crystallographic refinement

    DOE PAGES

    Moriarty, Nigel W.; Tronrud, Dale E.; Adams, Paul D.; ...

    2014-06-17

    Ideal values of bond angles and lengths used as external restraints are crucial for the successful refinement of protein crystal structures at all but the highest of resolutions. The restraints in common usage today have been designed based on the assumption that each type of bond or angle has a single ideal value independent of context. However, recent work has shown that the ideal values are, in fact, sensitive to local conformation, and as a first step toward using such information to build more accurate models, ultra-high resolution protein crystal structures have been used to derive a conformation-dependent library (CDL)more » of restraints for the protein backbone (Berkholz et al. 2009. Structure. 17, 1316). Here, we report the introduction of this CDL into the Phenix package and the results of test refinements of thousands of structures across a wide range of resolutions. These tests show that use of the conformation dependent library yields models that have substantially better agreement with ideal main-chain bond angles and lengths and, on average, a slightly enhanced fit to the X-ray data. No disadvantages of using the backbone CDL are apparent. In Phenix usage of the CDL can be selected by simply specifying the cdl=True option. This successful implementation paves the way for further aspects of the context-dependence of ideal geometry to be characterized and applied to improve experimental and predictive modelling accuracy.« less

  13. SOV_refine: A further refined definition of segment overlap score and its significance for protein structure similarity.

    PubMed

    Liu, Tong; Wang, Zheng

    2018-01-01

    The segment overlap score (SOV) has been used to evaluate the predicted protein secondary structures, a sequence composed of helix (H), strand (E), and coil (C), by comparing it with the native or reference secondary structures, another sequence of H, E, and C. SOV's advantage is that it can consider the size of continuous overlapping segments and assign extra allowance to longer continuous overlapping segments instead of only judging from the percentage of overlapping individual positions as Q3 score does. However, we have found a drawback from its previous definition, that is, it cannot ensure increasing allowance assignment when more residues in a segment are further predicted accurately. A new way of assigning allowance has been designed, which keeps all the advantages of the previous SOV score definitions and ensures that the amount of allowance assigned is incremental when more elements in a segment are predicted accurately. Furthermore, our improved SOV has achieved a higher correlation with the quality of protein models measured by GDT-TS score and TM-score, indicating its better abilities to evaluate tertiary structure quality at the secondary structure level. We analyzed the statistical significance of SOV scores and found the threshold values for distinguishing two protein structures (SOV_refine  > 0.19) and indicating whether two proteins are under the same CATH fold (SOV_refine > 0.94 and > 0.90 for three- and eight-state secondary structures respectively). We provided another two example applications, which are when used as a machine learning feature for protein model quality assessment and comparing different definitions of topologically associating domains. We proved that our newly defined SOV score resulted in better performance. The SOV score can be widely used in bioinformatics research and other fields that need to compare two sequences of letters in which continuous segments have important meanings. We also generalized the previous SOV definitions so that it can work for sequences composed of more than three states (e.g., it can work for the eight-state definition of protein secondary structures). A standalone software package has been implemented in Perl with source code released. The software can be downloaded from http://dna.cs.miami.edu/SOV/.

  14. Separation of CsCl and SrCl2 from a ternary CsCl-SrCl2-LiCl via a zone refining process for waste salt minimization of pyroprocessing

    NASA Astrophysics Data System (ADS)

    Shim, Moonsoo; Choi, Ho Gil; Yi, Kyung Woo; Hwang, Il Soon; Lee, Jong Hyeon

    2016-11-01

    The purification of LiCl salt mixture has traditionally been carried out by a melt crystallization process. To improve the throughput of zone refining, three heaters were installed in the zone refiner. The zone refining method was used to grow pure LiCl salt ingots from LiCl-CsCl-SrCl2 salt mixture. The main investigated parameters were the heater speed and the number of passes. A change in the LiCl crystal grain size was observed according to the horizontal direction. From each zone refined salt ingot, samples were collected horizontally. To analyze the concentrations of Sr and Cs, an inductively coupled plasma optical emission spectrometer and inductively coupled plasma mass spectrometer were used, respectively. The experimental results show that Sr and Cs concentrations at the initial region of the ingot were low and reached their peak at the final freezing region of the salt ingot. Concentration results of zone refined salt were compared with theoretical results yielded by the proposed model to validate its predictions. The keff of Sr and Cs were 0.13 and 0.11, respectively. The decontamination factors of Sr and Cs were 450 and 1650, respectively.

  15. Multidataset Refinement Resonant Diffraction, and Magnetic Structures

    PubMed Central

    Attfield, J. Paul

    2004-01-01

    The scope of Rietveld and other powder diffraction refinements continues to expand, driven by improvements in instrumentation, methodology and software. This will be illustrated by examples from our research in recent years. Multidataset refinement is now commonplace; the datasets may be from different detectors, e.g., in a time-of-flight experiment, or from separate experiments, such as at several x-ray energies giving resonant information. The complementary use of x rays and neutrons is exemplified by a recent combined refinement of the monoclinic superstructure of magnetite, Fe3O4, below the 122 K Verwey transition, which reveals evidence for Fe2+/Fe3+ charge ordering. Powder neutron diffraction data continue to be used for the solution and Rietveld refinement of magnetic structures. Time-of-flight instruments on cold neutron sources can produce data that have a high intensity and good resolution at high d-spacings. Such profiles have been used to study incommensurate magnetic structures such as FeAsO4 and β–CrPO4. A multiphase, multidataset refinement of the phase-separated perovskite (Pr0.35Y0.07Th0.04Ca0.04Sr0.5)MnO3 has been used to fit three components with different crystal and magnetic structures at low temperatures. PMID:27366599

  16. 3D surface voxel tracing corrector for accurate bone segmentation.

    PubMed

    Guo, Haoyan; Song, Sicong; Wang, Jinke; Guo, Maozu; Cheng, Yuanzhi; Wang, Yadong; Tamura, Shinichi

    2018-06-18

    For extremely close bones, their boundaries are weak and diffused due to strong interaction between adjacent surfaces. These factors prevent the accurate segmentation of bone structure. To alleviate these difficulties, we propose an automatic method for accurate bone segmentation. The method is based on a consideration of the 3D surface normal direction, which is used to detect the bone boundary in 3D CT images. Our segmentation method is divided into three main stages. Firstly, we consider a surface tracing corrector combined with Gaussian standard deviation [Formula: see text] to improve the estimation of normal direction. Secondly, we determine an optimal value of [Formula: see text] for each surface point during this normal direction correction. Thirdly, we construct the 1D signal and refining the rough boundary along the corrected normal direction. The value of [Formula: see text] is used in the first directional derivative of the Gaussian to refine the location of the edge point along accurate normal direction. Because the normal direction is corrected and the value of [Formula: see text] is optimized, our method is robust to noise images and narrow joint space caused by joint degeneration. We applied our method to 15 wrists and 50 hip joints for evaluation. In the wrist segmentation, Dice overlap coefficient (DOC) of [Formula: see text]% was obtained by our method. In the hip segmentation, fivefold cross-validations were performed for two state-of-the-art methods. Forty hip joints were used for training in two state-of-the-art methods, 10 hip joints were used for testing and performing comparisons. The DOCs of [Formula: see text], [Formula: see text]%, and [Formula: see text]% were achieved by our method for the pelvis, the left femoral head and the right femoral head, respectively. Our method was shown to improve segmentation accuracy for several specific challenging cases. The results demonstrate that our approach achieved a superior accuracy over two state-of-the-art methods.

  17. Comparative Omics-Driven Genome Annotation Refinement: Application across Yersiniae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutledge, Alexandra C.; Jones, Marcus B.; Chauhan, Sadhana

    2012-03-27

    Genome sequencing continues to be a rapidly evolving technology, yet most downstream aspects of genome annotation pipelines remain relatively stable or are even being abandoned. To date, the perceived value of manual curation for genome annotations is not offset by the real cost and time associated with the process. In order to balance the large number of sequences generated, the annotation process is now performed almost exclusively in an automated fashion for most genome sequencing projects. One possible way to reduce errors inherent to automated computational annotations is to apply data from 'omics' measurements (i.e. transcriptional and proteomic) to themore » un-annotated genome with a proteogenomic-based approach. This approach does require additional experimental and bioinformatics methods to include omics technologies; however, the approach is readily automatable and can benefit from rapid developments occurring in those research domains as well. The annotation process can be improved by experimental validation of transcription and translation and aid in the discovery of annotation errors. Here the concept of annotation refinement has been extended to include a comparative assessment of genomes across closely related species, as is becoming common in sequencing efforts. Transcriptomic and proteomic data derived from three highly similar pathogenic Yersiniae (Y. pestis CO92, Y. pestis pestoides F, and Y. pseudotuberculosis PB1/+) was used to demonstrate a comprehensive comparative omic-based annotation methodology. Peptide and oligo measurements experimentally validated the expression of nearly 40% of each strain's predicted proteome and revealed the identification of 28 novel and 68 previously incorrect protein-coding sequences (e.g., observed frameshifts, extended start sites, and translated pseudogenes) within the three current Yersinia genome annotations. Gene loss is presumed to play a major role in Y. pestis acquiring its niche as a virulent pathogen, thus the discovery of many translated pseudogenes underscores a need for functional analyses to investigate hypotheses related to divergence. Refinements included the discovery of a seemingly essential ribosomal protein, several virulence-associated factors, and a transcriptional regulator, among other proteins, most of which are annotated as hypothetical, that were missed during annotation.« less

  18. Transition state-finding strategies for use with the growing string method.

    PubMed

    Goodrow, Anthony; Bell, Alexis T; Head-Gordon, Martin

    2009-06-28

    Efficient identification of transition states is important for understanding reaction mechanisms. Most transition state search algorithms require long computational times and a good estimate of the transition state structure in order to converge, particularly for complex reaction systems. The growing string method (GSM) [B. Peters et al., J. Chem. Phys. 120, 7877 (2004)] does not require an initial guess of the transition state; however, the calculation is still computationally intensive due to repeated calls to the quantum mechanics code. Recent modifications to the GSM [A. Goodrow et al., J. Chem. Phys. 129, 174109 (2008)] have reduced the total computational time for converging to a transition state by a factor of 2 to 3. In this work, three transition state-finding strategies have been developed to complement the speedup of the modified-GSM: (1) a hybrid strategy, (2) an energy-weighted strategy, and (3) a substring strategy. The hybrid strategy initiates the string calculation at a low level of theory (HF/STO-3G), which is then refined at a higher level of theory (B3LYP/6-31G(*)). The energy-weighted strategy spaces points along the reaction pathway based on the energy at those points, leading to a higher density of points where the energy is highest and finer resolution of the transition state. The substring strategy is similar to the hybrid strategy, but only a portion of the low-level string is refined using a higher level of theory. These three strategies have been used with the modified-GSM and are compared in three reactions: alanine dipeptide isomerization, H-abstraction in methanol oxidation on VO(x)/SiO(2) catalysts, and C-H bond activation in the oxidative carbonylation of toluene to p-toluic acid on Rh(CO)(2)(TFA)(3) catalysts. In each of these examples, the substring strategy was proved most effective by obtaining a better estimate of the transition state structure and reducing the total computational time by a factor of 2 to 3 compared to the modified-GSM. The applicability of the substring strategy has been extended to three additional examples: cyclopropane rearrangement to propylene, isomerization of methylcyclopropane to four different stereoisomers, and the bimolecular Diels-Alder condensation of 1,3-butadiene and ethylene to cyclohexene. Thus, the substring strategy used in combination with the modified-GSM has been demonstrated to be an efficient transition state-finding strategy for a wide range of types of reactions.

  19. Simulation of an Isolated Tiltrotor in Hover with an Unstructured Overset-Grid RANS Solver

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, Elizabeth M.; Biedron, Robert T.

    2009-01-01

    An unstructured overset-grid Reynolds Averaged Navier-Stokes (RANS) solver, FUN3D, is used to simulate an isolated tiltrotor in hover. An overview of the computational method is presented as well as the details of the overset-grid systems. Steady-state computations within a noninertial reference frame define the performance trends of the rotor across a range of the experimental collective settings. Results are presented to show the effects of off-body grid refinement and blade grid refinement. The computed performance and blade loading trends show good agreement with experimental results and previously published structured overset-grid computations. Off-body flow features indicate a significant improvement in the resolution of the first perpendicular blade vortex interaction with background grid refinement across the collective range. Considering experimental data uncertainty and effects of transition, the prediction of figure of merit on the baseline and refined grid is reasonable at the higher collective range- within 3 percent of the measured values. At the lower collective settings, the computed figure of merit is approximately 6 percent lower than the experimental data. A comparison of steady and unsteady results show that with temporal refinement, the dynamic results closely match the steady-state noninertial results which gives confidence in the accuracy of the dynamic overset-grid approach.

  20. Towards solution and refinement of organic crystal structures by fitting to the atomic pair distribution function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prill, Dragica; Juhas, Pavol; Billinge, Simon J. L.

    2016-01-01

    In this study, a method towards the solution and refinement of organic crystal structures by fitting to the atomic pair distribution function (PDF) is developed. Approximate lattice parameters and molecular geometry must be given as input. The molecule is generally treated as a rigid body. The positions and orientations of the molecules inside the unit cell are optimized starting from random values. The PDF is obtained from carefully measured X-ray powder diffraction data. The method resembles `real-space' methods for structure solution from powder data, but works with PDF data instead of the diffraction pattern itself. As such it may bemore » used in situations where the organic compounds are not long-range-ordered, are poorly crystalline, or nanocrystalline. The procedure was applied to solve and refine the crystal structures of quinacridone (β phase), naphthalene and allopurinol. In the case of allopurinol it was even possible to successfully solve and refine the structure in P1 with four independent molecules. As an example of a flexible molecule, the crystal structure of paracetamol was refined using restraints for bond lengths, bond angles and selected torsion angles. In all cases, the resulting structures are in excellent agreement with structures from single-crystal data.« less

  1. Multimodality 3D Superposition and Automated Whole Brain Tractography: Comprehensive Printing of the Functional Brain

    PubMed Central

    Brimley, Cameron J; Sublett, Jesna Mathew; Stefanowicz, Edward; Flora, Sarah; Mongelluzzo, Gino; Schirmer, Clemens M

    2017-01-01

    Whole brain tractography using diffusion tensor imaging (DTI) sequences can be used to map cerebral connectivity; however, this can be time-consuming due to the manual component of image manipulation required, calling for the need for a standardized, automated, and accurate fiber tracking protocol with automatic whole brain tractography (AWBT). Interpreting conventional two-dimensional (2D) images, such as computed tomography (CT) and magnetic resonance imaging (MRI), as an intraoperative three-dimensional (3D) environment is a difficult task with recognized inter-operator variability. Three-dimensional printing in neurosurgery has gained significant traction in the past decade, and as software, equipment, and practices become more refined, trainee education, surgical skills, research endeavors, innovation, patient education, and outcomes via valued care is projected to improve. We describe a novel multimodality 3D superposition (MMTS) technique, which fuses multiple imaging sequences alongside cerebral tractography into one patient-specific 3D printed model. Inferences on cost and improved outcomes fueled by encouraging patient engagement are explored. PMID:29201580

  2. Rethinking Physics for Biologists: A design-based research approach

    NASA Astrophysics Data System (ADS)

    Sawtelle, Vashti

    2015-03-01

    Biology majors at the University of Maryland are required to take courses in biology, chemistry, and physics - but they often see these courses as disconnected. Over the past three years the NEXUS/Physics course has been working to develop an interdisciplinary learning environment that bridges the disciplinary domains of biology and physics. Across the three years we have gone from teaching in a small class with one instructor to teaching in a large lecture hall with multiple instructors. We have used a design-based research approach to support critical reflection of the course at multiple-time scales. In this presentation I will detail our process of collecting systematic data, listening to and valuing students' reasoning, and bridging diverse perspectives led. I will demonstrate how this process led to improved curricular design, refined assessment objectives, and new design heuristics. This work is supported by NSF-TUES DUE 11-22818, the HHMI NEXUS grant, and a NSF Graduate Research Fellowship (DGE 0750616).

  3. Multimodality 3D Superposition and Automated Whole Brain Tractography: Comprehensive Printing of the Functional Brain.

    PubMed

    Konakondla, Sanjay; Brimley, Cameron J; Sublett, Jesna Mathew; Stefanowicz, Edward; Flora, Sarah; Mongelluzzo, Gino; Schirmer, Clemens M

    2017-09-29

    Whole brain tractography using diffusion tensor imaging (DTI) sequences can be used to map cerebral connectivity; however, this can be time-consuming due to the manual component of image manipulation required, calling for the need for a standardized, automated, and accurate fiber tracking protocol with automatic whole brain tractography (AWBT). Interpreting conventional two-dimensional (2D) images, such as computed tomography (CT) and magnetic resonance imaging (MRI), as an intraoperative three-dimensional (3D) environment is a difficult task with recognized inter-operator variability. Three-dimensional printing in neurosurgery has gained significant traction in the past decade, and as software, equipment, and practices become more refined, trainee education, surgical skills, research endeavors, innovation, patient education, and outcomes via valued care is projected to improve. We describe a novel multimodality 3D superposition (MMTS) technique, which fuses multiple imaging sequences alongside cerebral tractography into one patient-specific 3D printed model. Inferences on cost and improved outcomes fueled by encouraging patient engagement are explored.

  4. Implementation of a three-qubit refined Deutsch Jozsa algorithm using SFG quantum logic gates

    NASA Astrophysics Data System (ADS)

    DelDuce, A.; Savory, S.; Bayvel, P.

    2006-05-01

    In this paper we present a quantum logic circuit which can be used for the experimental demonstration of a three-qubit solid state quantum computer based on a recent proposal of optically driven quantum logic gates. In these gates, the entanglement of randomly placed electron spin qubits is manipulated by optical excitation of control electrons. The circuit we describe solves the Deutsch problem with an improved algorithm called the refined Deutsch-Jozsa algorithm. We show that it is possible to select optical pulses that solve the Deutsch problem correctly, and do so without losing quantum information to the control electrons, even though the gate parameters vary substantially from one gate to another.

  5. Designs and test results for three new rotational sensors

    USGS Publications Warehouse

    Jedlicka, P.; Kozak, J.T.; Evans, J.R.; Hutt, C.R.

    2012-01-01

    We discuss the designs and testing of three rotational seismometer prototypes developed at the Institute of Geophysics, Academy of Sciences (Prague, Czech Republic). Two of these designs consist of a liquid-filled toroidal tube with the liquid as the proof mass and providing damping; we tested the piezoelectric and pressure transduction versions of this torus. The third design is a wheel-shaped solid metal inertial sensor with capacitive sensing and magnetic damping. Our results from testing in Prague and at the Albuquerque Seismological Laboratory of the US Geological Survey of transfer function and cross-axis sensitivities are good enough to justify the refinement and subsequent testing of advanced prototypes. These refinements and new testing are well along.

  6. Designs and test results for three new rotational sensors

    NASA Astrophysics Data System (ADS)

    Jedlička, P.; Kozák, J. T.; Evans, J. R.; Hutt, C. R.

    2012-10-01

    We discuss the designs and testing of three rotational seismometer prototypes developed at the Institute of Geophysics, Academy of Sciences (Prague, Czech Republic). Two of these designs consist of a liquid-filled toroidal tube with the liquid as the proof mass and providing damping; we tested the piezoelectric and pressure transduction versions of this torus. The third design is a wheel-shaped solid metal inertial sensor with capacitive sensing and magnetic damping. Our results from testing in Prague and at the Albuquerque Seismological Laboratory of the US Geological Survey of transfer function and cross-axis sensitivities are good enough to justify the refinement and subsequent testing of advanced prototypes. These refinements and new testing are well along.

  7. The value of Institute of Human Virology meeting abstracts and beyond

    PubMed Central

    Jeang, Kuan-Teh

    2005-01-01

    This month Retrovirology publishes the meeting abstracts from the 10th annual Institute of Human Virology conference held August 29th to September 2nd, 2005 in Baltimore, Maryland, USA. In this editorial, the rationale for publishing meeting abstracts is discussed.

  8. Refining glass structure in two dimensions

    NASA Astrophysics Data System (ADS)

    Sadjadi, Mahdi; Bhattarai, Bishal; Drabold, D. A.; Thorpe, M. F.; Wilson, Mark

    2017-11-01

    Recently determined atomistic scale structures of near-two dimensional bilayers of vitreous silica (using scanning probe and electron microscopy) allow us to refine the experimentally determined coordinates to incorporate the known local chemistry more precisely. Further refinement is achieved by using classical potentials of varying complexity: one using harmonic potentials and the second employing an electrostatic description incorporating polarization effects. These are benchmarked against density functional calculations. Our main findings are that (a) there is a symmetry plane between the two disordered layers, a nice example of an emergent phenomena, (b) the layers are slightly tilted so that the Si-O-Si angle between the two layers is not 180∘ as originally thought but rather 175 ±2∘ , and (c) while interior areas that are not completely imagined can be reliably reconstructed, surface areas are more problematic. It is shown that small crystallites that appear are just as expected statistically in a continuous random network. This provides a good example of the value that can be added to disordered structures imaged at the atomic level by implementing computer refinement.

  9. Instrumental resolution as a function of scattering angle and wavelength as exemplified for the POWGEN instrument

    PubMed Central

    Jacobs, Philipp; Houben, Andreas; Schweika, Werner; Tchougréeff, Andrei L.; Dronskowski, Richard

    2017-01-01

    The method of angular- and wavelength-dispersive (e.g. two-dimensional) Rietveld refinement is a new and emerging tool for the analysis of neutron diffraction data measured at time-of-flight instruments with large area detectors. Following the approach for one-dimensional refinements (using either scattering angle or time of flight), the first step at each beam time cycle is the calibration of the instrument including the determination of instrumental contributions to the peak shape variation to be expected for diffraction patterns measured by the users. The aim of this work is to provide the users with calibration files and – for the later Rietveld refinement of the measured data – with an instrumental resolution file (IRF). This article will elaborate on the necessary steps to generate such an IRF for the angular- and wavelength-dispersive case, exemplified for the POWGEN instrument. A dataset measured on a standard diamond sample is used to extract the profile function in the two-dimensional case. It is found that the variation of reflection width with 2θ and λ can be expressed by the standard equation used for evaluating the instrumental resolution, which yields a substantially more fundamental approach to the parameterization of the instrumental contribution to the peak shape. Geometrical considerations of the POWGEN instrument and sample effects lead to values for Δθ, Δt and ΔL that yield a very good match to the extracted FWHM values. In a final step the refinement results are compared with the one-dimensional, i.e. diffraction-focused, case. PMID:28656041

  10. Reporting funding source or conflict of interest in abstracts of randomized controlled trials, no evidence of a large impact on general practitioners' confidence in conclusions, a three-arm randomized controlled trial.

    PubMed

    Buffel du Vaure, Céline; Boutron, Isabelle; Perrodeau, Elodie; Ravaud, Philippe

    2014-04-28

    Systematic reporting of funding sources is recommended in the CONSORT Statement for abstracts. However, no specific recommendation is related to the reporting of conflicts of interest (CoI). The objective was to compare physicians' confidence in the conclusions of abstracts of randomized controlled trials of pharmaceutical treatment indexed in PubMed. We planned a three-arm parallel-group randomized trial. French general practitioners (GPs) were invited to participate and were blinded to the study's aim. We used a representative sample of 75 abstracts of pharmaceutical industry-funded randomized controlled trials published in 2010 and indexed in PubMed. Each abstract was standardized and reported in three formats: 1) no mention of the funding source or CoI; 2) reporting the funding source only; and 3) reporting the funding source and CoI. GPs were randomized according to a computerized randomization on a secure Internet system at a 1:1:1 ratio to assess one abstract among the three formats. The primary outcome was GPs' confidence in the abstract conclusions (0, not at all, to 10, completely confident). The study was planned to detect a large difference with an effect size of 0.5. Between October 2012 and June 2013, among 605 GPs contacted, 354 were randomized, 118 for each type of abstract. The mean difference (95% confidence interval) in GPs' confidence in abstract findings was 0.2 (-0.6; 1.0) (P = 0.84) for abstracts reporting the funding source only versus no funding source or CoI; -0.4 (-1.3; 0.4) (P = 0.39) for abstracts reporting the funding source and CoI versus no funding source and CoI; and -0.6 (-1.5; 0.2) (P = 0.15) for abstracts reporting the funding source and CoI versus the funding source only. We found no evidence of a large impact of trial report abstracts mentioning funding sources or CoI on GPs' confidence in the conclusions of the abstracts. ClinicalTrials.gov identifier: NCT01679873.

  11. Weak data do not make a free lunch, only a cheap meal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Zhipu; Rajashankar, Kanagalaghatta; Dauter, Zbigniew, E-mail: dauter@anl.gov

    2014-02-01

    Refinement and analysis of four structures with various data resolution cutoffs suggests that at present there are no reliable criteria for judging the diffraction data resolution limit and the condition I/σ(I) = 2.0 is reasonable. However, extending the limit by about 0.2 Å beyond the resolution defined by this threshold does not deteriorate the quality of refined structures and in some cases may be beneficial. Four data sets were processed at resolutions significantly exceeding the criteria traditionally used for estimating the diffraction data resolution limit. The analysis of these data and the corresponding model-quality indicators suggests that the criteria ofmore » resolution limits widely adopted in the past may be somewhat conservative. Various parameters, such as R{sub merge} and I/σ(I), optical resolution and the correlation coefficients CC{sub 1/2} and CC*, can be used for judging the internal data quality, whereas the reliability factors R and R{sub free} as well as the maximum-likelihood target values and real-space map correlation coefficients can be used to estimate the agreement between the data and the refined model. However, none of these criteria provide a reliable estimate of the data resolution cutoff limit. The analysis suggests that extension of the maximum resolution by about 0.2 Å beyond the currently adopted limit where the I/σ(I) value drops to 2.0 does not degrade the quality of the refined structural models, but may sometimes be advantageous. Such an extension may be particularly beneficial for significantly anisotropic diffraction. Extension of the maximum resolution at the stage of data collection and structure refinement is cheap in terms of the required effort and is definitely more advisable than accepting a too conservative resolution cutoff, which is unfortunately quite frequent among the crystal structures deposited in the Protein Data Bank.« less

  12. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    PubMed Central

    Mathew, Cherian; Obst, Matthias; Vicario, Saverio; Haines, Robert; Williams, Alan R.; de Jong, Yde; Goble, Carole

    2014-01-01

    Abstract The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users. PMID:25535486

  13. Black holes as quantum gravity condensates

    NASA Astrophysics Data System (ADS)

    Oriti, Daniele; Pranzetti, Daniele; Sindoni, Lorenzo

    2018-03-01

    We model spherically symmetric black holes within the group field theory formalism for quantum gravity via generalized condensate states, involving sums over arbitrarily refined graphs (dual to three-dimensional triangulations). The construction relies heavily on both the combinatorial tools of random tensor models and the quantum geometric data of loop quantum gravity, both part of the group field theory formalism. Armed with the detailed microscopic structure, we compute the entropy associated with the black hole horizon, which turns out to be equivalently the Boltzmann entropy of its microscopic degrees of freedom and the entanglement entropy between the inside and outside regions. We recover the area law under very general conditions, as well as the Bekenstein-Hawking formula. The result is also shown to be generically independent of any specific value of the Immirzi parameter.

  14. An efficient cloud detection method for high resolution remote sensing panchromatic imagery

    NASA Astrophysics Data System (ADS)

    Li, Chaowei; Lin, Zaiping; Deng, Xinpu

    2018-04-01

    In order to increase the accuracy of cloud detection for remote sensing satellite imagery, we propose an efficient cloud detection method for remote sensing satellite panchromatic images. This method includes three main steps. First, an adaptive intensity threshold value combined with a median filter is adopted to extract the coarse cloud regions. Second, a guided filtering process is conducted to strengthen the textural features difference and then we conduct the detection process of texture via gray-level co-occurrence matrix based on the acquired texture detail image. Finally, the candidate cloud regions are extracted by the intersection of two coarse cloud regions above and we further adopt an adaptive morphological dilation to refine them for thin clouds in boundaries. The experimental results demonstrate the effectiveness of the proposed method.

  15. Exploring the Potential of PROBA-V for Evapotranspiration Monitoring in Wetlands

    NASA Astrophysics Data System (ADS)

    Barrios, Jose Miguel; Ghilain, Nicolas; Arboleda, Alirio; Gellens-Meulenberghs, Francoise

    2016-08-01

    This study aims at deriving daily evapotranspiration (ET) estimates at a convenient spatial resolution for ecosystem monitoring. The methodological approach was based on the computation of the energy balance over the study sites. The study explored the potential of integrating remote sensing (RS) products derived from the Meteosat Second Generation (MSG) satellite -in virtue of their high temporal resolution- and Proba-V data, supplying moderate spatial resolution data. This strategy was tested for the year 2014 on three wetlands sites located in Europe where eddy covariance measurements were available for validation. The modelled results correlated well with the validation data and showed the added value of combining the strengths of different satellite missions. The results open interesting perspectives for refining this approach with the upcoming Sentinel-3 datasets.

  16. Numerical Simulations of As-Extruded Mg Matrix Composites Interpenetrated by Metal Reinforcement

    NASA Astrophysics Data System (ADS)

    Y Wang, H.; Wang, S. R.; Yang, X. F.; Li, P.

    2017-12-01

    The interpenetrating magnesium composites reinforced by three-dimensional braided stainless steel wire reinforcement were fabricated and investigated. The extrusion processes of the composites in different conditions were carried out and simulated by finite element method using the DEFORM-3D software. The results show that the matrix and reinforcement of the composites form a good interfacial bonding and the grains were refined by extrusion and the influence of reinforcement, which are in accordance with the enhanced strength and degraded plasticity. The combined quality between the matrix and reinforcement can be strengthened in extrusion chamber where occurred large strain and suffered intense stress, and the effective stress of the material increases continuously with the increase in extrusion ratio and the decrease in extrusion speed until it reaches a stable value.

  17. A technician-delivered 'virtual clinic' for triaging low-risk glaucoma referrals.

    PubMed

    Kotecha, A; Brookes, J; Foster, P J

    2017-06-01

    PurposeThe purpose of this study is to describe the outcomes of a technician-delivered glaucoma referral triaging service with 'virtual review' of resultant data by a consultant ophthalmologist.Patients and methodsThe Glaucoma Screening Clinic reviewed new optometrist or GP-initiated glaucoma suspect referrals into a specialist ophthalmic hospital. Patients underwent testing by three ophthalmic technicians in a dedicated clinical facility. Data were reviewed at a different time and date by a consultant glaucoma ophthalmologist. Approximately 10% of discharged patients were reviewed in a face-to-face consultant-led clinic to examine the false-negative rate of the service.ResultsBetween 1 March 2014 and 31 March 2016, 1380 patients were seen in the clinic. The number of patients discharged following consultant virtual review was 855 (62%). The positive predictive value of onward referrals was 84%. Three of the 82 patients brought back for face-to-face review were deemed to require treatment, equating to negative predictive value of 96%.ConclusionsOur technician-delivered glaucoma referral triaging clinic incorporates consultant 'virtual review' to provide a service model that significantly reduces the number of onward referrals into the glaucoma outpatient department. This model may be an alternative to departments where there are difficulties in implementing optometrist-led community-based referral refinement schemes.

  18. Application of construal level and value-belief norm theories to undergraduate decision-making on a wildlife socio-scientific issue

    NASA Astrophysics Data System (ADS)

    Sutter, A. McKinzie; Dauer, Jenny M.; Forbes, Cory T.

    2018-06-01

    One aim of science education is to develop scientific literacy for decision-making in daily life. Socio-scientific issues (SSI) and structured decision-making frameworks can help students reach these objectives. This research uses value belief norm (VBN) theory and construal level theory (CLT) to explore students' use of personal values in their decision-making processes and the relationship between abstract and concrete problematization and their decision-making. Using mixed methods, we conclude that the level of abstraction with which students problematise a prairie dog agricultural production and ecosystem preservation issue has a significant relationship to the values students used in the decision-making process. However, neither abstraction of the problem statement nor students' surveyed value orientations were significantly related to students' final decisions. These results may help inform teachers' understanding of students and their use of a structured-decision making tool in a classroom, and aid researchers in understanding if these tools help students remain objective in their analyses of complex SSIs.

  19. Optimization of Bleaching Parameters in Refining Process of Kenaf Seed Oil with a Central Composite Design Model.

    PubMed

    Chew, Sook Chin; Tan, Chin Ping; Nyam, Kar Lin

    2017-07-01

    Kenaf seed oil has been suggested to be used as nutritious edible oil due to its unique fatty acid composition and nutritional value. The objective of this study was to optimize the bleaching parameters of the chemical refining process for kenaf seed oil, namely concentration of bleaching earth (0.5 to 2.5% w/w), temperature (30 to 110 °C) and time (5 to 65 min) based on the responses of total oxidation value (TOTOX) and color reduction using response surface methodology. The results indicated that the corresponding response surface models were highly statistical significant (P < 0.0001) and sufficient to describe and predict TOTOX value and color reduction with R 2 of 0.9713 and 0.9388, respectively. The optimal parameters in the bleaching stage of kenaf seed oil were: 1.5% w/w of the concentration of bleaching earth, temperature of 70 °C, and time of 40 min. These optimum parameters produced bleached kenaf seed oil with TOTOX value of 8.09 and color reduction of 32.95%. There were no significant differences (P > 0.05) between experimental and predicted values, indicating the adequacy of the fitted models. © 2017 Institute of Food Technologists®.

  20. Heart rate control in normal and aborted-SIDS infants.

    PubMed

    Pincus, S M; Cummins, T R; Haddad, G G

    1993-03-01

    Approximate entropy (ApEn), a mathematical formula quantifying regularity in data, was applied to heart rate data from normal and aborted-sudden infant death syndrome (SIDS) infants. We distinguished quiet from rapid-eye-movement (REM) sleep via the following three criteria, refining the notion of REM as more "variable": 1) REM sleep has greater overall variability (0.0374 +/- 0.0138 vs. 0.0205 +/- 0.0090 s, P < 0.005); 2) REM sleep is less stationary (StatAv = 0.742 +/- 0.110) than quiet sleep (StatAv = 0.599 +/- 0.159, P < 0.03); 3) after normalization to overall variability, REM sleep is more regular (ApEnsub = 1.224 +/- 0.092) than quiet sleep (ApEnsub = 1.448 +/- 0.071, P < 0.0001). Fifty percent of aborted-SIDS infants showed greater ApEn instability across quiet sleep than any normal infant exhibited, suggesting that autonomic regulation of heart rate occasionally becomes abnormal in a high-risk subject. There was an association between low ApEn values and aborted-SIDS events; 5 of 14 aborted-SIDS infants had at least one quiet sleep epoch with an ApEn value below the minimum of 45 normal-infant ApEn values.

  1. Determination of mercury in SRM crude oils and refined products by isotope dilution cold vapor ICP-MS using closed-system combustion.

    PubMed

    Kelly, W Robert; Long, Stephen E; Mann, Jacqueline L

    2003-07-01

    Mercury was determined by isotope dilution cold-vapor inductively coupled plasma mass spectrometry (ID-CV-ICP-MS) in four different liquid petroleum SRMs. Samples of approximately 0.3 g were spiked with stable (201)Hg and wet ashed in a closed system (Carius tube) using 6 g of high-purity nitric acid. Three different types of commercial oils were measured: two Texas crude oils, SRM 2721 (41.7+/-5.7 pg g(-1)) and SRM 2722 (129+/-13 pg g(-1)), a low-sulfur diesel fuel, SRM 2724b (34+/-26 pg g(-1)), and a low-sulfur residual fuel oil, SRM 1619b (3.5+/-0.74 ng g(-1)) (mean value and 95% CI). The Hg values for the crude oils and the diesel fuel are the lowest values ever reported for these matrices. The method detection limit, which is ultimately limited by method blank uncertainty, is approximately 10 pg g(-1) for a 0.3 g sample. Accurate Hg measurements in petroleum products are needed to assess the contribution to the global Hg cycle and may be needed in the near future to comply with reporting regulations for toxic elements.

  2. Location of MTBE and toluene in the channel system of the zeolite mordenite: Adsorption and host-guest interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arletti, Rossella, E-mail: rossella.arletti@unito.it; Martucci, Annalisa; Alberti, Alberto

    This paper reports a study of the location of Methyl Tertiary Butyl Ether (MTBE) and toluene molecules adsorbed in the pores of the organophylic zeolite mordenite from an aqueous solution. The presence of these organic molecules in the zeolite channels was revealed by structure refinement performed by the Rietveld method. About 3 molecules of MTBE and 3.6 molecules of toluene per unit cell were incorporated into the cavities of mordenite, representing 75% and 80% of the total absorption capacity of this zeolite. In both cases a water molecule was localized inside the side pocket of mordenite. The saturation capacity determinedmore » by the adsorption isotherms, obtained by batch experiments, and the weight loss given by thermogravimetric (TG) analyses were in very good agreement with these values. The interatomic distances obtained after the structural refinements suggest MTBE could be connected to the framework through a water molecule, while toluene could be bonded to framework oxygen atoms. The rapid and high adsorption of these hydrocarbons into the organophylic mordenite zeolite makes this cheap and environmental friendly material a suitable candidate for the removal of these pollutants from water. - graphical abstract: Location of MTBE (a) and toluene (b) in mordenite channels (projection along the [001] direction). Highlights: Black-Right-Pointing-Pointer We investigated the MTBE and toluene adsorption process into an organophilic zeolite mordenite. Black-Right-Pointing-Pointer The presence of MTBE and toluene in mordenite was determined by X-ray diffraction studies. Black-Right-Pointing-Pointer About 3 molecules of MTBE and 3.6 molecules of toluene per unit cell were incorporated into the zeolite cavities. Black-Right-Pointing-Pointer MTBE is connected to the framework through a water molecule. Black-Right-Pointing-Pointer Toluene is directly bonded to framework oxygen atoms.« less

  3. Dynamic grid refinement for partial differential equations on parallel computers

    NASA Technical Reports Server (NTRS)

    Mccormick, S.; Quinlan, D.

    1989-01-01

    The fast adaptive composite grid method (FAC) is an algorithm that uses various levels of uniform grids to provide adaptive resolution and fast solution of PDEs. An asynchronous version of FAC, called AFAC, that completely eliminates the bottleneck to parallelism is presented. This paper describes the advantage that this algorithm has in adaptive refinement for moving singularities on multiprocessor computers. This work is applicable to the parallel solution of two- and three-dimensional shock tracking problems.

  4. GRACE-Based Analysis of Total Water Storage Trends and Groundwater Fluctuations in the North-Western Sahara Aquifer System (NWSAS) and Tindouf Aquifer in Northwest Africa

    NASA Astrophysics Data System (ADS)

    Lezzaik, K. A.; Milewski, A.

    2013-12-01

    Optimal water management practices and strategies, in arid and semi-arid environments, are often hindered by a lack of quantitative and qualitative understanding of hydrological processes. Moreover, progressive overexploitation of groundwater resources to meet agricultural, industrial, and domestic requirements is drawing concern over the sustainability of such exhaustive abstraction levels, especially in environments where groundwater is a major source of water. NASA's GRACE (gravity recovery and climate change experiment) mission, since March 2002, has advanced the understanding of hydrological events, especially groundwater depletion, through integrated measurements and modeling of terrestrial water mass. In this study, GLDAS variables (rainfall rate, evapotranspiration rate, average soil moisture), and TRMM 3B42.V7A precipitation satellite data, were used in combination with 95 GRACE-generated gravitational anomalies maps, to quantify total water storage change (TWSC) and groundwater storage change (GWSC) from January 2003 to December 2010 (excluding June 2003), in the North-Western Sahara Aquifer System (NWSAS) and Tindouf Aquifer System in northwestern Africa. Separately processed and computed GRACE products by JPL (Jet Propulsion Laboratory, NASA), CSR (Center of Space Research, UT Austin), and GFZ (German Research Centre for Geoscience, Potsdam), were used to determine which GRACE dataset(s) best reflect total water storage and ground water changes in northwest Africa. First-order estimates of annual TWSC for NWSAS (JPL: +5.297 BCM; CSR: -5.33 BCM; GFZ: -9.96 BCM) and Tindouf Aquifer System (JPL: +1.217 BCM; CSR: +0.203 BCM; GFZ: +1.019 BCM), were computed using zonal averaging over a span of eight years. Preliminary findings of annual GWSC for NWSAS (JPL: +2.45 BCM; CSR: -2.278 BCM; GFZ: -6.913 BCM) and Tindouf Aquifer System (JPL: +1.108 BCM; CSR: +0.094 BCM; GFZ: +0.910 BCM), were calculating using a water budget approach, parameterized by GLDAS-derived soil moisture and evapotranspiration values with GRACE-based TWSC. Initial results suggest CSR-processed datasets as being most representative of TWSC/GWSC values in the NWSAS, given groundwater abstraction estimates of 2.5 BCM/year, a conservative estimate considering it does not include unaccounted abstractions or increased consumption in recent years. Conversely, high abstraction rates and negligibly low recharge rates indicate the positive TWSC/GWSC values generated from JPL-processed datasets are not accurately representative of hydrologic changes in NWSAS. Consistently positive TWSC/GWSC values for the Tindouf Aquifer System, by JPL, CSR, and GFZ datasets are indicative of sustainable groundwater abstraction levels (recharge rate > abstraction rate). GWSC time series, computed for each of the three different processed datasets (JPL, CSR, GFZ), account for significant withdrawals in groundwater in both NWSAS (February 2006 and from August 2008 to January 2009) and Tindouf Aquifer System (November/October 2003, February/March 2006, and September/October 2010).

  5. A BRDF statistical model applying to space target materials modeling

    NASA Astrophysics Data System (ADS)

    Liu, Chenghao; Li, Zhi; Xu, Can; Tian, Qichen

    2017-10-01

    In order to solve the problem of poor effect in modeling the large density BRDF measured data with five-parameter semi-empirical model, a refined statistical model of BRDF which is suitable for multi-class space target material modeling were proposed. The refined model improved the Torrance-Sparrow model while having the modeling advantages of five-parameter model. Compared with the existing empirical model, the model contains six simple parameters, which can approximate the roughness distribution of the material surface, can approximate the intensity of the Fresnel reflectance phenomenon and the attenuation of the reflected light's brightness with the azimuth angle changes. The model is able to achieve parameter inversion quickly with no extra loss of accuracy. The genetic algorithm was used to invert the parameters of 11 different samples in the space target commonly used materials, and the fitting errors of all materials were below 6%, which were much lower than those of five-parameter model. The effect of the refined model is verified by comparing the fitting results of the three samples at different incident zenith angles in 0° azimuth angle. Finally, the three-dimensional modeling visualizations of these samples in the upper hemisphere space was given, in which the strength of the optical scattering of different materials could be clearly shown. It proved the good describing ability of the refined model at the material characterization as well.

  6. Insights into islet development and biology through characterization of a human iPSC-derived endocrine pancreas model

    PubMed Central

    van de Bunt, Martijn; Lako, Majlinda; Barrett, Amy; Gloyn, Anna L.; Hansson, Mattias; McCarthy, Mark I.; Honoré, Christian

    2016-01-01

    ABSTRACT Directed differentiation of stem cells offers a scalable solution to the need for human cell models recapitulating islet biology and T2D pathogenesis. We profiled mRNA expression at 6 stages of an induced pluripotent stem cell (iPSC) model of endocrine pancreas development from 2 donors, and characterized the distinct transcriptomic profiles associated with each stage. Established regulators of endodermal lineage commitment, such as SOX17 (log2 fold change [FC] compared to iPSCs = 14.2, p-value = 4.9 × 10−5) and the pancreatic agenesis gene GATA6 (log2 FC = 12.1, p-value = 8.6 × 10−5), showed transcriptional variation consistent with their known developmental roles. However, these analyses highlighted many other genes with stage-specific expression patterns, some of which may be novel drivers or markers of islet development. For example, the leptin receptor gene, LEPR, was most highly expressed in published data from in vivo-matured cells compared to our endocrine pancreas-like cells (log2 FC = 5.5, p-value = 2.0 × 10−12), suggesting a role for the leptin pathway in the maturation process. Endocrine pancreas-like cells showed significant stage-selective expression of adult islet genes, including INS, ABCC8, and GLP1R, and enrichment of relevant GO-terms (e.g. “insulin secretion”; odds ratio = 4.2, p-value = 1.9 × 10−3): however, principal component analysis indicated that in vitro-differentiated cells were more immature than adult islets. Integration of the stage-specific expression information with genetic data from T2D genome-wide association studies revealed that 46 of 82 T2D-associated loci harbor genes present in at least one developmental stage, facilitating refinement of potential effector transcripts. Together, these data show that expression profiling in an iPSC islet development model can further understanding of islet biology and T2D pathogenesis. PMID:27246810

  7. Value Added: the Case for Point-of-View Camera use in Orthopedic Surgical Education

    PubMed Central

    Thomas, Geb W.; Taylor, Leah; Liu, Xiaoxing; Anthony, Chris A.; Anderson, Donald D.

    2016-01-01

    Abstract Background Orthopedic surgical education is evolving as educators search for new ways to enhance surgical skills training. Orthopedic educators should seek new methods and technologies to augment and add value to real-time orthopedic surgical experience. This paper describes a protocol whereby we have started to capture and evaluate specific orthopedic milestone procedures with a GoPro® point-of-view video camera and a dedicated video reviewing website as a way of supplementing the current paradigm in surgical skills training. We report our experience regarding the details and feasibility of this protocol. Methods Upon identification of a patient undergoing surgical fixation of a hip or ankle fracture, an orthopedic resident places a GoPro® point-of-view camera on his or her forehead. All fluoroscopic images acquired during the case are saved and later incorporated into a video on the reviewing website. Surgical videos are uploaded to a secure server and are accessible for later review and assessment via a custom-built website. An electronic survey of resident participants was performed utilizing Qualtrics software. Results are reported using descriptive statistics. Results A total of 51 surgical videos involving 23 different residents have been captured to date. This includes 20 intertrochanteric hip fracture cases and 31 ankle fracture cases. The average duration of each surgical video was 1 hour and 16 minutes (range 40 minutes to 2 hours and 19 minutes). Of 24 orthopedic resident surgeons surveyed, 88% thought capturing a video portfolio of orthopedic milestones would benefit their education Conclusions There is a growing demand in orthopedic surgical education to extract more value from each surgical experience. While further work in development and refinement of such assessments is necessary, we feel that intraoperative video, particularly when captured and presented in a non-threatening, user friendly manner, can add significant value to the present and future paradigm of orthopedic surgical skill training. PMID:27528828

  8. An organ-based approach to dose calculation in the assessment of dose-dependent biological effects of ionising radiation in Arabidopsis thaliana.

    PubMed

    Biermans, Geert; Horemans, Nele; Vanhoudt, Nathalie; Vandenhove, Hildegarde; Saenen, Eline; Van Hees, May; Wannijn, Jean; Vives i Batlle, Jordi; Cuypers, Ann

    2014-07-01

    There is a need for a better understanding of biological effects of radiation exposure in non-human biota. Correct description of these effects requires a more detailed model of dosimetry than that available in current risk assessment tools, particularly for plants. In this paper, we propose a simple model for dose calculations in roots and shoots of Arabidopsis thaliana seedlings exposed to radionuclides in a hydroponic exposure setup. This model is used to compare absorbed doses for three radionuclides, (241)Am (α-radiation), (90)Sr (β-radiation) and (133)Ba (γ radiation). Using established dosimetric calculation methods, dose conversion coefficient values were determined for each organ separately based on uptake data from the different plant organs. These calculations were then compared to the DCC values obtained with the ERICA tool under equivalent geometry assumptions. When comparing with our new method, the ERICA tool appears to overestimate internal doses and underestimate external doses in the roots for all three radionuclides, though each to a different extent. These observations might help to refine dose-response relationships. The DCC values for (90)Sr in roots are shown to deviate the most. A dose-effect curve for (90)Sr β-radiation has been established on biomass and photosynthesis endpoints, but no significant dose-dependent effects are observed. This indicates the need for use of endpoints at the molecular and physiological scale. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Leptomeningeal metastases: a RANO proposal for response criteria

    PubMed Central

    Junck, Larry; Brandsma, Dieta; Soffietti, Riccardo; Rudà, Roberta; Raizer, Jeffrey; Boogerd, Willem; Taillibert, Sophie; Groves, Morris D.; Rhun, Emilie Le; Walker, Julie; van den Bent, Martin; Wen, Patrick Y.; Jaeckle, Kurt A.

    2017-01-01

    Abstract Leptomeningeal metastases (LM) currently lack standardization with respect to response assessment. A Response Assessment in Neuro-Oncology (RANO) working group with expertise in LM developed a consensus proposal for evaluating patients treated for this disease. Three basic elements in assessing response in LM are proposed: a standardized neurological examination, cerebral spinal fluid (CSF) cytology or flow cytometry, and radiographic evaluation. The group recommends that all patients enrolling in clinical trials undergo CSF analysis (cytology in all cancers; flow cytometry in hematologic cancers), complete contrast-enhanced neuraxis MRI, and in instances of planned intra-CSF therapy, radioisotope CSF flow studies. In conjunction with the RANO Neurological Assessment working group, a standardized instrument was created for assessing the neurological exam in patients with LM. Considering that most lesions in LM are nonmeasurable and that assessment of neuroimaging in LM is subjective, neuroimaging is graded as stable, progressive, or improved using a novel radiological LM response scorecard. Radiographic disease progression in isolation (ie, negative CSF cytology/flow cytometry and stable neurological assessment) would be defined as LM disease progression. The RANO LM working group has proposed a method of response evaluation for patients with LM that will require further testing, validation, and likely refinement with use. PMID:28039364

  10. Geohydrologic reconnaissance of a ground-water contamination problem in the Argonne Road area near Spokane, Washington

    USGS Publications Warehouse

    Dion, N.P.

    1987-01-01

    Three domestic wells that withdraw groundwater from an alluvium-filled trough cut into granite were found to be contaminated with the organic solvents tetrachloroethene, trichloroethene, 1,1,1-trichloroethane, and 1 ,2-trans-dichloroethene. The suspected source of contamination is a nearby septic-tank sludge disposal area. There is concern that the affected aquifer is tributary to the Spokane aquifer, which has been accorded ' sole source ' status by the U.S. Environmental Protection Agency. Preliminary estimates suggest that groundwater in the area is moving toward the Spokane aquifer and that the transit time may range from 2.5 to 25 years. Because of longitudinal dispersion, however, the plume of contaminants may move at a faster rate than the ambient groundwater and may arrive at given destinations more quickly than calculated above. A literature search has indicated that the dissolved solute phase of the contaminants will not be significantly affected by sorption, volatilization, chemical activity, or biodegradation. Because of the preliminary nature of the investigation, many questions relating to the extent of contamination remain unanswered. A list of suggested additional studies to answer those questions and to refine and confirm the findings of this investigation is presented. (Author 's abstract)

  11. Comprehensive Peptide Ion Structure Studies Using Ion Mobility Techniques: Part 3. Relating Solution-Phase to Gas-Phase Structures.

    PubMed

    Kondalaji, Samaneh Ghassabi; Khakinejad, Mahdiar; Valentine, Stephen J

    2018-06-01

    Molecular dynamics (MD) simulations have been utilized to study peptide ion conformer establishment during the electrospray process. An explicit water model is used for nanodroplets containing a model peptide and hydronium ions. Simulations are conducted at 300 K for two different peptide ion charge configurations and for droplets containing varying numbers of hydronium ions. For all conditions, modeling has been performed until production of the gas-phase ions and the resultant conformers have been compared to proposed gas-phase structures. The latter species were obtained from previous studies in which in silico candidate structures were filtered according to ion mobility and hydrogen-deuterium exchange (HDX) reactivity matches. Results from the present study present three key findings namely (1) the evidence from ion production modeling supports previous structure refinement studies based on mobility and HDX reactivity matching, (2) the modeling of the electrospray process is significantly improved by utilizing initial droplets existing below but close to the calculated Rayleigh limit, and (3) peptide ions in the nanodroplets sample significantly different conformers than those in the bulk solution due to altered physicochemical properties of the solvent. Graphical Abstract ᅟ.

  12. Existence and Non-uniqueness of Global Weak Solutions to Inviscid Primitive and Boussinesq Equations

    NASA Astrophysics Data System (ADS)

    Chiodaroli, Elisabetta; Michálek, Martin

    2017-08-01

    We consider the initial value problem for the inviscid Primitive and Boussinesq equations in three spatial dimensions. We recast both systems as an abstract Euler-type system and apply the methods of convex integration of De Lellis and Székelyhidi to show the existence of infinitely many global weak solutions of the studied equations for general initial data. We also introduce an appropriate notion of dissipative solutions and show the existence of suitable initial data which generate infinitely many dissipative solutions.

  13. Exploring the Diversification Discount: A Focus on High-Technology Target Firms

    DTIC Science & Technology

    2003-03-01

    26 3. Breusch - Pagan Test Analysis ……………………………………………………..39 ix AFIT/GCA/ENV/03-01 Abstract When firms choose to acquire...of fit test , the Durbin-Watson test , and the Breusch - Pagan test respectively. A further discussion of the validation of the three regression...addition, the Breusch - Pagan test was employed to objectively test the assumption (Neter, 1996). The test yielded a p-value of 0.990, again

  14. The left inferior frontal gyrus: A neural crossroads between abstract and concrete knowledge.

    PubMed

    Della Rosa, Pasquale Anthony; Catricalà, Eleonora; Canini, Matteo; Vigliocco, Gabriella; Cappa, Stefano F

    2018-07-15

    Evidence from both neuropsychology and neuroimaging suggests that different types of information are necessary for representing and processing concrete and abstract word meanings. Both abstract and concrete concepts, however, conjointly rely on perceptual, verbal and contextual knowledge, with abstract concepts characterized by low values of imageability (IMG) (low sensory-motor grounding) and low context availability (CA) (more difficult to contextualize). Imaging studies supporting differences between abstract and concrete concepts show a greater recruitment of the left inferior frontal gyrus (LIFG) for abstract concepts, which has been attributed either to the representation of abstract-specific semantic knowledge or to the request for more executive control than in the case of concrete concepts. We conducted an fMRI study on 27 participants, using a lexical decision task involving both abstract and concrete words, whose IMG and CA values were explicitly modelled in separate parametric analyses. The LIFG was significantly more activated for abstract than for concrete words, and a conjunction analysis showed a common activation for words with low IMG or low CA only in the LIFG, in the same area reported for abstract words. A regional template map of brain activations was then traced for words with low IMG or low CA, and BOLD regional time-series were extracted and correlated with the specific LIFG neural activity elicited for abstract words. The regions associated to low IMG, which were functionally correlated with LIFG, were mainly in the left hemisphere, while those associated with low CA were in the right hemisphere. Finally, in order to reveal which LIFG-related network increased its connectivity with decreases of IMG or CA, we conducted generalized psychophysiological interaction analyses. The connectivity strength values extracted from each region connected with the LIFG were correlated with specific LIFG neural activity for abstract words, and a regression analysis was conducted to highlight which areas recruited by low IMG or low CA predicted the greater activation of the IFG for abstract concepts. Only the left middle temporal gyrus/angular gyrus, known to be involved in semantic processing, was a significant predictor of LIFG activity differentiating abstract from concrete words. The results show that the abstract conceptual processing requires the interplay of multiple brain regions, necessary for both the intrinsic and extrinsic properties of abstract knowledge. The LIFG can be thus identified as the neural crossroads between different types of information equally necessary for representing processing and differentiating abstract concepts from concrete ones. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. HiTAD: detecting the structural and functional hierarchies of topologically associating domains from chromatin interactions

    PubMed Central

    Wang, Xiao-Tao; Cui, Wang

    2017-01-01

    Abstract A current question in the high-order organization of chromatin is whether topologically associating domains (TADs) are distinct from other hierarchical chromatin domains. However, due to the unclear TAD definition in tradition, the structural and functional uniqueness of TAD is not well studied. In this work, we refined TAD definition by further constraining TADs to the optimal separation on global intra-chromosomal interactions. Inspired by this constraint, we developed a novel method, called HiTAD, to detect hierarchical TADs from Hi-C chromatin interactions. HiTAD performs well in domain sensitivity, replicate reproducibility and inter cell-type conservation. With a novel domain-based alignment proposed by us, we defined several types of hierarchical TAD changes which were not systematically studied previously, and subsequently used them to reveal that TADs and sub-TADs differed statistically in correlating chromosomal compartment, replication timing and gene transcription. Finally, our work also has the implication that the refinement of TAD definition could be achieved by only utilizing chromatin interactions, at least in part. HiTAD is freely available online. PMID:28977529

  16. Histological Grading of Hepatocellular Carcinomas with Intravoxel Incoherent Motion Diffusion-weighted Imaging: Inconsistent Results Depending on the Fitting Method.

    PubMed

    Ichikawa, Shintaro; Motosugi, Utaroh; Hernando, Diego; Morisaka, Hiroyuki; Enomoto, Nobuyuki; Matsuda, Masanori; Onishi, Hiroshi

    2018-04-10

    To compare the abilities of three intravoxel incoherent motion (IVIM) imaging approximation methods to discriminate the histological grade of hepatocellular carcinomas (HCCs). Fifty-eight patients (60 HCCs) underwent IVIM imaging with 11 b-values (0-1000 s/mm 2 ). Slow (D) and fast diffusion coefficients (D * ) and the perfusion fraction (f) were calculated for the HCCs using the mean signal intensities in regions of interest drawn by two radiologists. Three approximation methods were used. First, all three parameters were obtained simultaneously using non-linear fitting (method A). Second, D was obtained using linear fitting (b = 500 and 1000), followed by non-linear fitting for D * and f (method B). Third, D was obtained by linear fitting, f was obtained using the regression line intersection and signals at b = 0, and non-linear fitting was used for D * (method C). A receiver operating characteristic analysis was performed to reveal the abilities of these methods to distinguish poorly-differentiated from well-to-moderately-differentiated HCCs. Inter-reader agreements were assessed using intraclass correlation coefficients (ICCs). The measurements of D, D * , and f in methods B and C (Az-value, 0.658-0.881) had better discrimination abilities than did those in method A (Az-value, 0.527-0.607). The ICCs of D and f were good to excellent (0.639-0.835) with all methods. The ICCs of D * were moderate with methods B (0.580) and C (0.463) and good with method A (0.705). The IVIM parameters may vary depending on the fitting methods, and therefore, further technical refinement may be needed.

  17. Impact of the dynamic and static component of the sport practised for electrocardiogram analysis in screening athletes.

    PubMed

    Maillot, N; Guenancia, C; Yameogo, N V; Gudjoncik, A; Garnier, F; Lorgis, L; Chagué, F; Cottin, Y

    2018-02-01

    To interpret the electrocardiogram (ECG) of athletes, the recommendations of the ESC and the Seattle criteria define type 1 peculiarities, those induced by training, and type 2, those not induced by training, to rule out cardiomyopathy. The specificity of the screening was improved by Sheikh who defined "Refined Criteria," which includes a group of intermediate peculiarities. The aim of our study was to investigate the influence of static and dynamic components on the prevalence of different types of abnormalities. The ECGs of 1030 athletes performed during preparticipation screening were interpreted using these three classifications. Our work revealed 62/16%, 69/13%, and 71/7% of type 1 peculiarities and type 2 abnormalities for the ESC, Seattle, and Refined Criteria algorithms, respectively(P<.001). For type 2 abnormalities, three independent factors were found for the ESC and Seattle criteria: age, Afro-Caribbean origin, and the dynamic component with, for the latter, an OR[95% CI] of 2.35[1.28-4.33] (P=.006) and 1.90[1.03-3.51] (P=.041), respectively. In contrast, only the Afro-Caribbean origin was associated with type 2 abnormalities using the Refined Criteria: OR[95% CI] 2.67[1.60-4.46] (P<.0001). The Refined Criteria classified more athletes in the type 1 category and fewer in the type 2 category compared with the ESC and Seattle algorithms. Contrary to previous studies, a high dynamic component was not associated with type 2 abnormalities when the Refined Criteria were used; only the Afro-Caribbean origin remained associated. Further research is necessary to better understand adaptations with regard to duration and thus improve the modern criteria for ECG screening in athletes. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Survey of Canadian animal-based researchers' views on the Three Rs: replacement, reduction and refinement.

    PubMed

    Fenwick, Nicole; Danielson, Peter; Griffin, Gilly

    2011-01-01

    The 'Three Rs' tenet (replacement, reduction, refinement) is a widely accepted cornerstone of Canadian and international policies on animal-based science. The Canadian Council on Animal Care (CCAC) initiated this web-based survey to obtain greater understanding of 'principal investigators' and 'other researchers' (i.e. graduate students, post-doctoral researchers etc.) views on the Three Rs, and to identify obstacles and opportunities for continued implementation of the Three Rs in Canada. Responses from 414 participants indicate that researchers currently do not view the goal of replacement as achievable. Researchers prefer to use enough animals to ensure quality data is obtained rather than using the minimum and potentially waste those animals if a problem occurs during the study. Many feel that they already reduce animal numbers as much as possible and have concerns that further reduction may compromise research. Most participants were ambivalent about re-use, but expressed concern that the practice could compromise experimental outcomes. In considering refinement, many researchers feel there are situations where animals should not receive pain relieving drugs because it may compromise scientific outcomes, although there was strong support for the Three Rs strategy of conducting animal welfare-related pilot studies, which were viewed as useful for both animal welfare and experimental design. Participants were not opposed to being offered "assistance" to implement the Three Rs, so long as the input is provided in a collegial manner, and from individuals who are perceived as experts. It may be useful for animal use policymakers to consider what steps are needed to make replacement a more feasible goal. In addition, initiatives that offer researchers greater practical and logistical support with Three Rs implementation may be useful. Encouragement and financial support for Three Rs initiatives may result in valuable contributions to Three Rs knowledge and improve welfare for animals used in science.

  19. Survey of Canadian Animal-Based Researchers' Views on the Three Rs: Replacement, Reduction and Refinement

    PubMed Central

    Fenwick, Nicole; Danielson, Peter; Griffin, Gilly

    2011-01-01

    The ‘Three Rs’ tenet (replacement, reduction, refinement) is a widely accepted cornerstone of Canadian and international policies on animal-based science. The Canadian Council on Animal Care (CCAC) initiated this web-based survey to obtain greater understanding of ‘principal investigators’ and ‘other researchers’ (i.e. graduate students, post-doctoral researchers etc.) views on the Three Rs, and to identify obstacles and opportunities for continued implementation of the Three Rs in Canada. Responses from 414 participants indicate that researchers currently do not view the goal of replacement as achievable. Researchers prefer to use enough animals to ensure quality data is obtained rather than using the minimum and potentially waste those animals if a problem occurs during the study. Many feel that they already reduce animal numbers as much as possible and have concerns that further reduction may compromise research. Most participants were ambivalent about re-use, but expressed concern that the practice could compromise experimental outcomes. In considering refinement, many researchers feel there are situations where animals should not receive pain relieving drugs because it may compromise scientific outcomes, although there was strong support for the Three Rs strategy of conducting animal welfare-related pilot studies, which were viewed as useful for both animal welfare and experimental design. Participants were not opposed to being offered “assistance” to implement the Three Rs, so long as the input is provided in a collegial manner, and from individuals who are perceived as experts. It may be useful for animal use policymakers to consider what steps are needed to make replacement a more feasible goal. In addition, initiatives that offer researchers greater practical and logistical support with Three Rs implementation may be useful. Encouragement and financial support for Three Rs initiatives may result in valuable contributions to Three Rs knowledge and improve welfare for animals used in science. PMID:21857928

  20. Applying an Empirical Hydropathic Forcefield in Refinement May Improve Low-Resolution Protein X-Ray Crystal Structures

    PubMed Central

    Koparde, Vishal N.; Scarsdale, J. Neel; Kellogg, Glen E.

    2011-01-01

    Background The quality of X-ray crystallographic models for biomacromolecules refined from data obtained at high-resolution is assured by the data itself. However, at low-resolution, >3.0 Å, additional information is supplied by a forcefield coupled with an associated refinement protocol. These resulting structures are often of lower quality and thus unsuitable for downstream activities like structure-based drug discovery. Methodology An X-ray crystallography refinement protocol that enhances standard methodology by incorporating energy terms from the HINT (Hydropathic INTeractions) empirical forcefield is described. This protocol was tested by refining synthetic low-resolution structural data derived from 25 diverse high-resolution structures, and referencing the resulting models to these structures. The models were also evaluated with global structural quality metrics, e.g., Ramachandran score and MolProbity clashscore. Three additional structures, for which only low-resolution data are available, were also re-refined with this methodology. Results The enhanced refinement protocol is most beneficial for reflection data at resolutions of 3.0 Å or worse. At the low-resolution limit, ≥4.0 Å, the new protocol generated models with Cα positions that have RMSDs that are 0.18 Å more similar to the reference high-resolution structure, Ramachandran scores improved by 13%, and clashscores improved by 51%, all in comparison to models generated with the standard refinement protocol. The hydropathic forcefield terms are at least as effective as Coulombic electrostatic terms in maintaining polar interaction networks, and significantly more effective in maintaining hydrophobic networks, as synthetic resolution is decremented. Even at resolutions ≥4.0 Å, these latter networks are generally native-like, as measured with a hydropathic interactions scoring tool. PMID:21246043

  1. Whole grains, refined grains and fortified refined grains: What's the difference?

    PubMed

    Slavin, J L

    2000-09-01

    Dietary guidance universally supports the importance of grains in the diet. The United States Department of Agriculture pyramid suggests that Americans consume from six to 11 servings of grains per day, with three of these servings being whole grain products. Whole grain contains the bran, germ and endosperm, while refined grain includes only endosperm. Both refined and whole grains can be fortified with nutrients to improve the nutrient profile of the product. Most grains consumed in developed countries are subjected to some type of processing to optimize flavor and provide shelf-stable products. Grains provide important sources of dietary fibre, plant protein, phytochemicals and needed vitamins and minerals. Additionally, in the United States grains have been chosen as the best vehicle to fortify our diets with vitamins and minerals that are typically in short supply. These nutrients include iron, thiamin, niacin, riboflavin and, more recently, folic acid and calcium. Grains contain antioxidants, including vitamins, trace minerals and non-nutrients such as phenolic acids, lignans and phytic acid, which are thought to protect against cardiovascular disease and cancer. Additionally, grains are our most dependable source of phytoestrogens, plant compounds known to protect against cancers such as breast and prostate. Grains are rich sources of oligosaccharides and resistant starch, carbohydrates that function like dietary fibre and enhance the intestinal environment and help improve immune function. Epidemiological studies find that whole grains are more protective than refined grains in the prevention of chronic disease, although instruments to define intake of refined, whole and fortified grains are limited. Nutritional guidance should support whole grain products over refined, with fortification of nutrients improving the nutrient profile of both refined and whole grain products.

  2. Comparison of three current sets of electrocardiographic interpretation criteria for use in screening athletes

    PubMed Central

    Riding, Nathan R; Sheikh, Nabeel; Adamuz, Carmen; Watt, Victoria; Farooq, Abdulaziz; Whyte, Gregory P; George, Keith P; Drezner, Jonathan A; Sharma, Sanjay; Wilson, Mathew G

    2015-01-01

    Background An increasing number of sporting bodies report unacceptably high levels of false-positive ECGs when undertaking pre-participation cardiac screening. To address this issue, modified ECG interpretation criteria have become available for use within athletes. Objective This study assessed the accuracy of the new 2014 ‘Refined Criteria’ against the 2013 Seattle Criteria and the 2010 European Society of Cardiology (ESC) recommendations in a cohort of Arabic, black and Caucasian athletes. Methods 2491 male athletes (1367 Arabic, 748 black and 376 Caucasian) undertook pre-participation screening including a 12-lead ECG, with further investigation(s) upon indication. Results Ten athletes (0.4%) were identified with cardiac pathology; seven with hypertrophic cardiomyopathy (HCM; five black and two Arabic) and three Arabs with Wolff–Parkinson–White syndrome (WPW). All three ECG criteria were 100% sensitive identifying all cases of HCM and WPW. The 2014 Refined Criteria reduced (p<0.0001) the prevalence of an abnormal ECG to 5.3% vs 11.6% (Seattle Criteria) and 22.3% (2010 ESC recommendations). The 2014 Refined Criteria significantly (p<0.0001) improved specificity (94.0%) across all ethnicities compared with the Seattle Criteria (87.5%) and ESC recommendations (76.6%). Black athletes continue to present a higher prevalence (p<0.0001) of abnormal ECGs compared with Arabic and Caucasian athletes (10% vs 3.6% and 2.1%). Conclusions The 2014 Refined Criteria for athlete ECG interpretation outperformed both the 2013 Seattle Criteria and the 2010 ESC recommendations by significantly reducing the number of false-positive ECGs in Arabic, black and Caucasian athletes while maintaining 100% sensitivity for serious cardiac pathologies. PMID:25502812

  3. MODFLOW–LGR—Documentation of ghost node local grid refinement (LGR2) for multiple areas and the boundary flow and head (BFH2) package

    USGS Publications Warehouse

    Mehl, Steffen W.; Hill, Mary C.

    2013-01-01

    This report documents the addition of ghost node Local Grid Refinement (LGR2) to MODFLOW-2005, the U.S. Geological Survey modular, transient, three-dimensional, finite-difference groundwater flow model. LGR2 provides the capability to simulate groundwater flow using multiple block-shaped higher-resolution local grids (a child model) within a coarser-grid parent model. LGR2 accomplishes this by iteratively coupling separate MODFLOW-2005 models such that heads and fluxes are balanced across the grid-refinement interface boundary. LGR2 can be used in two-and three-dimensional, steady-state and transient simulations and for simulations of confined and unconfined groundwater systems. Traditional one-way coupled telescopic mesh refinement methods can have large, often undetected, inconsistencies in heads and fluxes across the interface between two model grids. The iteratively coupled ghost-node method of LGR2 provides a more rigorous coupling in which the solution accuracy is controlled by convergence criteria defined by the user. In realistic problems, this can result in substantially more accurate solutions and require an increase in computer processing time. The rigorous coupling enables sensitivity analysis, parameter estimation, and uncertainty analysis that reflects conditions in both model grids. This report describes the method used by LGR2, evaluates accuracy and performance for two-and three-dimensional test cases, provides input instructions, and lists selected input and output files for an example problem. It also presents the Boundary Flow and Head (BFH2) Package, which allows the child and parent models to be simulated independently using the boundary conditions obtained through the iterative process of LGR2.

  4. Relationship between using conceptual comprehension of academic material and thinking abstractly about global life issues.

    PubMed

    Westman, A S; Kamoo, R L

    1990-04-01

    The study explored whether more frequent use of conceptual comprehension of academic material generalized to greater use of abstract thinking about global life issues, such as death, goal in life, marriage, AIDS, etc. Undergraduate and graduate students (28 men and 61 women) voluntarily completed a questionnaire which assessed their conceptualizations using three indices. These were an intelligence scale and two learning style indices, namely, Deep Processing and Elaborative Processing of R. R. Schmeck. Also assessed were their levels of abstract thinking about Death Issues and about Other Real Life Issues, and their Denial of Death and their Denial of Dying. All three indices of conceptualization correlated with thinking more abstractly about Other Real Life Issues, but only Elaborative Processing correlated with thinking more abstractly about Death Issues. None of the three indices correlated with Denial of Death or Denial of Dying. It appears conceptualization skills were selectively generalized.

  5. Testing MODFLOW-LGR for simulating flow around buried Quaternary valleys - synthetic test cases

    NASA Astrophysics Data System (ADS)

    Vilhelmsen, T. N.; Christensen, S.

    2009-12-01

    In this study the Local Grid Refinement (LGR) method developed for MODFLOW-2005 (Mehl and Hill, 2005) is utilized to describe groundwater flow in areas containing buried Quaternary valley structures. The tests are conducted as comparative analysis between simulations run with a globally refined model, a locally refined model, and a globally coarse model, respectively. The models vary from simple one layer models to more complex ones with up to 25 model layers. The comparisons of accuracy are conducted within the locally refined area and focus on water budgets, simulated heads, and simulated particle traces. Simulations made with the globally refined model are used as reference (regarded as “true” values). As expected, for all test cases the application of local grid refinement resulted in more accurate results than when using the globally coarse model. A significant advantage of utilizing MODFLOW-LGR was that it allows increased numbers of model layers to better resolve complex geology within local areas. This resulted in more accurate simulations than when using either a globally coarse model grid or a locally refined model with lower geological resolution. Improved accuracy in the latter case could not be expected beforehand because difference in geological resolution between the coarse parent model and the refined child model contradicts the assumptions of the Darcy weighted interpolation used in MODFLOW-LGR. With respect to model runtimes, it was sometimes found that the runtime for the locally refined model is much longer than for the globally refined model. This was the case even when the closure criteria were relaxed compared to the globally refined model. These results are contradictory to those presented by Mehl and Hill (2005). Furthermore, in the complex cases it took some testing (model runs) to identify the closure criteria and the damping factor that secured convergence, accurate solutions, and reasonable runtimes. For our cases this is judged to be a serious disadvantage of applying MODFLOW-LGR. Another disadvantage in the studied cases was that the MODFLOW-LGR results proved to be somewhat dependent on the correction method used at the parent-child model interface. This indicates that when applying MODFLOW-LGR there is a need for thorough and case-specific considerations regarding choice of correction method. References: Mehl, S. and M. C. Hill (2005). "MODFLOW-2005, THE U.S. GEOLOGICAL SURVEY MODULAR GROUND-WATER MODEL - DOCUMENTATION OF SHARED NODE LOCAL GRID REFINEMENT (LGR) AND THE BOUNDARY FLOW AND HEAD (BFH) PACKAGE " U.S. Geological Survey Techniques and Methods 6-A12

  6. Expanding the three Rs to meet new challenges in humane animal experimentation.

    PubMed

    Schuppli, Catherine A; Fraser, David; McDonald, Michael

    2004-11-01

    The Three Rs are the main principles used by Animal Ethics Committees in the governance of animal experimentation, but they appear not to cover some ethical issues that arise today. These include: a) claims that certain species should be exempted on principle from harmful research; b) increased emphasis on enhancing quality of life of research animals; c) research involving genetically modified (GM) animals; and d) animals bred as models of disease. In some cases, the Three Rs can be extended to cover these developments. The burgeoning use of GM animals in science calls for new forms of reduction through improved genetic modification technology, plus continued attention to alternative approaches and cost-benefit analyses that include the large numbers of animals involved indirectly. The adoption of more expanded definitions of refinement that go beyond minimising distress will capture concerns for enhancing the quality of life of animals through improved husbandry and handling. Targeting refinement to the unpredictable effects of gene modification may be difficult; in these cases, careful attention to monitoring and endpoints are the obvious options. Refinement can also include sharing data about the welfare impacts of gene modifications, and modelling earlier stages of disease, in order to reduce the potential suffering caused to disease models. Other issues may require a move beyond the Three Rs. Certain levels of harm, or numbers and use of certain species, may be unacceptable, regardless of potential benefits. This can be addressed by supplementing the utilitarian basis of the Three Rs with principles based on deontological and relational ethics. The Three Rs remain very useful, but they require thoughtful interpretation and expansion in order for Animal Ethics Committees to address the full range of issues in animal-based research.

  7. A tensor Banach algebra approach to abstract kinetic equations

    NASA Astrophysics Data System (ADS)

    Greenberg, W.; van der Mee, C. V. M.

    The study deals with a concrete algebraic construction providing the existence theory for abstract kinetic equation boundary-value problems, when the collision operator A is an accretive finite-rank perturbation of the identity operator in a Hilbert space H. An algebraic generalization of the Bochner-Phillips theorem is utilized to study solvability of the abstract boundary-value problem without any regulatory condition. A Banach algebra in which the convolution kernel acts is obtained explicitly, and this result is used to prove a perturbation theorem for bisemigroups, which then plays a vital role in solving the initial equations.

  8. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting

    PubMed Central

    Cereceda-Sánchez, Francisco José; Molina-Mula, Jesús

    2017-01-01

    ABSTRACT Objective: to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. Methods: in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. Results: 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2), eight with bicarbonate (HCO3), three with lactate, and four with blood pH. Conclusions: most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way. PMID:28513767

  9. Systems and methods for predicting materials properties

    DOEpatents

    Ceder, Gerbrand; Fischer, Chris; Tibbetts, Kevin; Morgan, Dane; Curtarolo, Stefano

    2007-11-06

    Systems and methods for predicting features of materials of interest. Reference data are analyzed to deduce relationships between the input data sets and output data sets. Reference data includes measured values and/or computed values. The deduced relationships can be specified as equations, correspondences, and/or algorithmic processes that produce appropriate output data when suitable input data is used. In some instances, the output data set is a subset of the input data set, and computational results may be refined by optionally iterating the computational procedure. To deduce features of a new material of interest, a computed or measured input property of the material is provided to an equation, correspondence, or algorithmic procedure previously deduced, and an output is obtained. In some instances, the output is iteratively refined. In some instances, new features deduced for the material of interest are added to a database of input and output data for known materials.

  10. The period history of the X-ray pulsar in MSH 15-52

    NASA Technical Reports Server (NTRS)

    Weisskopf, M. C.; Elsner, R. F.; Darbo, W.; Leahy, D.; Naranan, S.; Harnden, F. R.; Seward, F. D.; Sutherland, P. G.; Grindlay, J. E.

    1983-01-01

    New and refined mesurements of the pulse period of the X-ray pulsar in the supernova remnant MSH 15-52 are presented. The data were obtained with the Monitor proportional Counter on board the HEAO 2 observatory. The period measurements were obtained by analyzing pulse arrival times determined by cross-correlating sample pulse profiles with a master template. The period history for the source and a representative 0.15 s X-ray light curve are shown. The X-ray measurements alone lead to a refined value of the period derivative of (1.5382 + or -0.0024) x 10 to the -12th s/s, while including the results of more recent radio observations leads to a value of (1.54029 + or -0.00095) x 10 to the -12th s/s. These results indicate a hard-point source surrounded by diffuse nebular emission.

  11. FUN3D Grid Refinement and Adaptation Studies for the Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Vasta, Veer; Carlson, Jan-Renee; Park, Mike; Mineck, Raymond E.

    2010-01-01

    This paper presents grid refinement and adaptation studies performed in conjunction with computational aeroelastic analyses of the Ares crew launch vehicle (CLV). The unstructured grids used in this analysis were created with GridTool and VGRID while the adaptation was performed using the Computational Fluid Dynamic (CFD) code FUN3D with a feature based adaptation software tool. GridTool was developed by ViGYAN, Inc. while the last three software suites were developed by NASA Langley Research Center. The feature based adaptation software used here operates by aligning control volumes with shock and Mach line structures and by refining/de-refining where necessary. It does not redistribute node points on the surface. This paper assesses the sensitivity of the complex flow field about a launch vehicle to grid refinement. It also assesses the potential of feature based grid adaptation to improve the accuracy of CFD analysis for a complex launch vehicle configuration. The feature based adaptation shows the potential to improve the resolution of shocks and shear layers. Further development of the capability to adapt the boundary layer and surface grids of a tetrahedral grid is required for significant improvements in modeling the flow field.

  12. Triclinic lysozyme at 0.65 angstrom resolution.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J.; Dauter, M.; Alkire, R.

    The crystal structure of triclinic hen egg-white lysozyme (HEWL) has been refined against diffraction data extending to 0.65 {angstrom} resolution measured at 100 K using synchrotron radiation. Refinement with anisotropic displacement parameters and with the removal of stereochemical restraints for the well ordered parts of the structure converged with a conventional R factor of 8.39% and an R{sub free} of 9.52%. The use of full-matrix refinement provided an estimate of the variances in the derived parameters. In addition to the 129-residue protein, a total of 170 water molecules, nine nitrate ions, one acetate ion and three ethylene glycol molecules weremore » located in the electron-density map. Eight sections of the main chain and many side chains were modeled with alternate conformations. The occupancies of the water sites were refined and this step is meaningful when assessed by use of the free R factor. A detailed description and comparison of the structure are made with reference to the previously reported triclinic HEWL structures refined at 0.925 {angstrom} (at the low temperature of 120 K) and at 0.95 {angstrom} resolution (at room temperature).« less

  13. A new insight into ductile fracture of ultrafine-grained Al-Mg alloys.

    PubMed

    Yu, Hailiang; Tieu, A Kiet; Lu, Cheng; Liu, Xiong; Liu, Mao; Godbole, Ajit; Kong, Charlie; Qin, Qinghua

    2015-04-08

    It is well known that when coarse-grained metals undergo severe plastic deformation to be transformed into nano-grained metals, their ductility is reduced. However, there are no ductile fracture criteria developed based on grain refinement. In this paper, we propose a new relationship between ductile fracture and grain refinement during deformation, considering factors besides void nucleation and growth. Ultrafine-grained Al-Mg alloy sheets were fabricated using different rolling techniques at room and cryogenic temperatures. It is proposed for the first time that features of the microstructure near the fracture surface can be used to explain the ductile fracture post necking directly. We found that as grains are refined to a nano size which approaches the theoretical minimum achievable value, the material becomes brittle at the shear band zone. This may explain the tendency for ductile fracture in metals under plastic deformation.

  14. A new insight into ductile fracture of ultrafine-grained Al-Mg alloys

    PubMed Central

    Yu, Hailiang; Tieu, A. Kiet; Lu, Cheng; Liu, Xiong; Liu, Mao; Godbole, Ajit; Kong, Charlie; Qin, Qinghua

    2015-01-01

    It is well known that when coarse-grained metals undergo severe plastic deformation to be transformed into nano-grained metals, their ductility is reduced. However, there are no ductile fracture criteria developed based on grain refinement. In this paper, we propose a new relationship between ductile fracture and grain refinement during deformation, considering factors besides void nucleation and growth. Ultrafine-grained Al-Mg alloy sheets were fabricated using different rolling techniques at room and cryogenic temperatures. It is proposed for the first time that features of the microstructure near the fracture surface can be used to explain the ductile fracture post necking directly. We found that as grains are refined to a nano size which approaches the theoretical minimum achievable value, the material becomes brittle at the shear band zone. This may explain the tendency for ductile fracture in metals under plastic deformation. PMID:25851228

  15. Formation and reduction of 3-monochloropropane-1,2-diol esters in peanut oil during physical refining.

    PubMed

    Li, Chang; Li, Linyan; Jia, Hanbing; Wang, Yuting; Shen, Mingyue; Nie, Shaoping; Xie, Mingyong

    2016-05-15

    In the present study, lab-scale physical refining processes were investigated for their effects on the formation of 3-monochloropropane-1,2-diol (3-MCPD) esters. The potential precursors, partial acylglycerols and chlorines were determined before each refining step. 3-MCPD esters were not detected in degummed and bleached oil when the crude oils were extracted by solvent. While in the hot squeezed crude oils, 3-MCPD esters were detected with low amounts. 3-MCPD esters were generated with maximum values in 1-1.5h at a certain deodorizing temperature (220-260°C). Chlorine seemed to be more effective precursor than partial acylglycerol. By washing bleached oil before deodorization with ethanol solution, the precursors were removed partially and the content of 3-MCPD esters decreased to some extent accordingly. Diacetin was found to reduce 3-MCPD esters effectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A new insight into ductile fracture of ultrafine-grained Al-Mg alloys

    NASA Astrophysics Data System (ADS)

    Yu, Hailiang; Tieu, A. Kiet; Lu, Cheng; Liu, Xiong; Liu, Mao; Godbole, Ajit; Kong, Charlie; Qin, Qinghua

    2015-04-01

    It is well known that when coarse-grained metals undergo severe plastic deformation to be transformed into nano-grained metals, their ductility is reduced. However, there are no ductile fracture criteria developed based on grain refinement. In this paper, we propose a new relationship between ductile fracture and grain refinement during deformation, considering factors besides void nucleation and growth. Ultrafine-grained Al-Mg alloy sheets were fabricated using different rolling techniques at room and cryogenic temperatures. It is proposed for the first time that features of the microstructure near the fracture surface can be used to explain the ductile fracture post necking directly. We found that as grains are refined to a nano size which approaches the theoretical minimum achievable value, the material becomes brittle at the shear band zone. This may explain the tendency for ductile fracture in metals under plastic deformation.

  17. Abstractness and emotionality values for 398 English words.

    PubMed

    Guido, Gianluigi; Provenzano, Maria Rosaria

    2004-06-01

    This study is aimed to replicate Vikis-Freibergs' classic study (1976) on the values of vividness for French words. Vividness resulted from the concreteness and the emotionality values of words, here defined, respectively, as referring to something that can be experienced through senses and that can arouse pleasant or unpleasant emotions. 398 English words were rated on two different scales, Abstractness and Emotionality, by a group of English native speakers and also by a group of Italian subjects who used English as a second language. Results show a low correlation between the concreteness and emotionality ratings in line with Vikis-Freibergs' previous study of French words (1976). A negative correlation between Abstractness and Emotionality was observed for British data but a slightly positive correlation for the Italian data.

  18. The future of cerebral surgery: a kaleidoscope of opportunities.

    PubMed

    Elder, James B; Hoh, Daniel J; Oh, Bryan C; Heller, A Chris; Liu, Charles Y; Apuzzo, Michael L J

    2008-06-01

    The emerging future of cerebral surgery will witness the refined evolution of current techniques, as well as the introduction of numerous novel concepts. Clinical practice and basic science research will benefit greatly from their application. The sum of these efforts will result in continued minimalism and improved accuracy and efficiency of neurosurgical diagnostic and therapeutic methodologies.Initially, the refinement of current technologies will further enhance various aspects of cerebral surgery. Advances in computing power and information technology will speed data acquisition, storage, and transfer. Miniaturization of current devices will impact diverse areas, such as modulation of endoscopy and endovascular techniques. The increased penetrance of surgical technologies such as stereotactic radiosurgery, neuronavigation, intraoperative imaging, and implantable electrodes for neurodegenerative disorders and epilepsy will enhance the knowledge and experience in these areas and facilitate refinements and advances in these technologies. Further into the future, technologies that are currently relatively remote to surgical events will fundamentally alter the complexity and scale at which a neurological disease may be treated or investigated. Seemingly futuristic concepts will become ubiquitous in the daily experience of the neurosurgeon. These include diverse fields such as nanotechnology, virtual reality, and robotics. Ultimately, combining advances in multiple fields will yield progress in diverse realms such as brain tumor therapy, neuromodulation for psychiatric diseases, and neuroprosthetics. Operating room equipment and design will benefit from each of the aforementioned advances. In this work, we discuss new developments in three parts. In Part I, concepts in minimalism important for future cerebral surgery are discussed. These include concrete and abstract ideas in miniaturization, as well as recent and future work in microelectromechanical systems and nanotechnology. Part II presents advances in computational sciences and technological fields dependent on these developments. Future breakthroughs in the components of the "computer," including data storage, electrical circuitry, and computing hardware and techniques, are discussed. Additionally, important concepts in the refinement of virtual environments and the brain-machine interface are presented, as their incorporation into cerebral surgery is closely linked to advances in computing and electronics. Finally, Part III offers insights into the future evolution of surgical and nonsurgical diagnostic and therapeutic modalities that are important for the future cerebral surgeon. A number of topics relevant to cerebral surgery are discussed, including the operative environment, imaging technologies, endoscopy, robotics, neuromodulation, stem cell therapy, radiosurgery, and technical methods of restoration of neural function. Cerebral surgery in the near and distant future will reflect the application of these emerging technologies. As this article indicates, the key to maximizing the impact of these advancements in the clinical arena is continued collaboration between scientists and neurosurgeons, as well as the emergence of a neurosurgeon whose scientific grounding and technical focus are far removed from those of his predecessors.

  19. Vanishing Point Extraction and Refinement for Robust Camera Calibration

    PubMed Central

    Tsai, Fuan

    2017-01-01

    This paper describes a flexible camera calibration method using refined vanishing points without prior information. Vanishing points are estimated from human-made features like parallel lines and repeated patterns. With the vanishing points extracted from the three mutually orthogonal directions, the interior and exterior orientation parameters can be further calculated using collinearity condition equations. A vanishing point refinement process is proposed to reduce the uncertainty caused by vanishing point localization errors. The fine-tuning algorithm is based on the divergence of grouped feature points projected onto the reference plane, minimizing the standard deviation of each of the grouped collinear points with an O(1) computational complexity. This paper also presents an automated vanishing point estimation approach based on the cascade Hough transform. The experiment results indicate that the vanishing point refinement process can significantly improve camera calibration parameters and the root mean square error (RMSE) of the constructed 3D model can be reduced by about 30%. PMID:29280966

  20. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  1. MAIN software for density averaging, model building, structure refinement and validation

    PubMed Central

    Turk, Dušan

    2013-01-01

    MAIN is software that has been designed to interactively perform the complex tasks of macromolecular crystal structure determination and validation. Using MAIN, it is possible to perform density modification, manual and semi-automated or automated model building and rebuilding, real- and reciprocal-space structure optimization and refinement, map calculations and various types of molecular structure validation. The prompt availability of various analytical tools and the immediate visualization of molecular and map objects allow a user to efficiently progress towards the completed refined structure. The extraordinary depth perception of molecular objects in three dimensions that is provided by MAIN is achieved by the clarity and contrast of colours and the smooth rotation of the displayed objects. MAIN allows simultaneous work on several molecular models and various crystal forms. The strength of MAIN lies in its manipulation of averaged density maps and molecular models when noncrystallographic symmetry (NCS) is present. Using MAIN, it is possible to optimize NCS parameters and envelopes and to refine the structure in single or multiple crystal forms. PMID:23897458

  2. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    PubMed Central

    Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.

    2014-01-01

    The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083

  3. Lithium conductivity in an Li-bearing double-ring silicate mineral, sogdianite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S.-H.; Hoelzel, M.; Forschungsneutronenquelle Heinz Maier-Leibnitz

    The crystal structure of an Li-bearing double-ring silicate mineral, sogdianite ((Zr{sub 1.18}Fe{sup 3+} {sub 0.55}Ti{sub 0.24}Al{sub 0.03})(Y {sub 1.64},Na{sub 0.36})K{sub 0.85}[Li{sub 3}Si{sub 12}O{sub 30}], P6/mcc, a{approx}10.06 A, c{approx}14.30 A, Z=2), was investigated by neutron powder diffraction from 300 up to 1273 K. Rietveld refinements of displacement parameters revealed high anisotropic Li motions perpendicular to the crystallographic c-axis, indicating an exchange process between tetrahedral T2 and octahedral A sites. AC impedance spectra of a sogdianite single crystal (0.04x0.09x0.25 cm{sup 3}) show that the material is an ionic conductor with conductivity values of {sigma}=4.1x10{sup -5} S cm{sup -1} at 923 K andmore » 1.2x10{sup -3} S cm{sup -1} at 1219 K perpendicular to the c-axis, involving two relaxation processes with activation energies of 1.26(3) and 1.08(3) eV, respectively. - Graphical abstract: Structure of sogdianite with atomic displacement parameter (ADP) ellipsoids at 1273 K.« less

  4. Accountability for Community-Based Programs for the Seriously Ill

    PubMed Central

    Montgomery, Russ; Valuck, Tom; Corrigan, Janet; Meier, Diane E.; Kelley, Amy; Curtis, J. Randall; Engelberg, Ruth

    2018-01-01

    Abstract Innovation is needed to improve care of the seriously ill, and there are important opportunities as we transition from a volume- to value-based payment system. Not all seriously ill are dying; some recover, while others are persistently functionally impaired. While we innovate in service delivery and payment models for the seriously ill, it is important that we concurrently develop accountability that ensures a focus on high-quality care rather than narrowly focusing on cost containment. The Gordon and Betty Moore Foundation convened a meeting of 45 experts to arrive at guiding principles for measurement, create a starter measurement set, specify a proposed definition of the denominator and its refinement, and identify research priorities for future implementation of the accountability system. A series of articles written by experts provided the basis for debate and guidance in formulating a path forward to develop an accountability system for community-based programs for the seriously ill, outlined in this article. As we innovate in existing population-based payment programs such as Medicare Advantage and develop new alternative payment models, it is important and urgent that we develop the foundation for accountability along with actionable measures so that the healthcare system ensures high-quality person- and family-centered care for persons who are seriously ill. PMID:29195052

  5. Multi objective multi refinery optimization with environmental and catastrophic failure effects objectives

    NASA Astrophysics Data System (ADS)

    Khogeer, Ahmed Sirag

    2005-11-01

    Petroleum refining is a capital-intensive business. With stringent environmental regulations on the processing industry and declining refining margins, political instability, increased risk of war and terrorist attacks in which refineries and fuel transportation grids may be targeted, higher pressures are exerted on refiners to optimize performance and find the best combination of feed and processes to produce salable products that meet stricter product specifications, while at the same time meeting refinery supply commitments and of course making profit. This is done through multi objective optimization. For corporate refining companies and at the national level, Intea-Refinery and Inter-Refinery optimization is the second step in optimizing the operation of the whole refining chain as a single system. Most refinery-wide optimization methods do not cover multiple objectives such as minimizing environmental impact, avoiding catastrophic failures, or enhancing product spec upgrade effects. This work starts by carrying out a refinery-wide, single objective optimization, and then moves to multi objective-single refinery optimization. The last step is multi objective-multi refinery optimization, the objectives of which are analysis of the effects of economic, environmental, product spec, strategic, and catastrophic failure. Simulation runs were carried out using both MATLAB and ASPEN PIMS utilizing nonlinear techniques to solve the optimization problem. The results addressed the need to debottleneck some refineries or transportation media in order to meet the demand for essential products under partial or total failure scenarios. They also addressed how importing some high spec products can help recover some of the losses and what is needed in order to accomplish this. In addition, the results showed nonlinear relations among local and global objectives for some refineries. The results demonstrate that refineries can have a local multi objective optimum that does not follow the same trends as either global or local single objective optimums. Catastrophic failure effects on refinery operations and on local objectives are more significant than environmental objective effects, and changes in the capacity or the local objectives follow a discrete behavioral pattern, in contrast to environmental objective cases in which the effects are smoother. (Abstract shortened by UMI.)

  6. Troponin-only Manchester Acute Coronary Syndromes (T-MACS) decision aid: single biomarker re-derivation and external validation in three cohorts

    PubMed Central

    Body, Richard; Sperrin, Matthew; Lewis, Philip S; Burrows, Gillian; Carley, Simon; McDowell, Garry; Buchan, Iain; Greaves, Kim; Mackway-Jones, Kevin

    2017-01-01

    Background The original Manchester Acute Coronary Syndromes model (MACS) ‘rules in’ and ‘rules out’ acute coronary syndromes (ACS) using high sensitivity cardiac troponin T (hs-cTnT) and heart-type fatty acid binding protein (H-FABP) measured at admission. The latter is not always available. We aimed to refine and validate MACS as Troponin-only Manchester Acute Coronary Syndromes (T-MACS), cutting down the biomarkers to just hs-cTnT. Methods We present secondary analyses from four prospective diagnostic cohort studies including patients presenting to the ED with suspected ACS. Data were collected and hs-cTnT measured on arrival. The primary outcome was ACS, defined as prevalent acute myocardial infarction (AMI) or incident death, AMI or coronary revascularisation within 30 days. T-MACS was built in one cohort (derivation set) and validated in three external cohorts (validation set). Results At the ‘rule out’ threshold, in the derivation set (n=703), T-MACS had 99.3% (95% CI 97.3% to 99.9%) negative predictive value (NPV) and 98.7% (95.3%–99.8%) sensitivity for ACS, ‘ruling out’ 37.7% patients (specificity 47.6%, positive predictive value (PPV) 34.0%). In the validation set (n=1459), T-MACS had 99.3% (98.3%–99.8%) NPV and 98.1% (95.2%–99.5%) sensitivity, ‘ruling out’ 40.4% (n=590) patients (specificity 47.0%, PPV 23.9%). T-MACS would ‘rule in’ 10.1% and 4.7% patients in the respective sets, of which 100.0% and 91.3% had ACS. C-statistics for the original and refined rules were similar (T-MACS 0.91 vs MACS 0.90 on validation). Conclusions T-MACS could ‘rule out’ ACS in 40% of patients, while ‘ruling in’ 5% at highest risk using a single hs-cTnT measurement on arrival. As a clinical decision aid, T-MACS could therefore help to conserve healthcare resources. PMID:27565197

  7. The emergence of levothyroxine as a treatment for hypothyroidism.

    PubMed

    Hennessey, James V

    2017-01-01

    To describe the historical refinements, understanding of physiology and clinical outcomes observed with thyroid hormone replacement strategies. A Medline search was initiated using the search terms, levothyroxine, thyroid hormone history, levothyroxine mono therapy, thyroid hormone replacement, combination LT4 therapy, levothyroxine Bioequivalence. Pertinent articles of interest were identified by title and where available abstract for further review. Additional references were identified in the course of review of the literature identified. Physicians have intervened in cases of thyroid dysfunction for more than two millennia. Ingestion of animal thyroid derived preparations has been long described but only scientifically documented for the last 130 years. Refinements in hormone preparation, pharmaceutical production and regulation continue to this day. The literature provides documentation of physiologic, pathologic and clinical outcomes which have been reported and continuously updated. Recommendations for effective and safe use of these hormones for reversal of patho-physiology associated with hypothyroidism and the relief of symptoms of hypothyroidism has documented a progressive refinement in our understanding of thyroid hormone use. Studies of thyroid hormone metabolism, action and pharmacokinetics have allowed evermore focused recommendations for use in clinical practice. Levothyroxine mono-therapy has emerged as the therapy of choice of all recent major guidelines. The evolution of thyroid hormone therapies has been significant over an extended period of time. Thyroid hormone replacement is very useful in the treatment of those with hypothyroidism. All of the most recent guidelines of major endocrine societies recommend levothyroxine mono-therapy for first line use in hypothyroidism.

  8. Syntheses and characterization of elpasolite-type ammonium alkali metal hexafluorometallates(III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mi Jinxiao; Key Laboratory of New Processing Technology for Nonferrous Metals and Materials; Luo Shuming

    Crystal structures of three fluorides (NH{sub 4}){sub 2}NaFeF{sub 6}, (Fe), (NH{sub 4}){sub 2}NaGaF{sub 6}, (Ga), and (NH{sub 4}){sub 2}NaCrF{sub 6}, (Cr), as well as a substituted compound [(NH{sub 4}){sub 1-x}K{sub x}]{sub 2}KAlF{sub 6} (x{approx}0.17), (Al), have been refined using single-crystal and powder X-ray diffraction techniques. All these four ammonium hexafluorides have a cubic elpasolite-type structure and crystallize in the space group Fm3-bar m with lattice constants a=8.483(3), 8.450 (3), 8.4472(2) and 8.724(3) A for compounds (Fe), (Ga), (Cr) and (Al), respectively. The effective ionic radius of the ammonium ion calculated from those compounds has a mean value of R=1.729 Amore » for CN=12. An ultraviolet-visible absorption spectrum of (NH{sub 4}){sub 2}NaCrF{sub 6}, measured at room temperature, gives a crystal field (Dq=1575 cm{sup -1}) and Racah parameters (B=758 cm{sup -1} and C=3374 cm{sup -1}). Abnormal anisotropic thermal parameters of fluorine atoms have been observed in the compound (Al), and interpreted to arise from four strong hydrogen bonds (F...H-N) that are distributed in a square form around each fluorine atom. - Graphical abstract: Abnormal anisotropic thermal parameters of fluorine atoms have been observed in the compound [(NH{sub 4}){sub 1-x}K{sub x}]{sub 2}KAlF{sub 6} (x{approx}0.17), and interpreted to arise from four strong hydrogen bonds (F...H-N) that are distributed in a square form around each fluorine atom. The endmembers' phase transitions at low temperature are believed to be caused by them.« less

  9. Adherence to outpatient epilepsy quality indicators at a tertiary epilepsy center.

    PubMed

    Pourdeyhimi, R; Wolf, B J; Simpson, A N; Martz, G U

    2014-10-01

    Quality indicators for the treatment of people with epilepsy were published in 2010. This is the first report of adherence to all measures in routine care of people with epilepsy at a level 4 comprehensive epilepsy center in the US. Two hundred patients with epilepsy were randomly selected from the clinics of our comprehensive epilepsy center, and all visits during 2011 were abstracted for documentation of adherence to the eight quality indicators. Alternative measures were constructed to evaluate failure of adherence. Detailed descriptions of all equations are provided. Objective measures (EEG, imaging) showed higher adherence than counseling measures (safety). Initial visits showed higher adherence. Variations in the interpretation of the quality measure result in different adherence values. Advanced practice providers and physicians had different adherence patterns. No patient-specific patterns of adherence were seen. This is the first report of adherence to all the epilepsy quality indicators for a sample of patients during routine care in a level 4 epilepsy center in the US. Overall adherence was similar to that previously reported on similar measures. Precise definitions of adherence equations are essential for accurate measurement. Complex measures result in lower adherence. Counseling measures showed low adherence, possibly highlighting a difference between practice and documentation. Adherence to the measures as written does not guarantee high quality care. The current quality indicators have value in the process of improving quality of care. Future approaches may be refined to eliminate complex measures and incorporate features linked to outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Hidden values in bauxite residue (red mud): Recovery of metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yanju; Naidu, Ravi, E-mail: ravi.naidu@unisa.edu.au

    Highlights: • Current iron recovery techniques using red mud are depicted. • Advantages and disadvantages exist in different recovering processes. • Economic and environmental friendly integrated usage of red mud is promising. - Abstract: Bauxite residue (red mud) is a hazardous waste generated from alumina refining industries. Unless managed properly, red mud poses significant risks to the local environment due to its extreme alkalinity and its potential impacts on surface and ground water quality. The ever-increasing generation of red mud poses significant challenges to the aluminium industries from management perspectives given the low proportion that are currently being utilized beneficially.more » Red mud, in most cases, contains elevated concentrations of iron in addition to aluminium, titanium, sodium and valuable rare earth elements. Given the scarcity of iron supply globally, the iron content of red mud has attracted increasing research interest. This paper presents a critical overview of the current techniques employed for iron recovery from red mud. Information on the recovery of other valuable metals is also reviewed to provide an insight into the full potential usage of red mud as an economic resource rather than a waste. Traditional hydrometallurgy and pyrometallurgy are being investigated continuously. However, in this review several new techniques are introduced that consider the process of iron recovery from red mud. An integrated process which can achieve multiple additional values from red mud is much preferred over the single process methods. The information provided here should help to improve the future management and utilization of red mud.« less

  11. Three-dimensional structure of Erwinia carotovora L-asparaginase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kislitsyn, Yu. A.; Kravchenko, O. V.; Nikonov, S. V.

    2006-10-15

    Three-dimensional structure of Erwinia carotovora L-asparaginase, which has antitumor activity and is used for the treatment of acute lymphoblastic leukemia, was solved at 3 A resolution and refined to R{sub cryst} = 20% and R{sub free} = 28%. Crystals of recombinant Erwinia carotovora L-asparaginase were grown by the hanging-drop vapor-diffusion method from protein solutions in a HEPES buffer (pH 6.5) and PEG MME 5000 solutions in a cacodylate buffer (pH 6.5) as the precipitant. Three-dimensional X-ray diffraction data were collected up to 3 A resolution from one crystal at room temperature. The structure was solved by the molecular replacement methodmore » using the coordinates of Erwinia chrysanthemi L-asparaginase as the starting model. The coordinates refined with the use of the CNS program package were deposited in the Protein Data Bank (PDB code 1ZCF)« less

  12. Effect of whole wheat flour on the quality, texture profile, and oxidation stability of instant fried noodles.

    PubMed

    Cao, Xinlei; Zhou, Sumei; Yi, Cuiping; Wang, Li; Qian, Haifeng; Zhang, Hui; Qi, Xiguang

    2017-12-01

    The effects of whole wheat flour (WWF) on pasting properties of instant fried noodle dry mix and quality of final product were investigated in this research. Refined wheat flour in the recipe for instant-fried noodle was replaced by WWF at different levels. The peak and final viscosities were significantly and negatively correlated to WWF substitution level. With increasing WWF level, the hardness, cohesiveness, adhesiveness, and resilience values of instant fried noodles decreased by 11.63, 16.23, 16.67, 20.00%, respectively. WWF darkened noodle's surface color and increased its oil content (26.63%). A porous and less uniformed structure of the WWF instant fried noodles was observed by a scanning electron microscope. Moreover, the WWF incorporation lowered peroxide values of the instant fried noodles during storage. In conclusion, even though the oil content increased, WWF was helpful to inhibit the oil oxidation and produce instant fried noodles with softer texture and less sticky surface. Refined wheat flour in the recipe for instant-fried noodle was replaced by whole wheat flour (WWF), which is rich in dietary fibers, vitamins, and other bioactive compounds. The addition of WWF delayed the retrogradation tendency of starch in the dry mix. WWF-added instant noodles had softer texture, less sticky surface, and lower peroxide value. Based on the results of this study, the refined wheat flour in the recipe for instant-fried noodle could be partially replaced by WWF to make noodles with better texture profile and higher consumer acceptance. © 2017 Wiley Periodicals, Inc.

  13. Determination of remodeling parameters for a strain-adaptive finite element model of the distal ulna.

    PubMed

    Neuert, Mark A C; Dunning, Cynthia E

    2013-09-01

    Strain energy-based adaptive material models are used to predict bone resorption resulting from stress shielding induced by prosthetic joint implants. Generally, such models are governed by two key parameters: a homeostatic strain-energy state (K) and a threshold deviation from this state required to initiate bone reformation (s). A refinement procedure has been performed to estimate these parameters in the femur and glenoid; this study investigates the specific influences of these parameters on resulting density distributions in the distal ulna. A finite element model of a human ulna was created using micro-computed tomography (µCT) data, initialized to a homogeneous density distribution, and subjected to approximate in vivo loading. Values for K and s were tested, and the resulting steady-state density distribution compared with values derived from µCT images. The sensitivity of these parameters to initial conditions was examined by altering the initial homogeneous density value. The refined model parameters selected were then applied to six additional human ulnae to determine their performance across individuals. Model accuracy using the refined parameters was found to be comparable with that found in previous studies of the glenoid and femur, and gross bone structures, such as the cortical shell and medullary canal, were reproduced. The model was found to be insensitive to initial conditions; however, a fair degree of variation was observed between the six specimens. This work represents an important contribution to the study of changes in load transfer in the distal ulna following the implementation of commercial orthopedic implants.

  14. Publication rates of the abstracts presented at the annual meeting of International Society for Pediatric Neurosurgery.

    PubMed

    Ekşi, Murat Şakir; Özcan-Ekşi, Emel Ece

    2018-01-19

    Publication of a study is the end point of the process to contribute to the literature and confirm the scientific value of the study. Publication rates of the abstracts presented at the annual meetings of neurosurgery have been studied, previously. However, publication rates of the abstracts presented at the annual meetings of pediatric neurosurgery have not been reported, yet. We evaluated abstracts presented at the 38th annual meeting of the International Society for Pediatric Neurosurgery (ISPN) held in South Korea, 2010. We conducted this cross-sectional study by reviewing the abstracts presented at the annual meeting of the ISPN, 2010. Titles and authors of the abstracts were surveyed using Google Scholar and PubMed/MEDLINE. Time to publication, origin of the study, journal name in which the study has been accepted and published, and type of study has been analyzed for each abstract. The abstract booklet included 235 abstracts, consisted of 128 oral presentations (54%) and 107 electronic posters (46%). Fifty-nine (46%) of the oral presentations were published in a peer-reviewed journal. Laboratory studies were more likely to be published when compared to the clinical studies (72 vs. 39%). Thirty-two (30%) of the electronic posters were published in peer-reviewed journals. Most of the published abstracts were from Asia and Europe. Most of the abstracts were published in Child's Nervous System and Journal of Neurosurgery: Pediatrics. Publication rates of the abstracts presented at annual meeting of the ISPN were comparable to the other similar congresses. Oral presentations were more likely to be published. High publication rates of the abstracts presented at the annual meeting of the ISPN suggested that the meeting had a high scientific value.

  15. Effects of Wheat and Oat-Based Whole Grain Foods on Serum Lipoprotein Size and Distribution in Overweight Middle Aged People: A Randomised Controlled Trial

    PubMed Central

    Tighe, Paula; Duthie, Garry; Brittenden, Julie; Vaughan, Nicholas; Mutch, William; Simpson, William G.; Duthie, Susan; Horgan, Graham W.; Thies, Frank

    2013-01-01

    Introduction Epidemiological studies suggest three daily servings of whole-grain foods (WGF) might lower cardiovascular disease risk, at least partly by lowering serum lipid levels. We have assessed the effects of consuming three daily portions of wholegrain food (provided as wheat or a mixture of wheat and oats) on lipoprotein subclass size and concentration in a dietary randomised controlled trial involving middle aged healthy individuals. Methods After a 4-week run-in period on a refined diet, volunteers were randomly allocated to a control (refined diet), wheat, or wheat + oats group for 12 weeks. Our servings were determined in order to significantly increase the intakes of non starch polysaccharides to the UK Dietary Reference Value of 18 g per day in the whole grain groups (18.5 g and 16.8 g per day in the wheat and wheat + oats groups respectively in comparison with 11.3 g per day in the control group). Outcome measures were serum lipoprotein subclasses' size and concentration. Habitual dietary intake was assessed prior and during the intervention. Of the 233 volunteers recruited, 24 withdrew and 3 were excluded. Results At baseline, significant associations were found between lipoprotein size and subclasses' concentrations and some markers of cardiovascular risk such as insulin resistance, blood pressure and serum Inter cellular adhesion molecule 1 concentration. Furthermore, alcohol and vitamin C intake were positively associated with an anti-atherogenic lipoprotein profile, with regards to lipoprotein size and subclasses' distribution. However, none of the interventions with whole grain affected lipoprotein size and profile. Conclusion Our results indicate that three portions of wholegrain foods, irrelevant of the type (wheat or oat-based) do not reduce cardiovascular risk by beneficially altering the size and distribution of lipoprotein subclasses. Trial Registration www.Controlled-Trials.com ISRCTN 27657880. PMID:23940575

  16. Craniometric Analysis of the Hindbrain and Craniocervical Junction of Chihuahua, Affenpinscher and Cavalier King Charles Spaniel Dogs With and Without Syringomyelia Secondary to Chiari-Like Malformation

    PubMed Central

    Kiviranta, Anna-Mariam; McFadyen, Angus K.; Jokinen, Tarja S.; La Ragione, Roberto M.; Rusbridge, Clare

    2017-01-01

    Objectives To characterize and compare the phenotypic variables of the hindbrain and craniocervical junction associated with syringomyelia (SM) in the Chihuahua, Affenpinscher and Cavalier King Charles Spaniel (CKCS). Method Analysis of 273 T1-weighted mid-sagittal DICOM sequences of the hindbrain and craniocervical junction from 99 Chihuahuas, 42 Affenpinschers and 132 CKCSs. The study compared 22 morphometric features (11 lines, eight angles and three ratios) of dogs with and without SM using refined techniques based on previous studies of the Griffon Bruxellois (GB) using Discriminant Function Analysis and ANOVA with post-hoc corrections. Results The analysis identified 14/22 significant traits for SM in the three dog breeds, five of which were identical to those reported for the GB and suggest inclusion of a common aetiology. One ratio, caudal fossa height to the length of the skull base extended to an imaginary point of alignment between the atlas and supraoccipital bones, was common to all three breeds (p values 0.029 to <0.001). Associated with SM were a reduced occipital crest and two acute changes in angulation i) ‘sphenoid flexure’ at the spheno-occipital synchondrosis ii) ‘cervical flexure’ at the foramen magnum allied with medulla oblongata elevation. Comparing dogs with and without SM, each breed had a unique trait: Chihuahua had a smaller angle between the dens, atlas and basioccipital bone (p value < 0.001); Affenpinschers had a smaller distance from atlas to dens (p value 0.009); CKCS had a shorter distance between the spheno-occipital synchondrosis and atlas (p value 0.007). Conclusion The selected morphometries successfully characterised conformational changes in the brain and craniocervical junction that might form the basis of a diagnostic tool for all breeds. The severity of SM involved a spectrum of abnormalities, incurred by changes in both angulation and size that could alter neural parenchyma compliance and/or impede cerebrospinal fluid channels. PMID:28121988

  17. Craniometric Analysis of the Hindbrain and Craniocervical Junction of Chihuahua, Affenpinscher and Cavalier King Charles Spaniel Dogs With and Without Syringomyelia Secondary to Chiari-Like Malformation.

    PubMed

    Knowler, Susan P; Kiviranta, Anna-Mariam; McFadyen, Angus K; Jokinen, Tarja S; La Ragione, Roberto M; Rusbridge, Clare

    2017-01-01

    To characterize and compare the phenotypic variables of the hindbrain and craniocervical junction associated with syringomyelia (SM) in the Chihuahua, Affenpinscher and Cavalier King Charles Spaniel (CKCS). Analysis of 273 T1-weighted mid-sagittal DICOM sequences of the hindbrain and craniocervical junction from 99 Chihuahuas, 42 Affenpinschers and 132 CKCSs. The study compared 22 morphometric features (11 lines, eight angles and three ratios) of dogs with and without SM using refined techniques based on previous studies of the Griffon Bruxellois (GB) using Discriminant Function Analysis and ANOVA with post-hoc corrections. The analysis identified 14/22 significant traits for SM in the three dog breeds, five of which were identical to those reported for the GB and suggest inclusion of a common aetiology. One ratio, caudal fossa height to the length of the skull base extended to an imaginary point of alignment between the atlas and supraoccipital bones, was common to all three breeds (p values 0.029 to <0.001). Associated with SM were a reduced occipital crest and two acute changes in angulation i) 'sphenoid flexure' at the spheno-occipital synchondrosis ii) 'cervical flexure' at the foramen magnum allied with medulla oblongata elevation. Comparing dogs with and without SM, each breed had a unique trait: Chihuahua had a smaller angle between the dens, atlas and basioccipital bone (p value < 0.001); Affenpinschers had a smaller distance from atlas to dens (p value 0.009); CKCS had a shorter distance between the spheno-occipital synchondrosis and atlas (p value 0.007). The selected morphometries successfully characterised conformational changes in the brain and craniocervical junction that might form the basis of a diagnostic tool for all breeds. The severity of SM involved a spectrum of abnormalities, incurred by changes in both angulation and size that could alter neural parenchyma compliance and/or impede cerebrospinal fluid channels.

  18. 77 FR 23673 - Notice of Stakeholder Meeting: Industry Roundtable-DON/USDA/DOE/DOT-FAA Advanced Drop-In Biofuels...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-20

    ... participants in the biofuels supply chain. The purpose of the roundtable meeting is for the federal government... Production Value Chain: (feedstock provider, bio-refiner, finished products distributor, integrated effort...

  19. Reexamining organizational configurations: an update, validation, and expansion of the taxonomy of health networks and systems.

    PubMed

    Dubbs, Nicole L; Bazzoli, Gloria J; Shortell, Stephen M; Kralovec, Peter D

    2004-02-01

    To (a) assess how the original cluster categories of hospital-led health networks and systems have changed over time; (b) identify any new patterns of cluster configurations; and (c) demonstrate how additional data can be used to refine and enhance the taxonomy measures. DATA SOURCES; 1994 and 1998 American Hospital Association (AHA) Annual Survey of Hospitals. As in the original taxonomy, separate cluster solutions are identified for health networks and health systems by applying three strategic/structural dimensions (differentiation, integration, and centralization) to three components of the health service/product continuum (hospital services, physician arrangements, and provider-based insurance activities). Factor, cluster, and discriminant analyses are used to analyze the 1998 data. Descriptive and comparative methods are used to analyze the updated 1998 taxonomy relative to the original 1994 version. The 1998 cluster categories are similar to the original taxonomy, however, they reveal some new organizational configurations. For the health networks, centralization of product/service lines is occurring more selectively than in the past. For the health systems, participation has grown in and dispersed across a more diverse set of decentralized organizational forms. For both networks and systems, the definition of centralization has changed over time. In its updated form, the taxonomy continues to provide policymakers and practitioners with a descriptive and contextual framework against which to assess organizational programs and policies. There is a need to continue to revisit the taxonomy from time to time because of the persistent evolution of the U.S. health care industry and the consequent shifting of organizational configurations in this arena. There is also value in continuing to move the taxonomy in the direction of refinement/expansion as new opportunities become available.

  20. A refined index of model performance: a rejoinder

    USGS Publications Warehouse

    Legates, David R.; McCabe, Gregory J.

    2013-01-01

    Willmott et al. [Willmott CJ, Robeson SM, Matsuura K. 2012. A refined index of model performance. International Journal of Climatology, forthcoming. DOI:10.1002/joc.2419.] recently suggest a refined index of model performance (dr) that they purport to be superior to other methods. Their refined index ranges from − 1.0 to 1.0 to resemble a correlation coefficient, but it is merely a linear rescaling of our modified coefficient of efficiency (E1) over the positive portion of the domain of dr. We disagree with Willmott et al. (2012) that dr provides a better interpretation; rather, E1 is more easily interpreted such that a value of E1 = 1.0 indicates a perfect model (no errors) while E1 = 0.0 indicates a model that is no better than the baseline comparison (usually the observed mean). Negative values of E1 (and, for that matter, dr < 0.5) indicate a substantially flawed model as they simply describe a ‘level of inefficacy’ for a model that is worse than the comparison baseline. Moreover, while dr is piecewise continuous, it is not continuous through the second and higher derivatives. We explain why the coefficient of efficiency (E or E2) and its modified form (E1) are superior and preferable to many other statistics, including dr, because of intuitive interpretability and because these indices have a fundamental meaning at zero.We also expand on the discussion begun by Garrick et al. [Garrick M, Cunnane C, Nash JE. 1978. A criterion of efficiency for rainfall-runoff models. Journal of Hydrology 36: 375-381.] and continued by Legates and McCabe [Legates DR, McCabe GJ. 1999. Evaluating the use of “goodness-of-fit” measures in hydrologic and hydroclimatic model validation. Water Resources Research 35(1): 233-241.] and Schaefli and Gupta [Schaefli B, Gupta HV. 2007. Do Nash values have value? Hydrological Processes 21: 2075-2080. DOI: 10.1002/hyp.6825.]. This important discussion focuses on the appropriate baseline comparison to use, and why the observed mean often may be an inadequate choice for model evaluation and development. 

  1. Industrial garnet

    USGS Publications Warehouse

    Olson, D.W.

    2006-01-01

    In 2005, US production of crude garnet concentrate for industrial use was 28.4 kt valued at $3.05 million. Refined garnet material sold or used was 30.4 kt valued at $10 million. For the year, the US was one of the world's leading consumers of industrial garnet. Domestic values for crude concentrates for different applications ranged from about $53 to $120/t. In the short term, excess production capacity, combined with suppliers that vary in quality, grain size and mineral type, will keep prices down.

  2. Residual Distribution Schemes for Conservation Laws Via Adaptive Quadrature

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Abgrall, Remi; Biegel, Bryan (Technical Monitor)

    2000-01-01

    This paper considers a family of nonconservative numerical discretizations for conservation laws which retains the correct weak solution behavior in the limit of mesh refinement whenever sufficient order numerical quadrature is used. Our analysis of 2-D discretizations in nonconservative form follows the 1-D analysis of Hou and Le Floch. For a specific family of nonconservative discretizations, it is shown under mild assumptions that the error arising from non-conservation is strictly smaller than the discretization error in the scheme. In the limit of mesh refinement under the same assumptions, solutions are shown to satisfy an entropy inequality. Using results from this analysis, a variant of the "N" (Narrow) residual distribution scheme of van der Weide and Deconinck is developed for first-order systems of conservation laws. The modified form of the N-scheme supplants the usual exact single-state mean-value linearization of flux divergence, typically used for the Euler equations of gasdynamics, by an equivalent integral form on simplex interiors. This integral form is then numerically approximated using an adaptive quadrature procedure. This renders the scheme nonconservative in the sense described earlier so that correct weak solutions are still obtained in the limit of mesh refinement. Consequently, we then show that the modified form of the N-scheme can be easily applied to general (non-simplicial) element shapes and general systems of first-order conservation laws equipped with an entropy inequality where exact mean-value linearization of the flux divergence is not readily obtained, e.g. magnetohydrodynamics, the Euler equations with certain forms of chemistry, etc. Numerical examples of subsonic, transonic and supersonic flows containing discontinuities together with multi-level mesh refinement are provided to verify the analysis.

  3. The effects of cold rolling and the subsequent heat treatments on the shape memory and the superelasticity characteristics of Cu73Al16Mn11 shape memory alloy

    NASA Astrophysics Data System (ADS)

    Babacan, N.; Ma, J.; Turkbas, O. S.; Karaman, I.; Kockar, B.

    2018-01-01

    In the present study, the effect of thermo-mechanical treatments on the shape memory and the superelastic characteristics of Cu73Al16Mn11 (at%) shape memory alloy were investigated. 10%, 50% and 70% cold rolling and subsequent heat treatment processes were conducted to achieve strengthening via grain size refinement. 70% grain size reduction compared to the homogenized condition was obtained using 70% cold rolling and subsequent recrystallization heat treatment technique. Moreover, 10% cold rolling was applied to homogenized specimen to reveal the influence of the low percentage cold rolling reduction with no heat treatment on shape memory properties of Cu73Al16Mn11 (at%) alloy. Stress free transformation temperatures, monotonic tension and superelasticity behaviors of these samples were compared with those of the as-aged sample. Isobaric heating-cooling experiments were also conducted to see the dimensional stability of the samples as a function of applied stress. The 70% grain-refined sample exhibited better dimensional stability showing reduced residual strain levels upon thermal cycling under constant stress compared with the as-aged material. However, no improvement was achieved with grain size reduction in the superelasticity experiments. This distinctive observation was attributed to the difference in the magnitude of the stress levels achieved during two different types of experiments which were the isobaric heating-cooling and superelasticity tests. Intergranular fracture due to the stress concentration overcame the strengthening effect via grain refinement in the superelasticity tests at higher stress values. On the other hand, the strength of the material and resistance of material against plastic deformation upon phase transformation were increased as a result of the grain refinement at lower stress values in the isobaric heating-cooling experiments.

  4. Refined 3d-3d correspondence

    NASA Astrophysics Data System (ADS)

    Alday, Luis F.; Genolini, Pietro Benetti; Bullimore, Mathew; van Loon, Mark

    2017-04-01

    We explore aspects of the correspondence between Seifert 3-manifolds and 3d N = 2 supersymmetric theories with a distinguished abelian flavour symmetry. We give a prescription for computing the squashed three-sphere partition functions of such 3d N = 2 theories constructed from boundary conditions and interfaces in a 4d N = 2∗ theory, mirroring the construction of Seifert manifold invariants via Dehn surgery. This is extended to include links in the Seifert manifold by the insertion of supersymmetric Wilson-'t Hooft loops in the 4d N = 2∗ theory. In the presence of a mass parameter cfor the distinguished flavour symmetry, we recover aspects of refined Chern-Simons theory with complex gauge group, and in particular construct an analytic continuation of the S-matrix of refined Chern-Simons theory.

  5. Occupational Therapy Home Safety Intervention via Telehealth

    PubMed Central

    BREEDEN, LORI E.

    2016-01-01

    Photography can be an effective addition for education-based telehealth services delivered by an occupational therapist. In this study, photography was used as antecedent to telehealth sessions delivered by an occupational therapist focused on narrative learning about home safety. After taking photographs of past home safety challenges, six participants experienced three web-based occupational therapy sessions. Sessions were recorded and transcribed. Data were examined using content analysis. The content analysis identified the following themes: the value of photos to support learning; the value of narrative learning related to home safety education; and abstract versus concrete learners. Procedural findings are included to support future endeavors. Findings indicate that within a wellness context, home safety education for older adults can be delivered effectively via telehealth when using photography as a part of an occupational therapy intervention. PMID:27563389

  6. Strategy and the art of reinventing value.

    PubMed

    van der Heijden, K; Maccoby, M; Hama, N; Lundquist, J T; Collis, D J; Zeithaml, C; Martin, J E; Carroll, V P; Lurie, R

    1993-01-01

    In "From Value Chain to Value Constellation: Designing Interactive Strategy" (July-August 1993), Richard Normann and Rafael Ramírez argue that successful companies increasingly do not just add value, they reinvent it. The key strategic task is to reconfigure roles and relationships among a constellation of actors--suppliers, business partners, customers--in order to mobilize the creation of value in new forms and by new players. What is so different about this new logic of value? It breaks down the distinction between products and services and combines them into activity-based "offerings" from which customers can create value for themselves. But as potential offerings become more complex, so do the relationships necessary to create them. As a result, a company's strategic task becomes the reconfiguration and integration of its compentencies and customers. Normann and Ramírez provide three illustrations of these new rules of strategy. IKEA has blossomed into the world's largest retailer of home furnishings by redefining the relationships and organizational pratices of the furniture business. Danish pharmacies and their national organization have used the opportunity of health care reform to reconfigure their relationships with customers, doctors, hospitals, drug manufacturers, and with Danish and international health organizations.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Lunar Science. 3: Revised abstracts of papers presented at the Third Lunar Science Conference

    NASA Technical Reports Server (NTRS)

    Watkins, C. (Editor)

    1972-01-01

    Prior to the meeting some 375 preliminary abstracts were printed for distribution to conference participants, with the provision that revised abstracts of up to three typed pages each could be submitted before the end of the conference. These updated expanded abstracts are collected here.

  8. Kinetic Modeling of a Silicon Refining Process in a Moist Hydrogen Atmosphere

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Morita, Kazuki

    2018-03-01

    We developed a kinetic model that considers both silicon loss and boron removal in a metallurgical grade silicon refining process. This model was based on the hypotheses of reversible reactions. The reaction rate coefficient kept the same form but error of terminal boron concentration could be introduced when relating irreversible reactions. Experimental data from published studies were used to develop a model that fit the existing data. At 1500 °C, our kinetic analysis suggested that refining silicon in a moist hydrogen atmosphere generates several primary volatile species, including SiO, SiH, HBO, and HBO2. Using the experimental data and the kinetic analysis of volatile species, we developed a model that predicts a linear relationship between the reaction rate coefficient k and both the quadratic function of p(H2O) and the square root of p(H2). Moreover, the model predicted the partial pressure values for the predominant volatile species and the prediction was confirmed by the thermodynamic calculations, indicating the reliability of the model. We believe this model provides a foundation for designing a silicon refining process with a fast boron removal rate and low silicon loss.

  9. Kinetic Modeling of a Silicon Refining Process in a Moist Hydrogen Atmosphere

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Morita, Kazuki

    2018-06-01

    We developed a kinetic model that considers both silicon loss and boron removal in a metallurgical grade silicon refining process. This model was based on the hypotheses of reversible reactions. The reaction rate coefficient kept the same form but error of terminal boron concentration could be introduced when relating irreversible reactions. Experimental data from published studies were used to develop a model that fit the existing data. At 1500 °C, our kinetic analysis suggested that refining silicon in a moist hydrogen atmosphere generates several primary volatile species, including SiO, SiH, HBO, and HBO2. Using the experimental data and the kinetic analysis of volatile species, we developed a model that predicts a linear relationship between the reaction rate coefficient k and both the quadratic function of p(H2O) and the square root of p(H2). Moreover, the model predicted the partial pressure values for the predominant volatile species and the prediction was confirmed by the thermodynamic calculations, indicating the reliability of the model. We believe this model provides a foundation for designing a silicon refining process with a fast boron removal rate and low silicon loss.

  10. Fluid mechanical model of the Helmholtz resonator

    NASA Technical Reports Server (NTRS)

    Hersh, A. S.; Walker, B.

    1977-01-01

    A semi-empirical fluid mechanical model of the acoustic behavior of Helmholtz resonators is presented which predicts impedance as a function of the amplitude and frequency of the incident sound pressure field and resonator geometry. The model assumes that the particle velocity approaches the orifice in a spherical manner. The incident and cavity sound fields are connected by solving the governing oscillating mass and momentum conservation equations. The model is in agreement with the Rayleigh slug-mass model at low values of incident sound pressure level. At high values, resistance is predicted to be independent of frequency, proportional to the square root of the amplitude of the incident sound pressure field, and virtually independent of resonator geometry. Reactance is predicted to depend in a very complicated way upon resonator geometry, incident sound pressure level, and frequency. Nondimensional parameters are defined that divide resonator impedance into three categories corresponding to low, moderately low, and intense incident sound pressure amplitudes. The two-microphone method was used to measure the impedance of a variety of resonators. The data were used to refine and verify the model.

  11. A deep learning approach for the analysis of masses in mammograms with minimal user intervention.

    PubMed

    Dhungel, Neeraj; Carneiro, Gustavo; Bradley, Andrew P

    2017-04-01

    We present an integrated methodology for detecting, segmenting and classifying breast masses from mammograms with minimal user intervention. This is a long standing problem due to low signal-to-noise ratio in the visualisation of breast masses, combined with their large variability in terms of shape, size, appearance and location. We break the problem down into three stages: mass detection, mass segmentation, and mass classification. For the detection, we propose a cascade of deep learning methods to select hypotheses that are refined based on Bayesian optimisation. For the segmentation, we propose the use of deep structured output learning that is subsequently refined by a level set method. Finally, for the classification, we propose the use of a deep learning classifier, which is pre-trained with a regression to hand-crafted feature values and fine-tuned based on the annotations of the breast mass classification dataset. We test our proposed system on the publicly available INbreast dataset and compare the results with the current state-of-the-art methodologies. This evaluation shows that our system detects 90% of masses at 1 false positive per image, has a segmentation accuracy of around 0.85 (Dice index) on the correctly detected masses, and overall classifies masses as malignant or benign with sensitivity (Se) of 0.98 and specificity (Sp) of 0.7. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Biodiesel production using waste frying oil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charpe, Trupti W.; Rathod, Virendra K., E-mail: vk.rathod@ictmumbai.edu.in

    2011-01-15

    Research highlights: {yields} Waste sunflower frying oil is successfully converted to biodiesel using lipase as catalyst. {yields} Various process parameters that affects the conversion of transesterification reaction such as temperature, enzyme concentration, methanol: oil ratio and solvent are optimized. {yields} Inhibitory effect of methanol on lipase is reduced by adding methanol in three stages. {yields} Polar solvents like n-hexane and n-heptane increases the conversion of tranesterification reaction. - Abstract: Waste sunflower frying oil is used in biodiesel production by transesterification using an enzyme as a catalyst in a batch reactor. Various microbial lipases have been used in transesterification reaction tomore » select an optimum lipase. The effects of various parameters such as temperature, methanol:oil ratio, enzyme concentration and solvent on the conversion of methyl ester have been studied. The Pseudomonas fluorescens enzyme yielded the highest conversion. Using the P. fluorescens enzyme, the optimum conditions included a temperature of 45 deg. C, an enzyme concentration of 5% and a methanol:oil molar ratio 3:1. To avoid an inhibitory effect, the addition of methanol was performed in three stages. The conversion obtained after 24 h of reaction increased from 55.8% to 63.84% because of the stage-wise addition of methanol. The addition of a non-polar solvent result in a higher conversion compared to polar solvents. Transesterification of waste sunflower frying oil under the optimum conditions and single-stage methanol addition was compared to the refined sunflower oil.« less

  13. Three-dimensional printing and computer navigation assisted hemipelvectomy for en bloc resection of osteochondroma

    PubMed Central

    Zhang, Yaqing; Wen, Lianjiang; Zhang, Jun; Yan, Guoliang; Zhou, Yue; Huang, Bo

    2017-01-01

    Abstract Rationale: Three-dimensional (3D) printed templates can be designed to match an individual's anatomy, allowing surgeons to refine preoperative planning. In addition, the use of computer navigation (NAV) is gaining popularity to improve surgical accuracy in the resection of pelvic tumors. However, its use in combination with 3D printing to assist complex pelvic tumor resection has not been reported. Patient concerns: A 36-year-old man presented with left-sided pelvic pain and a fast-growing mass. He also complained of a 3-month history of radiating pain and numbness in the lower left extremity. Diagnoses: A biopsy revealed an osteochondroma with malignant potential. This osteochondroma arises from the ilium and involves the sacrum and lower lumbar vertebrae. Interventions: Here, we describe a novel combined application of 3D printing and intraoperative NAV systems to guide hemipelvectomy for en-bloc resection of the osteochondroma. The 3D printed template is analyzed during surgical planning and guides the initial intraoperative bone work to improve surgical accuracy and efficiency, while a computer NAV system provides real-time imaging during the tumor removal to achieve adequate resection margins and minimize the likelihood of injury to adjacent critical structures. Outcomes: The tumor mass and the invaded spinal structures were removed en bloc. Lessons: The combined application of 3D printing and computer NAV may be useful for tumor targeting and safe osteotomies in pelvic tumor surgery. PMID:28328842

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prince, K.R.; Schneider, B.J.

    This study obtained estimates of the hydraulic properties of the upper glacial and Magothy aquifers in the East Meadow area for use in analyzing the movement of reclaimed waste water through the aquifer system. This report presents drawdown and recovery data form the two aquifer tests of 1978 and 1985, describes the six methods of analysis used, and summarizes the results of the analyses in tables and graphs. The drawdown and recovery data were analyzed through three simple analytical equations, two curve-matching techniques, and a finite-element radial-flow model. The resulting estimates of hydraulic conductivity, anisotropy, and storage characteristics were usedmore » as initial input values to the finite-element radial-flow model (Reilly, 1984). The flow model was then used to refine the estimates of the aquifer properties by more accurately representing the aquifer geometry and field conditions of the pumping tests.« less

  15. Electrochemical hydrogenation of thiophene on SPE electrodes

    NASA Astrophysics Data System (ADS)

    Huang, Haiyan; Yuan, Penghui; Yu, Ying; Chung, Keng H.

    2017-01-01

    Electrochemical reduction desulfurization is a promising technology for petroleum refining which is environmental friendly, low cost and able to achieve a high degree of automation. Electrochemical hydrogenation of thiophene was performed in a three-electrode system which SPE electrode was the working electrode. The electrochemical desulfurization was studied by cyclic voltammetry and bulk electrolysis with coulometry (BEC) techniques. The results of cyclic voltammetry showed that the electrochemical hydrogenation reduction reaction occurred at -0.4V. The BEC results showed that the currents generated from thiophene hydrogenation reactions increased with temperature. According to Arrhenius equation, activation energy of thiophene electrolysis was calculated and lower activation energy value indicated it was diffusion controlled reaction. From the products of electrolytic reactions, the mechanisms of electrochemical hydrogenation of thiophene were proposed, consisting of two pathways: openingring followed by hydrogenation, and hydrogenation followed by ring opening.

  16. Comparison of conference abstracts and presentations with full-text articles in the health technology assessments of rapidly evolving technologies.

    PubMed

    Dundar, Y; Dodd, S; Dickson, R; Walley, T; Haycox, A; Williamson, P R

    2006-02-01

    To assess the extent of use of data from conference abstracts and presentations in health technology assessments (HTAs) provided as part of the National Institute for Health and Clinical Excellence (NICE) appraisal process. Also to assess the methodological quality of trials from conference abstracts and presentations, the consistency of reporting major outcomes between these sources and subsequent full-length publications, the effect of inclusion or exclusion of data from these sources on the meta-analysis pooled effect estimates, and the timeliness of availability of data from these sources and full articles in relation to the development of technology assessment reviews (TARs). A survey of seven TAR groups. An audit of published TARs: included all NICE TARs published between January 2000 and October 2004. Case studies of selected TARs. Analyses of the results of the survey and audit were presented as a descriptive summary and in a tabular format. Sensitivity analyses were carried out to compare the effect of inclusion of data from abstracts and presentations on the meta-analysis pooled effect estimates by including data from both abstracts/presentations and full papers, and data from only full publications, included in the original TAR. These analyses were then compared with meta-analysis of data from trials that have subsequently been published in full. All seven TAR groups completed and returned the survey. Five out of seven groups reported a general policy that included searching for and including studies available as conference abstracts/presentations. Five groups responded that if they included data from these sources they would carry out methodological quality assessment of studies from these sources using the same assessment tools as for full publications, and manage the data from these sources in the same way as fully published reports. All groups reported that if relevant outcome data were reported in both an abstract/presentation and a full publication, they would only consider the data in the full publication. Conversely, if data were only available in conference abstract/presentation, all but two groups reported that they would extract and use the data from the abstract/presentation. In total, 63 HTA reports for NICE were identified. In 20 of 63 TARs (32%) explicit statements were made with regards to inclusion and assessment of data from abstracts/presentations. Thirty-eight (60%) identified at least one randomised controlled trial (RCT) available as a conference abstract or presentation. Of these, 26 (68%) included trials available as abstracts/presentations. About 80% (20/26) of the 26 TARs that included RCTs in abstract/presentation form carried out an assessment of the methodological quality of such trials. In 16 TARs full reports of these trials were used for quality assessment where both abstracts/presentations and subsequent full publications were available. Twenty-three of 63 TARs (37%) carried out a quantitative analysis of results. Of these, ten (43%) included trials that were available as abstracts/presentations in the review; however, only 60% (6/10) of these included data from abstracts/presentations in the data analysis of results. Thirteen TARs evaluated rapidly evolving technologies and only three of these identified and included trial data from conference abstracts/presentations and carried out a quantitative analysis where abstract/presentation data were used. These three TARs were used as case studies. In all three case studies the overall quality of reporting in abstracts/presentations was generally poor. In all case studies abstracts and presentations failed to describe the method of randomisation or allocation concealment. Overall, there was no mention of blinding in 66% (25/38) of the abstracts and in 26% (7/27) of the presentations included in case studies, and one presentation (4%) explicitly stated use of intention-to-treat analysis. Results from one case study demonstrated discrepancies in data made available in abstracts or online conference presentations. Not only were discrepancies evident between these sources, but also comparison of conference abstracts/presentations with subsequently published full-length articles demonstrates data discrepancies in reporting of results. Sensitivity analyses based on one case study indicated a change in significance of effect in two outcome measures when only full papers published to date were included. There are variations in policy and practice across TAR groups regarding searching for and inclusion of studies available as conference abstracts/presentations. There is also variation in the level of detail reported in TARs regarding the use of abstracts/presentations. Therefore, TAR teams should be encouraged to state explicitly their search strategies for identifying conference abstracts and presentations, their methods for assessing these for inclusion, and where appropriate how the data were used and their effect on the results. Comprehensive searching for trials available as conference abstracts/presentations is time consuming and may be of questionable value. However, there may be a case for searching for and including abstract/presentation data if, for example, other sources of data are limited. If conference abstracts/presentations are to be included, the TAR teams need to allocate additional time for searching and managing data from these sources. Incomplete reporting in conference abstracts and presentations limits the ability of reviewers to assess confidently the methodological quality of trials. Where conference abstracts and presentations are considered for inclusion in the review, the TAR teams should increase their efforts to obtain further study details by contacting trialists. Where abstract/presentation data are included, reviewers should discuss the effect of including data from these sources. Any data discrepancies identified across sources in TARs should be highlighted and their impact discussed in the review. In addition, there is a need to carry out, for example, a sensitivity analysis with and without abstract/presentation data in the analysis. There is a need for research into the development of search strategies specific to identification of studies available as conference abstracts and presentations in TARs. Such strategies may include guidance with regard to identification of relevant electronic databases and appropriate conference sites relevant to certain clinical areas. As there are limited case studies included in this report, analyses should be repeated as more TARs accrue, or include the work of other international HTA groups.

  17. Analysis of three different equations for predicting quadriceps femoris muscle strength in patients with COPD *

    PubMed Central

    Nellessen, Aline Gonçalves; Donária, Leila; Hernandes, Nidia Aparecida; Pitta, Fabio

    2015-01-01

    Abstract Objective: To compare equations for predicting peak quadriceps femoris (QF) muscle force; to determine the agreement among the equations in identifying QF muscle weakness in COPD patients; and to assess the differences in characteristics among the groups of patients classified as having or not having QF muscle weakness by each equation. Methods: Fifty-six COPD patients underwent assessment of peak QF muscle force by dynamometry (maximal voluntary isometric contraction of knee extension). Predicted values were calculated with three equations: an age-height-weight-gender equation (Eq-AHWG); an age-weight-gender equation (Eq-AWG); and an age-fat-free mass-gender equation (Eq-AFFMG). Results: Comparison of the percentage of predicted values obtained with the three equations showed that the Eq-AHWG gave higher values than did the Eq-AWG and Eq-AFFMG, with no difference between the last two. The Eq-AHWG showed moderate agreement with the Eq-AWG and Eq-AFFMG, whereas the last two also showed moderate, albeit lower, agreement with each other. In the sample as a whole, QF muscle weakness (< 80% of predicted) was identified by the Eq-AHWG, Eq-AWG, and Eq-AFFMG in 59%, 68%, and 70% of the patients, respectively (p > 0.05). Age, fat-free mass, and body mass index are characteristics that differentiate between patients with and without QF muscle weakness. Conclusions: The three equations were statistically equivalent in classifying COPD patients as having or not having QF muscle weakness. However, the Eq-AHWG gave higher peak force values than did the Eq-AWG and the Eq-AFFMG, as well as showing greater agreement with the other equations. PMID:26398750

  18. In the wake of suicide: Developing guidelines for suicide postvention in fire service

    PubMed Central

    Gulliver, Suzy Bird; Pennington, Michelle L.; Leto, Frank; Cammarata, Claire; Ostiguy, William; Zavodny, Cynthia; Flynn, Elisa J.; Kimbrel, Nathan A.

    2016-01-01

    ABSTRACT This project aimed to develop a standard operating procedure (SOP) for suicide postvention in Fire Service. First, an existing SOP was refined through expert review. Next, focus groups were conducted with fire departments lacking a peer suicide postvention SOP; feedback obtained guided revisions. The current article describes the iterative process used to evaluate and revise a Suicide Postvention SOP into a Postvention guideline that is available for implementation and evaluation. Postventions assist survivors in grief and bereavement and attempt to prevent additional negative outcomes. The implementation of suicide postvention guidelines will increase behavioral wellness within Fire Service. PMID:26332212

  19. "Cast Your Net Widely": Three Steps to Expanding and Refining Your Problem before Action Learning Application

    ERIC Educational Resources Information Center

    Reese, Simon R.

    2015-01-01

    This paper reflects upon a three-step process to expand the problem definition in the early stages of an action learning project. The process created a community-powered problem-solving approach within the action learning context. The simple three steps expanded upon in the paper create independence, dependence, and inter-dependence to aid the…

  20. Comprehension of concrete and abstract words in semantic variant primary progressive aphasia and Alzheimer's disease: A behavioral and neuroimaging study.

    PubMed

    Joubert, Sven; Vallet, Guillaume T; Montembeault, Maxime; Boukadi, Mariem; Wilson, Maximiliano A; Laforce, Robert Jr; Rouleau, Isabelle; Brambati, Simona M

    2017-07-01

    The aim of this study was to investigate the comprehension of concrete, abstract and abstract emotional words in semantic variant primary progressive aphasia (svPPA), Alzheimer's disease (AD), and healthy elderly adults (HE) Three groups of participants (9 svPPA, 12 AD, 11 HE) underwent a general neuropsychological assessment, a similarity judgment task, and structural brain MRI. The three types of words were processed similarly in the group of AD participants. In contrast, patients in the svPPA group were significantly more impaired at processing concrete words than abstract words, while comprehension of abstract emotional words was in between. VBM analyses showed that comprehension of concrete words relative to abstract words was significantly correlated with atrophy in the left anterior temporal lobe. These results support the view that concrete words are disproportionately impaired in svPPA, and that concrete and abstract words may rely upon partly dissociable brain regions. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Structural studies of TiC{sub 1−x}O{sub x} solid solution by Rietveld refinement and first-principles calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Bo, E-mail: youqin5912@yahoo.com.cn; Hou, Na; Huang, Shanyan

    2013-08-15

    The lattice parameters, structural stability and electronic structure of titanium oxycarbides (TiC{sub 1−x}O{sub x}, 0≤x≤1) solid solution were investigated by Rietveld refinement and first-principles calculations. Series of TiC{sub 1−x}O{sub x} were precisely synthesized by sintering process under the vacuum. Rietveld refinement results of XRD patterns show the properties of continuous solid solution in TiC{sub 1−x}O{sub x} over the whole composition range. The lattice parameters vary from 0.4324 nm to 0.4194 nm decreasing with increasing oxygen concentration. Results of first-principles calculations reveal that the disorder C/O structure is stable than the order C/O structure. Further investigations of the vacancy in Ti{submore » 1−Va}(C{sub 1−x}O{sub x}){sub 1−Va} solid solution present that the structure of vacancy segregated in TiO-part is more stable than the disorder C/O structure, which can be ascribed to the Ti–Ti bond across O-vacancy and the charge redistributed around Ti-vacancy via the analysis of the electron density difference plots and PDOS. - Graphical abstract: XRD of series of titanium oxycarbides (TiC{sub 1−x}O{sub x}, 0≤x≤1) solid solution prepared by adjusting the proportion of TiO in the starting material. Highlights: • Titanium oxycarbides were obtained by sintering TiO and TiC under carefully controlled conditions. • Rietveld refinement results show continuous solid solution with FCC structure in TiC{sub 1−x}O{sub x}. • The disorder C/O structure is stable than the order C/O structure. • Introduction of vacancy segregated in TiO-part is more stable than disorder C/O structure. • Ti–Ti bond across O-vacancy and the charge redistributed around Ti-vacancy enhance structural stability.« less

  2. Three-dimensional structure of photosystem II from Thermosynechococcus elongates in complex with terbutryn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabdulkhakov, A. G., E-mail: azat@vega.protes.ru; Dontsova, M. V.; Saenger, W.

    Photosystem II is a key component of the photosynthetic pathway producing oxygen at the thylakoid membrane of cyanobacteria, green algae, and plants. The three-dimensional structure of photosystem II from the cyanobacterium Thermosynechococcus elongates in a complex with herbicide terbutryn (a photosynthesis inhibitor) was determined for the first time by X-ray diffraction and refined at 3.2 Angstrom-Sign resolution (R{sub factor} = 26.9%, R{sub free} = 29.9%, rmsd for bond lengths is 0.013 Angstrom-Sign , and rmsd for bond angles is 2.2 Degree-Sign ). The terbutryn molecule was located in the binding pocket of the mobile plastoquinone. The atomic coordinates of themore » refined structure of photosystem II in a complex with terbutryn were deposited in the Protein Data Bank.« less

  3. Variability in δ13C values between individual Daphnia ephippia: Implications for palaeo-studies

    NASA Astrophysics Data System (ADS)

    Schilder, Jos; van Roij, Linda; Reichart, Gert-Jan; Sluijs, Appy; Heiri, Oliver

    2018-06-01

    The stable carbon isotope ratio (δ13C value) of Daphnia spp. resting egg shells (ephippia) provides information on past changes in Daphnia diet. Measurements are typically performed on samples of ≥20 ephippia, which obscures the range of values associated with individual ephippia. Using a recently developed laser ablation-based technique, we perform multiple δ13C analyses on individual ephippia, which show a high degree of reproducibility (standard deviations 0.1-0.5‰). We further measured δ13C values of 13 ephippia from surface sediments of three Swiss lakes. In the well-oxygenated lake with low methane concentrations, δ13C values are close to values typical for algae (-31.4‰) and the range in values is relatively small (5.8‰). This variability is likely driven by seasonal (or inter-annual) variability in algae δ13C values. In two seasonally anoxic lakes with higher methane concentrations, average values were lower (-41.4 and -43.9‰, respectively) and the ranges much larger (10.7 and 20.0‰). We attribute this variability to seasonal variation in incorporation of methane-derived carbon. In one lake we identify two statistically distinct isotopic populations, which may reflect separate production peaks. The potentially large within-sample variability should be considered when interpreting small-amplitude, short-lived isotope excursions based on samples consisting of few ephippia. We show that measurements on single ephippia can be performed using laser ablation, which allows for refined assessments of past Daphnia diet and carbon cycling in lake food webs. Furthermore, our study provides a basis for similar measurements on other chitinous remains (e.g. from chironomids, bryozoans).

  4. Assessment of the USCENTCOM Medical Distribution Structure

    PubMed Central

    Welser, William; Yoho, Keenan D.; Robbins, Marc; Peltz, Eric; Van Roo, Ben D.; Resnick, Adam C.; Harper, Ronald E.

    2012-01-01

    Abstract This study examined whether there might be a medical supply and distribution structure for U.S. Central Command (USCENTCOM) that would maintain or improve performance while reducing costs. The authors evaluated the likely performance and cost implications of the range of possibilities, considering both the medical and nonmedical logistics structures, for providing medical supplies to support medical activities in USCENTCOM. They found that three options would preserve or improve performance while either lowering or not increasing costs. Additionally, they considered how the value of these solutions would likely change with future shifts in USCENTCOM operations. PMID:28083245

  5. Weak data do not make a free lunch, only a cheap meal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Zhipu; Rajashankar, Kanagalaghatta; Dauter, Zbigniew

    2014-01-17

    Four data sets were processed at resolutions significantly exceeding the criteria traditionally used for estimating the diffraction data resolution limit. The analysis of these data and the corresponding model-quality indicators suggests that the criteria of resolution limits widely adopted in the past may be somewhat conservative. Various parameters, such asR mergeandI/σ(I), optical resolution and the correlation coefficients CC 1/2and CC*, can be used for judging the internal data quality, whereas the reliability factorsRandR freeas well as the maximum-likelihood target values and real-space map correlation coefficients can be used to estimate the agreement between the data and the refined model. However,more » none of these criteria provide a reliable estimate of the data resolution cutoff limit. The analysis suggests that extension of the maximum resolution by about 0.2 Å beyond the currently adopted limit where theI/σ(I) value drops to 2.0 does not degrade the quality of the refined structural models, but may sometimes be advantageous. Such an extension may be particularly beneficial for significantly anisotropic diffraction. Extension of the maximum resolution at the stage of data collection and structure refinement is cheap in terms of the required effort and is definitely more advisable than accepting a too conservative resolution cutoff, which is unfortunately quite frequent among the crystal structures deposited in the Protein Data Bank.« less

  6. Variationally consistent discretization schemes and numerical algorithms for contact problems

    NASA Astrophysics Data System (ADS)

    Wohlmuth, Barbara

    We consider variationally consistent discretization schemes for mechanical contact problems. Most of the results can also be applied to other variational inequalities, such as those for phase transition problems in porous media, for plasticity or for option pricing applications from finance. The starting point is to weakly incorporate the constraint into the setting and to reformulate the inequality in the displacement in terms of a saddle-point problem. Here, the Lagrange multiplier represents the surface forces, and the constraints are restricted to the boundary of the simulation domain. Having a uniform inf-sup bound, one can then establish optimal low-order a priori convergence rates for the discretization error in the primal and dual variables. In addition to the abstract framework of linear saddle-point theory, complementarity terms have to be taken into account. The resulting inequality system is solved by rewriting it equivalently by means of the non-linear complementarity function as a system of equations. Although it is not differentiable in the classical sense, semi-smooth Newton methods, yielding super-linear convergence rates, can be applied and easily implemented in terms of a primal-dual active set strategy. Quite often the solution of contact problems has a low regularity, and the efficiency of the approach can be improved by using adaptive refinement techniques. Different standard types, such as residual- and equilibrated-based a posteriori error estimators, can be designed based on the interpretation of the dual variable as Neumann boundary condition. For the fully dynamic setting it is of interest to apply energy-preserving time-integration schemes. However, the differential algebraic character of the system can result in high oscillations if standard methods are applied. A possible remedy is to modify the fully discretized system by a local redistribution of the mass. Numerical results in two and three dimensions illustrate the wide range of possible applications and show the performance of the space discretization scheme, non-linear solver, adaptive refinement process and time integration.

  7. Distance matrix-based approach to protein structure prediction.

    PubMed

    Kloczkowski, Andrzej; Jernigan, Robert L; Wu, Zhijun; Song, Guang; Yang, Lei; Kolinski, Andrzej; Pokarowski, Piotr

    2009-03-01

    Much structural information is encoded in the internal distances; a distance matrix-based approach can be used to predict protein structure and dynamics, and for structural refinement. Our approach is based on the square distance matrix D = [r(ij)(2)] containing all square distances between residues in proteins. This distance matrix contains more information than the contact matrix C, that has elements of either 0 or 1 depending on whether the distance r (ij) is greater or less than a cutoff value r (cutoff). We have performed spectral decomposition of the distance matrices D = sigma lambda(k)V(k)V(kT), in terms of eigenvalues lambda kappa and the corresponding eigenvectors v kappa and found that it contains at most five nonzero terms. A dominant eigenvector is proportional to r (2)--the square distance of points from the center of mass, with the next three being the principal components of the system of points. By predicting r (2) from the sequence we can approximate a distance matrix of a protein with an expected RMSD value of about 7.3 A, and by combining it with the prediction of the first principal component we can improve this approximation to 4.0 A. We can also explain the role of hydrophobic interactions for the protein structure, because r is highly correlated with the hydrophobic profile of the sequence. Moreover, r is highly correlated with several sequence profiles which are useful in protein structure prediction, such as contact number, the residue-wise contact order (RWCO) or mean square fluctuations (i.e. crystallographic temperature factors). We have also shown that the next three components are related to spatial directionality of the secondary structure elements, and they may be also predicted from the sequence, improving overall structure prediction. We have also shown that the large number of available HIV-1 protease structures provides a remarkable sampling of conformations, which can be viewed as direct structural information about the dynamics. After structure matching, we apply principal component analysis (PCA) to obtain the important apparent motions for both bound and unbound structures. There are significant similarities between the first few key motions and the first few low-frequency normal modes calculated from a static representative structure with an elastic network model (ENM) that is based on the contact matrix C (related to D), strongly suggesting that the variations among the observed structures and the corresponding conformational changes are facilitated by the low-frequency, global motions intrinsic to the structure. Similarities are also found when the approach is applied to an NMR ensemble, as well as to atomic molecular dynamics (MD) trajectories. Thus, a sufficiently large number of experimental structures can directly provide important information about protein dynamics, but ENM can also provide a similar sampling of conformations. Finally, we use distance constraints from databases of known protein structures for structure refinement. We use the distributions of distances of various types in known protein structures to obtain the most probable ranges or the mean-force potentials for the distances. We then impose these constraints on structures to be refined or include the mean-force potentials directly in the energy minimization so that more plausible structural models can be built. This approach has been successfully used by us in 2006 in the CASPR structure refinement (http://predictioncenter.org/caspR).

  8. 76 FR 68188 - Valero Refining-Texas, L.P. v. Port of Corpus Christi Authority of Nueces County, TX; Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-03

    ... related to the value of services rendered to Complainant.'' Further, ``[t]hrough application of such... and desist from engaging in the aforesaid violations of the Shipping Act; putting in force such...

  9. Refining Lung Cancer Screening Criteria in the Era of Value-Based Medicine.

    PubMed

    Shapiro, Steven D

    2017-02-01

    In a Perspective on the Research Article by ten Haaf and colleagues, Steven Shapiro discusses the challenges of identifying screening strategies that maximize the number of cancers diagnosed, while minimizing the harms of overdiagnosis and maintaining cost-effectiveness.

  10. Diagnostic Value of Endorectal Ultrasound in Preoperative Assessment of Lymph Node Involvement in Colorectal Cancer: a Meta-analysis.

    PubMed

    Li, Li; Chen, Shi; Wang, Ke; Huang, Jiao; Liu, Li; Wei, Sheng; Gao, Hong-Yu

    2015-01-01

    Nodal invasion by colorectal cancer is a critical determinant in estimating patient survival and in choosing appropriate preoperative treatment. The present meta-analysis was designed to evaluate the diagnostic value of endorectal ultrasound (EUS) in preoperative assessment of lymph node involvement in colorectal cancer. We systematically searched PubMed, Web of Science, Embase, and China National Knowledge Infrastructure (CNKI) databases for relevant studies published on or before December 10th, 2014. The sensitivity, specificity, likelihood ratios, diagnostic odds ratio (DOR) and area under the summary receiver operating characteristics curve (AUC) were assessed to estimate the diagnostic value of EUS. Subgroup analysis and meta-regression were performed to explore heterogeneity across studies. Thirty-three studies covering 3,016 subjects were included. The pooled sensitivity and specificity were 0.69 (95%CI: 0.63-0.75) and 0.77 (95%CI: 0.73-0.82), respectively. The positive and negative likelihood ratios were 3.09 (95%CI: 2.52-3.78) and 0.39 (95%CI: 0.32-0.48), respectively. The DOR was 7.84 (95%CI: 5.56-11.08), and AUC was 0.80 (95%CI: 0.77-0.84). This meta-analysis indicated that EUS has moderate diagnostic value in preoperative assessment of lymph node involvement in colorectal cancer. Further refinements in technology and diagnostic criteria are necessary to improve the diagnostic accuracy of EUS.

  11. Nutritive value of selected variety breads and pastas.

    PubMed

    Ranhotra, G S; Gelroth, J A; Novak, F A; Bock, M A; Winterringer, G L; Matthews, R H

    1984-03-01

    Nine types of commercially produced variety breads, plain bagels, corn tortillas, and three types of pasta products were obtained from each of four cities, New York, San Francisco, Atlanta, and Kansas City. Proximate components and 12 minerals and vitamins were determined in these and in cooked pasta products. Available carbohydrate and energy values were calculated. On the average, French, Italian, and pita breads were lower in moisture than other breads. Protein in bread products averaged between 7.6% and 10.4% and in cooked pastas and tortillas between 4.4% and 5.3%. Bagels averaged 10.2% protein. Insoluble dietary fiber in whole wheat bread averaged 5.6%; for most products, dietary fiber values were five- to eightfold higher than crude fiber values. Pasta products and tortillas were virtually free of sodium. Sodium in bread products averaged between 379 and 689 mg/100 gm. Although all pasta products and most bread products were enriched, calcium was often not included. Iron averaged from 2.16 to 3.29 mg/100 gm in bread products and 3.10 to 4.24 mg/100 gm in dry pasta products. Products made with unrefined or less-refined flours and/or containing germ and bran tended to be high in phosphorus, magnesium, zinc, and manganese, and, to a lesser extent, in copper. A good portion of potassium, thiamin, riboflavin, and niacin in pasta products was lost during cooking.

  12. Quantitation of tr-cinnamaldehyde, safrole and myristicin in cola-flavoured soft drinks to improve the assessment of their dietary exposure.

    PubMed

    Raffo, Antonio; D'Aloise, Antonio; Magrì, Antonio L; Leclercq, Catherine

    2013-09-01

    Quantitation of tr-cinnamaldehyde, safrole and myristicin was carried out in 70 samples of cola-flavoured soft drinks purchased in eight European countries with the purpose of assessing the variability in the levels of these substances. Results indicated a limited variability in the content of the three substances: the ratio between the 90th and the 10th percentile concentration amounted to 21, 6 and 13 for tr-cinnamaldehyde, safrole and myristicin, respectively. The uncertainty in the assessment of dietary exposure to these substances due to the variability of their level in cola-flavoured drinks was low. Based on these analytical data and on refined food consumption data, estimates of exposure to safrole associated to cola drink consumption, along with Margin of Exposure (MOE) values, were obtained. For high consumers of cola-flavoured soft drinks in certain age groups, within some European countries, MOE values lower than 10,000 resulted, MOE values of 10,000 or higher having been stated by the EFSA as a quantitative criterion to identify low concern from a public health point of view and low priority for risk management actions. The lowest MOE values, from 1900 to 3000, were observed for children and teen agers in the United Kingdom and Ireland. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Abstraction in perceptual symbol systems.

    PubMed Central

    Barsalou, Lawrence W

    2003-01-01

    After reviewing six senses of abstraction, this article focuses on abstractions that take the form of summary representations. Three central properties of these abstractions are established: ( i ) type-token interpretation; (ii) structured representation; and (iii) dynamic realization. Traditional theories of representation handle interpretation and structure well but are not sufficiently dynamical. Conversely, connectionist theories are exquisitely dynamic but have problems with structure. Perceptual symbol systems offer an approach that implements all three properties naturally. Within this framework, a loose collection of property and relation simulators develops to represent abstractions. Type-token interpretation results from binding a property simulator to a region of a perceived or simulated category member. Structured representation results from binding a configuration of property and relation simulators to multiple regions in an integrated manner. Dynamic realization results from applying different subsets of property and relation simulators to category members on different occasions. From this standpoint, there are no permanent or complete abstractions of a category in memory. Instead, abstraction is the skill to construct temporary online interpretations of a category's members. Although an infinite number of abstractions are possible, attractors develop for habitual approaches to interpretation. This approach provides new ways of thinking about abstraction phenomena in categorization, inference, background knowledge and learning. PMID:12903648

  14. Genesis Solar Wind Interstream, Coronal Hole and Coronal Mass Ejection Samples: Update on Availability and Condition

    NASA Technical Reports Server (NTRS)

    Allton, J. H.; Gonzalez, C. P.; Allums, K. K.

    2017-01-01

    Recent refinement of analysis of ACE/SWICS data (Advanced Composition Explorer/Solar Wind Ion Composition Spectrometer) and of onboard data for Genesis Discovery Mission of 3 regimes of solar wind at Earth-Sun L1 make it an appropriate time to update the availability and condition of Genesis samples specifically collected in these three regimes and currently curated at Johnson Space Center. ACE/SWICS spacecraft data indicate that solar wind flow types emanating from the interstream regions, from coronal holes and from coronal mass ejections are elementally and isotopically fractionated in different ways from the solar photosphere, and that correction of solar wind values to photosphere values is non-trivial. Returned Genesis solar wind samples captured very different kinds of information about these three regimes than spacecraft data. Samples were collected from 11/30/2001 to 4/1/2004 on the declining phase of solar cycle 23. Meshik, et al is an example of precision attainable. Earlier high precision laboratory analyses of noble gases collected in the interstream, coronal hole and coronal mass ejection regimes speak to degree of fractionation in solar wind formation and models that laboratory data support. The current availability and condition of samples captured on collector plates during interstream slow solar wind, coronal hole high speed solar wind and coronal mass ejections are de-scribed here for potential users of these samples.

  15. Using more than 801 296 small-molecule crystal structures to aid in protein structure refinement and analysis

    PubMed Central

    Cole, Jason C.

    2017-01-01

    The Cambridge Structural Database (CSD) is the worldwide resource for the dissemination of all published three-dimensional structures of small-molecule organic and metal–organic compounds. This paper briefly describes how this collection of crystal structures can be used en masse in the context of macromolecular crystallography. Examples highlight how the CSD and associated software aid protein–ligand complex validation, and show how the CSD could be further used in the generation of geometrical restraints for protein structure refinement. PMID:28291758

  16. PDB_REDO: automated re-refinement of X-ray structure models in the PDB.

    PubMed

    Joosten, Robbie P; Salzemann, Jean; Bloch, Vincent; Stockinger, Heinz; Berglund, Ann-Charlott; Blanchet, Christophe; Bongcam-Rudloff, Erik; Combet, Christophe; Da Costa, Ana L; Deleage, Gilbert; Diarena, Matteo; Fabbretti, Roberto; Fettahi, Géraldine; Flegel, Volker; Gisel, Andreas; Kasam, Vinod; Kervinen, Timo; Korpelainen, Eija; Mattila, Kimmo; Pagni, Marco; Reichstadt, Matthieu; Breton, Vincent; Tickle, Ian J; Vriend, Gert

    2009-06-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour.

  17. Hydrothermal synthesis, crystal structures and photoluminescence properties of mixed europium-yttrium organic frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han Yinfeng; Department of Chemistry and Environmental Science, Taishan University, Taian 271021; Fu Lianshe

    Three mixed europium-yttrium organic frameworks: Eu{sub 2-x}Y{sub x}(Mel)(H{sub 2}O){sub 6} (Mel=mellitic acid or benzene-1,2,3,4,5,6-hexacarboxylic acid, x=0.38 1, 0.74 2, and 0.86 3) have been synthesized and characterized. All the compounds contain a 3-D net with (4, 8)-flu topology. The study indicates that the photoluminescence properties are effectively affected by the different ratios of europium and yttrium ions, the quantum efficiency is increased and the Eu{sup 3+} lifetime becomes longer in these MOFs than those of the Eu analog. - Graphical abstract: Three mixed europium and yttrium organic frameworks: Eu{sub 2-x}Y{sub x}(Mel)(H{sub 2}O){sub 6} (Mel=mellitic acid) have been synthesized and characterized.more » All the compounds contain a 3-D net with (4, 8)-flu topology. The study indicates that the photoluminescence properties are effectively affected by the different ratios of europium and yttrium ions, the quantum efficiency is increased and the Eu{sup 3+} lifetime becomes longer in these MOFs than those of the Eu analog. Highlights: Black-Right-Pointing-Pointer Three (4, 8)-flu topological mixed Eu and Y MOFs were synthesized under mild conditions. Black-Right-Pointing-Pointer Metal ratios were refined by the single crystal data consistent with the EDS analysis. Black-Right-Pointing-Pointer Mixed Eu and Y MOFs show longer lifetime and higher quantum efficiency than the Eu analog. Black-Right-Pointing-Pointer Adding inert lanthanide into luminescent MOFs enlarges the field of luminescent MOFs.« less

  18. Distributed metadata servers for cluster file systems using shared low latency persistent key-value metadata store

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Pedone, Jr., James M.

    A cluster file system is provided having a plurality of distributed metadata servers with shared access to one or more shared low latency persistent key-value metadata stores. A metadata server comprises an abstract storage interface comprising a software interface module that communicates with at least one shared persistent key-value metadata store providing a key-value interface for persistent storage of key-value metadata. The software interface module provides the key-value metadata to the at least one shared persistent key-value metadata store in a key-value format. The shared persistent key-value metadata store is accessed by a plurality of metadata servers. A metadata requestmore » can be processed by a given metadata server independently of other metadata servers in the cluster file system. A distributed metadata storage environment is also disclosed that comprises a plurality of metadata servers having an abstract storage interface to at least one shared persistent key-value metadata store.« less

  19. Abstracts of SIG Sessions.

    ERIC Educational Resources Information Center

    Proceedings of the ASIS Annual Meeting, 1994

    1994-01-01

    Includes abstracts of 18 special interest group (SIG) sessions. Highlights include natural language processing, information science and terminology science, classification, knowledge-intensive information systems, information value and ownership issues, economics and theories of information science, information retrieval interfaces, fuzzy thinking…

  20. An ab initio/Rice-Ramsperger-Kassel-Marcus study of the hydrogen-abstraction reactions of methyl ethers, H(3)COCH(3-x)(CH(3))(x), x = 0-2, by OH; mechanism and kinetics.

    PubMed

    Zhou, Chong-Wen; Simmie, John M; Curran, Henry J

    2010-07-14

    A theoretical study of the mechanism and kinetics of the H-abstraction reaction from dimethyl (DME), ethylmethyl (EME) and iso-propylmethyl (IPME) ethers by the OH radical has been carried out using the high-level methods CCSD(T)/CBS, G3 and G3MP2BH&H. The computationally less-expensive methods of G3 and G3MP2BH&H yield results for DME within 0.2-0.6 and 0.7-0.9 kcal mol(-1), respectively, of the coupled cluster, CCSD(T), values extrapolated to the basis set limit. So the G3 and G3MP2BH&H methods can be confidently used for the reactions of the higher ethers. A distinction is made between the two different kinds of H-atoms, classified as in/out-of the symmetry plane, and it is found that abstraction from the out-of-plane H-atoms proceeds through a stepwise mechanism involving the formation of a reactant complex in the entrance channel and product complex in the exit channel. The in-plane H-atom abstractions take place through a more direct mechanism and are less competitive. Rate constants of the three reactions have been calculated in the temperature range of 500-3000 K using the Variflex code, based on the weak collision, master equation/microcanonical variational RRKM theory including tunneling corrections. The computed total rate constants (cm(3) mol(-1) s(-1)) have been fitted as follows: k(DME) = 2.74 xT(3.94) exp (1534.2/T), k(EME) = 20.93 xT(3.61) exp (2060.1/T) and k(IPME) = 0.55 xT(3.93) exp (2826.1/T). Expressions of the group rate constants for the three different carbon sites are also provided.

  1. The Impact of Education on Income Distribution.

    ERIC Educational Resources Information Center

    Tinbergen, Jan

    The author's previously developed theory on income distribution, in which two of the explanatory variables are the average level and the distribution of education, is refined and tested on data selected and processed by the author and data from three studies by Americans. The material consists of data on subdivisions of three countries, the United…

  2. Children's Strategies for Solving Two- and Three-Dimensional Combinatorial Problems.

    ERIC Educational Resources Information Center

    English, Lyn D.

    1993-01-01

    Investigated strategies that 7- to 12-year-old children (n=96) spontaneously applied in solving novel combinatorial problems. With experience in solving two-dimensional problems, children were able to refine their strategies and adapt them to three dimensions. Results on some problems indicated significant effects of age. (Contains 32 references.)…

  3. Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds.

    PubMed

    Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun

    2016-06-17

    Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data.

  4. Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds†

    PubMed Central

    Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun

    2016-01-01

    Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data. PMID:27322279

  5. Participant and Public Involvement in Refining a Peer-Volunteering Active Aging Intervention: Project ACE (Active, Connected, Engaged)

    PubMed Central

    Withall, Janet; Thompson, Janice L; Fox, Kenneth R; Davis, Mark; Gray, Selena; de Koning, Jolanthe; Lloyd, Liz; Parkhurst, Graham; Stathi, Afroditi

    2018-01-01

    Abstract Background Evidence for the health benefits of a physically active lifestyle among older adults is strong, yet only a small proportion of older people meet physical activity recommendations. A synthesis of evidence identified “best bet” approaches, and this study sought guidance from end-user representatives and stakeholders to refine one of these, a peer-volunteering active aging intervention. Methods Focus groups with 28 older adults and four professional volunteer managers were conducted. Semi-structured interviews were conducted with 9 older volunteers. Framework analysis was used to gauge participants’ views on the ACE intervention. Results Motives for engaging in community groups and activities were almost entirely social. Barriers to participation were lack of someone to attend with, lack of confidence, fear of exclusion or “cliquiness” in established groups, bad weather, transport issues, inaccessibility of activities, ambivalence, and older adults being “set in their ways”. Motives for volunteering included “something to do,” avoiding loneliness, the need to feel needed, enjoyment, and altruism. Challenges included negative events between volunteer and recipient of volunteering support, childcare commitments, and high volunteering workload. Conclusion Peer-volunteering approaches have great potential for promotion of active aging. The systematic multistakeholder approach adopted in this study led to important refinements of the original ACE intervention. The findings provide guidance for active aging community initiatives highlighting the importance of effective recruitment strategies and of tackling major barriers including lack of motivation, confidence, and readiness to change; transport issues; security concerns and cost; activity availability; and lack of social support. PMID:27927733

  6. Light Curves of Lucy Targets: Leucus and Polymele

    NASA Astrophysics Data System (ADS)

    Buie, Marc W.; Zangari, Amanda M.; Marchi, Simone; Levison, Harold F.; Mottola, Stefano

    2018-06-01

    We present new observations from 2016 of two Jupiter Trojan asteroids that are targets for the Lucy Discovery mission. The extremely long rotation period of (11351) Leucus is confirmed and refined to a secure value of 445.732 ± 0.021 hr with photometric parameters of H r = 11.046 ± 0.003 and G r = 0.58 ± 0.02 in the SDSS r‧ filter. This leads to a geometric albedo of p V = 4.7%. The amplitude of the light curve was measured to be 0.61 mag, unchanged from the value of one-fourth of a revolution earlier, suggesting a low obliquity. The first light-curve observations for (15094) Polymele are also presented. This object is revealed to have a much shorter rotation period of 5.8607 ± 0.0005 hr with a very low amplitude of 0.09 mag. Its photometric parameters are H r = 11.691 ± 0.002 and G r = 0.22 ± 0.02. These values lead to a refined geometric albedo of p V = 7.3%. This object is either nearly spherical or was being viewed nearly pole-on in 2016. Further observations are required to fully determine the spin pole orientation and convex-hull shapes.

  7. The Effect of Shadow Area on Sgm Algorithm and Disparity Map Refinement from High Resolution Satellite Stereo Images

    NASA Astrophysics Data System (ADS)

    Tatar, N.; Saadatseresht, M.; Arefi, H.

    2017-09-01

    Semi Global Matching (SGM) algorithm is known as a high performance and reliable stereo matching algorithm in photogrammetry community. However, there are some challenges using this algorithm especially for high resolution satellite stereo images over urban areas and images with shadow areas. As it can be seen, unfortunately the SGM algorithm computes highly noisy disparity values for shadow areas around the tall neighborhood buildings due to mismatching in these lower entropy areas. In this paper, a new method is developed to refine the disparity map in shadow areas. The method is based on the integration of potential of panchromatic and multispectral image data to detect shadow areas in object level. In addition, a RANSAC plane fitting and morphological filtering are employed to refine the disparity map. The results on a stereo pair of GeoEye-1 captured over Qom city in Iran, shows a significant increase in the rate of matched pixels compared to standard SGM algorithm.

  8. Formation of Polyphenol-Denatured Protein Flocs in Alcohol Beverages Sweetened with Refined Cane Sugars.

    PubMed

    Eggleston, Gillian; Triplett, Alexa

    2017-11-08

    The sporadic appearance of floc from refined, white cane sugars in alcohol beverages remains a technical problem for both beverage manufacturers and sugar refiners. Cane invert sugars mixed with 60% pure alcohol and water increased light scattering by up to ∼1000-fold. Insoluble and soluble starch, fat, inorganic ash, oligosaccharides, Brix, and pH were not involved in the prevailing floc-formation mechanism. Strong polynomial correlations existed between the haze floc and indicator values (IVs) (color at 420 nm pH 9.0/color at pH 4.0-an indirect measure of polyphenolic and flavonoid colorants) (R 2 = 0.815) and protein (R 2 = 0.819) content of the invert sugars. Ethanol-induced denaturation of the protein exposed hydrophobic polyphenol-binding sites that were further exposed when heated to 80 °C. A tentative mechanism for floc formation was advanced by molecular probing with a haze (floc) active protein and polyphenol as well as polar, nonpolar, and ionic solvents.

  9. A method to estimate statistical errors of properties derived from charge-density modelling

    PubMed Central

    Lecomte, Claude

    2018-01-01

    Estimating uncertainties of property values derived from a charge-density model is not straightforward. A methodology, based on calculation of sample standard deviations (SSD) of properties using randomly deviating charge-density models, is proposed with the MoPro software. The parameter shifts applied in the deviating models are generated in order to respect the variance–covariance matrix issued from the least-squares refinement. This ‘SSD methodology’ procedure can be applied to estimate uncertainties of any property related to a charge-density model obtained by least-squares fitting. This includes topological properties such as critical point coordinates, electron density, Laplacian and ellipticity at critical points and charges integrated over atomic basins. Errors on electrostatic potentials and interaction energies are also available now through this procedure. The method is exemplified with the charge density of compound (E)-5-phenylpent-1-enylboronic acid, refined at 0.45 Å resolution. The procedure is implemented in the freely available MoPro program dedicated to charge-density refinement and modelling. PMID:29724964

  10. Topological quantum computation of the Dold-Thom functor

    NASA Astrophysics Data System (ADS)

    Ospina, Juan

    2014-05-01

    A possible topological quantum computation of the Dold-Thom functor is presented. The method that will be used is the following: a) Certain 1+1-topological quantum field theories valued in symmetric bimonoidal categories are converted into stable homotopical data, using a machinery recently introduced by Elmendorf and Mandell; b) we exploit, in this framework, two recent results (independent of each other) on refinements of Khovanov homology: our refinement into a module over the connective k-theory spectrum and a stronger result by Lipshitz and Sarkar refining Khovanov homology into a stable homotopy type; c) starting from the Khovanov homotopy the Dold-Thom functor is constructed; d) the full construction is formulated as a topological quantum algorithm. It is conjectured that the Jones polynomial can be described as the analytical index of certain Dirac operator defined in the context of the Khovanov homotopy using the Dold-Thom functor. As a line for future research is interesting to study the corresponding supersymmetric model for which the Khovanov-Dirac operator plays the role of a supercharge.

  11. Textbook Forum: Confusion in the Expressions for Transport Coefficients.

    ERIC Educational Resources Information Center

    Earl, Boyd L.

    1989-01-01

    Notes that the rigorous kinetic theory, based on the Boltzmann equation, does not yield exact results although some texts claim this to be so. Stresses that they should be presented as approximations with an indication that refinements in the values are possible. (MVL)

  12. 76 FR 55909 - CITGO Refining and Chemicals Company L.P. v. Port of Corpus Christi Authority of Nueces County...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... calculated as a percentage thereof that are excessive and not reasonably related to the value of services... desist from engaging in the aforesaid violations of the Shipping Act; putting in force such practices as...

  13. A refined gravity model from Lageos /GEM-L2/

    NASA Technical Reports Server (NTRS)

    Lerch, F. J.; Klosko, S. M.; Patel, G. B.

    1982-01-01

    Lageos satellite laser ranging (SLR) data taken over a 2.5 yr period were employed to develop the Goddard Earth Model GEM-L2, a refined gravity field model. Additional data was gathered with 30 other satellites, resulting in spherical harmonics through degree and order 20, based on over 600,000 measurements. The Lageos data was accurate down to 10 cm, after which the GEM 9 data were used to make adjustments past order 7. The resolution of long wavelength activity, through degree and order 4, was made possible by the Lageos data. The GEM-L2 model features a 20 x 20 geopotential, tracking station coordinates (20), 5-day polar motion and A1-UT1 values, and a GM value of 398,600.607 cu km/sq sec. The accuracy of station positioning has been raised to within 6 cm total position globally and within 1.8 cm in baselines. It is concluded that SLR is useful for measuring tectonic plate motions and inter-plate deformations.

  14. Evolution of hardness, microstructure, and strain rate sensitivity in a Zn-22% Al eutectoid alloy processed by high-pressure torsion

    NASA Astrophysics Data System (ADS)

    Kawasaki, Megumi; Lee, Han-Joo; Choi, In-Chul; Jang, Jae-il; Ahn, Byungmin; Langdon, Terence G.

    2014-08-01

    Severe plastic deformation (SPD) is an attractive processing method for refining microstructures of metallic materials to give ultrafine grain sizes within the submicrometer to even the nanometer levels. Experiments were conducted to discuss the evolution of hardness, microstructure and strain rate sensitivity, m, in a Zn-22% Al eutectoid alloy processed by high- pressure torsion (HPT). The data from microhardness and nanoindentation hardness measurements revealed that there is a significant weakening in the Zn-Al alloy during HPT despite extensive grain refinement. Excellent room-temperature (RT) plasticity was observed in the alloy after HPT from nanoindentation creep in terms of an increased value of m. The microstructural changes with increasing numbers of HPT turns show a strong correlation with the change in the m value. Moerover, the excellent RT plasticity in the alloy is discussed in terms of the enhanced level of grain boundary sliding and the evolution of microsturucture.

  15. Amendment of Articles 8, 9, 10, 21 and 78 of the International Code of Zoological Nomenclature to expand and refine methods of publication

    PubMed Central

    International Commission on Zoological Nomenclature

    2012-01-01

    Abstract The International Commission on Zoological Nomenclature has voted in favour of a revised version of the amendment to the International Code of Zoological Nomenclature that was proposed in 2008. The purpose of the amendment is to expand and refine the methods of publication allowed by the Code, particularly in relation to electronic publication. The amendment establishes an Official Register of Zoological Nomenclature (with ZooBank as its online version), allows electronic publication after 2011 under certain conditions, and disallows publication on optical discs after 2012. The requirements for electronic publications are that the work be registered in ZooBank before it is published, that the work itself state the date of publication and contain evidence that registration has occurred, and that the ZooBank registration state both the name of an electronic archive intended to preserve the work and the ISSN or ISBN associated with the work. Registration of new scientific names and nomenclatural acts is not required. The Commission has confirmed that ZooBank is ready to handle the requirements of the amendment. PMID:22977348

  16. Model-Based Reasoning in Upper-division Lab Courses

    NASA Astrophysics Data System (ADS)

    Lewandowski, Heather

    2015-05-01

    Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.

  17. Theory Interpretations in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)

    2001-01-01

    The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.

  18. Study on the solid solution of YMn{sub 1-x}Fe{sub x}O{sub 3}: Structural, magnetic and dielectric properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samal, S.L.; Green, W.; Lofland, S.E.

    The solid solution of YMn{sub 1-x}Fe{sub x}O{sub 3} (x=0.0, 0.1, 0.2, 0.3, 0.5, 1.0) was synthesized from the citrate precursor route. The hexagonal crystal structure related to YMnO{sub 3} was stable for x{<=}0.3. Rietveld refinement was carried out on the composition for x=0.3 and was refined to a major hexagonal phase ({approx}97%) with 3% of orthorhombic Y(Fe/Mn)O{sub 3} phase. The a-axis lattice constant increases and the c-axis lattice constant decreases with x for x{<=}0.2. The increase in the c-axis lattice constant at x=0.3 could be due to the doping of significant amount of d{sup 5} ion (high spin Fe{sup 3+}more » ion) in a trigonal bipyramidal crystal field. The detailed structural, magnetic and dielectric properties are discussed. - Graphical abstract: Temperature dependence of {epsilon} of YMn{sub 1-x}Fe{sub x}O{sub 3} (0.0{<=}x{<=}0.3) at 100 kHz. Inset shows the temperature variation of inverse magnetic susceptibility.« less

  19. Analysis of three-dimensional SAR distributions emitted by mobile phones in an epidemiological perspective.

    PubMed

    Deltour, Isabelle; Wiart, Joe; Taki, Masao; Wake, Kanako; Varsier, Nadège; Mann, Simon; Schüz, Joachim; Cardis, Elisabeth

    2011-12-01

    The three-dimensional distribution of the specific absorption rate of energy (SAR) in phantom models was analysed to detect clusters of mobile phones producing similar spatial deposition of energy in the head. The clusters' characteristics were described from the phones external features, frequency band and communication protocol. Compliance measurements with phones in cheek and tilt positions, and on the left and right side of a physical phantom were used. Phones used the Personal Digital Cellular (PDC), Code division multiple access One (CdmaOne), Global System for Mobile Communications (GSM) and Nordic Mobile Telephony (NMT) communication systems, in the 800, 900, 1500 and 1800 MHz bands. Each phone's measurements were summarised by the half-ellipsoid in which the SAR values were above half the maximum value. Cluster analysis used the Partitioning Around Medoids algorithm. The dissimilarity measure was based on the overlap of the ellipsoids, and the Manhattan distance was used for robustness analysis. Within the 800 MHz frequency band, and in part within the 900 MHz and the 1800 MHz frequency bands, weak clustering was obtained for the handset shape (bar phone, flip with top and flip with central antennas), but only in specific positions (tilt or cheek). On measurements of 120 phones, the three-dimensional distribution of SAR in phantom models did not appear to be related to particular external phone characteristics or measurement characteristics, which could be used for refining the assessment of exposure to radiofrequency energy within the brain in epidemiological studies such as the Interphone. Copyright © 2011 Wiley Periodicals, Inc.

  20. Definition and structure of body-relatedness from the perspective of patients with severe somatoform disorder and their therapists.

    PubMed

    Kalisvaart, Hanneke; van Broeckhuysen, Saskia; Bühring, Martina; Kool, Marianne B; van Dulmen, Sandra; Geenen, Rinie

    2012-01-01

    How a patient is connected with one's body is core to rehabilitation of somatoform disorder but a common model to describe body-relatedness is missing. The aim of our study was to investigate the components and hierarchical structure of body-relatedness as perceived by patients with severe somatoform disorder and their therapists. Interviews with patients and therapists yielded statements about components of body-relatedness. Patients and therapists individually sorted these statements according to similarity. Hierarchical cluster analysis was applied to these sortings. Analysis of variance was used to compare the perceived importance of the statements between patients and therapists. The hierarchical structure included 71 characteristics of body-relatedness. It consisted of three levels with eight clusters at the lowest level: 1) understanding, 2) acceptance, 3) adjustment, 4) respect for the body, 5) regulation, 6) confidence, 7) self-esteem, and 8) autonomy. The cluster 'understanding' was considered most important by patients and therapists. Patients valued 'regulating the body' more than therapists. According to patients with somatoform disorders and their therapists, body-relatedness includes awareness of the body and self by understanding, accepting and adjusting to bodily signals, by respecting and regulating the body, by confiding and esteeming oneself and by being autonomous. This definition and structure of body-relatedness may help professionals to improve interdisciplinary communication, assessment, and treatment, and it may help patients to better understand their symptoms and treatment. (German language abstract, Abstract S1; Spanish language abstract, Abstract S2).

Top