Sample records for incremental integrity checking

  1. Integrated system for automated financial document processing

    NASA Astrophysics Data System (ADS)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  2. Property Differencing for Incremental Checking

    NASA Technical Reports Server (NTRS)

    Yang, Guowei; Khurshid, Sarfraz; Person, Suzette; Rungta, Neha

    2014-01-01

    This paper introduces iProperty, a novel approach that facilitates incremental checking of programs based on a property di erencing technique. Speci cally, iProperty aims to reduce the cost of checking properties as they are initially developed and as they co-evolve with the program. The key novelty of iProperty is to compute the di erences between the new and old versions of expected properties to reduce the number and size of the properties that need to be checked during the initial development of the properties. Furthermore, property di erencing is used in synergy with program behavior di erencing techniques to optimize common regression scenarios, such as detecting regression errors or checking feature additions for conformance to new expected properties. Experimental results in the context of symbolic execution of Java programs annotated with properties written as assertions show the e ectiveness of iProperty in utilizing change information to enable more ecient checking.

  3. Validation of daily increments and a marine-entry check in the otoliths of sockeye salmon Oncorhynchus nerka post-smolts.

    PubMed

    Freshwater, C; Trudel, M; Beacham, T D; Neville, C-E; Tucker, S; Juanes, F

    2015-07-01

    Juvenile sockeye salmon Oncorhynchus nerka that were reared and smolted in laboratory conditions were found to produce otolith daily increments, as well as a consistently visible marine-entry check formed during their transition to salt water. Field-collected O. nerka post-smolts of an equivalent age also displayed visible checks; however, microchemistry estimates of marine-entry date using Sr:Ca ratios differed from visual estimates by c. 9 days suggesting that microstructural and microchemical processes occur on different time scales. © 2015 The Fisheries Society of the British Isles.

  4. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  5. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  6. Suprathreshold contrast summation over area using drifting gratings.

    PubMed

    McDougall, Thomas J; Dickinson, J Edwin; Badcock, David R

    2018-04-01

    This study investigated contrast summation over area for moving targets applied to a fixed-size contrast pedestal-a technique originally developed by Meese and Summers (2007) to demonstrate strong spatial summation of contrast for static patterns at suprathreshold contrast levels. Target contrast increments (drifting gratings) were applied to either the entire 20% contrast pedestal (a full fixed-size drifting grating), or in the configuration of a checkerboard pattern in which the target increment was applied to every alternate check region. These checked stimuli are known as "Battenberg patterns" and the sizes of the checks were varied (within a fixed overall area), across conditions, to measure summation behavior. Results showed that sensitivity to an increment covering the full pedestal was significantly higher than that for the Battenberg patterns (areal summation). Two observers showed strong summation across all check sizes (0.71°-3.33°), and for two other observers the summation ratio dropped to levels consistent with probability summation once check size reached 2.00°. Therefore, areal summation with moving targets does operate at high contrast, and is subserved by relatively large receptive fields covering a square area extending up to at least 3.33° × 3.33° for some observers. Previous studies in which the spatial structure of the pedestal and target covaried were unable to demonstrate spatial summation, potentially due to increasing amounts of suppression from gain-control mechanisms which increases as pedestal size increases. This study shows that when this is controlled, by keeping the pedestal the same across all conditions, extensive summation can be demonstrated.

  7. Plan-Do-Check-Act and the Management of Institutional Research. AIR 1992 Annual Forum Paper.

    ERIC Educational Resources Information Center

    McLaughlin, Gerald W.; Snyder, Julie K.

    This paper describes the application of a Total Quality Management strategy called Plan-Do-Check-Act (PDCA) to the projects and activities of an institutional research office at the Virginia Polytechnic Institute and State University. PDCA is a cycle designed to facilitate incremental continual improvement through change. The specific steps are…

  8. James Webb Space Telescope: Frequently Asked Questions for Scientists and Engineers

    NASA Technical Reports Server (NTRS)

    Gardner, Jonathan P.

    2008-01-01

    JWST will be tested incrementally during its construction, starting with individual mirrors and instruments (including cameras and spectrometers) and building up to the full observatory. JWST's mirrors and the telescope structure are first each tested individually, including optical testing of the mirrors and alignment testing of the structure inside a cold thermal-vacuum chamber. The mirrors are then installed on the telescope structure in a clean room at Goddard Space Flight Center (GSFC). In parallel to the telescope assembly and alignment, the instruments are being built and tested, again first individually, and then as part of an integrated instrument assembly. The integrated instrument assembly will be tested in a thermal-vacuum chamber at GSFC using an optical simulator of the telescope. This testing makes sure the instruments are properly aligned relative to each other and also provides an independent check of the individual tests. After both the telescope and the integrated instrument module are successfully assembled, the integrated instrument module will be installed onto the telescope, and the combined system will be sent to Johnson Space Flight Center (JSC) where it will be optically tested in one of the JSC chambers. The process includes testing the 18 primary mirror segments acting as a single primary mirror, and testing the end-to-end system. The final system test will assure that the combined telescope and instruments are focused and aligned properly, and that the alignment, once in space, will be within the range of the actively controlled optics. In general, the individual optical tests of instruments and mirrors are the most accurate. The final system tests provide a cost-effective check that no major problem has occurred during assembly. In addition, independent optical checks of earlier tests will be made as the full system is assembled, providing confidence that there are no major problems.

  9. Geometrically Nonlinear Finite Element Analysis of a Composite Space Reflector

    NASA Technical Reports Server (NTRS)

    Lee, Kee-Joo; Leet, Sung W.; Clark, Greg; Broduer, Steve (Technical Monitor)

    2001-01-01

    Lightweight aerospace structures, such as low areal density composite space reflectors, are highly flexible and may undergo large deflection under applied loading, especially during the launch phase. Accordingly, geometrically nonlinear analysis that takes into account the effect of finite rotation may be needed to determine the deformed shape for a clearance check and the stress and strain state to ensure structural integrity. In this study, deformation of the space reflector is determined under static conditions using a geometrically nonlinear solid shell finite element model. For the solid shell element formulation, the kinematics of deformation is described by six variables that are purely vector components. Because rotational angles are not used, this approach is free of the limitations of small angle increments. This also allows easy connections between substructures and large load increments with respect to the conventional shell formulation using rotational parameters. Geometrically nonlinear analyses were carried out for three cases of static point loads applied at selected points. A chart shows results for a case when the load is applied at the center point of the reflector dish. The computed results capture the nonlinear behavior of the composite reflector as the applied load increases. Also, they are in good agreement with the data obtained by experiments.

  10. Advanced CLIPS capabilities

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is a forward chaining rule based language developed by NASA. CLIPS was designed specifically to provide high portability, low cost, and easy integration with external systems. The current release of CLIPS, version 4.3, is being used by over 2500 users throughout the public and private community. The primary addition to the next release of CLIPS, version 5.0, will be the CLIPS Object Oriented Language (COOL). The major capabilities of COOL are: class definition with multiple inheritance and no restrictions on the number, types, or cardinality of slots; message passing which allows procedural code bundled with an object to be executed; and query functions which allow groups of instances to be examined and manipulated. In addition to COOL, numerous other enhancements were added to CLIPS including: generic functions (which allow different pieces of procedural code to be executed depending upon the types or classes of the arguments); integer and double precision data type support; multiple conflict resolution strategies; global variables; logical dependencies; type checking on facts; full ANSI compiler support; and incremental reset for rules.

  11. Integrating Incremental Learning and Episodic Memory Models of the Hippocampal Region

    ERIC Educational Resources Information Center

    Meeter, M.; Myers, C. E.; Gluck, M. A.

    2005-01-01

    By integrating previous computational models of corticohippocampal function, the authors develop and test a unified theory of the neural substrates of familiarity, recollection, and classical conditioning. This approach integrates models from 2 traditions of hippocampal modeling, those of episodic memory and incremental learning, by drawing on an…

  12. World energy projection system: Model documentation

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES), provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report.

  13. An incremental approach to genetic-algorithms-based classification.

    PubMed

    Guan, Sheng-Uei; Zhu, Fangming

    2005-04-01

    Incremental learning has been widely addressed in the machine learning literature to cope with learning tasks where the learning environment is ever changing or training samples become available over time. However, most research work explores incremental learning with statistical algorithms or neural networks, rather than evolutionary algorithms. The work in this paper employs genetic algorithms (GAs) as basic learning algorithms for incremental learning within one or more classifier agents in a multiagent environment. Four new approaches with different initialization schemes are proposed. They keep the old solutions and use an "integration" operation to integrate them with new elements to accommodate new attributes, while biased mutation and crossover operations are adopted to further evolve a reinforced solution. The simulation results on benchmark classification data sets show that the proposed approaches can deal with the arrival of new input attributes and integrate them with the original input space. It is also shown that the proposed approaches can be successfully used for incremental learning and improve classification rates as compared to the retraining GA. Possible applications for continuous incremental training and feature selection are also discussed.

  14. Energy Conservation Programs | Climate Neutral Research Campuses | NREL

    Science.gov Websites

    . Recognize accomplishments. One common theme is that successful programs check in often with the target very small scale with one building or one department. The success and savings from that effort can then be used to grow incrementally. Harvard University adopted this approach, where investment in one

  15. Integrated Personnel and Pay System-Army Increment 2 (IPPS-A Inc 2)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Integrated Personnel and Pay System-Army Increment 2 (IPPS-A Inc 2) Defense Acquisition...703-325-3747 DSN Phone: 865-2915 DSN Fax: 221-3747 Date Assigned: May 2, 2014 Program Information Program Name Integrated Personnel and Pay System...Program Description The Integrated Personnel and Pay System-Army (IPPS-A) will provide the Army with an integrated, multi-Component (Active, National

  16. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  17. Addressing System Reconfiguration and Incremental Integration within IMA Systems

    NASA Astrophysics Data System (ADS)

    Ferrero, F.; Rodríques, A. I.

    2009-05-01

    Recently space industry is paying special attention to Integrated Modular Avionics (IMA) systems due to the benefits that modular concepts could bring to the development of space applications, especially in terms of interoperability, flexibility and software reuse. Two important IMA goals to be highlighted are system reconfiguration, and incremental integration of new functionalities into a pre-existing system. The purpose of this paper is to show how system reconfiguration is conducted based on Allied Standard Avionics Architecture Council (ASAAC) concepts for IMA Systems. Besides, it aims to provide a proposal for addressing the incremental integration concept supported by our experience gained during European Technology Acquisition Program (ETAP) TDP1.7 programme. All these topics will be discussed taking into account safety issues and showing the blueprint as an appropriate technique to support these concepts.

  18. A Discussion of Issues in Integrity Constraint Monitoring

    NASA Technical Reports Server (NTRS)

    Fernandez, Francisco G.; Gates, Ann Q.; Cooke, Daniel E.

    1998-01-01

    In the development of large-scale software systems, analysts, designers, and programmers identify properties of data objects in the system. The ability to check those assertions during runtime is desirable as a means of verifying the integrity of the program. Typically, programmers ensure the satisfaction of such properties through the use of some form of manually embedded assertion check. The disadvantage to this approach is that these assertions become entangled within the program code. The goal of the research is to develop an integrity constraint monitoring mechanism whereby a repository of software system properties (called integrity constraints) are automatically inserted into the program by the mechanism to check for incorrect program behaviors. Such a mechanism would overcome many of the deficiencies of manually embedded assertion checks. This paper gives an overview of the preliminary work performed toward this goal. The manual instrumentation of constraint checking on a series of test programs is discussed, This review then is used as the basis for a discussion of issues to be considered in developing an automated integrity constraint monitor.

  19. Effects of monetary reward and punishment on information checking behaviour: An eye-tracking study.

    PubMed

    Li, Simon Y W; Cox, Anna L; Or, Calvin; Blandford, Ann

    2018-07-01

    The aim of the present study was to investigate the effect of error consequence, as reward or punishment, on individuals' checking behaviour following data entry. This study comprised two eye-tracking experiments that replicate and extend the investigation of Li et al. (2016) into the effect of monetary reward and punishment on data-entry performance. The first experiment adopted the same experimental setup as Li et al. (2016) but additionally used an eye tracker. The experiment validated Li et al. (2016) finding that, when compared to no error consequence, both reward and punishment led to improved data-entry performance in terms of reducing errors, and that no performance difference was found between reward and punishment. The second experiment extended the earlier study by associating error consequence to each individual trial by providing immediate performance feedback to participants. It was found that gradual increment (i.e. reward feedback) also led to significantly more accurate performance than no error consequence. It is unclear whether gradual increment is more effective than gradual decrement because of the small sample size tested. However, this study reasserts the effectiveness of reward on data-entry performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Upper-Air Quality Control, A Comparison Study

    DTIC Science & Technology

    1993-12-01

    hydrostatic different quadrant . From these four increments the check is based on the redundancy of reported value of the point in question is interpolated...NUAV COCHT NUAV Europa (01-17) 5,347 4,900 214 432 4.00 8.82 Former USSR (20-38) 9,152 8,824 838 704 9.16 .98 Aa (40-41, 44-48) 3,868 3,797 443 295 11.45

  1. Redundancy checking algorithms based on parallel novel extension rule

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai

    2017-05-01

    Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.

  2. Comment on id-based remote data integrity checking with data privacy preserving

    NASA Astrophysics Data System (ADS)

    Zhang, Jianhong; Meng, Hongxin

    2017-09-01

    Recently, an ID-based remote data integrity checking protocol with perfect data privacy preserving (IEEE Transactions on Information Forensics and Security, doi: 10.1109/TIFS.2016.2615853) was proposed to achieve data privacy protection and integrity checking. Unfortunately, in this letter, we demonstrate that their protocol is insecure. An active hacker can modify the stored data without being detected by the verifier in the auditing. And we also show malicious cloud server can convince the verifier that the stored data are kept intact after the outsourced data blocks are deleted. Finally, the reasons to produce such attacks are given.

  3. SU8 diaphragm micropump with monolithically integrated cantilever check valves.

    PubMed

    Ezkerra, Aitor; Fernández, Luis José; Mayora, Kepa; Ruano-López, Jesús Miguel

    2011-10-07

    This paper presents a SU8 unidirectional diaphragm micropump with embedded out-of-plane cantilever check valves. The device represents a reliable and low-cost solution for integration of microfluidic control in lab-on-a-chip devices. Its planar architecture allows monolithic definition of its components in a single step and potential integration with previously reported PCR, electrophoresis and flow-sensing SU8 microdevices. Pneumatic actuation is applied on a PDMS diaphragm, which is bonded to the SU8 body at wafer level, further enhancing its integration and mass production capabilities. The cantilever check valves move synchronously with the diaphragm, feature fast response (10ms), low dead volume (86nl) and a 94% flow blockage up to 300kPa. The micropump achieves a maximum flow rate of 177 μl min(-1) at 6 Hz and 200 kPa with an effective area of 10 mm(2). The device is reliable, self-priming and tolerant to particles and big bubbles. To the knowledge of the authors, this is the first micropump in SU8 with monolithically integrated cantilever check valves.

  4. Start-Up and Ongoing Practice Expenses of Behavioral Health and Primary Care Integration Interventions in the Advancing Care Together (ACT) Program.

    PubMed

    Wallace, Neal T; Cohen, Deborah J; Gunn, Rose; Beck, Arne; Melek, Steve; Bechtold, Donald; Green, Larry A

    2015-01-01

    Provide credible estimates of the start-up and ongoing effort and incremental practice expenses for the Advancing Care Together (ACT) behavioral health and primary care integration interventions. Expenditure data were collected from 10 practice intervention sites using an instrument with a standardized general format that could accommodate the unique elements of each intervention. Average start-up effort expenses were $44,076 and monthly ongoing effort expenses per patient were $40.39. Incremental expenses averaged $20,788 for start-up and $4.58 per patient for monthly ongoing activities. Variations in expenditures across practices reflect the differences in intervention specifics and organizational settings. Differences in effort to incremental expenditures reflect the extensive use of existing resources in implementing the interventions. ACT program incremental expenses suggest that widespread adoption would likely have a relatively modest effect on overall health systems expenditures. Practice effort expenses are not trivial and may pose barriers to adoption. Payers and purchasers interested in attaining widespread adoption of integrated care must consider external support to practices that accounts for both incremental and effort expense levels. Existing knowledge transfer mechanisms should be employed to minimize developmental start-up expenses and payment reform focused toward value-based, Triple Aim-oriented reimbursement and purchasing mechanisms are likely needed. © Copyright 2015 by the American Board of Family Medicine.

  5. Estimating the variance and integral scale of the transmissivity field using head residual increments

    USGS Publications Warehouse

    Zheng, Li; Silliman, Stephen E.

    2000-01-01

    A modification of previously published solutions regarding the spatial variation of hydraulic heads is discussed whereby the semivariogram of increments of head residuals (termed head residual increments HRIs) are related to the variance and integral scale of the transmissivity field. A first‐order solution is developed for the case of a transmissivity field which is isotropic and whose second‐order behavior can be characterized by an exponential covariance structure. The estimates of the variance σY2 and the integral scale λ of the log transmissivity field are then obtained via fitting a theoretical semivariogram for the HRI to its sample semivariogram. This approach is applied to head data sampled from a series of two‐dimensional, simulated aquifers with isotropic, exponential covariance structures and varying degrees of heterogeneity (σY2 = 0.25, 0.5, 1.0, 2.0, and 5.0). The results show that this method provided reliable estimates for both λ and σY2 in aquifers with the value of σY2 up to 2.0, but the errors in those estimates were higher for σY2 equal to 5.0. It is also demonstrated through numerical experiments and theoretical arguments that the head residual increments will provide a sample semivariogram with a lower variance than will the use of the head residuals without calculation of increments.

  6. Integration Process for the Habitat Demonstration Unit

    NASA Technical Reports Server (NTRS)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Howe, A. Scott

    2010-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities The HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture To complete the development of the HDU from conception in June 2009 to rollout for operations in July 2010, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads, such as the Geology Lab, that those systems will support The utilization of interface design standards and uniquely tailored reviews have allowed for an accelerated design process Scheduled activities include early fit-checks and the utilization of a Habitat avionics test bed prior to equipment installation into HDU A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development Modeling tools have been effective in hardware systems layout, cable routing and length estimation, and human factors analysis Decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU to maximize the efficiency of both integration and field operations Incremental test operations leading up to an integrated systems test allows for an orderly systems test program The HDU will begin its journey as an emulation of a Pressurized Excursion Module (PEM) for 2010 field testing and then may evolve to a Pressurized Core Module (PCM) for 2011 and later field tests, depending on agency architecture decisions The HDU deployment will vary slightly from current lunar architecture plans to include developmental hardware and software items and additional systems called opportunities for technology demonstration One of the HDU challenges has been designing to be prepared for the integration of presently unanticipated systems Results of the HDU field tests will influence future designs of habitat systems.

  7. Enhanced invitation methods to increase uptake of NHS health checks: study protocol for a randomized controlled trial.

    PubMed

    Forster, Alice S; Burgess, Caroline; McDermott, Lisa; Wright, Alison J; Dodhia, Hiten; Conner, Mark; Miller, Jane; Rudisill, Caroline; Cornelius, Victoria; Gulliford, Martin C

    2014-08-30

    NHS Health Checks is a new program for primary prevention of heart disease, stroke, diabetes, chronic kidney disease, and vascular dementia in adults aged 40 to 74 years in England. Individuals without existing cardiovascular disease or diabetes are invited for a Health Check every 5 years. Uptake among those invited is lower than anticipated. The project is a three-arm randomized controlled trial to test the hypothesis that enhanced invitation methods, using the Question-Behaviour Effect (QBE), will increase uptake of NHS Health Checks compared with a standard invitation. Participants comprise individuals eligible for an NHS Health Check registered in two London boroughs. Participants are randomized into one of three arms. Group A receives the standard NHS Health Check invitation letter, information sheet, and reminder letter at 12 weeks for nonattenders. Group B receives a QBE questionnaire 1 week before receiving the standard invitation, information sheet, and reminder letter where appropriate. Group C is the same as Group B, but participants are offered a £5 retail voucher if they return the questionnaire. Participants are randomized in equal proportions, stratified by general practice. The primary outcome is uptake of NHS Health Checks 6 months after invitation from electronic health records. We will estimate the incremental health service cost per additional completed Health Check for trial groups B and C versus trial arm A, as well as evaluating the impact of the QBE questionnaire, and questionnaire plus voucher, on the socioeconomic inequality in uptake of Health Checks.The trial includes a nested comparison of two methods for implementing allocation, one implemented manually at general practices and the other implemented automatically through the information systems used to generate invitations for the Health Check. The research will provide evidence on whether asking individuals to complete a preliminary questionnaire, by using the QBE, is effective in increasing uptake of Health Checks and whether an incentive alters questionnaire return rates as well as uptake of Health Checks. The trial interventions can be readily translated into routine service delivery if they are shown to be cost-effective. Current Controlled Trials ISRCTN42856343. Date registered: 21.03.2013.

  8. International Space Station Increment Operations Services

    NASA Astrophysics Data System (ADS)

    Michaelis, Horst; Sielaff, Christian

    2002-01-01

    The Industrial Operator (IO) has defined End-to-End services to perform efficiently all required operations tasks for the Manned Space Program (MSP) as agreed during the Ministerial Council in Edinburgh in November 2001. Those services are the result of a detailed task analysis based on the operations processes as derived from the Space Station Program Implementation Plans (SPIP) and defined in the Operations Processes Documents (OPD). These services are related to ISS Increment Operations and ATV Mission Operations. Each of these End-to-End services is typically characterised by the following properties: It has a clearly defined starting point, where all requirements on the end-product are fixed and associated performance metrics of the customer are well defined. It has a clearly defined ending point, when the product or service is delivered to the customer and accepted by him, according to the performance metrics defined at the start point. The implementation of the process might be restricted by external boundary conditions and constraints mutually agreed with the customer. As far as those are respected the IO has the free choice to select methods and means of implementation. The ISS Increment Operations Service (IOS) activities required for the MSP Exploitation program cover the complete increment specific cycle starting with the support to strategic planning and ending with the post increment evaluation. These activities are divided into sub-services including the following tasks: - ISS Planning Support covering the support to strategic and tactical planning up to the generation - Development &Payload Integration Support - ISS Increment Preparation - ISS Increment Execution These processes are tight together by the Increment Integration Management, which provides the planning and scheduling of all activities as well as the technical management of the overall process . The paper describes the entire End-to-End ISS Increment Operations service and the implementation to support the Columbus Flight 1E related increment and subsequent ISS increments. Special attention is paid to the implications caused by long term operations on hardware, software and operations personnel.

  9. An Environment for Incremental Development of Distributed Extensible Asynchronous Real-time Systems

    NASA Technical Reports Server (NTRS)

    Ames, Charles K.; Burleigh, Scott; Briggs, Hugh C.; Auernheimer, Brent

    1996-01-01

    Incremental parallel development of distributed real-time systems is difficult. Architectural techniques and software tools developed at the Jet Propulsion Laboratory's (JPL's) Flight System Testbed make feasible the integration of complex systems in various stages of development.

  10. A comparative study of velocity increment generation between the rigid body and flexible models of MMET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ismail, Norilmi Amilia, E-mail: aenorilmi@usm.my

    The motorized momentum exchange tether (MMET) is capable of generating useful velocity increments through spin–orbit coupling. This study presents a comparative study of the velocity increments between the rigid body and flexible models of MMET. The equations of motions of both models in the time domain are transformed into a function of true anomaly. The equations of motion are integrated, and the responses in terms of the velocity increment of the rigid body and flexible models are compared and analysed. Results show that the initial conditions, eccentricity, and flexibility of the tether have significant effects on the velocity increments ofmore » the tether.« less

  11. A path to integration in an academic health science center.

    PubMed Central

    Panko, W. B.; Wilson, W.

    1992-01-01

    This article describes a networking and integration strategy in use at the University of Michigan Medical Center. This strategy builds upon the existing technology base and is designed to provide a roadmap that will direct short-term development along a productive, long-term path. It offers a way to permit the short-term development of incremental solutions to current problems while at the same time maximizing the likelihood that these incremental efforts can be recycled into a more comprehensive approach. PMID:1336413

  12. 10 CFR 73.67 - Licensee fixed site and in-transit requirements for the physical protection of special nuclear...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...

  13. 10 CFR 73.67 - Licensee fixed site and in-transit requirements for the physical protection of special nuclear...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...

  14. 10 CFR 73.67 - Licensee fixed site and in-transit requirements for the physical protection of special nuclear...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...

  15. 10 CFR 73.67 - Licensee fixed site and in-transit requirements for the physical protection of special nuclear...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...

  16. 10 CFR 73.67 - Licensee fixed site and in-transit requirements for the physical protection of special nuclear...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... detection and surveillance of unauthorized penetration or activities, (3) Monitor with an intrusion alarm or... acknowledges the specified mode of transport, (iii) Check the integrity of the container and locks or seals... material of moderate strategic significance shall: (i) Check the integrity of the containers and seals upon...

  17. Numerical simulation of pseudoelastic shape memory alloys using the large time increment method

    NASA Astrophysics Data System (ADS)

    Gu, Xiaojun; Zhang, Weihong; Zaki, Wael; Moumni, Ziad

    2017-04-01

    The paper presents a numerical implementation of the large time increment (LATIN) method for the simulation of shape memory alloys (SMAs) in the pseudoelastic range. The method was initially proposed as an alternative to the conventional incremental approach for the integration of nonlinear constitutive models. It is adapted here for the simulation of pseudoelastic SMA behavior using the Zaki-Moumni model and is shown to be especially useful in situations where the phase transformation process presents little or lack of hardening. In these situations, a slight stress variation in a load increment can result in large variations of strain and local state variables, which may lead to difficulties in numerical convergence. In contrast to the conventional incremental method, the LATIN method solve the global equilibrium and local consistency conditions sequentially for the entire loading path. The achieved solution must satisfy the conditions of static and kinematic admissibility and consistency simultaneously after several iterations. 3D numerical implementation is accomplished using an implicit algorithm and is then used for finite element simulation using the software Abaqus. Computational tests demonstrate the ability of this approach to simulate SMAs presenting flat phase transformation plateaus and subjected to complex loading cases, such as the quasi-static behavior of a stent structure. Some numerical results are contrasted to those obtained using step-by-step incremental integration.

  18. Integral Analysis of Field Work and Laboratory Electrical Resistivity Imaging for Saline Water Intrusion Prediction in Groundwater

    NASA Astrophysics Data System (ADS)

    Zawawi, M. H.; Zahar, M. F.; Hashim, M. M. M.; Hazreek, Z. A. M.; Zahari, N. M.; Kamaruddin, M. A.

    2018-04-01

    Saline water intrusion is a serious threat to the groundwater as many part of the world utilize groundwater as their main source of fresh water supply. The usage of high salinity level of water as drinking water can lead to a very serious health hazard towards human. Saline water intrusion is a process by which induced flow of seawater into freshwater aquifer along the coastal area. It might happen due to human action and/or by natural event. The climate change and rise up of sea level may speed up the saline water intrusion process. The conventional method for distinguishing and checking saltwater interference to groundwater along the coast aquifers is to gather and test the groundwater from series of observation wells (borehole) with an end goal to give the important information about the hydrochemistry data to conclude whether the water in the well are safe to consume or not. An integrated approach of field and laboratory electrical resistivity investigation is proposed for indicating the contact region between saline and fresh groundwater. It was found that correlation for both soilbox produced almost identical curvilinear trends for 2% increment of seawater tested using sand sample. This project contributes towards predicting the saline water intrusion to the groundwater by non-destructive test that can replaced the conventional method of groundwater monitoring using series of boreholes in the coastal area

  19. Incremental Validity and Informant Effect from a Multi-Method Perspective: Assessing Relations between Parental Acceptance and Children’s Behavioral Problems

    PubMed Central

    Izquierdo-Sotorrío, Eva; Holgado-Tello, Francisco P.; Carrasco, Miguel Á.

    2016-01-01

    This study examines the relationships between perceived parental acceptance and children’s behavioral problems (externalizing and internalizing) from a multi-informant perspective. Using mothers, fathers, and children as sources of information, we explore the informant effect and incremental validity. The sample was composed of 681 participants (227 children, 227 fathers, and 227 mothers). Children’s (40% boys) ages ranged from 9 to 17 years (M = 12.52, SD = 1.81). Parents and children completed both the Parental Acceptance Rejection/Control Questionnaire (PARQ/Control) and the check list of the Achenbach System of Empirically Based Assessment (ASEBA). Statistical analyses were based on the correlated uniqueness multitrait-multimethod matrix (model MTMM) by structural equations and different hierarchical regression analyses. Results showed a significant informant effect and a different incremental validity related to which combination of sources was considered. A multi-informant perspective rather than a single one increased the predictive value. Our results suggest that mother–father or child–father combinations seem to be the best way to optimize the multi-informant method in order to predict children’s behavioral problems based on perceived parental acceptance. PMID:27242582

  20. Incremental Validity and Informant Effect from a Multi-Method Perspective: Assessing Relations between Parental Acceptance and Children's Behavioral Problems.

    PubMed

    Izquierdo-Sotorrío, Eva; Holgado-Tello, Francisco P; Carrasco, Miguel Á

    2016-01-01

    This study examines the relationships between perceived parental acceptance and children's behavioral problems (externalizing and internalizing) from a multi-informant perspective. Using mothers, fathers, and children as sources of information, we explore the informant effect and incremental validity. The sample was composed of 681 participants (227 children, 227 fathers, and 227 mothers). Children's (40% boys) ages ranged from 9 to 17 years (M = 12.52, SD = 1.81). Parents and children completed both the Parental Acceptance Rejection/Control Questionnaire (PARQ/Control) and the check list of the Achenbach System of Empirically Based Assessment (ASEBA). Statistical analyses were based on the correlated uniqueness multitrait-multimethod matrix (model MTMM) by structural equations and different hierarchical regression analyses. Results showed a significant informant effect and a different incremental validity related to which combination of sources was considered. A multi-informant perspective rather than a single one increased the predictive value. Our results suggest that mother-father or child-father combinations seem to be the best way to optimize the multi-informant method in order to predict children's behavioral problems based on perceived parental acceptance.

  1. Sample Processor for Life on Icy Worlds (SPLIce): Design and Test Results

    NASA Technical Reports Server (NTRS)

    Chinn, Tori N.; Lee, Anthony K.; Boone, Travis D.; Tan, Ming X.; Chin, Matthew M.; McCutcheon, Griffin C.; Horne, Mera F.; Padgen, Michael R.; Blaich, Justin T.; Forgione, Joshua B.; hide

    2017-01-01

    We report the design, development, and testing of the Sample Processor for Life on Icy Worlds (SPLIce) system, a microfluidic sample processor to enable autonomous detection of signatures of life and measurements of habitability parameters in Ocean Worlds. This monolithic fluid processing-and-handling system (Figure 1; mass 0.5 kg) retrieves a 50-L-volume sample and prepares it to supply a suite of detection instruments, each with unique preparation needs. SPLIce has potential applications in orbiter missions that sample ocean plumes, such as found in Saturns icy moon Enceladus, or landed missions on the surface of icy satellites, such as Jupiters moon Europa. Answering the question Are we alone in the universe? is captivating and exceptionally challenging. Even general criteria that define life very broadly include a significant role for water [1,2]. Searches for extinct or extant life therefore prioritize locations of abundant water whether in ancient (Mars), or present (Europa and Enceladus) times. Only two previous planetary missions had onboard fluid processing: the Viking Biology Experiments [3] and Phoenixs Wet Chemistry Laboratory (WCL) [4]. SPLIce differs crucially from those systems, including its capability to process and distribute L-volume samples and the integration autonomous control of a wide range of fluidic functions, including: 1) retrieval of fluid samples from an evacuated sample chamber; 2) onboard multi-year storage of dehydrated reagents; 3) integrated pressure, pH, and conductivity measurement; 4) filtration and retention of insoluble particles for microscopy; 5) dilution or vacuum-driven concentration of samples to accommodate instrument working ranges; 6) removal of gas bubbles from sample aliquots; 7) unidirectional flow (check valves); 8) active flow-path selection (solenoid-actuated valves); 9) metered pumping in 100 nL volume increments. The SPLIce manifold, made of three thermally fused layers of precision-machined cyclo-olefin polymer, supports all fluidic components (Figure 1) and integrated microchannels (125 x 250 m). Fluid is pumped by a stepper-motor-driven pump (Lee Co.). The functionality of the integrated MEMS pressure sensor (Honeywell) and passive check valves (Figure 2) were tested in conjunction with our newly designed integral bubble traps (Figure 3) and hydrophobic membrane-based concentrator (Figure 4). The concentrator (initially tested as a standalone component) demonstrated 5-fold vacuum-evaporative concentration. Polyethylene fused bead beds (PEFBBs; 50 porosity) store drylyophilized buffers, calibrants, and fluorescent dyes, and also promote mixing of sample with calibrant, dye, or H2O. Software-controlled automated tests demonstrated successful 1) fluid delivery to each component 2) valve and pump synchronization 3) sample aliquot delivery to instrument interface ports, and 4) rehydration of vacuum-dried fluorescent dye. In Figure 5, fluorescein on PEFBBs was rehydrated for 15 min using a pump-delivered water aliquot; it is displaced as H2O enters the bottom of the channel and pushes the dye into a check valve. Ultimately, SPLIce will fluorescently label amino acids in the sample for microchip-based electrophoretic (MCE) chiral separation and detection to seek and quantify key organic bio-signatures [5]; it will also deliver sample to a microfluidic version of WCL (mWCL) to measure soluble ions and redox-active species.

  2. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  3. Otolith research for Puget Sound

    USGS Publications Warehouse

    Larsen, K.; Reisenbichler, R.

    2007-01-01

    Otoliths are hard structures located in the brain cavity of fish. These structures are formed by a buildup of calcium carbonate within a gelatinous matrix that produces light and dark bands similar to the growth rings in trees. The width of the bands corresponds to environmental factors such as temperature and food availability. As juvenile salmon encounter different environments in their migration to sea, they produce growth increments of varying widths and visible 'checks' corresponding to times of stress or change. The resulting pattern of band variations and check marks leave a record of fish growth and residence time in each habitat type. This information helps Puget Sound restoration by determining the importance of different habitats for the optimal health and management of different salmon populations. The USGS Western Fisheries Research Center (WFRC) provides otolith research findings directly to resource managers who put this information to work.

  4. CheckMATE 2: From the model to the limit

    NASA Astrophysics Data System (ADS)

    Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten

    2017-12-01

    We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.

  5. Comparison of spot tests with AdultaCheck 6 and Intect 7 urine test strips for detecting the presence of adulterants in urine specimens.

    PubMed

    Dasgupta, Amitava; Chughtai, Omar; Hannah, Christina; Davis, Bonnette; Wells, Alice

    2004-10-01

    Several adulterants are used to mask tests for abused drugs in urine. Adulterants such as "Klear" and "Whizzies" contain potassium nitrite while "Urine Luck" contains pyridinium chlorochromate (PCC). The presence of these adulterants cannot be detected by routine specimen integrity check (pH, specific gravity, creatinine and temperature). We previously reported the development of rapid spot tests to detect the presence of these adulterants. AdultaCheck 6 and Intect 7 urine test strips are commercially available for detecting the presence of these adulterants along with specific gravity, creatinine and pH in urine. The performance of these two test strips for detecting adulterants was compared with the results obtained by spot tests. Both AdultaCheck 6 and Intect 7 effectively detected the presence of nitrite and pyridinium chlorochromate in urine. Moreover, both test strips successfully detected the presence of glutaraldehyde, for which no spot test is currently available. High amount of glucose and ascorbic acid did not cause any false positive result with AdultaCheck 6 or Intect 7. Both AdultaCheck 6 and Intect 7 can be used for checking the integrity of a urine specimen submitted for drugs of abuse testing.

  6. Integration of Computer-Based Virtual Check Ride System--Pre-Trip Inspection in Commercial Driver License Training Program

    ERIC Educational Resources Information Center

    Makwana, Alpesh P.

    2009-01-01

    "Pre-Trip Inspection" of the truck and trailer is one of the components of the current Commercial Driver's License (CDL) test. This part of the CDL test checks the ability of the student to identify the important parts of the commercial vehicle and their potential defects. The "Virtual Check Ride System" (VCRS), a…

  7. Acceleration of incremental-pressure-correction incompressible flow computations using a coarse-grid projection method

    NASA Astrophysics Data System (ADS)

    Kashefi, Ali; Staples, Anne

    2016-11-01

    Coarse grid projection (CGP) methodology is a novel multigrid method for systems involving decoupled nonlinear evolution equations and linear elliptic equations. The nonlinear equations are solved on a fine grid and the linear equations are solved on a corresponding coarsened grid. Mapping functions transfer data between the two grids. Here we propose a version of CGP for incompressible flow computations using incremental pressure correction methods, called IFEi-CGP (implicit-time-integration, finite-element, incremental coarse grid projection). Incremental pressure correction schemes solve Poisson's equation for an intermediate variable and not the pressure itself. This fact contributes to IFEi-CGP's efficiency in two ways. First, IFEi-CGP preserves the velocity field accuracy even for a high level of pressure field grid coarsening and thus significant speedup is achieved. Second, because incremental schemes reduce the errors that arise from boundaries with artificial homogenous Neumann conditions, CGP generates undamped flows for simulations with velocity Dirichlet boundary conditions. Comparisons of the data accuracy and CPU times for the incremental-CGP versus non-incremental-CGP computations are presented.

  8. Integrated Personnel and Pay System-Army Increment 1 (IPPS-A Inc 1)

    DTIC Science & Technology

    2016-03-01

    Acquisition Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision FY...2013 Feb 2014 Increment I FDD Apr 2013 Apr 2014 Full Deployment1 Jan 2015 Jul 2015 Memo Per 10 U.S.C. Chapter 144A, the MS C Decision exceeded the

  9. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  10. RETRACTED — PMD mitigation through interleaving LDPC codes with polarization scramblers

    NASA Astrophysics Data System (ADS)

    Han, Dahai; Chen, Haoran; Xi, Lixia

    2012-11-01

    The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved as an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this paper as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10 MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes brings incremental performance of error correction, and the PMD tolerance is 10 ps at OSNR=11.4 dB. The results show that the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.

  11. PMD mitigation through interleaving LDPC codes with polarization scramblers

    NASA Astrophysics Data System (ADS)

    Han, Dahai; Chen, Haoran; Xi, Lixia

    2013-09-01

    The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this article as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes bring incremental performance of error correction, and the PMD tolerance is 10ps at OSNR=11.4dB. The results show the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.

  12. Provenance based data integrity checking and verification in cloud environments

    PubMed Central

    Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151

  13. Provenance based data integrity checking and verification in cloud environments.

    PubMed

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  14. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    ERIC Educational Resources Information Center

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  15. The Single Soldier Quality of Life Initiative: Great Expectations of Privacy

    DTIC Science & Technology

    1995-04-01

    without regard to their marital status and to hold them accountable to established standards. 18 To many "old soldiers," some of the ideas contained...Family Housing Office: assign and terminate quarters, conduct check-in and check-out inspections, maintain accountability of SQ furniture, follow up on...integrity is a second priority." 2 6 Further hindering unit integrity is that smoking preference of the soldiers must be taken into account when making

  16. Comparative effectiveness and cost-effectiveness of the implantable miniature telescope.

    PubMed

    Brown, Gary C; Brown, Melissa M; Lieske, Heidi B; Lieske, Philip A; Brown, Kathryn S; Lane, Stephen S

    2011-09-01

    To assess the preference-based comparative effectiveness (human value gain) and the cost-utility (cost-effectiveness) of a telescope prosthesis (implantable miniature telescope) for the treatment of end-stage, age-related macular degeneration (AMD). A value-based medicine, second-eye model, cost-utility analysis was performed to quantify the comparative effectiveness and cost-effectiveness of therapy with the telescope prosthesis. Published, evidence-based data from the IMT002 Study Group clinical trial. Ophthalmic utilities were obtained from a validated cohort of >1000 patients with ocular diseases. Comparative effectiveness data were converted from visual acuity to utility (value-based) format. The incremental costs (Medicare) of therapy versus no therapy were integrated with the value gain conferred by the telescope prosthesis to assess its average cost-utility. The incremental value gains and incremental costs of therapy referent to (1) a fellow eye cohort and (2) a fellow eye cohort of those who underwent intra-study cataract surgery were integrated in incremental cost-utility analyses. All value outcomes and costs were discounted at a 3% annual rate, as per the Panel on Cost-Effectiveness in Health and Medicine. Comparative effectiveness was quantified using the (1) quality-adjusted life-year (QALY) gain and (2) percent human value gain (improvement in quality of life). The QALY gain was integrated with incremental costs into the cost-utility ratio ($/QALY, or US dollars expended per QALY gained). The mean, discounted QALY gain associated with use of the telescope prosthesis over 12 years was 0.7577. When the QALY loss of 0.0004 attributable to the adverse events was factored into the model, the final QALY gain was 0.7573. This resulted in a 12.5% quality of life gain for the average patient during the 12 years of the model. The average cost-utility versus no therapy for use of the telescope prosthesis was $14389/QALY. The incremental cost-utility referent to control fellow eyes was $14063/QALY, whereas the incremental cost-utility referent to fellow eyes that underwent intra-study cataract surgery was $11805/QALY. Therapy with the telescope prosthesis considerably improves quality of life and at the same time is cost-effective by conventional standards. Copyright © 2011 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  17. The incremental impact of cardiac MRI on clinical decision-making.

    PubMed

    Rajwani, Adil; Stewart, Michael J; Richardson, James D; Child, Nicholas M; Maredia, Neil

    2016-01-01

    Despite a significant expansion in the use of cardiac MRI (CMR), there is inadequate evaluation of its incremental impact on clinical decision-making over and above other well-established modalities. We sought to determine the incremental utility of CMR in routine practice. 629 consecutive CMR studies referred by 44 clinicians from 9 institutions were evaluated. Pre-defined algorithms were used to determine the incremental influence on diagnostic thinking, influence on clinical management and thus the overall clinical utility. Studies were also subdivided and evaluated according to the indication for CMR. CMR provided incremental information to the clinician in 85% of cases, with incremental influence on diagnostic thinking in 85% of cases and incremental impact on management in 42% of cases. The overall incremental utility of CMR exceeded 90% in 7 out of the 13 indications, whereas in settings such as the evaluation of unexplained ventricular arrhythmia or mild left ventricular systolic dysfunction, this was <50%. CMR was frequently able to inform and influence decision-making in routine clinical practice, even with analyses that accepted only incremental clinical information and excluded a redundant duplication of imaging. Significant variations in yield were noted according to the indication for CMR. These data support a wider integration of CMR services into cardiac imaging departments. These data are the first to objectively evaluate the incremental value of a UK CMR service in clinical decision-making. Such data are essential when seeking justification for a CMR service.

  18. Advanced development of the boundary element method for elastic and inelastic thermal stress analysis. Ph.D. Thesis, 1987 Final Report

    NASA Technical Reports Server (NTRS)

    Henry, Donald P., Jr.

    1991-01-01

    The focus of this dissertation is on advanced development of the boundary element method for elastic and inelastic thermal stress analysis. New formulations for the treatment of body forces and nonlinear effects are derived. These formulations, which are based on particular integral theory, eliminate the need for volume integrals or extra surface integrals to account for these effects. The formulations are presented for axisymmetric, two and three dimensional analysis. Also in this dissertation, two dimensional and axisymmetric formulations for elastic and inelastic, inhomogeneous stress analysis are introduced. The derivatives account for inhomogeneities due to spatially dependent material parameters, and thermally induced inhomogeneities. The nonlinear formulation of the present work are based on an incremental initial stress approach. Two inelastic solutions algorithms are implemented: an iterative; and a variable stiffness type approach. The Von Mises yield criterion with variable hardening and the associated flow rules are adopted in these algorithms. All formulations are implemented in a general purpose, multi-region computer code with the capability of local definition of boundary conditions. Quadratic, isoparametric shape functions are used to model the geometry and field variables of the boundary (and domain) of the problem. The multi-region implementation permits a body to be modeled in substructured parts, thus dramatically reducing the cost of analysis. Furthermore, it allows a body consisting of regions of different (homogeneous) material to be studied. To test the program, results obtained for simple test cases are checked against their analytic solutions. Thereafter, a range of problems of practical interest are analyzed. In addition to displacement and traction loads, problems with body forces due to self-weight, centrifugal, and thermal loads are considered.

  19. A design of LED adaptive dimming lighting system based on incremental PID controller

    NASA Astrophysics Data System (ADS)

    He, Xiangyan; Xiao, Zexin; He, Shaojia

    2010-11-01

    As a new generation energy-saving lighting source, LED is applied widely in various technology and industry fields. The requirement of its adaptive lighting technology is more and more rigorous, especially in the automatic on-line detecting system. In this paper, a closed loop feedback LED adaptive dimming lighting system based on incremental PID controller is designed, which consists of MEGA16 chip as a Micro-controller Unit (MCU), the ambient light sensor BH1750 chip with Inter-Integrated Circuit (I2C), and constant-current driving circuit. A given value of light intensity required for the on-line detecting environment need to be saved to the register of MCU. The optical intensity, detected by BH1750 chip in real time, is converted to digital signal by AD converter of the BH1750 chip, and then transmitted to MEGA16 chip through I2C serial bus. Since the variation law of light intensity in the on-line detecting environment is usually not easy to be established, incremental Proportional-Integral-Differential (PID) algorithm is applied in this system. Control variable obtained by the incremental PID determines duty cycle of Pulse-Width Modulation (PWM). Consequently, LED's forward current is adjusted by PWM, and the luminous intensity of the detection environment is stabilized by self-adaptation. The coefficients of incremental PID are obtained respectively after experiments. Compared with the traditional LED dimming system, it has advantages of anti-interference, simple construction, fast response, and high stability by the use of incremental PID algorithm and BH1750 chip with I2C serial bus. Therefore, it is suitable for the adaptive on-line detecting applications.

  20. Identifying professionals' needs in integrating electronic pain monitoring in community palliative care services: An interview study.

    PubMed

    Taylor, Sally; Allsop, Matthew J; Bekker, Hilary L; Bennett, Michael I; Bewick, Bridgette M

    2017-07-01

    Poor pain assessment is a barrier to effective pain control. There is growing interest internationally in the development and implementation of remote monitoring technologies to enhance assessment in cancer and chronic disease contexts. Findings describe the development and testing of pain monitoring systems, but research identifying the needs of health professionals to implement routine monitoring systems within clinical practice is limited. To inform the development and implementation strategy of an electronic pain monitoring system, PainCheck, by understanding palliative care professionals' needs when integrating PainCheck into routine clinical practice. Qualitative study using face-to-face interviews. Data were analysed using framework analysis Setting/participants: Purposive sample of health professionals managing the palliative care of patients living in the community Results: A total of 15 interviews with health professionals took place. Three meta-themes emerged from the data: (1) uncertainties about integration of PainCheck and changes to current practice, (2) appraisal of current practice and (3) pain management is everybody's responsibility Conclusion: Even the most sceptical of health professionals could see the potential benefits of implementing an electronic patient-reported pain monitoring system. Health professionals have reservations about how PainCheck would work in practice. For optimal use, PainCheck needs embedding within existing electronic health records. Electronic pain monitoring systems have the potential to enable professionals to support patients' pain management more effectively but only when barriers to implementation are appropriately identified and addressed.

  1. Recall intervals for oral health in primary care patients.

    PubMed

    Beirne, P; Forgie, A; Clarkson, Je; Worthington, H V

    2005-04-18

    The frequency with which patients should attend for a dental check-up and the potential effects on oral health of altering recall intervals between check-ups have been the subject of ongoing international debate for almost 3 decades. Although recommendations regarding optimal recall intervals vary between countries and dental healthcare systems, 6-monthly dental check-ups have traditionally been advocated by general dental practitioners in many developed countries. To determine the beneficial and harmful effects of different fixed recall intervals (for example 6 months versus 12 months) for the following different types of dental check-up: a) clinical examination only; b) clinical examination plus scale and polish; c) clinical examination plus preventive advice; d) clinical examination plus preventive advice plus scale and polish. To determine the relative beneficial and harmful effects between any of these different types of dental check-up at the same fixed recall interval. To compare the beneficial and harmful effects of recall intervals based on clinicians' assessment of patients' disease risk with fixed recall intervals. To compare the beneficial and harmful effects of no recall interval/patient driven attendance (which may be symptomatic) with fixed recall intervals. We searched the Cochrane Oral Health Group Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE and EMBASE. Reference lists from relevant articles were scanned and the authors of some papers were contacted to identify further trials and obtain additional information. Date of most recent searches: 9th April 2003. Trials were selected if they met the following criteria: design- random allocation of participants; participants - all children and adults receiving dental check-ups in primary care settings, irrespective of their level of risk for oral disease; interventions -recall intervals for the following different types of dental check-ups: a) clinical examination only; b) clinical examination plus scale and polish; c) clinical examination plus preventive advice; d) clinical examination plus scale and polish plus preventive advice; e) no recall interval/patient driven attendance (which may be symptomatic); f) clinician risk-based recall intervals; outcomes - clinical status outcomes for dental caries (including, but not limited to, mean dmft/DMFT, dmfs/DMFS scores, caries increment, filled teeth (including replacement restorations), early carious lesions arrested or reversed); periodontal disease (including, but not limited to, plaque, calculus, gingivitis, periodontitis, change in probing depth, attachment level); oral mucosa (presence or absence of mucosal lesions, potentially malignant lesions, cancerous lesions, size and stage of cancerous lesions at diagnosis). In addition the following outcomes were considered where reported: patient-centred outcomes, economic cost outcomes, other outcomes such as improvements in oral health knowledge and attitudes, harms, changes in dietary habits and any other oral health-related behavioural change. Information regarding methods, participants, interventions, outcome measures and results were independently extracted, in duplicate, by two authors. Authors were contacted, where deemed necessary and where possible, for further details regarding study design and for data clarification. A quality assessment of the included trial was carried out. The Cochrane Oral Health Group's statistical guidelines were followed. Only one study (with 188 participants) was included in this review and was assessed as having a high risk of bias. This study provided limited data for dental caries outcomes (dmfs/DMFS increment) and economic cost outcomes (reported time taken to provide examinations and treatment). There is insufficient evidence from randomised controlled trials (RCTs) to draw any conclusions regarding the potential beneficial and harmful effects of altering the recall interval between dental check-ups. There is insufficient evidence to support or refute the practice of encouraging patients to attend for dental check-ups at 6-monthly intervals. It is important that high quality RCTs are conducted for the outcomes listed in this review in order to address the objectives of this review.

  2. 42 CFR 455.19 - Provider's statement on check.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Provider's statement on check. 455.19 Section 455.19 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS PROGRAM INTEGRITY: MEDICAID Medicaid Agency Fraud Detection and...

  3. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  4. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  5. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  6. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  7. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  8. Cones of localized shear strain in incompressible elasticity with prestress: Green's function and integral representations

    PubMed Central

    Argani, L. P.; Bigoni, D.; Capuani, D.; Movchan, N. V.

    2014-01-01

    The infinite-body three-dimensional Green's function set (for incremental displacement and mean stress) is derived for the incremental deformation of a uniformly strained incompressible, nonlinear elastic body. Particular cases of the developed formulation are the Mooney–Rivlin elasticity and the J2-deformation theory of plasticity. These Green's functions are used to develop a boundary integral equation framework, by introducing an ad hoc potential, which paves the way for a boundary element formulation of three-dimensional problems of incremental elasticity. Results are used to investigate the behaviour of a material deformed near the limit of ellipticity and to reveal patterns of shear failure. In fact, within the investigated three-dimensional framework, localized deformations emanating from a perturbation are shown to be organized in conical geometries rather than in planar bands, so that failure is predicted to develop through curved and thin surfaces of intense shearing, as can for instance be observed in the cup–cone rupture of ductile metal bars. PMID:25197258

  9. Progressive simplification and transmission of building polygons based on triangle meshes

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Wang, Yingjie; Guo, Qingsheng; Han, Jiafu

    2010-11-01

    Digital earth is a virtual representation of our planet and a data integration platform which aims at harnessing multisource, multi-resolution, multi-format spatial data. This paper introduces a research framework integrating progressive cartographic generalization and transmission of vector data. The progressive cartographic generalization provides multiple resolution data from coarse to fine as key scales and increments between them which is not available in traditional generalization framework. Based on the progressive simplification algorithm, the building polygons are triangulated into meshes and encoded according to the simplification sequence of two basic operations, edge collapse and vertex split. The map data at key scales and encoded increments between them are stored in a multi-resolution file. As the client submits requests to the server, the coarsest map is transmitted first and then the increments. After data decoding and mesh refinement the building polygons with more details will be visualized. Progressive generalization and transmission of building polygons is demonstrated in the paper.

  10. Preventing mental health problems in children: the Families in Mind population-based cluster randomised controlled trial

    PubMed Central

    2012-01-01

    Background Externalising and internalising problems affect one in seven school-aged children and are the single strongest predictor of mental health problems into early adolescence. As the burden of mental health problems persists globally, childhood prevention of mental health problems is paramount. Prevention can be offered to all children (universal) or to children at risk of developing mental health problems (targeted). The relative effectiveness and costs of a targeted only versus combined universal and targeted approach are unknown. This study aims to determine the effectiveness, costs and uptake of two approaches to early childhood prevention of mental health problems ie: a Combined universal-targeted approach, versus a Targeted only approach, in comparison to current primary care services (Usual care). Methods/design Three armed, population-level cluster randomised trial (2010–2014) within the universal, well child Maternal Child Health system, attended by more than 80% of families in Victoria, Australia at infant age eight months. Participants were families of eight month old children from nine participating local government areas. Randomised to one of three groups: Combined, Targeted or Usual care. The interventions comprises (a) the Combined universal and targeted program where all families are offered the universal Toddlers Without Tears group parenting program followed by the targeted Family Check-Up one-on-one program or (b) the Targeted Family Check-Up program. The Family Check-Up program is only offered to children at risk of behavioural problems. Participants will be analysed according to the trial arm to which they were randomised, using logistic and linear regression models to compare primary and secondary outcomes. An economic evaluation (cost consequences analysis) will compare incremental costs to all incremental outcomes from a societal perspective. Discussion This trial will inform public health policy by making recommendations about the effectiveness and cost-effectiveness of these early prevention programs. If effective prevention programs can be implemented at the population level, the growing burden of mental health problems could be curbed. Trial registration ISRCTN61137690 PMID:22682229

  11. A study of the use of abstract types for the representation of engineering units in integration and test applications

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    Physical quantities using various units of measurement can be well represented in Ada by the use of abstract types. Computation involving these quantities (electric potential, mass, volume) can also automatically invoke the computation and checking of some of the implicitly associable attributes of measurements. Quantities can be held internally in SI units, transparently to the user, with automatic conversion. Through dimensional analysis, the type of the derived quantity resulting from a computation is known, thereby allowing dynamic checks of the equations used. The impact of the possible implementation of these techniques in integration and test applications is discussed. The overhead of computing and transporting measurement attributes is weighed against the advantages gained by their use. The construction of a run time interpreter using physical quantities in equations can be aided by the dynamic equation checks provided by dimensional analysis. The effects of high levels of abstraction on the generation and maintenance of software used in integration and test applications are also discussed.

  12. Summary of the Science Performed Onboard the International Space Station within the United States Orbital Segment during Increments 16 and 17

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Istasse, Eric; Stenuit, Hilde; Murakami, Jeiji; Yoshizaki, Izumi; Johnson-Green, Perry

    2008-01-01

    With the launch of the STS-122 on February 7, 2008, which delivered the European Columbus science module and the upcoming STS-124 flight, which will deliver the Japanese Kibo science module in May 2008, the International Space Station will become truly International with Europe and Japan joining the United States of America and Russia to perform science on a continuous basis in a wide spectrum of science disciplines. The last science module, Kibo, of the United States Orbital Segment (USOS) will be mated to the station on time to celebrate its first decade in low Earth orbit in October 2008 (end of Increment 17), thus ushering in the second decade of the station with all the USOS science modules mated and performing science. The arrival of the Kibo science module will also mark continuous human presence on the station for eighty eight (88) months, and, with the addition of the ESA science module during the STS-122 flight, the USOS will be made up of four space agencies: CSA, ESA, JAXA and NASA, spanning three continents. With the additional partners coming onboard with different research needs, every effort is being made to coordinate science across the USOS segment in an integrated manner for the benefit of all parties. One of the objectives of this paper is to discuss the integrated manner in which science planning/replanning and prioritization during the execution phase of an increment is being done. The main focus, though, of this paper is to summarize and to discuss the science performed during Increments 16 and 17 (October 2007 to October 2008). The discussion will focus mainly on the primary objectives of each investigation and their associated hypotheses that were investigated during these two Increments. Also, preliminary science results will be discussed for each of the investigation as science results availability permit. Additionally, the paper will briefly touch on what the science complement for these two increments was and what was actually accomplished due to real time science implementation and constraints. Finally, the paper will briefly discuss the science research complements for the next three Increments: Increments 18 to 20, in order to preview how much science might be accomplished during these three upcoming Increments of the station next decade.

  13. MultiLIS: A Description of the System Design and Operational Features.

    ERIC Educational Resources Information Center

    Kelly, Glen J.; And Others

    1988-01-01

    Describes development, hardware requirements, and features of the MultiLIS integrated library software package. A system profile provides pricing information, operational characteristics, and technical specifications. Sidebars discuss MultiLIS integration structure, incremental architecture, and NCR Tower Computers. (4 references) (MES)

  14. Effects of simulated microgravity on otoliths growth and microstructure of Larval Zebrafish, Danio rerio

    NASA Astrophysics Data System (ADS)

    Li, Xiaoyan; Wang, Gaohong; Liu, Yongding

    2012-07-01

    Otolith is the vestibular endorgan that takes part in gravitational signal initiation. Environmental change can leave mark on otolith microstructure. In this study, we use zebrafish from embryo stage of 10hpf to middle larval stage of 12dpf to investigate the effect of microgravity on otolith development. It was found that otoliths size of microgravity group was larger than the control before 6dpf, but after that both groups kept nearly the same size. Surface scanning of otolith morphology with SEM showed that otolith of microgravity group were much smoother than the control. After etching with HCl, we found both groups formed daily increments, but microgravity group lack clear check marks in some special developmental stage. Widths between increments were wider, and granule shape was much sharper in microgravity group. Analysis of crystal orientation disclosed the increments of microgravity group formed irregularly. The surface etched with PKb also exhibited different granule size and orientation: the granules in the control had nearly the same size and direction, while the particles in microgravity were smaller and orientated differently along the translucent ring. The organic leftover were also found between layers in microgravity group. These results suggest that microgravity can affect otolith development, the component and structural mode of inorganic and organic parts change with different gravitation environment, which may be involved in orientation adjustment of SMS (Space Movement Sickness).

  15. 42 CFR 424.518 - Screening levels for Medicare providers and suppliers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... this section. (ii)(A) Requires the submission of a set of fingerprints for a national background check... provider or supplier; and (B) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who...

  16. 42 CFR 424.518 - Screening levels for Medicare providers and suppliers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... this section. (ii)(A) Requires the submission of a set of fingerprints for a national background check... provider or supplier; and (B) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who...

  17. 42 CFR 424.518 - Screening levels for Medicare providers and suppliers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Requires the submission of a set of fingerprints for a national background check from all individuals who...) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who maintain a 5 percent or greater...

  18. 42 CFR 424.518 - Screening levels for Medicare providers and suppliers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... this section. (ii)(A) Requires the submission of a set of fingerprints for a national background check... provider or supplier; and (B) Conducts a fingerprint-based criminal history record check of the Federal Bureau of Investigation's Integrated Automated Fingerprint Identification System on all individuals who...

  19. An Integrated Approach to Functional Engineering: An Engineering Database for Harness, Avionics and Software

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni

    2012-08-01

    In the design and development phase of a new program one of the critical aspects is the integration of all the functional requirements of the system and the control of the overall consistency between the identified needs on one side and the available resources on the other side, especially when both the required needs and available resources are not yet consolidated, but they are evolving as the program maturity increases.The Integrated Engineering Harness Avionics and Software database (IDEHAS) is a tool that has been developed to support this process in the frame of the Avionics and Software disciplines through the different phases of the program. The tool is in fact designed to allow an incremental build up of the avionics and software systems, from the description of the high level architectural data (available in the early stages of the program) to the definition of the pin to pin connectivity information (typically consolidated in the design finalization stages) and finally to the construction and validation of the detailed telemetry parameters and commands to be used in the test phases and in the Mission Control Centre. The key feature of this approach and of the associated tool is that it allows the definition and the maintenance / update of all these data in a single, consistent environment.On one side a system level and concurrent approach requires the feasibility to easily integrate and update the best data available since the early stages of a program in order to improve confidence in the consistency and to control the design information.On the other side, the amount of information of different typologies and the cross-relationships among the data imply highly consolidated structures requiring lot of checks to guarantee the data content consistency with negative effects on simplicity and flexibility and often limiting the attention to special needs and to the interfaces with other disciplines.

  20. Testing Instrument for Flight-Simulator Displays

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1987-01-01

    Displays for flight-training simulators rapidly aligned with aid of integrated optical instrument. Calibrations and tests such as aligning boresight of display with respect to user's eyes, checking and adjusting display horizon, checking image sharpness, measuring illuminance of displayed scenes, and measuring distance of optical focus of scene performed with single unit. New instrument combines all measurement devices in single, compact, integrated unit. Requires just one initial setup. Employs laser and produces narrow, collimated beam for greater measurement accuracy. Uses only one moving part, double right prism, to position laser beam.

  1. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  2. FERC examines transmission access pricing policies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burkhart, L.A.

    1993-02-15

    The Federal Energy Regulatory Commission (FERC) has approved two orders dealing with transmission pricing issues that evolved from its March 1992 Pennsylvania Electric Co. (Penelec) decision dealing with cost recovery of transmission plant expansion. Commissioner Betsy Moler said the cases represented the first time the FERC has used incremental rates for wheeling transactions. In the first new order, Public Service Electric Gas Co. proposed charging for third-party transmission services to recover the sum of its standard embedded cost rate and the net incremental cost of making upgrades to its integrated transmission system. Public Service sought to provide service to EEAmore » I Limited, which proposed building and operating a cogeneration facility at a United States Gypsum Co. plant in New Jersey. Consolidated Edison Company of New York Inc. had agreed to purchase the project's output. In rejecting Public Service's proposal, the FERC ruled the utility must charge a transmission rate that is the higher of either its embedded cost rate or its incremental cost rate of accelerated expansion. (In this instance, the incremental rate was higher). In the second new order, Public Service Co. of Colorado (PSCC) filed a proposed open access transmission service tariff related to its acquisition of facilities from Colorado-Ute Electric Association Inc. The FERC rejected PSCC's tariff request that would have required a new transmission customer (in the event PSCC modified its integrated transmission grid) to pay the sum of PSCC's standard transmission rate (reflecting the average cost of the grid facilities) plus the expansion cost (reflecting the incremental facility cost).« less

  3. Autonomic Recovery: HyperCheck: A Hardware-Assisted Integrity Monitor

    DTIC Science & Technology

    2013-08-01

    system (OS). HyperCheck leverages the CPU System Management Mode ( SMM ), present in x86 systems, to securely generate and transmit the full state of the...HyperCheck harnesses the CPU System Management Mode ( SMM ) which is present in all x86 commodity systems to create a snapshot view of the current state of the...protect the software above it. Our assumptions are that the attacker does not have physical access to the machine and that the SMM BIOS is locked and

  4. Building a vision for the future: strategic planning in a shared governance nursing organization.

    PubMed

    Baker, C; Beglinger, J E; Bowles, K; Brandt, C; Brennan, K M; Engelbaugh, S; Hallock, T; LaHam, M

    2000-06-01

    Today's health care delivery environment is marked by extreme turbulence and ever-increasing complexity. Now, more than ever, an organization's strategic plan must do more than outline a business plan. Rather, the strategic plan is a fundamental tool for building and sustaining an organizational vision for the future. The strong, dynamic strategic plan (1) represents a long-range vision for improving organizational performance, (2) provides a model for planning and implementing structures and processes for the management of outcomes, (3) reflects and shapes the organizational culture and customer focus, (4) provides decision support for difficult operational choices made day to day, and (5) integrates and aligns the work of the organization. This article describes the development, implementation, and evaluation of a methodology for strategic planning within a shared governance nursing organization. Built upon the strategic plan of the hospital, the process undertaken by the nursing organization reflects the following commitments: (1) to develop a strategic plan that is meaningful and part of daily work life at all levels of the nursing organization, (2) to make the plan practical and realistic through incremental building, (3) to locate and articulate accountability for each step, and (4) to build in a process for checking progress toward goal achievement and readjusting the plan as necessary.

  5. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  6. Astronaut Joseph Tanner checks gloves during during launch/entry training

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Astronaut Joseph R. Tanner, mission specialist, checks his gloves during a rehearsal for the launch and entry phases of the scheduled November 1994 flight of STS-66. This rehearsal, held in the crew compartment trainer (CCT) of JSC's Shuttle mockup and integration laboratory, was followed by a training session on emergency egress procedures.

  7. Ice hockey shoulder pad design and the effect on head response during shoulder-to-head impacts.

    PubMed

    Richards, Darrin; Ivarsson, B Johan; Scher, Irving; Hoover, Ryan; Rodowicz, Kathleen; Cripton, Peter

    2016-11-01

    Ice hockey body checks involving direct shoulder-to-head contact frequently result in head injury. In the current study, we examined the effect of shoulder pad style on the likelihood of head injury from a shoulder-to-head check. Shoulder-to-head body checks were simulated by swinging a modified Hybrid-III anthropomorphic test device (ATD) with and without shoulder pads into a stationary Hybrid-III ATD at 21 km/h. Tests were conducted with three different styles of shoulder pads (traditional, integrated and tethered) and without shoulder pads for the purpose of control. Head response kinematics for the stationary ATD were measured. Compared to the case of no shoulder pads, the three different pad styles significantly (p < 0.05) reduced peak resultant linear head accelerations of the stationary ATD by 35-56%. The integrated shoulder pads reduced linear head accelerations by an additional 18-21% beyond the other two styles of shoulder pads. The data presented here suggest that shoulder pads can be designed to help protect the head of the struck player in a shoulder-to-head check.

  8. Continuous optical monitoring of a near-shore sea-water column

    NASA Astrophysics Data System (ADS)

    Bensky, T. J.; Neff, B.

    2006-12-01

    Cal Poly San Luis Obispo runs the Central Coast Marine Sciences Center, south-facing, 1-km-long pier in San Luis Bay, on the west coast of California, midway between Los Angeles and San Fransisco. The facility is secure and dedicated to marine science research. We have constructed an automated optical profiling system that collects sunlight samples, in half-foot increments, from a 30 foot vertical column of sea-water below the pier. Our implementation lowers a high quality, optically pure fiber cable into the water at 30 minute intervals. Light collected by the submersed fiber aperture is routed to the pier surface where it is spectrally analyzed using an Ocean Optics HR2000 spectrometer. The spectrometer instantly yields the spectrum of the light collected at a given depth. The "spectrum" here is light intensity as a function of wavelength between 200 and 1100 nm in increments of 0.1 nm. Each dive of the instrument takes approximately 80 seconds, lowers the fiber from the surface to a depth of 30 feet, and yields approximately 60 spectra, each one taken at a such successively larger depth. A computer logs each spectra as a function of depth. From such data, we are able to extract total downward photon flux, quantify ocean color, and compute attenuation coefficients. The system is entirely autonomous, includes an integrated data-browser, and can be checked-on, or even controlled over the Internet, using a web-browser. Linux runs the computer, data is logged directly to a mySQL database for easy extraction, and a PHP-script ties the system together. Current work involves studying light-energy deposition trends and effects of surface action on downward photon flux. This work has been funded by the Office of Naval Research (ONR) and the California Central Coast Research Park Initiative (C3RP).

  9. An efficient incremental learning mechanism for tracking concept drift in spam filtering

    PubMed Central

    Sheu, Jyh-Jian; Chu, Ko-Tsung; Li, Nien-Feng; Lee, Cheng-Chi

    2017-01-01

    This research manages in-depth analysis on the knowledge about spams and expects to propose an efficient spam filtering method with the ability of adapting to the dynamic environment. We focus on the analysis of email’s header and apply decision tree data mining technique to look for the association rules about spams. Then, we propose an efficient systematic filtering method based on these association rules. Our systematic method has the following major advantages: (1) Checking only the header sections of emails, which is different from those spam filtering methods at present that have to analyze fully the email’s content. Meanwhile, the email filtering accuracy is expected to be enhanced. (2) Regarding the solution to the problem of concept drift, we propose a window-based technique to estimate for the condition of concept drift for each unknown email, which will help our filtering method in recognizing the occurrence of spam. (3) We propose an incremental learning mechanism for our filtering method to strengthen the ability of adapting to the dynamic environment. PMID:28182691

  10. Variational formulation for dissipative continua and an incremental J-integral

    NASA Astrophysics Data System (ADS)

    Rahaman, Md. Masiur; Dhas, Bensingh; Roy, D.; Reddy, J. N.

    2018-01-01

    Our aim is to rationally formulate a proper variational principle for dissipative (viscoplastic) solids in the presence of inertia forces. As a first step, a consistent linearization of the governing nonlinear partial differential equations (PDEs) is carried out. An additional set of complementary (adjoint) equations is then formed to recover an underlying variational structure for the augmented system of linearized balance laws. This makes it possible to introduce an incremental Lagrangian such that the linearized PDEs, including the complementary equations, become the Euler-Lagrange equations. Continuous groups of symmetries of the linearized PDEs are computed and an analysis is undertaken to identify the variational groups of symmetries of the linearized dissipative system. Application of Noether's theorem leads to the conservation laws (conserved currents) of motion corresponding to the variational symmetries. As a specific outcome, we exploit translational symmetries of the functional in the material space and recover, via Noether's theorem, an incremental J-integral for viscoplastic solids in the presence of inertia forces. Numerical demonstrations are provided through a two-dimensional plane strain numerical simulation of a compact tension specimen of annealed mild steel under dynamic loading.

  11. Deficits in Thematic Integration Processes in Broca's and Wernicke's Aphasia

    ERIC Educational Resources Information Center

    Nakano, Hiroko; Blumstein, Sheila E.

    2004-01-01

    This study investigated how normal subjects and Broca's and Wernicke's aphasics integrate thematic information incrementally using syntax, lexical-semantics, and pragmatics in a simple active declarative sentence. Three priming experiments were conducted using an auditory lexical decision task in which subjects made a lexical decision on a…

  12. The dynamics of a shear band

    NASA Astrophysics Data System (ADS)

    Giarola, Diana; Capuani, Domenico; Bigoni, Davide

    2018-03-01

    A shear band of finite length, formed inside a ductile material at a certain stage of a continued homogeneous strain, provides a dynamic perturbation to an incident wave field, which strongly influences the dynamics of the material and affects its path to failure. The investigation of this perturbation is presented for a ductile metal, with reference to the incremental mechanics of a material obeying the J2-deformation theory of plasticity (a special form of prestressed, elastic, anisotropic, and incompressible solid). The treatment originates from the derivation of integral representations relating the incremental mechanical fields at every point of the medium to the incremental displacement jump across the shear band faces, generated by an impinging wave. The boundary integral equations (under the plane strain assumption) are numerically approached through a collocation technique, which keeps into account the singularity at the shear band tips and permits the analysis of an incident wave impinging a shear band. It is shown that the presence of the shear band induces a resonance, visible in the incremental displacement field and in the stress intensity factor at the shear band tips, which promotes shear band growth. Moreover, the waves scattered by the shear band are shown to generate a fine texture of vibrations, parallel to the shear band line and propagating at a long distance from it, but leaving a sort of conical shadow zone, which emanates from the tips of the shear band.

  13. Body Covering and Body Image: A Comparison of Veiled and Unveiled Muslim Women, Christian Women, and Atheist Women Regarding Body Checking, Body Dissatisfaction, and Eating Disorder Symptoms.

    PubMed

    Wilhelm, Leonie; Hartmann, Andrea S; Becker, Julia C; Kişi, Melahat; Waldorf, Manuel; Vocks, Silja

    2018-02-21

    Although Islam is the fastest growing religion worldwide, only few studies have investigated body image in Muslim women, and no study has investigated body checking. Therefore, the present study examined whether body image, body checking, and disordered eating differ between veiled and unveiled Muslim women, Christian women, and atheist women. While the groups did not differ regarding body dissatisfaction, unveiled Muslim women reported more checking than veiled Muslim and Christian women, and higher bulimia scores than Christian. Thus, prevention against eating disorders should integrate all women, irrespective of religious affiliation or veiling, with a particular focus on unveiled Muslim women.

  14. Poster — Thur Eve — 17: In-phantom and Fluence-based Measurements for Quality Assurance of Volumetric-driven Adaptation of Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaly, B; Hoover, D; Mitchell, S

    2014-08-15

    During volumetric modulated arc therapy (VMAT) of head and neck cancer, some patients lose weight which may result in anatomical deviations from the initial plan. If these deviations are substantial a new treatment plan can be designed for the remainder of treatment (i.e., adaptive planning). Since the adaptive treatment process is resource intensive, one possible approach to streamlining the quality assurance (QA) process is to use the electronic portal imaging device (EPID) to measure the integrated fluence for the adapted plans instead of the currently-used ArcCHECK device (Sun Nuclear). Although ArcCHECK is recognized as the clinical standard for patient-specific VMATmore » plan QA, it has limited length (20 cm) for most head and neck field apertures and has coarser detector spacing than the EPID (10 mm vs. 0.39 mm). In this work we compared measurement of the integrated fluence using the EPID with corresponding measurements from the ArcCHECK device. In the past year nine patients required an adapted plan. Each of the plans (the original and adapted) is composed of two arcs. Routine clinical QA was performed using the ArcCHECK device, and the same plans were delivered to the EPID (individual arcs) in integrated mode. The dose difference between the initial plan and adapted plan was compared for ArcCHECK and EPID. In most cases, it was found that the EPID is more sensitive in detecting plan differences. Therefore, we conclude that EPID provides a viable alternative for QA of the adapted head and neck plans and should be further explored.« less

  15. STS-97 Mission Specialist Tanner during pre-pack and fit check

    NASA Technical Reports Server (NTRS)

    2000-01-01

    STS-97 Mission Specialist Joseph Tanner gets help with his boots from suit technician Erin Canlon during check pre-pack and fit check. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.

  16. ExoMars Raman Laser Spectrometer scientific required performances check with a Breadboard

    NASA Astrophysics Data System (ADS)

    Moral, A.; Díaz, E.; Ramos, G.; Rodríguez Prieto, J. A.; Pérez Canora, C.; Díaz, C.; Canchal, R.; Gallego, P.; Santamaría, P.; Colombo, M.

    2013-09-01

    The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Program, ExoMars mission. For being able to verify the achievement of the scientific objectives of the instrument, a Breadboard campaign was developed, for achieving instrument TRL5. Within the Instrument TRL5 Plan, it was required to every unit to develop its own Unit Breadboard, to check their own TRL5 and then to deliver it to System Team to be integrated and tested for finally checks Instrument performances.

  17. Single-pass incremental force updates for adaptively restrained molecular dynamics.

    PubMed

    Singh, Krishna Kant; Redon, Stephane

    2018-03-30

    Adaptively restrained molecular dynamics (ARMD) allows users to perform more integration steps in wall-clock time by switching on and off positional degrees of freedoms. This article presents new, single-pass incremental force updates algorithms to efficiently simulate a system using ARMD. We assessed different algorithms for speedup measurements and implemented them in the LAMMPS MD package. We validated the single-pass incremental force update algorithm on four different benchmarks using diverse pair potentials. The proposed algorithm allows us to perform simulation of a system faster than traditional MD in both NVE and NVT ensembles. Moreover, ARMD using the new single-pass algorithm speeds up the convergence of observables in wall-clock time. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor

    NASA Technical Reports Server (NTRS)

    deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert

    2001-01-01

    This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.

  19. integrated Electronic Health Record Increment 1 (iEHR Inc 1)

    DTIC Science & Technology

    2016-03-01

    Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision FY - Fiscal Year IA...Acronyms and Abbreviations ADM - Acquisition Decision Memorandum FD - Full Deployment FDD - Full Deployment Decision iEHR Inc 1 2016 MAR UNCLASSIFIED...iEHR Increment 1 APB for FDD achieved in November 2014 was signed by the MDA March 2, 2015. Current estimate is consistent with the FDD APB. iEHR Inc 1 2016 MAR UNCLASSIFIED 11

  20. Stokes parameters modulator for birefringent filters

    NASA Technical Reports Server (NTRS)

    Dollfus, A.

    1985-01-01

    The Solar Birefringent Filter (Filter Polarisiant Solaire Selectif FPSS) of Meudon Observatory is presently located at the focus of a solar refractor with a 28 cm lens directly pointed at the Sun. It produces a diffraction limited image without instrumental polarization and with a spectral resolution of 46,000 in a field of 6 arc min. diameter. The instrument is calibrated for absolute Doppler velocity measurements and is presently used for quantitative imagery of the radial velocity motions in the photosphere. The short period oscillations are recorded. Work of adapting the instrument for the imagery of the solar surface in the Stokes parameters is discussed. The first polarizer of the birefringent filter, with a reference position angle 0 deg, is associated with a fixed quarter wave plate at +45 deg. A rotating quarter wave plate is set at 0 deg and can be turned by incremented steps of exactly +45 deg. Another quarter wave plate also initially set at 0 deg is simultaneously incremented by -45 deg but only on each even step of the first plate. A complete cycle of increments produces images for each of the 6 parameters I + or - Q, I + or - U and I + or - V. These images are then subtracted by pairs to produce a full image in the three Stokes parameters Q, U and V. With proper retardation tolerance and positioning accuracy of the quarter wave plates, the cross talk between the Stokes parameters was calculated and checked to be minimal.

  1. Electron-proton spectrometer design summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The electron-proton spectrometer (EPS) will be placed aboard the Skylab in order to provide data from which electron and proton radiation dose can be determined. The EPS has five sensors, each consisting of a shielded silicon detector. These provide four integral electron channels and five integral proton channels from which can be deduced four differential proton increments.

  2. Microspheres as resistive elements in a check valve for low pressure and low flow rate conditions.

    PubMed

    Ou, Kevin; Jackson, John; Burt, Helen; Chiao, Mu

    2012-11-07

    In this paper we describe a microsphere-based check valve integrated with a micropump. The check valve uses Ø20 μm polystyrene microspheres to rectify flow in low pressure and low flow rate applications (Re < 1). The microspheres form a porous medium in the check valve increasing fluidic resistance based on the direction of flow. Three check valve designs were fabricated and characterized to study the microspheres' effectiveness as resistive elements. A maximum diodicity (ratio of flow in the forward and reverse direction) of 18 was achieved. The pumping system can deliver a minimum flow volume of 0.25 μL and a maximum flow volume of 1.26 μL under an applied pressure of 0.2 kPa and 1 kPa, respectively. A proof-of-concept study was conducted using a pharmaceutical agent, docetaxel (DTX), as a sample drug showing the microsphere check valve's ability to limit diffusion from the micropump. The proposed check valve and pumping concept shows strong potential for implantable drug delivery applications with low flow rate requirements.

  3. A voice-actuated wind tunnel model leak checking system

    NASA Technical Reports Server (NTRS)

    Larson, William E.

    1989-01-01

    A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.

  4. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  5. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Validation and data integrity of records... verify that the information provided to the NICS Index remains valid and correct. (b) Each data source...

  6. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...

  7. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...

  8. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...

  9. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...

  10. How to Improve Adolescent Stress Responses: Insights From Integrating Implicit Theories of Personality and Biopsychosocial Models.

    PubMed

    Yeager, David S; Lee, Hae Yeon; Jamieson, Jeremy P

    2016-08-01

    This research integrated implicit theories of personality and the biopsychosocial model of challenge and threat, hypothesizing that adolescents would be more likely to conclude that they can meet the demands of an evaluative social situation when they were taught that people have the potential to change their socially relevant traits. In Study 1 (N = 60), high school students were assigned to an incremental-theory-of-personality or a control condition and then given a social-stress task. Relative to control participants, incremental-theory participants exhibited improved stress appraisals, more adaptive neuroendocrine and cardiovascular responses, and better performance outcomes. In Study 2 (N = 205), we used a daily-diary intervention to test high school students' stress reactivity outside the laboratory. Threat appraisals (Days 5-9 after intervention) and neuroendocrine responses (Days 8 and 9 after intervention only) were unrelated to the intensity of daily stressors when adolescents received the incremental-theory intervention. Students who received the intervention also had better grades over freshman year than those who did not. These findings offer new avenues for improving theories of adolescent stress and coping. © The Author(s) 2016.

  11. HOLEGAGE 1.0 - Strain-Gauge Drilling Analysis Program

    NASA Technical Reports Server (NTRS)

    Hampton, Roy V.

    1992-01-01

    Interior stresses inferred from changes in surface strains as hole is drilled. Computes stresses using strain data from each drilled-hole depth layer. Planar stresses computed in three ways: least-squares fit for linear variation with depth, integral method to give incremental stress data for each layer, and/or linear fit to integral data. Written in FORTRAN 77.

  12. Analysis of DISMS (Defense Integrated Subsistence Management System) Increment 4

    DTIC Science & Technology

    1988-12-01

    response data entry; and rationale supporting an on-line system based on real time management information needs. Keywords: Automated systems; Subsistence; Workload capacity; Bid response; Contract administration; Computer systems.

  13. Optimal Estimation of Clock Values and Trends from Finite Data

    NASA Technical Reports Server (NTRS)

    Greenhall, Charles

    2005-01-01

    We show how to solve two problems of optimal linear estimation from a finite set of phase data. Clock noise is modeled as a stochastic process with stationary dth increments. The covariance properties of such a process are contained in the generalized autocovariance function (GACV). We set up two principles for optimal estimation: with the help of the GACV, these principles lead to a set of linear equations for the regression coefficients and some auxiliary parameters. The mean square errors of the estimators are easily calculated. The method can be used to check the results of other methods and to find good suboptimal estimators based on a small subset of the available data.

  14. Effect of electron irradiation in vacuum on FEP-A silicon solar cell covers

    NASA Technical Reports Server (NTRS)

    Marsik, S. J.; Broder, J. D.

    1975-01-01

    Fluorinated ethylene-propylene-A (FEP-A) covers on silicon solar cells were irradiated with 1-MeV electrons, in vacuum, to an accumulated fluence equivalent to approximately 28 years in synchronous orbit. The effect of irradiation on the light transmittance of FEP-A was checked by measuring the short-circuit current of the cells after each dose increment. The results indicate no apparent overall loss in transmission due to irradiation of FEP-A. Filter wheel measurements revealed some darkening of the FEP-A at the blue end of the spectrum. Although no delamination from the cell surface was observed while in vacuum, embrittlement of FEP-A occurred at the accumulated dose.

  15. Irradiation and measurements of fluorinated ethylene-propylene-A on silicon solar cells in vacuum

    NASA Technical Reports Server (NTRS)

    Marsik, S. J.; Broder, J. D.

    1975-01-01

    Silicon monoxide (SiO) coated silicon solar cells covered with fluorinated ethylene-propylene-A (FEP-A) were irradiated by 1-MeV electrons in vacuum. The effect of irradiation on the light transmittance of FEP-A was checked by measuring the short-circuit current of the cells while in vacuum after each dose increment, immediately after the irradiation, and again after a minimum elapsed time of 16 hr. The results indicated no apparent loss in transmission due to irradiation of FEP-A and no delamination from the SiO surface while the cells were in vacuum, but embrittlement of FEP-A occurred at the accumulated dose.

  16. Different geomagnetic indices as an indicator for geo-effective solar storms and human physiological state

    NASA Astrophysics Data System (ADS)

    Dimitrova, Svetla

    2008-02-01

    A group of 86 healthy volunteers were examined on each working day during periods of high solar activity. Data about systolic and diastolic blood pressure, pulse pressure, heart rate and subjective psycho-physiological complaints were gathered. MANOVA was employed to check the significance of the influence of three factors on the physiological parameters. The factors were as follows: (1) geomagnetic activity estimated by daily amplitude of H-component of the local geomagnetic field, Ap- and Dst-index; (2) gender; and (3) the presence of medication. Average values of systolic, diastolic blood pressure, pulse pressure and subjective complaints of the group were found to increase significantly with geomagnetic activity increment.

  17. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...

  18. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...

  19. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...

  20. Capturing spiral radial growth of conifers using the superellipse to model tree-ring geometric shape

    PubMed Central

    Shi, Pei-Jian; Huang, Jian-Guo; Hui, Cang; Grissino-Mayer, Henri D.; Tardif, Jacques C.; Zhai, Li-Hong; Wang, Fu-Sheng; Li, Bai-Lian

    2015-01-01

    Tree-rings are often assumed to approximate a circular shape when estimating forest productivity and carbon dynamics. However, tree rings are rarely, if ever, circular, thereby possibly resulting in under- or over-estimation in forest productivity and carbon sequestration. Given the crucial role played by tree ring data in assessing forest productivity and carbon storage within a context of global change, it is particularly important that mathematical models adequately render cross-sectional area increment derived from tree rings. We modeled the geometric shape of tree rings using the superellipse equation and checked its validation based on the theoretical simulation and six actual cross sections collected from three conifers. We found that the superellipse better describes the geometric shape of tree rings than the circle commonly used. We showed that a spiral growth trend exists on the radial section over time, which might be closely related to spiral grain along the longitudinal axis. The superellipse generally had higher accuracy than the circle in predicting the basal area increment, resulting in an improved estimate for the basal area. The superellipse may allow better assessing forest productivity and carbon storage in terrestrial forest ecosystems. PMID:26528316

  1. Capturing spiral radial growth of conifers using the superellipse to model tree-ring geometric shape.

    PubMed

    Shi, Pei-Jian; Huang, Jian-Guo; Hui, Cang; Grissino-Mayer, Henri D; Tardif, Jacques C; Zhai, Li-Hong; Wang, Fu-Sheng; Li, Bai-Lian

    2015-01-01

    Tree-rings are often assumed to approximate a circular shape when estimating forest productivity and carbon dynamics. However, tree rings are rarely, if ever, circular, thereby possibly resulting in under- or over-estimation in forest productivity and carbon sequestration. Given the crucial role played by tree ring data in assessing forest productivity and carbon storage within a context of global change, it is particularly important that mathematical models adequately render cross-sectional area increment derived from tree rings. We modeled the geometric shape of tree rings using the superellipse equation and checked its validation based on the theoretical simulation and six actual cross sections collected from three conifers. We found that the superellipse better describes the geometric shape of tree rings than the circle commonly used. We showed that a spiral growth trend exists on the radial section over time, which might be closely related to spiral grain along the longitudinal axis. The superellipse generally had higher accuracy than the circle in predicting the basal area increment, resulting in an improved estimate for the basal area. The superellipse may allow better assessing forest productivity and carbon storage in terrestrial forest ecosystems.

  2. Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media

    PubMed Central

    Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang

    2016-01-01

    Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users’ spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last. PMID:27999398

  3. Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media.

    PubMed

    Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang

    2016-12-20

    Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users' spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last.

  4. Performance of two different digital evaluation systems used for assessing pre-clinical dental students' prosthodontic technical skills.

    PubMed

    Gratton, D G; Kwon, S R; Blanchette, D R; Aquilino, S A

    2017-11-01

    Proper integration of newly emerging digital assessment tools is a central issue in dental education in an effort to provide more accurate and objective feedback to students. The study examined how the outcomes of students' tooth preparation were correlated when evaluated using traditional faculty assessment and two types of digital assessment approaches. Specifically, incorporation of the Romexis Compare 2.0 (Compare) and Sirona prepCheck 1.1 (prepCheck) systems was evaluated. Additionally, satisfaction of students based on the type of software was evaluated through a survey. Students in a second-year pre-clinical prosthodontics course were allocated to either Compare (n = 42) or prepCheck (n = 37) systems. All students received conventional instruction and used their assigned digital system as an additional evaluation tool to aid in assessing their work. Examinations assessed crown preparations of the maxillary right central incisor (#8) and the mandibular left first molar (#19). All submissions were graded by faculty, Compare and prepCheck. Technical scores did not differ between student groups for any of the assessment approaches. Compare and prepCheck had modest, statistically significant correlations with faculty scores with a minimum correlation of 0.3944 (P = 0.0011) and strong, statistically significant correlations with each other with a minimum correlation of 0.8203 (P < 0.0001). A post-course student survey found that 55.26% of the students felt unfavourably about learning the digital evaluation protocols. A total of 62.31% felt favourably about the integration of these digital tools into the curriculum. Comparison of Compare and prepCheck showed no evidence of significant difference in students' prosthodontics technical performance and perception. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Comparison of different incremental analysis update schemes in a realistic assimilation system with Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Yan, Y.; Barth, A.; Beckers, J. M.; Brankart, J. M.; Brasseur, P.; Candille, G.

    2017-07-01

    In this paper, three incremental analysis update schemes (IAU 0, IAU 50 and IAU 100) are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The difference between the three IAU schemes lies on the position of the increment update window. The relevance of each IAU scheme is evaluated through analyses on both thermohaline and dynamical variables. The validation of the assimilation results is performed according to both deterministic and probabilistic metrics against different sources of observations. For deterministic validation, the ensemble mean and the ensemble spread are compared to the observations. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score. The obtained results show that 1) the IAU 50 scheme has the same performance as the IAU 100 scheme 2) the IAU 50/100 schemes outperform the IAU 0 scheme in error covariance propagation for thermohaline variables in relatively stable region, while the IAU 0 scheme outperforms the IAU 50/100 schemes in dynamical variables estimation in dynamically active region 3) in case with sufficient number of observations and good error specification, the impact of IAU schemes is negligible. The differences between the IAU 0 scheme and the IAU 50/100 schemes are mainly due to different model integration time and different instability (density inversion, large vertical velocity, etc.) induced by the increment update. The longer model integration time with the IAU 50/100 schemes, especially the free model integration, on one hand, allows for better re-establishment of the equilibrium model state, on the other hand, smooths the strong gradients in dynamically active region.

  6. Cycle time reduction by Html report in mask checking flow

    NASA Astrophysics Data System (ADS)

    Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon

    2017-07-01

    The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.

  7. Adenovirus-delivered GFP-HO-1C[INCREMENT]23 attenuates blood-spinal cord barrier permeability after rat spinal cord contusion.

    PubMed

    Chang, Sheng; Bi, Yunlong; Meng, Xiangwei; Qu, Lin; Cao, Yang

    2018-03-21

    The blood-spinal cord barrier (BSCB) plays a key role in maintaining the microenvironment and is primarily composed of tight junction proteins and nonfenestrated capillary endothelial cells. After injury, BSCB damage results in increasing capillary permeability and release of inflammatory factors. Recent studies have reported that haem oxygenase-1 (HO-1) fragments lacking 23 amino acids at the C-terminus (HO-1C[INCREMENT]23) exert novel anti-inflammatory and antioxidative effects in vitro. However, no study has identified the role of HO-1C[INCREMENT]23 in vivo. We aimed to investigate the protective effects of HO-1C[INCREMENT]23 on the BSCB after spinal cord injury (SCI) in a rat model. Here, adenoviral HO-1C[INCREMENT]23 (Ad-GFP-HO-1C[INCREMENT]23) was intrathecally injected into the 10th thoracic spinal cord segment (T10) 7 days before SCI. In addition, nuclear and cytoplasmic extraction and immunofluorescence staining of HO-1 were used to examine the effect of Ad-GFP-HO-1C[INCREMENT]23 on HO-1 nuclear translocation. Evan's blue staining served as an index of capillary permeability and was detected by fluorescence microscopy at 633 nm. Western blotting was also performed to detect tight junction protein expression. The Basso, Beattie and Bresnahan score was used to evaluate kinematic functional recovery through the 28th day after SCI. In this study, the Ad-GFP-HO-1C[INCREMENT]23 group showed better kinematic functional recovery after SCI than the Ad-GFP and Vehicle groups, as well as smaller reductions in TJ proteins and capillary permeability compared with those in the Ad-GFP and Vehicle groups. These findings indicated that Ad-GFP-HO-1C[INCREMENT]23 might have a potential therapeutic effect that is mediated by its protection of BSCB integrity.

  8. The Regular Education Initiative in the U.S.: What Is Its Relevance to the Integration Movement in Australia?

    ERIC Educational Resources Information Center

    O'Shea, Lawrence J.; And Others

    1989-01-01

    A rationale for the regular education initiative (REI) is presented, focusing on criticisms of current service delivery such as disjointed incrementalism, excessive proceduralism, and inefficacy of pull-out programs. Criticisms of REI are addressed, and the issues are analyzed from the context of the integration movement in Australian schools.…

  9. Online Academic Integrity

    ERIC Educational Resources Information Center

    Mastin, David F.; Peszka, Jennifer; Lilly, Deborah R.

    2009-01-01

    Psychology students completed a task with reinforcement for successful performance. We tested academic integrity under randomly assigned conditions of check mark acknowledgment of an honor pledge, typed honor pledge, or no pledge. Across all conditions, 14.1% of students inflated their self-reported performance (i.e., cheated). We found no…

  10. Algorithms For Integrating Nonlinear Differential Equations

    NASA Technical Reports Server (NTRS)

    Freed, A. D.; Walker, K. P.

    1994-01-01

    Improved algorithms developed for use in numerical integration of systems of nonhomogenous, nonlinear, first-order, ordinary differential equations. In comparison with integration algorithms, these algorithms offer greater stability and accuracy. Several asymptotically correct, thereby enabling retention of stability and accuracy when large increments of independent variable used. Accuracies attainable demonstrated by applying them to systems of nonlinear, first-order, differential equations that arise in study of viscoplastic behavior, spread of acquired immune-deficiency syndrome (AIDS) virus and predator/prey populations.

  11. Block-Level Added Redundancy Explicit Authentication for Parallelized Encryption and Integrity Checking of Processor-Memory Transactions

    NASA Astrophysics Data System (ADS)

    Elbaz, Reouven; Torres, Lionel; Sassatelli, Gilles; Guillemin, Pierre; Bardouillet, Michel; Martinez, Albert

    The bus between the System on Chip (SoC) and the external memory is one of the weakest points of computer systems: an adversary can easily probe this bus in order to read private data (data confidentiality concern) or to inject data (data integrity concern). The conventional way to protect data against such attacks and to ensure data confidentiality and integrity is to implement two dedicated engines: one performing data encryption and another data authentication. This approach, while secure, prevents parallelizability of the underlying computations. In this paper, we introduce the concept of Block-Level Added Redundancy Explicit Authentication (BL-AREA) and we describe a Parallelized Encryption and Integrity Checking Engine (PE-ICE) based on this concept. BL-AREA and PE-ICE have been designed to provide an effective solution to ensure both security services while allowing for full parallelization on processor read and write operations and optimizing the hardware resources. Compared to standard encryption which ensures only confidentiality, we show that PE-ICE additionally guarantees code and data integrity for less than 4% of run-time performance overhead.

  12. Check Calibration of the NASA Glenn 10- by 10-Foot Supersonic Wind Tunnel (2014 Test Entry)

    NASA Technical Reports Server (NTRS)

    Johnson, Aaron; Pastor-Barsi, Christine; Arrington, E. Allen

    2016-01-01

    A check calibration of the 10- by 10-Foot Supersonic Wind Tunnel (SWT) was conducted in May/June 2014 using an array of five supersonic wedge probes to verify the 1999 Calibration. This check calibration was necessary following a control systems upgrade and an integrated systems test (IST). This check calibration was required to verify the tunnel flow quality was unchanged by the control systems upgrade prior to the next test customer beginning their test entry. The previous check calibration of the tunnel occurred in 2007, prior to the Mars Science Laboratory test program. Secondary objectives of this test entry included the validation of the new Cobra data acquisition system (DAS) against the current Escort DAS and the creation of statistical process control (SPC) charts through the collection of series of repeated test points at certain predetermined tunnel parameters. The SPC charts secondary objective was not completed due to schedule constraints. It is hoped that this effort will be readdressed and completed in the near future.

  13. Biochemia Medica has started using the CrossCheck plagiarism detection software powered by iThenticate

    PubMed Central

    Šupak-Smolčić, Vesna; Šimundić, Ana-Maria

    2013-01-01

    In February 2013, Biochemia Medica has joined CrossRef, which enabled us to implement CrossCheck plagiarism detection service. Therefore, all manuscript submitted to Biochemia Medica are now first assigned to Research integrity editor (RIE), before sending the manuscript for peer-review. RIE submits the text to CrossCheck analysis and is responsible for reviewing the results of the text similarity analysis. Based on the CrossCheck analysis results, RIE subsequently provides a recommendation to the Editor-in-chief (EIC) on whether the manuscript should be forwarded to peer-review, corrected for suspected parts prior to peer-review or immediately rejected. Final decision on the manuscript is, however, with the EIC. We hope that our new policy and manuscript processing algorithm will help us to further increase the overall quality of our Journal. PMID:23894858

  14. The Cost-Effectiveness of Integrating HIV Counseling and Testing into Primary Health Care in the Ukraine.

    PubMed

    Johns, Benjamin; Doroshenko, Olena; Tarantino, Lisa; Cowley, Peter

    2017-03-01

    We estimate the number of HIV cases diagnosed, costs, and cost per HIV case detected associated with integrating HIV counseling and testing (HCT) into primary health care facilities in Ukraine. The study uses a difference-in-difference design with four districts implementing the intervention compared to 20 districts where HCT were offered only at specialized HIV clinics. There was a 2.01 (95 % CI: 1.12-3.61) times increase in the number of HIV cases detected per capita in intervention districts compared to other districts. The incremental cost of the intervention was $21,017 and the incremental cost per HIV case detected was $369. The average cost per HIV case detected before the intervention was $558. Engaging primary health care facilities to provide HCT is likely desirable from an efficiency point-of-view. However, the affordability of the intervention needs to be assessed because expansion will require additional investment.

  15. Impulse processing: A dynamical systems model of incremental eye movements in the visual world paradigm

    PubMed Central

    Kukona, Anuenue; Tabor, Whitney

    2011-01-01

    The visual world paradigm presents listeners with a challenging problem: they must integrate two disparate signals, the spoken language and the visual context, in support of action (e.g., complex movements of the eyes across a scene). We present Impulse Processing, a dynamical systems approach to incremental eye movements in the visual world that suggests a framework for integrating language, vision, and action generally. Our approach assumes that impulses driven by the language and the visual context impinge minutely on a dynamical landscape of attractors corresponding to the potential eye-movement behaviors of the system. We test three unique predictions of our approach in an empirical study in the visual world paradigm, and describe an implementation in an artificial neural network. We discuss the Impulse Processing framework in relation to other models of the visual world paradigm. PMID:21609355

  16. SU-G-201-17: Verification of Dose Distributions From High-Dose-Rate Brachytherapy Ir-192 Source Using a Multiple-Array-Diode-Detector (MapCheck2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harpool, K; De La Fuente Herman, T; Ahmad, S

    Purpose: To investigate quantitatively the accuracy of dose distributions for the Ir-192 high-dose-rate (HDR) brachytherapy source calculated by the Brachytherapy-Planning system (BPS) and measured using a multiple-array-diode-detector in a heterogeneous medium. Methods: A two-dimensional diode-array-detector system (MapCheck2) was scanned with a catheter and the CT-images were loaded into the Varian-Brachytherapy-Planning which uses TG-43-formalism for dose calculation. Treatment plans were calculated for different combinations of one dwell-position and varying irradiation times and different-dwell positions and fixed irradiation time with the source placed 12mm from the diode-array plane. The calculated dose distributions were compared to the measured doses with MapCheck2 delivered bymore » an Ir-192-source from a Nucletron-Microselectron-V2-remote-after-loader. The linearity of MapCheck2 was tested for a range of dwell-times (2–600 seconds). The angular effect was tested with 30 seconds irradiation delivered to the central-diode and then moving the source away in increments of 10mm. Results: Large differences were found between calculated and measured dose distributions. These differences are mainly due to absence of heterogeneity in the dose calculation and diode-artifacts in the measurements. The dose differences between measured and calculated due to heterogeneity ranged from 5%–12% depending on the position of the source relative to the diodes in MapCheck2 and different heterogeneities in the beam path. The linearity test of the diode-detector showed 3.98%, 2.61%, and 2.27% over-response at short irradiation times of 2, 5, and 10 seconds, respectively, and within 2% for 20 to 600 seconds (p-value=0.05) which depends strongly on MapCheck2 noise. The angular dependency was more pronounced at acute angles ranging up to 34% at 5.7 degrees. Conclusion: Large deviations between measured and calculated dose distributions for HDR-brachytherapy with Ir-192 may be improved when considering medium heterogeneity and dose-artifact of the diodes. This study demonstrates that multiple-array-diode-detectors provide practical and accurate dosimeter to verify doses delivered from the brachytherapy Ir-192-source.« less

  17. How to Improve Adolescent Stress Responses: Insights from an Integration of Implicit Theories and Biopsychosocial Models

    PubMed Central

    Yeager, David S.; Lee, Hae Yeon; Jamieson, Jeremy

    2016-01-01

    This research integrated implicit theories and the biopsychosocial (BPS) model of challenge and threat, hypothesizing that adolescents would be more likely to conclude that they have the resources to meet the demands of an evaluative social situation when they were taught a belief that people have the potential to change their socially-relevant traits. Study 1 (N=60) randomly assigned high school adolescents to an incremental theory of personality or control condition, and then administered a standardized social stress task. Relative to controls, incremental theory participants exhibited improved stress appraisals, more adaptive neuroendocrine and cardiovascular responses (lower salivary cortisol, reduced vascular resistance, higher stroke volume, and more rapid return to homeostasis after stress offset), and better performance outcomes. Study 2 (N=205) used a daily diary intervention study to test high school adolescents’ stress reactivity outside the laboratory. Threat appraisals (days 5–9 post-intervention) and neuroendocrine responses (cortisol and DHEA-S; days 8–9 post-intervention only) were untethered from the intensity of daily stressors when adolescents received the incremental theory of personality intervention. The intervention also improved grades over freshman year. These findings offer new avenues for improving theories of adolescent stress and coping. PMID:27324267

  18. SU-E-P-20: Personnel Lead Apparel Integrity Inspection: Where We Are and What We Need?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, S; Zhang, J; Anaskevich, L

    Purpose: In recent years, tremendous efforts have been devoted to radiation dose reduction, especially for patients who are directly exposed to primary radiation or receive radiopharmaceuticals. Limited efforts have been focused on those personnel who are exposed to secondary radiation while fulfilling their work responsibilities associated with diagnostic imaging and image-guided interventions. Occupational exposure is compounded in daily practice and can lead to a significant radiation dose over time. Personnel lead apparel is a well-accepted engineering control to protect healthcare workers when radiation is inevitable. The question is, do we have a nationally established program to protect personnel? This studymore » is to investigate the lead apparel inspection programs among the USA. Methods: A series of surveys of state regulations, the University Health System Consortium, and federal regulations and regulations determined by accrediting bodies were conducted. The surveys were used to determine the current status of lead apparel programs regarding integrity inspections. Based on the survey results, a thorough program was proposed accordingly. Results: Of 50 states, seventeen states and Washington D.C. require lead apparel integrity inspections within their state regulations. Eleven of these states specify that the inspection is required on an annual basis. Two of these states require lead apron integrity checks to be performed semi-annually. Eleven out of the two hundred academic medical centers surveyed responded. The results show that the method (visually vs. fluoroscopy) used to conduct lead apparel integrity checks differ greatly amongst healthcare organizations. The FDA, EPA, CRCPD and NCRP require lead apparel integrity checks. However, the level of policies is different. A standard program is not well established and clearly there is a lack of standardization. Conclusion: A program led by legislative (state or federal government) and with specific frequency, methods, tracking and criteria is needed to ensure the integrity of personnel lead apparel.« less

  19. Automation of Flight Software Regression Testing

    NASA Technical Reports Server (NTRS)

    Tashakkor, Scott B.

    2016-01-01

    NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add automation, and the ability of the harness and cases to be executed continually. This test concept is an approach that can be adapted to support other projects.

  20. Net primary productivity, allocation pattern and carbon use efficiency in an apple orchard assessed by integrating eddy-covariance, biometric and continuous soil chamber measurements

    NASA Astrophysics Data System (ADS)

    Zanotelli, D.; Montagnani, L.; Manca, G.; Tagliavini, M.

    2012-10-01

    Carbon use efficiency (CUE) is a functional parameter that could possibly link the current increasingly accurate global estimates of gross primary production with those of net ecosystem exchange, for which global predictors are still unavailable. Nevertheless, CUE estimates are actually available for only a few ecosystem types, while information regarding agro-ecosystems is scarce, in spite of the simplified spatial structure of these ecosystems that facilitates studies on allocation patterns and temporal growth dynamics. We combined three largely deployed methods, eddy covariance, soil respiration and biometric measurements, to assess monthly values of CUE, net primary production (NPP) and allocation patterns in different plant organs in an apple orchard during a complete year (2010). We applied a~measurement protocol optimized for quantifying monthly values of carbon fluxes in this ecosystem type, which allows for a cross-check between estimates obtained from different methods. We also attributed NPP components to standing biomass increments, detritus cycle feeding and lateral exports. We found that in the apple orchard both net ecosystem production and gross primary production on yearly basis, 380 ± 30 g C m-2 and 1263 ± 189 g C m-2 respectively, were of a magnitude comparable to those of natural forests growing in similar climate conditions. The largest differences with respect to forests are in the allocation pattern and in the fate of produced biomass. The carbon sequestered from the atmosphere was largely allocated to production of fruits: 49% of annual NPP was taken away from the ecosystem through apple production. Organic material (leaves, fine root litter, pruned wood and early fruit falls) contributing to the detritus cycle was 46% of the NPP. Only 5% was attributable to standing biomass increment, while this NPP component is generally the largest in forests. The CUE, with an annual average of 0.71 ± 0.09, was higher than the previously suggested constant values of 0.47-0.50. Low nitrogen investment in fruits, the limited root-apparatus, and the optimal growth temperature and nutritional condition observed at the site are suggested to be explanatory variables for the high CUE observed.

  1. Development of Advanced Czochralski Growth Process to produce low cost 150 KG silicon ingots from a single crucible for technology readiness

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The modified CG2000 crystal grower construction, installation, and machine check-out was completed. The process development check-out proceeded with several dry runs and one growth run. Several machine calibrations and functional problems were discovered and corrected. Several exhaust gas analysis system alternatives were evaluated and an integrated system approved and ordered. A contract presentation was made at the Project Integration Meeting at JPL, including cost-projections using contract projected throughput and machine parameters. Several growth runs on a development CG200 RC grower show that complete neck, crown, and body automated growth can be achieved with only one operator input. Work continued for melt level, melt temperature, and diameter sensor development.

  2. On nonstationarity and antipersistency in global temperature series

    NASA Astrophysics Data System (ADS)

    KäRner, O.

    2002-10-01

    Statistical analysis is carried out for satellite-based global daily tropospheric and stratospheric temperature anomaly and solar irradiance data sets. Behavior of the series appears to be nonstationary with stationary daily increments. Estimating long-range dependence between the increments reveals a remarkable difference between the two temperature series. Global average tropospheric temperature anomaly behaves similarly to the solar irradiance anomaly. Their daily increments show antipersistency for scales longer than 2 months. The property points at a cumulative negative feedback in the Earth climate system governing the tropospheric variability during the last 22 years. The result emphasizes a dominating role of the solar irradiance variability in variations of the tropospheric temperature and gives no support to the theory of anthropogenic climate change. The global average stratospheric temperature anomaly proceeds like a 1-dim random walk at least up to 11 years, allowing good presentation by means of the autoregressive integrated moving average (ARIMA) models for monthly series.

  3. Low-cost and high-speed optical mark reader based on an intelligent line camera

    NASA Astrophysics Data System (ADS)

    Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin

    2003-08-01

    Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.

  4. Improving the Simplified Acquisition of Base Engineering Requirements (SABER) Delivery Order Award Process: Results of a Process Improvement Plan

    DTIC Science & Technology

    1991-09-01

    putting all tasks directed towsrds achieving an outcome in aequence. The tasks can be viewed as steps in the process (39:2.3). Using this...improvement opportunity is investigated. A plan is developed, root causes are identified, and solutions are tested and implemented. The process is... solutions , check for actual improvement, and integrate the successful improvements into the process. ?UP 7. Check Improvement Performance. Finally, the

  5. Multi-Reanalysis Comparison of Variability in Analysis Increment of Column-Integrated Water Vapor Associated with Madden-Julian Oscillation

    NASA Astrophysics Data System (ADS)

    Yokoi, S.

    2014-12-01

    This study conducts a comparison of three reanalysis products (JRA-55, JRA-25, and ERA-Interim) in representation of Madden-Julian Oscillation (MJO), focusing on column-integrated water vapor (CWV) that is considered as an essential variable for discussing MJO dynamics. Besides the analysis fields of CWV, which exhibit spatio-temporal distributions that are quite similar to satellite observations, CWV tendency simulated by forecast models and analysis increment calculated by data assimilation are examined. For JRA-55, it is revealed that, while its forecast model is able to simulate eastward propagation of the CWV anomaly, it tends to weaken the amplitude, and data assimilation process sustains the amplitude. The multi-reanalysis comparison of the analysis increment further reveals that this weakening bias is probably caused by excessively weak cloud-radiative feedback represented by the model. This bias in the feedback strength makes anomalous moisture supply by the vertical advection term in the CWV budget equation too insensitive to precipitation anomaly, resulting in reduction of the amplitude of CWV anomaly. ERA-Interim has a nearly opposite feature; the forecast model represents excessively strong feedback and unrealistically strengthens the amplitude, while the data assimilation weakens it. These results imply the necessity of accurate representation of the cloud-radiative feedback strength for a short-term MJO forecast, and may be evidence to support the argument that this feedback is essential for the existence of MJO. Furthermore, this study demonstrates that the multi-reanalysis comparison of the analysis increment will provide useful information for identifying model biases and, potentially, for estimating parameters that are difficult to estimate solely from observation data, such as gross moist stability.

  6. Effects of frequency and duration on psychometric functions for detection of increments and decrements in sinusoids in noise.

    PubMed

    Moore, B C; Peters, R W; Glasberg, B R

    1999-12-01

    Psychometric functions for detecting increments or decrements in level of sinusoidal pedestals were measured for increment and decrement durations of 5, 10, 20, 50, 100, and 200 ms and for frequencies of 250, 1000, and 4000 Hz. The sinusoids were presented in background noise intended to mask spectral splatter. A three-interval, three-alternative procedure was used. The results indicated that, for increments, the detectability index d' was approximately proportional to delta I/I. For decrements, d' was approximately proportional to delta L. The slopes of the psychometric functions increased (indicating better performance) with increasing frequency for both increments and decrements. For increments, the slopes increased with increasing increment duration up to 200 ms at 250 and 1000 Hz, but at 4000 Hz they increased only up to 50 ms. For decrements, the slopes increased for durations up to 50 ms, and then remained roughly constant, for all frequencies. For a center frequency of 250 Hz, the slopes of the psychometric functions for increment detection increased with duration more rapidly than predicted by a "multiple-looks" hypothesis, i.e., more rapidly than the square root of duration, for durations up to 50 ms. For center frequencies of 1000 and 4000 Hz, the slopes increased less rapidly than predicted by a multiple-looks hypothesis, for durations greater than about 20 ms. The slopes of the psychometric functions for decrement detection increased with decrement duration at a rate slightly greater than the square root of duration, for durations up to 50 ms, at all three frequencies. For greater durations, the increase in slope was less than proportional to the square root of duration. The results were analyzed using a model incorporating a simulated auditory filter, a compressive nonlinearity, a sliding temporal integrator, and a decision device based on a template mechanism. The model took into account the effects of both the external noise and an assumed internal noise. The model was able to account for the major features of the data for both increment and decrement detection.

  7. Local short-term variability in solar irradiance

    NASA Astrophysics Data System (ADS)

    Lohmann, Gerald M.; Monahan, Adam H.; Heinemann, Detlev

    2016-05-01

    Characterizing spatiotemporal irradiance variability is important for the successful grid integration of increasing numbers of photovoltaic (PV) power systems. Using 1 Hz data recorded by as many as 99 pyranometers during the HD(CP)2 Observational Prototype Experiment (HOPE), we analyze field variability of clear-sky index k* (i.e., irradiance normalized to clear-sky conditions) and sub-minute k* increments (i.e., changes over specified intervals of time) for distances between tens of meters and about 10 km. By means of a simple classification scheme based on k* statistics, we identify overcast, clear, and mixed sky conditions, and demonstrate that the last of these is the most potentially problematic in terms of short-term PV power fluctuations. Under mixed conditions, the probability of relatively strong k* increments of ±0.5 is approximately twice as high compared to increment statistics computed without conditioning by sky type. Additionally, spatial autocorrelation structures of k* increment fields differ considerably between sky types. While the profiles for overcast and clear skies mostly resemble the predictions of a simple model published by , this is not the case for mixed conditions. As a proxy for the smoothing effects of distributed PV, we finally show that spatial averaging mitigates variability in k* less effectively than variability in k* increments, for a spatial sensor density of 2 km-2.

  8. Checking the Grammar Checker: Integrating Grammar Instruction with Writing.

    ERIC Educational Resources Information Center

    McAlexander, Patricia J.

    2000-01-01

    Notes Rei Noguchi's recommendation of integrating grammar instruction with writing instruction and teaching only the most vital terms and the most frequently made errors. Presents a project that provides a review of the grammar lessons, applies many grammar rules specifically to the students' writing, and teaches students the effective use of the…

  9. Conceptual Integration of Chemical Equilibrium by Prospective Physical Sciences Teachers

    ERIC Educational Resources Information Center

    Ganaras, Kostas; Dumon, Alain; Larcher, Claudine

    2008-01-01

    This article describes an empirical study concerning the mastering of the chemical equilibrium concept by prospective physical sciences teachers. The main objective was to check whether the concept of chemical equilibrium had become an integrating and unifying concept for them, that is to say an operational and functional knowledge to explain and…

  10. How do initial signals of quality influence the diffusion of new medical products? The case of new cancer drug treatments.

    PubMed

    Conti, Rena M; Bernstein, Arielle; Meltzer, David O

    2012-01-01

    Objective measures of a new treatment's expected ability to improve patients' health are presumed to be significant factors influencing physicians' treatment decisions. Physicians' behavior may also be influenced by their patients' disease severity and insurance reimbursement policies, firm promotional activities and public media reports. This chapter examines how objective evidence of the incremental effectiveness of novel drugs to treat cancer ("chemotherapies") impacts the rate at which physicians' adopt these treatments into practice, holding constant other factors. The novelty of the analysis resides in the dataset and estimation strategy employed. Data is derived from a United States population-based chemotherapy order entry system, IntrinsiQ Intellidose. Quality/price endogeneity is overcome by employing sample selection methods and an estimation strategy that exploits quality variation at the molecule-indication level. Pooled diffusion rates across molecule-indication pairs are estimated using nonparametric hazard models. Results suggest incremental effectiveness is negatively and nonsignificantly associated with the diffusion of new chemotherapies; faster rates of diffusion are positively and significantly related to low five-year survival probabilities and measures of perceived clinical significance. Results are robust to numerous specification checks, including a measure of alternative therapeutic availability. We discuss the magnitude and potential direction of bias introduced by several threats to internal validity. Evidence of incremental effectiveness does not appear to motivate the rate of specialty physician diffusion of new medical treatment; in all models high risk of disease mortality and perceptions of therapeutic quality are significant drivers of physician use of novel chemotherapies. Understanding the rate of technological advance across different clinical settings, as well as the product-, provider-, and patient-level determinants of this rate, is an important subject for future research.

  11. Density variations and their influence on carbon stocks: case-study on two Biosphere Reserves in the Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    De Ridder, Maaike; De Haulleville, Thalès; Kearsley, Elizabeth; Van den Bulcke, Jan; Van Acker, Joris; Beeckman, Hans

    2014-05-01

    It is commonly acknowledged that allometric equations for aboveground biomass and carbon stock estimates are improved significantly if density is included as a variable. However, not much attention is given to this variable in terms of exact, measured values and density profiles from pith to bark. Most published case-studies obtain density values from literature sources or databases, this way using large ranges of density values and possible causing significant errors in carbon stock estimates. The use of one single fixed value for density is also not recommended if carbon stock increments are estimated. Therefore, our objective is to measure and analyze a large number of tree species occurring in two Biosphere Reserves (Luki and Yangambi). Nevertheless, the diversity of tree species in these tropical forests is too high to perform this kind of detailed analysis on all tree species (> 200/ha). Therefore, we focus on the most frequently encountered tree species with high abundance (trees/ha) and dominance (basal area/ha) for this study. Increment cores were scanned with a helical X-ray protocol to obtain density profiles from pith to bark. This way, we aim at dividing the tree species with a distinct type of density profile into separate groups. If, e.g., slopes in density values from pith to bark remain stable over larger samples of one tree species, this slope could also be used to correct for errors in carbon (increment) estimates, caused by density values from simplified density measurements or density values from literature. In summary, this is most likely the first study in the Congo Basin that focuses on density patterns in order to check their influence on carbon stocks and differences in carbon stocking based on species composition (density profiles ~ temperament of tree species).

  12. Multi-agent planning and scheduling, execution monitoring and incremental rescheduling: Application to motorway traffic

    NASA Technical Reports Server (NTRS)

    Mourou, Pascal; Fade, Bernard

    1992-01-01

    This article describes a planning method applicable to agents with great perception and decision-making capabilities and the ability to communicate with other agents. Each agent has a task to fulfill allowing for the actions of other agents in its vicinity. Certain simultaneous actions may cause conflicts because they require the same resource. The agent plans each of its actions and simultaneously transmits these to its neighbors. In a similar way, it receives plans from the other agents and must take account of these plans. The planning method allows us to build a distributed scheduling system. Here, these agents are robot vehicles on a highway communicating by radio. In this environment, conflicts between agents concern the allocation of space in time and are connected with the inertia of the vehicles. Each vehicle made a temporal, spatial, and situated reasoning in order to drive without collision. The flexibility and reactivity of the method presented here allows the agent to generate its plan based on assumptions concerning the other agents and then check these assumptions progressively as plans are received from the other agents. A multi-agent execution monitoring of these plans can be done, using data generated during planning and the multi-agent decision-making algorithm described here. A selective backtrack allows us to perform incremental rescheduling.

  13. Age estimation from dental cementum incremental lines and periodontal disease.

    PubMed

    Dias, P E M; Beaini, T L; Melani, R F H

    2010-12-01

    Age estimation by counting incremental lines in cementum added to the average age of tooth eruption is considered an accurate method by some authors, while others reject it stating weak correlation between estimated and actual age. The aim of this study was to evaluate this technique and check the influence of periodontal disease on age estimates by analyzing both the number of cementum lines and the correlation between cementum thickness and actual age on freshly extracted teeth. Thirty one undecalcified ground cross sections of approximately 30 µm, from 25 teeth were prepared, observed, photographed and measured. Images were enhanced by software and counts were made by one observer, and the results compared with two control-observers. There was moderate correlation ((r)=0.58) for the entire sample, with mean error of 9.7 years. For teeth with periodontal pathologies, correlation was 0.03 with a mean error of 22.6 years. For teeth without periodontal pathologies, correlation was 0.74 with mean error of 1.6 years. There was correlation of 0.69 between cementum thickness and known age for the entire sample, 0.25 for teeth with periodontal problems and 0.75 for teeth without periodontal pathologies. The technique was reliable for periodontally sound teeth, but not for periodontally diseased teeth.

  14. Metrology Laboratory | Energy Systems Integration Facility | NREL

    Science.gov Websites

    and artificial) Spectral reflectance and transmission of materials (functional check only , pyrheliometers,* pyranometers,* and pyrgeometers. The Metrology Laboratory provides National Institute of

  15. [Design and implementation of data checking system for Chinese materia medica resources survey].

    PubMed

    Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.

  16. [Advanced information technologies for financial services industry]. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The project scope is to develop an advanced user interface utilizing speech and/or handwriting recognition technology that will improve the accuracy and speed of recording transactions in the dynamic environment of a foreign exchange (FX) trading floor. The project`s desired result is to improve the base technology for trader`s workstations on FX trading floors. Improved workstation effectiveness will allow vast amounts of complex information and events to be presented and analyzed, thus increasing the volume of money and other assets to be exchanged at an accelerated rate. The project scope is to develop and demonstrate technologies that advance interbank checkmore » imaging and paper check truncation. The following describes the tasks to be completed: (1) Identify the economics value case, the legal and regulatory issues, the business practices that are affected, and the effects upon settlement. (2) Familiarization with existing imaging technology. Develop requirements for image quality, security, and interoperability. Adapt existing technologies to meet requirements. (3) Define requirements for the imaging laboratory and design its architecture. Integrate and test technology from task 2 with equipment in the laboratory. (4) Develop and/or integrate and test remaining components; includes security, storage, and communications. (5) Build a prototype system and test in a laboratory. Install and run in two or more banks. Develop documentation. Conduct training. The project`s desired result is to enable a proof-of-concept trial in which multiple banks will exchange check images, exhibiting operating conditions which a check experiences as it travels through the payments/clearing system. The trial should demonstrate the adequacy of digital check images instead of paper checks.« less

  17. Path-integral method for the source apportionment of photochemical pollutants

    NASA Astrophysics Data System (ADS)

    Dunker, A. M.

    2015-06-01

    A new, path-integral method is presented for apportioning the concentrations of pollutants predicted by a photochemical model to emissions from different sources. A novel feature of the method is that it can apportion the difference in a species concentration between two simulations. For example, the anthropogenic ozone increment, which is the difference between a simulation with all emissions present and another simulation with only the background (e.g., biogenic) emissions included, can be allocated to the anthropogenic emission sources. The method is based on an existing, exact mathematical equation. This equation is applied to relate the concentration difference between simulations to line or path integrals of first-order sensitivity coefficients. The sensitivities describe the effects of changing the emissions and are accurately calculated by the decoupled direct method. The path represents a continuous variation of emissions between the two simulations, and each path can be viewed as a separate emission-control strategy. The method does not require auxiliary assumptions, e.g., whether ozone formation is limited by the availability of volatile organic compounds (VOCs) or nitrogen oxides (NOx), and can be used for all the species predicted by the model. A simplified configuration of the Comprehensive Air Quality Model with Extensions (CAMx) is used to evaluate the accuracy of different numerical integration procedures and the dependence of the source contributions on the path. A Gauss-Legendre formula using three or four points along the path gives good accuracy for apportioning the anthropogenic increments of ozone, nitrogen dioxide, formaldehyde, and nitric acid. Source contributions to these increments were obtained for paths representing proportional control of all anthropogenic emissions together, control of NOx emissions before VOC emissions, and control of VOC emissions before NOx emissions. There are similarities in the source contributions from the three paths but also differences due to the different chemical regimes resulting from the emission-control strategies.

  18. Path-integral method for the source apportionment of photochemical pollutants

    NASA Astrophysics Data System (ADS)

    Dunker, A. M.

    2014-12-01

    A new, path-integral method is presented for apportioning the concentrations of pollutants predicted by a photochemical model to emissions from different sources. A novel feature of the method is that it can apportion the difference in a species concentration between two simulations. For example, the anthropogenic ozone increment, which is the difference between a simulation with all emissions present and another simulation with only the background (e.g., biogenic) emissions included, can be allocated to the anthropogenic emission sources. The method is based on an existing, exact mathematical equation. This equation is applied to relate the concentration difference between simulations to line or path integrals of first-order sensitivity coefficients. The sensitivities describe the effects of changing the emissions and are accurately calculated by the decoupled direct method. The path represents a continuous variation of emissions between the two simulations, and each path can be viewed as a separate emission-control strategy. The method does not require auxiliary assumptions, e.g., whether ozone formation is limited by the availability of volatile organic compounds (VOC's) or nitrogen oxides (NOx), and can be used for all the species predicted by the model. A simplified configuration of the Comprehensive Air Quality Model with Extensions is used to evaluate the accuracy of different numerical integration procedures and the dependence of the source contributions on the path. A Gauss-Legendre formula using 3 or 4 points along the path gives good accuracy for apportioning the anthropogenic increments of ozone, nitrogen dioxide, formaldehyde, and nitric acid. Source contributions to these increments were obtained for paths representing proportional control of all anthropogenic emissions together, control of NOx emissions before VOC emissions, and control of VOC emissions before NOx emissions. There are similarities in the source contributions from the three paths but also differences due to the different chemical regimes resulting from the emission-control strategies.

  19. An efficient numerical algorithm for transverse impact problems

    NASA Technical Reports Server (NTRS)

    Sankar, B. V.; Sun, C. T.

    1985-01-01

    Transverse impact problems in which the elastic and plastic indentation effects are considered, involve a nonlinear integral equation for the contact force, which, in practice, is usually solved by an iterative scheme with small increments in time. In this paper, a numerical method is proposed wherein the iterations of the nonlinear problem are separated from the structural response computations. This makes the numerical procedures much simpler and also efficient. The proposed method is applied to some impact problems for which solutions are available, and they are found to be in good agreement. The effect of the magnitude of time increment on the results is also discussed.

  20. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  1. CCM-C,Collins checks the middeck experiment

    NASA Image and Video Library

    1999-07-24

    S93-E-5016 (23 July 1999) --- Astronaut Eileen M. Collins, mission commander, checks on an experiment on Columbia's middeck during Flight Day 1 activity. The experiment is called the Cell Culture Model, Configuration C. Objectives of it are to validate cell culture models for muscle, bone and endothelial cell biochemical and functional loss induced by microgravity stress; to evaluate cytoskeleton, metabolism, membrane integrity and protease activity in target cells; and to test tissue loss pharmaceuticals for efficacy. The photo was recorded with an electronic still camera (ESC).

  2. KSC-00padig103

    NASA Image and Video Library

    2000-11-28

    STS-97 Mission Specialist Joseph Tanner gets help with his boots from suit technician Erin Canlon during check pre-pack and fit check. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST

  3. NASA Dryden technicians work on a fit-check mockup in preparation for systems installation work on an Orion boilerplate crew capsule for launch abort testing.

    NASA Image and Video Library

    2008-01-24

    NASA Dryden technicians work on a fit-check mockup in preparation for systems installation work on an Orion boilerplate crew capsule for launch abort testing. A mockup Orion crew module has been constructed by NASA Dryden Flight Research Center's Fabrication Branch. The mockup is being used to develop integration procedures for avionics and instrumentation in advance of the arrival of the first abort flight test article.

  4. NASA Dryden technicians take measurements inside a fit-check mockup for prior to systems installation on a boilerplate Orion launch abort test crew capsule.

    NASA Image and Video Library

    2008-01-24

    NASA Dryden technicians take measurements inside a fit-check mockup for prior to systems installation on a boilerplate Orion launch abort test crew capsule. A mockup Orion crew module has been constructed by NASA Dryden Flight Research Center's Fabrication Branch. The mockup is being used to develop integration procedures for avionics and instrumentation in advance of the arrival of the first abort flight test article.

  5. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  6. Communication Satellite Payload Special Check out Equipment (SCOE) for Satellite Testing

    NASA Astrophysics Data System (ADS)

    Subhani, Noman

    2016-07-01

    This paper presents Payload Special Check out Equipment (SCOE) for the test and measurement of communication satellite Payload at subsystem and system level. The main emphasis of this paper is to demonstrate the principle test equipment, instruments and the payload test matrix for an automatic test control. Electrical Ground Support Equipment (EGSE)/ Special Check out Equipment (SCOE) requirements, functions and architecture for C-band and Ku-band payloads are presented in details along with their interface with satellite during different phases of satellite testing. It provides test setup, in a single rack cabinet that can easily be moved from payload assembly and integration environment to thermal vacuum chamber all the way to launch site (for pre-launch test and verification).

  7. Manipulation of Microenvironment with a Built-in Electrochemical Actuator in Proximity of a Dissolved Oxygen Microsensor

    NASA Technical Reports Server (NTRS)

    Kim, Chang-Soo; Lee, Cae-Hyang; Fiering, Jason O.; Ufer, Stefan; Scarantino, Charles W.; Nagle, H. Troy; Fiering, Jason O.; Ufer, Stefan; Nagle, H. Troy; Scarantino, Charles W.

    2004-01-01

    Abstract - Biochemical sensors for continuous monitoring require dependable periodic self- diagnosis with acceptable simplicity to check its functionality during operation. An in situ self- diagnostic technique for a dissolved oxygen microsensor is proposed in an effort to devise an intelligent microsensor system with an integrated electrochemical actuation electrode. With a built- in platinum microelectrode that surrounds the microsensor, two kinds of microenvironments, called the oxygen-saturated or oxygen-depleted phases, can be created by water electrolysis depending on the polarity. The functionality of the microsensor can be checked during these microenvironment phases. The polarographic oxygen microsensor is fabricated on a flexible polyimide substrate (Kapton) and the feasibility of the proposed concept is demonstrated in a physiological solution. The sensor responds properly during the oxygen-generating and oxygen- depleting phases. The use of these microenvironments for in situ self-calibration is discussed to achieve functional integration as well as structural integration of the microsensor system.

  8. Software Productivity of Field Experiments Using the Mobile Agents Open Architecture with Workflow Interoperability

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Lowry, Michael R.; Nado, Robert Allen; Sierhuis, Maarten

    2011-01-01

    We analyzed a series of ten systematically developed surface exploration systems that integrated a variety of hardware and software components. Design, development, and testing data suggest that incremental buildup of an exploration system for long-duration capabilities is facilitated by an open architecture with appropriate-level APIs, specifically designed to facilitate integration of new components. This improves software productivity by reducing changes required for reconfiguring an existing system.

  9. Processing structure in language and music: a case for shared reliance on cognitive control.

    PubMed

    Slevc, L Robert; Okada, Brooke M

    2015-06-01

    The relationship between structural processing in music and language has received increasing interest in the past several years, spurred by the influential Shared Syntactic Integration Resource Hypothesis (SSIRH; Patel, Nature Neuroscience, 6, 674-681, 2003). According to this resource-sharing framework, music and language rely on separable syntactic representations but recruit shared cognitive resources to integrate these representations into evolving structures. The SSIRH is supported by findings of interactions between structural manipulations in music and language. However, other recent evidence suggests that such interactions also can arise with nonstructural manipulations, and some recent neuroimaging studies report largely nonoverlapping neural regions involved in processing musical and linguistic structure. These conflicting results raise the question of exactly what shared (and distinct) resources underlie musical and linguistic structural processing. This paper suggests that one shared resource is prefrontal cortical mechanisms of cognitive control, which are recruited to detect and resolve conflict that occurs when expectations are violated and interpretations must be revised. By this account, musical processing involves not just the incremental processing and integration of musical elements as they occur, but also the incremental generation of musical predictions and expectations, which must sometimes be overridden and revised in light of evolving musical input.

  10. The cost effectiveness of integrated care for people living with HIV including antiretroviral treatment in a primary health care centre in Bujumbura, Burundi.

    PubMed

    Renaud, A; Basenya, O; de Borman, N; Greindl, I; Meyer-Rath, G

    2009-11-01

    The incremental cost effectiveness of an integrated care package (i.e., medical care including antiretroviral therapy (ART) and other services such as psychological and social support) for people living with HIV/AIDS was calculated in a not-for-profit primary health care centre in Bujumbura run by Society of Women against AIDS-Burundi (SWAA-Burundi), an African non-governmental organisation (NGO). Results are expressed as cost-effectiveness ratio 2007, constant US$ per disability-adjusted life year (DALY) averted. Unit costs are estimated from the NGO's accounting data and activity reports, healthcare utilisation is estimated from the medical records of a cohort of 149 patients. Effectiveness is modelled on the survival of this cohort, using standard calculation methods. The incremental cost of integrated care for people living with HIV/AIDS in the Bujumbura health centre of SWAA-Burundi is 258 USD per DALY averted. The package of care provided by SWAA-Burundi is therefore a very cost-effective intervention in comparison with other interventions against HIV/AIDS that include ART. It is however, less cost effective than other types of interventions against HIV/AIDS, such as preventive activities.

  11. Intelligent Data Visualization for Cross-Checking Spacecraft System Diagnosis

    NASA Technical Reports Server (NTRS)

    Ong, James C.; Remolina, Emilio; Breeden, David; Stroozas, Brett A.; Mohammed, John L.

    2012-01-01

    Any reasoning system is fallible, so crew members and flight controllers must be able to cross-check automated diagnoses of spacecraft or habitat problems by considering alternate diagnoses and analyzing related evidence. Cross-checking improves diagnostic accuracy because people can apply information processing heuristics, pattern recognition techniques, and reasoning methods that the automated diagnostic system may not possess. Over time, cross-checking also enables crew members to become comfortable with how the diagnostic reasoning system performs, so the system can earn the crew s trust. We developed intelligent data visualization software that helps users cross-check automated diagnoses of system faults more effectively. The user interface displays scrollable arrays of timelines and time-series graphs, which are tightly integrated with an interactive, color-coded system schematic to show important spatial-temporal data patterns. Signal processing and rule-based diagnostic reasoning automatically identify alternate hypotheses and data patterns that support or rebut the original and alternate diagnoses. A color-coded matrix display summarizes the supporting or rebutting evidence for each diagnosis, and a drill-down capability enables crew members to quickly view graphs and timelines of the underlying data. This system demonstrates that modest amounts of diagnostic reasoning, combined with interactive, information-dense data visualizations, can accelerate system diagnosis and cross-checking.

  12. Effects of surround articulation on lightness depend on the spatial arrangement of the articulated region

    NASA Astrophysics Data System (ADS)

    Zemach, Iris K.; Rudd, Michael E.

    2007-07-01

    We investigated the effect of surround articulation on the perceived lightness of a target disk. Surround articulation was manipulated by varying either the number of wedges in a surround consisting of wedges of alternating luminance or the number of checks in a surround consisting of a radial checkerboard pattern. In most conditions, increased articulation caused incremental targets to appear lighter and decremental targets to appear darker. But increasing the surround articulation in a way that did not increase the number of target-coaligned edges in the display did not affect the target lightness. We propose that the effects of surround articulation depend on the relationship between the orientations and contrast polarities of the target edges and those of edges present within the surround.

  13. Cuba: Multidimensional numerical integration library

    NASA Astrophysics Data System (ADS)

    Hahn, Thomas

    2016-08-01

    The Cuba library offers four independent routines for multidimensional numerical integration: Vegas, Suave, Divonne, and Cuhre. The four algorithms work by very different methods, and can integrate vector integrands and have very similar Fortran, C/C++, and Mathematica interfaces. Their invocation is very similar, making it easy to cross-check by substituting one method by another. For further safeguarding, the output is supplemented by a chi-square probability which quantifies the reliability of the error estimate.

  14. Ada and the rapid development lifecycle

    NASA Technical Reports Server (NTRS)

    Deforrest, Lloyd; Gref, Lynn

    1991-01-01

    JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies.

  15. Ensuring the Quality of Data Packages in the LTER Network Provenance Aware Synthesis Tracking Architecture Data Management System and Archive

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; O'Brien, M.; Costa, D.

    2013-12-01

    Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.

  16. A large scale software system for simulation and design optimization of mechanical systems

    NASA Technical Reports Server (NTRS)

    Dopker, Bernhard; Haug, Edward J.

    1989-01-01

    The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.

  17. Department of Defense Travel Reengineering Pilot Report to Congress

    DTIC Science & Technology

    1997-06-01

    Electronic Commerce /Electronic Data Interchange (EC/EDI) capabilities to integrate functions. automate edit checks for internal controls, and create user-friendly management tools at all levels of the process.

  18. Summary of Recent Research Accomplishment Onboard the International Space Station—Within the United States Orbital Segment

    NASA Astrophysics Data System (ADS)

    Jules, Kenol; Istasse, Eric; Stenuit, Hilde; Murakami, Keiji; Yoshizaki, Izumi; Johnson-Green, Perry

    2011-06-01

    November 20, 2010, marked a significant milestone in the annals of human endeavors in space since it was the twelfth anniversary of one of the most challenging and complex construction projects ever attempted by humans away from our planet: The construction of the International Space Stations. On November 20, 1998, the Zarya Control Module was launched. With this simple, almost unnoticed launch in the science community, the construction of a continuously staffed research platform, in Low Earth Orbit, was underway. This paper discusses the research that was performed by many occupants of this research platform during the year celebrating its twelfth anniversary. The main objectives of this paper are fourfold: (1) to discuss the integrated manner in which science planning/replanning and prioritization during the execution phase of an increment is carried out across the United States Orbital Segment since that segment is made of four independent space agencies; (2) to discuss and summarize the research that was performed during increments 16 and 17 (October 2007 to October 2008). The discussion for these two increments is primarily focused on the main objectives of each investigation and its associated hypotheses that were investigated. Whenever available and approved, preliminary research results are also discussed for each of the investigations performed during these two increments; (3) to compare the planned research portfolio for these two increments versus what was actually accomplished during the execution phase in order to discuss the challenges associated with planning and performing research in a space laboratory located over 240 miles up in space, away from the ground support team; (4) to briefly touch on the research portfolio of increments 18 and 19/20 as the International Space Station begins its next decade in Low Earth Orbit.

  19. Endogenous-cue prospective memory involving incremental updating of working memory: an fMRI study.

    PubMed

    Halahalli, Harsha N; John, John P; Lukose, Ammu; Jain, Sanjeev; Kutty, Bindu M

    2015-11-01

    Prospective memory paradigms are conventionally classified on the basis of event-, time-, or activity-based intention retrieval. In the vast majority of such paradigms, intention retrieval is provoked by some kind of external event. However, prospective memory retrieval cues that prompt intention retrieval in everyday life are commonly endogenous, i.e., linked to a specific imagined retrieval context. We describe herein a novel prospective memory paradigm wherein the endogenous cue is generated by incremental updating of working memory, and investigated the hemodynamic correlates of this task. Eighteen healthy adult volunteers underwent functional magnetic resonance imaging while they performed a prospective memory task where the delayed intention was triggered by an endogenous cue generated by incremental updating of working memory. Working memory and ongoing task control conditions were also administered. The 'endogenous-cue prospective memory condition' with incremental working memory updating was associated with maximum activations in the right rostral prefrontal cortex, and additional activations in the brain regions that constitute the bilateral fronto-parietal network, central and dorsal salience networks as well as cerebellum. In the working memory control condition, maximal activations were noted in the left dorsal anterior insula. Activation of the bilateral dorsal anterior insula, a component of the central salience network, was found to be unique to this 'endogenous-cue prospective memory task' in comparison to previously reported exogenous- and endogenous-cue prospective memory tasks without incremental working memory updating. Thus, the findings of the present study highlight the important role played by the dorsal anterior insula in incremental working memory updating that is integral to our endogenous-cue prospective memory task.

  20. Patterns of oxygen consumption during simultaneously occurring elevated metabolic states in the viviparous snake Thamnophis marcianus.

    PubMed

    Jackson, Alexander G S; Leu, Szu-Yun; Ford, Neil B; Hicks, James W

    2015-11-01

    Snakes exhibit large factorial increments in oxygen consumption during digestion and physical activity, and long-lasting sub-maximal increments during reproduction. Under natural conditions, all three physiological states may occur simultaneously, but the integrated response is not well understood. Adult male and female checkered gartersnakes (Thamnophis marcianus) were used to examine increments in oxygen consumption (i.e. V̇(O2)) and carbon dioxide production (i.e. V̇(CO2)) associated with activity (Act), digestion (Dig) and post-prandial activity (Act+Dig). For females, we carried out these trials in the non-reproductive state, and also during the vitellogenic (V) and embryogenic (E) phases of a reproductive cycle. Endurance time (i.e. time to exhaustion, TTE) was recorded for all groups during Act and Act+Dig trials. Our results indicate that male and non-reproductive female T. marcianus exhibit significant increments in V̇(O2) during digestion (∼5-fold) and activity (∼9-fold), and that Act+Dig results in a similar increment in V̇(O2) (∼9- to 10-fold). During reproduction, resting V̇(O2) increased by 1.6- to 1.7-fold, and peak increments during digestion were elevated by 30-50% above non-reproductive values, but values associated with Act and Act+Dig were not significantly different from non-reproductive values. During Act+Dig, endurance time remained similar for all of the groups in the present study. Overall, our results indicate that prioritization is the primary pattern of interaction in oxygen delivery exhibited by this species. We propose that the metabolic processes associated with digestion, and perhaps reproduction, are temporarily compromised during activity. © 2015. Published by The Company of Biologists Ltd.

  1. Dynamic modeling and simulation of an integral bipropellant propulsion double-valve combined test system

    NASA Astrophysics Data System (ADS)

    Chen, Yang; Wang, Huasheng; Xia, Jixia; Cai, Guobiao; Zhang, Zhenpeng

    2017-04-01

    For the pressure reducing regulator and check valve double-valve combined test system in an integral bipropellant propulsion system, a system model is established with modular models of various typical components. The simulation research is conducted on the whole working process of an experiment of 9 MPa working condition from startup to rated working condition and finally to shutdown. Comparison of simulation results with test data shows: five working conditions including standby, startup, rated pressurization, shutdown and halt and nine stages of the combined test system are comprehensively disclosed; valve-spool opening and closing details of the regulator and two check valves are accurately revealed; the simulation also clarifies two phenomena which test data are unable to clarify, one is the critical opening state in which the check valve spools slightly open and close alternately in their own fully closed positions, the other is the obvious effects of flow-field temperature drop and temperature rise in pipeline network with helium gas flowing. Moreover, simulation results with consideration of component wall heat transfer are closer to the test data than those under the adiabatic-wall condition, and more able to reveal the dynamic characteristics of the system in various working stages.

  2. Planning for Space Station Freedom laboratory payload integration

    NASA Technical Reports Server (NTRS)

    Willenberg, Harvey J.; Torre, Larry P.

    1989-01-01

    Space Station Freedom is being developed to support extensive missions involving microgravity research and applications. Requirements for on-orbit payload integration and the simultaneous payload integration of multiple mission increments will provide the stimulus to develop new streamlined integration procedures in order to take advantage of the increased capabilities offered by Freedom. The United States Laboratory and its user accommodations are described. The process of integrating users' experiments and equipment into the United States Laboratory and the Pressurized Logistics Modules is described. This process includes the strategic and tactical phases of Space Station utilization planning. The support that the Work Package 01 Utilization office will provide to the users and hardware developers, in the form of Experiment Integration Engineers, early accommodation assessments, and physical integration of experiment equipment, is described. Plans for integrated payload analytical integration are also described.

  3. Playing chess unconsciously.

    PubMed

    Kiesel, Andrea; Kunde, Wilfried; Pohl, Carsten; Berner, Michael P; Hoffmann, Joachim

    2009-01-01

    Expertise in a certain stimulus domain enhances perceptual capabilities. In the present article, the authors investigate whether expertise improves perceptual processing to an extent that allows complex visual stimuli to bias behavior unconsciously. Expert chess players judged whether a target chess configuration entailed a checking configuration. These displays were preceded by masked prime configurations that either represented a checking or a nonchecking configuration. Chess experts, but not novice chess players, revealed a subliminal response priming effect, that is, faster responding when prime and target displays were congruent (both checking or both nonchecking) rather than incongruent. Priming generalized to displays that were not used as targets, ruling out simple repetition priming effects. Thus, chess experts were able to judge unconsciously presented chess configurations as checking or nonchecking. A 2nd experiment demonstrated that experts' priming does not occur for simpler but uncommon chess configurations. The authors conclude that long-term practice prompts the acquisition of visual memories of chess configurations with integrated form-location conjunctions. These perceptual chunks enable complex visual processing outside of conscious awareness.

  4. Attitude Determination Algorithm based on Relative Quaternion Geometry of Velocity Incremental Vectors for Cost Efficient AHRS Design

    NASA Astrophysics Data System (ADS)

    Lee, Byungjin; Lee, Young Jae; Sung, Sangkyung

    2018-05-01

    A novel attitude determination method is investigated that is computationally efficient and implementable in low cost sensor and embedded platform. Recent result on attitude reference system design is adapted to further develop a three-dimensional attitude determination algorithm through the relative velocity incremental measurements. For this, velocity incremental vectors, computed respectively from INS and GPS with different update rate, are compared to generate filter measurement for attitude estimation. In the quaternion-based Kalman filter configuration, an Euler-like attitude perturbation angle is uniquely introduced for reducing filter states and simplifying propagation processes. Furthermore, assuming a small angle approximation between attitude update periods, it is shown that the reduced order filter greatly simplifies the propagation processes. For performance verification, both simulation and experimental studies are completed. A low cost MEMS IMU and GPS receiver are employed for system integration, and comparison with the true trajectory or a high-grade navigation system demonstrates the performance of the proposed algorithm.

  5. Comparative analysis of age dynamics of average values of body dimensions in children from birth to 7 years.

    PubMed

    Deryabin, Vasily E; Krans, Valentina M; Fedotova, Tatiana K

    2005-07-01

    Mean values of different body dimensions in different age cohorts of children make it possible to learn a lot about their dynamic changes. Their comparative analysis, as is usually practiced, in fact leads to a simple description of changes in measurement units (mm or cm) at the average level of some body dimension during a shorter or longer period of time. To estimate comparative intensity of the growth process of different body dimensions, the authors use the analogue of Mahalanobis distance, the so-called Kullback divergence (1967), which does not demand stability of dispersion or correlation coefficients of dimensions in compared cohorts of children. Most of the dimensions, excluding skinfolds, demonstrate growth dynamics with gradually reducing increments from birth to 7 years. Body length has the highest integrative increment, leg length about 94% of body length, body mass 77%, and trunk and extremities circumferences 56%. Skinfolds have a non-monotonic pattern of accumulated standardized increments with some increase until 1-2 years of age.

  6. The costs and cost-effectiveness of an integrated sepsis treatment protocol.

    PubMed

    Talmor, Daniel; Greenberg, Dan; Howell, Michael D; Lisbon, Alan; Novack, Victor; Shapiro, Nathan

    2008-04-01

    Sepsis is associated with high mortality and treatment costs. International guidelines recommend the implementation of integrated sepsis protocols; however, the true cost and cost-effectiveness of these are unknown. To assess the cost-effectiveness of an integrated sepsis protocol, as compared with conventional care. Prospective cohort study of consecutive patients presenting with septic shock and enrolled in the institution's integrated sepsis protocol. Clinical and economic outcomes were compared with a historical control cohort. Beth Israel Deaconess Medical Center. Overall, 79 patients presenting to the emergency department with septic shock in the treatment cohort and 51 patients in the control group. An integrated sepsis treatment protocol incorporating empirical antibiotics, early goal-directed therapy, intensive insulin therapy, lung-protective ventilation, and consideration for drotrecogin alfa and steroid therapy. In-hospital treatment costs were collected using the hospital's detailed accounting system. The cost-effectiveness analysis was performed from the perspective of the healthcare system using a lifetime horizon. The primary end point for the cost-effectiveness analysis was the incremental cost per quality-adjusted life year gained. Mortality in the treatment group was 20.3% vs. 29.4% in the control group (p = .23). Implementing an integrated sepsis protocol resulted in a mean increase in cost of approximately $8,800 per patient, largely driven by increased intensive care unit length of stay. Life expectancy and quality-adjusted life years were higher in the treatment group; 0.78 and 0.54, respectively. The protocol was associated with an incremental cost of $11,274 per life-year saved and a cost of $16,309 per quality-adjusted life year gained. In patients with septic shock, an integrated sepsis protocol, although not cost-saving, appears to be cost-effective and compares very favorably to other commonly delivered acute care interventions.

  7. Privacy-preserving public auditing for data integrity in cloud

    NASA Astrophysics Data System (ADS)

    Shaik Saleem, M.; Murali, M.

    2018-04-01

    Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.

  8. Integration, acceptance testing, and clinical operation of the Medical Information, Communication and Archive System, phase II.

    PubMed

    Smith, E M; Wandtke, J; Robinson, A

    1999-05-01

    The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.

  9. Sediment and Particular Organic Carbon (POC) fluxes changes over the past decades in the Yellow River system

    NASA Astrophysics Data System (ADS)

    Lu, Xixi; Ran, Lishan

    2015-04-01

    The Yellow River system used to have very high sediment export to ocean (around 1.5 Gt/yr in the 1950s) because of severe soil erosion on the Loess Plateau. However, its sediment export has declined to <0.25 Gt/yr in recent years (in the 2000s), mainly due to human activities like construction of reservoirs and check dams and other soil and water conservations such as construction of terraces and vegetation restoration. Such drastic reduction in soil erosion and sediment flux and subsequently in associated Particular Organic Carbon (POC) transport can potentially play a significant role in carbon cycling. Through the sediment flux budget we examined POC budget and carbon sequestration through vegetation restoration and various soil and water conservations including reservoirs construction over the past decades in the Yellow River system. Landsat imageries were used to delineate the reservoirs and check dams for estimating the sediment trapping. The reservoirs and check dams trapped a total amount of sediment 0.94 Gt/yr, equivalent to 6.5 Mt C. Soil erosion controls through vegetation restoration and terrace construction reduced soil erosion 1.82 Gt/yr, equivalent to 12 Mt C. The annual NPP increased from 0.150 Gt C in 2000 to 0.1889 Gt C in 2010 with an average increment rate of 3.4 Mt C per year over the recent decade (from 2000 to 2010) through vegetation restoration. The total carbon stabilized on slope systems through soil erosion controls (12 Mt C per year) was much higher than the direct carbon sequestration via vegetation restoration (3.4 Mt C per year), indicating the importance of horizontal carbon mobilization in carbon cycling, albeit a high estimate uncertainty.

  10. Cost-effectiveness of nivolumab for recurrent or metastatic head and neck cancer☆.

    PubMed

    Ward, Matthew C; Shah, Chirag; Adelstein, David J; Geiger, Jessica L; Miller, Jacob A; Koyfman, Shlomo A; Singer, Mendel E

    2017-11-01

    Nivolumab is the first drug to demonstrate a survival benefit for platinum-refractory recurrent or metastatic head and neck cancer. We performed a cost-utility analysis to assess the economic value of nivolumab as compared to alternative standard agents in this context. Using data from the CheckMate 141 trial, we constructed a Markov simulation model from the US payer's perspective to evaluate the cost-effectiveness of nivolumab compared to physician choice of either cetuximab, methotrexate or docetaxel. Alternative strategies considered included: single-agent cetuximab, methotrexate or docetaxel, or first testing for PD-L1 to select for nivolumab. Costs were extracted from Medicare and utilities from the literature and CheckMate. Probabilistic sensitivity analysis (PSA) was used to evaluate parameter uncertainty. $100,000/QALY was the primary threshold for cost-effectiveness. When comparing nivolumab to the standard arm of CheckMate, nivolumab demonstrated an incremental cost-effectiveness ratio (ICER) of $140,672/QALY. When comparing standard therapies, methotrexate was the most cost-effective with similar results for docetaxel. Nivolumab was cost-effective compared to single-agent cetuximab (ICER $89,786/QALY). Treatment selection by PD-L1 immunohistochemistry did not markedly improve the cost-effectiveness of nivolumab. Factors likely to positively impact the cost-effectiveness of nivolumab include better baseline quality-of-life, poor tolerability of standard treatments and/or a lower cost of nivolumab. Nivolumab is preferred to single-agent cetuximab but requires a willingness-to-pay of at least $150,000/QALY to be considered cost-effective when compared to docetaxel or methotrexate. Selection by PD-L1 does not markedly improve the cost-effectiveness of nivolumab. This informs patient selection and clinical care-path development. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Simple Check Valves for Microfluidic Devices

    NASA Technical Reports Server (NTRS)

    Willis, Peter A.; Greer, Harold F.; Smith, J. Anthony

    2010-01-01

    A simple design concept for check valves has been adopted for microfluidic devices that consist mostly of (1) deformable fluorocarbon polymer membranes sandwiched between (2) borosilicate float glass wafers into which channels, valve seats, and holes have been etched. The first microfluidic devices in which these check valves are intended to be used are micro-capillary electrophoresis (microCE) devices undergoing development for use on Mars in detecting compounds indicative of life. In this application, it will be necessary to store some liquid samples in reservoirs in the devices for subsequent laboratory analysis, and check valves are needed to prevent cross-contamination of the samples. The simple check-valve design concept is also applicable to other microfluidic devices and to fluidic devices in general. These check valves are simplified microscopic versions of conventional rubber- flap check valves that are parts of numerous industrial and consumer products. These check valves are fabricated, not as separate components, but as integral parts of microfluidic devices. A check valve according to this concept consists of suitably shaped portions of a deformable membrane and the two glass wafers between which the membrane is sandwiched (see figure). The valve flap is formed by making an approximately semicircular cut in the membrane. The flap is centered over a hole in the lower glass wafer, through which hole the liquid in question is intended to flow upward into a wider hole, channel, or reservoir in the upper glass wafer. The radius of the cut exceeds the radius of the hole by an amount large enough to prevent settling of the flap into the hole. As in a conventional rubber-flap check valve, back pressure in the liquid pushes the flap against the valve seat (in this case, the valve seat is the adjacent surface of the lower glass wafer), thereby forming a seal that prevents backflow.

  12. Incremental Adaptive Fuzzy Control for Sensorless Stroke Control of A Halbach-type Linear Oscillatory Motor

    NASA Astrophysics Data System (ADS)

    Lei, Meizhen; Wang, Liqiang

    2018-01-01

    The halbach-type linear oscillatory motor (HT-LOM) is multi-variable, highly coupled, nonlinear and uncertain, and difficult to get a satisfied result by conventional PID control. An incremental adaptive fuzzy controller (IAFC) for stroke tracking was presented, which combined the merits of PID control, the fuzzy inference mechanism and the adaptive algorithm. The integral-operation is added to the conventional fuzzy control algorithm. The fuzzy scale factor can be online tuned according to the load force and stroke command. The simulation results indicate that the proposed control scheme can achieve satisfied stroke tracking performance and is robust with respect to parameter variations and external disturbance.

  13. STS-97 Mission Specialist Noriega during pre-pack and fit check

    NASA Technical Reports Server (NTRS)

    2000-01-01

    STS-97 Mission Specialist Carlos Noriega gets help with his boots from suit technician Shelly Grick-Agrella during pre-pack and fit check. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.

  14. KSC01pp0129

    NASA Image and Video Library

    2001-01-17

    Workers in the Payload Changeout Room check the U.S. Lab Destiny as its moves from Atlantis’ payload bay into the PCR. Destiny will remain in the PCR while Atlantis rolls back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster’s system tunnel. An extensive evaluation of NASA’s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis

  15. KSC01pp0130

    NASA Image and Video Library

    2001-01-17

    Workers in the Payload Changeout Room check the U.S. Lab Destiny as its moves from Atlantis’ payload bay into the PCR. Destiny will remain in the PCR while Atlantis rolls back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster’s system tunnel. An extensive evaluation of NASA’s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis

  16. Development of advanced Czochralski growth process to produce low-cost 150 kG silicon ingots from a single crucible for technology readiness

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The modified CG2000 crystal grower construction, installation, and machine check out was completed. The process development check out proceeded with several dry runs and one growth run. Several machine calibrations and functional problems were discovered and corrected. Exhaust gas analysis system alternatives were evaluated and an integrated system approved and ordered. Several growth runs on a development CG2000 RC grower show that complete neck, crown, and body automated growth can be achieved with only one operator input.

  17. Reducing software security risk through an integrated approach

    NASA Technical Reports Server (NTRS)

    Gilliam, D.; Powell, J.; Kelly, J.; Bishop, M.

    2001-01-01

    The fourth quarter delivery, FY'01 for this RTOP is a Property-Based Testing (PBT), 'Tester's Assistant' (TA). The TA tool is to be used to check compiled and pre-compiled code for potential security weaknesses that could be exploited by hackers. The TA Instrumenter, implemented mostly in C++ (with a small part in Java), parsels two types of files: Java and TASPEC. Security properties to be checked are written in TASPEC. The Instrumenter is used in conjunction with the Tester's Assistant Specification (TASpec)execution monitor to verify the security properties of a given program.

  18. Flexible Processing and the Design of Grammar

    ERIC Educational Resources Information Center

    Sag, Ivan A.; Wasow, Thomas

    2015-01-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This…

  19. Integrated Strategic Planning and Analysis Network Increment 4 (ISPAN Inc 4)

    DTIC Science & Technology

    2016-03-01

    Defense Acquisition Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision FY...Inc 4 will achieve FDD completion criteria when: 1) the system meets all the KPP thresholds as verified through an Initial Operational Test and

  20. Integrated Strategic Planning and Analysis Network Increment 5 (ISPAN Inc 5)

    DTIC Science & Technology

    2016-03-01

    Defense Acquisition Executive DoD - Department of Defense DoDAF - DoD Architecture Framework FD - Full Deployment FDD - Full Deployment Decision...achieve FDD in August 2018. ISPAN Inc 5 is envisioned as a follow-on to ISPAN Inc 4 in order to respond to USSTRATCOM requirements for improved

  1. Controlling Bed Bugs Using Integrated Pest Management (IPM)

    EPA Pesticide Factsheets

    Several non-chemical methods can help control an infestation, such as heat treatment or freezing, or mattress and box spring encasements. When using a pesticide, follow label directions carefully and check for EPA registration.

  2. Pellet bed reactor for nuclear propelled vehicles: Part 2: Missions and vehicle integration trades

    NASA Technical Reports Server (NTRS)

    Haloulakos, V. E.

    1991-01-01

    Mission and vehicle integration tradeoffs involving the use of the pellet bed reactor (PBR) for nuclear powered vehicles is discussed, with much of the information being given in viewgraph form. Information is given on propellant tank geometries, shield weight requirements for conventional tank configurations, effective specific impulse, radiation mapping, radiation dose rate after shutdown, space transfer vehicle design data, a Mars mission summary, sample pellet bed nuclear orbit transfer vehicle mass breakdown, and payload fraction vs. velocity increment.

  3. On the Integration of Medium Wave Infrared Cameras for Vision-Based Navigation

    DTIC Science & Technology

    2015-03-01

    SWIR Short Wave Infrared VisualSFM Visual Structure from Motion WPAFB Wright Patterson Air Force Base xi ON THE INTEGRATION OF MEDIUM WAVE INFRARED...Structure from Motion Visual Structure from Motion ( VisualSFM ) is an application that performs incremental SfM using images fed into it of a scene [20...too drastically in between frames. When this happens, VisualSFM will begin creating a new model with images that do not fit to the old one. These new

  4. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    NASA Astrophysics Data System (ADS)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  5. Methods used to calculate doses resulting from inhalation of Capstone depleted uranium aerosols.

    PubMed

    Miller, Guthrie; Cheng, Yung Sung; Traub, Richard J; Little, Tom T; Guilmette, Raymond A

    2009-03-01

    The methods used to calculate radiological and toxicological doses to hypothetical persons inside either a U.S. Army Abrams tank or Bradley Fighting Vehicle that has been perforated by depleted uranium munitions are described. Data from time- and particle-size-resolved measurements of depleted uranium aerosol as well as particle-size-resolved measurements of aerosol solubility in lung fluids for aerosol produced in the breathing zones of the hypothetical occupants were used. The aerosol was approximated as a mixture of nine monodisperse (single particle size) components corresponding to particle size increments measured by the eight stages plus the backup filter of the cascade impactors used. A Markov Chain Monte Carlo Bayesian analysis technique was employed, which straightforwardly calculates the uncertainties in doses. Extensive quality control checking of the various computer codes used is described.

  6. Experiences in integrating auto-translated state-chart designs for model checking

    NASA Technical Reports Server (NTRS)

    Pingree, P. J.; Benowitz, E. G.

    2003-01-01

    In the complex environment of JPL's flight missions with increasing dependency on advanced software designs, traditional software validation methods of simulation and testing are being stretched to adequately cover the needs of software development.

  7. AutoLock: a semiautomated system for radiotherapy treatment plan quality control

    PubMed Central

    Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.

    2015-01-01

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498

  8. AutoLock: a semiautomated system for radiotherapy treatment plan quality control.

    PubMed

    Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G

    2015-05-08

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.

  9. Bayesian tomography and integrated data analysis in fusion diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei

    2016-11-15

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less

  10. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    PubMed

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  11. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen

    Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data aremore » accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.« less

  12. The NASA Lewis integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1991-01-01

    A new flight simulation facility has been developed at NASA Lewis to allow integrated propulsion-control and flight-control algorithm development and evaluation in real time. As a preliminary check of the simulator facility and the correct integration of its components, the control design and physics models for an STOVL fighter aircraft model have been demonstrated, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The results show that this fixed-based flight simulator can provide real-time feedback and display of both airframe and propulsion variables for validation of integrated systems and testing of control design methodologies and cockpit mechanizations.

  13. Development and pilot testing of HIV screening program integration within public/primary health centers providing antenatal care services in Maharashtra, India.

    PubMed

    Bindoria, Suchitra V; Devkar, Ramesh; Gupta, Indrani; Ranebennur, Virupax; Saggurti, Niranjan; Ramesh, Sowmya; Deshmukh, Dilip; Gaikwad, Sanjeevsingh

    2014-03-26

    The objectives of this paper are: (1) to study the feasibility and relative benefits of integrating the prevention of parent-to-child transmission (PPTCT) component of the National AIDS Control Program with the maternal and child health component of the National Rural Health Mission (NRHM) by offering HIV screening at the primary healthcare level; and (2) to estimate the incremental cost-effectiveness ratio to understand whether the costs are commensurate with the benefits. The intervention included advocacy with political, administrative/health heads, and capacity building of health staff in Satara district, Maharashtra, India. The intervention also conducted biannual outreach activities at primary health centers (PHCs)/sub-centers (SCs); initiated facility-based integrated counseling and testing centers (FICTCs) at all round-the-clock PHCs; made the existing FICTCs functional and trained PHC nurses in HIV screening. All "functional" FICTCs were equipped to screen for HIV and trained staff provided counseling and conducted HIV testing as per the national protocol. Data were collected pre- and post- integration on the number of pregnant women screened for HIV, the number of functional FICTCs and intervention costs. Trend analyses on various outcome measures were conducted. Further, the incremental cost-effectiveness ratio per pregnant woman screened was calculated. An additional 27% of HIV-infected women were detected during the intervention period as the annual HIV screening increased from pre- to post-intervention (55% to 79%, p < 0.001) among antenatal care (ANC) attendees under the NRHM. A greater increase in HIV screening was observed in PHCs/SCs. The proportions of functional FICTCs increased from 47% to 97% (p < 0.001). Additionally, 93% of HIV-infected pregnant women were linked to anti-retroviral therapy centers; 92% of mother-baby pairs received Nevirapine; and 89% of exposed babies were enrolled for early infant diagnosis. The incremental cost-effectiveness ratio was estimated at INR 44 (less than 1 US$) per pregnant woman tested. Integrating HIV screening with the broader Rural Health Mission is a promising opportunity to scale up the PPTCT program. However, advocacy, sensitization, capacity building and the judicious utilization of available resources are key to widening the reach of the PPTCT program in India and elsewhere.

  14. A Framework for Flipped Learning

    ERIC Educational Resources Information Center

    Eppard, Jenny; Rochdi, Aicha

    2017-01-01

    Over the last few decades, with the rapid developments of mobile technology, the advent of Web 2.0 sites, and the expansion of social media, there has been an incremental use of technology in the classroom. One of the approaches for technological integration into the classroom is via flipped learning. This pedagogical method has become…

  15. Saxon Math. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2016

    2016-01-01

    "Saxon Math" is a core curriculum for students in grades K-12 that uses an incremental approach to instruction and assessment. This approach limits the amount of new math content delivered to students each day and allows time for daily practice. New concepts are introduced gradually and integrated with previously introduced content so…

  16. Space Station Freedom operations planning

    NASA Technical Reports Server (NTRS)

    Accola, Anne L.; Keith, Bryant

    1989-01-01

    The Space Station Freedom program is developing an operations planning structure which assigns responsibility for planning activities to three tiers of management. The strategic level develops the policy, goals and requirements for the program over a five-year horizon. Planning at the tactical level emphasizes program integration and planning for a two-year horizon. The tactical planning process, architecture, and products have been documented and discussed with the international partners. Tactical planning includes the assignment of user and system hardware as well as significant operational events to a time increment (the period of time from the arrival of one Shuttle to the manned base to the arrival of the next). Execution-level planning emphasizes implementation, and each organization produces detailed plans, by increment, that are specific to its function.

  17. A computational procedure for multibody systems including flexible beam dynamics

    NASA Technical Reports Server (NTRS)

    Downer, J. D.; Park, K. C.; Chiou, J. C.

    1990-01-01

    A computational procedure suitable for the solution of equations of motions for flexible multibody systems has been developed. The flexible beams are modeled using a fully nonlinear theory which accounts for both finite rotations and large deformations. The present formulation incorporates physical measures of conjugate Cauchy stress and covariant strain increments. As a consequence, the beam model can easily be interfaced with real-time strain measurements and feedback control systems. A distinct feature of the present work is the computational preservation of total energy for undamped systems; this is obtained via an objective strain increment/stress update procedure combined with an energy-conserving time integration algorithm which contains an accurate update of angular orientations. The procedure is demonstrated via several example problems.

  18. Extending XNAT Platform with an Incremental Semantic Framework

    PubMed Central

    Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael

    2017-01-01

    Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709

  19. Extending XNAT Platform with an Incremental Semantic Framework.

    PubMed

    Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael

    2017-01-01

    Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.

  20. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  1. Net primary productivity, allocation pattern and carbon use efficiency in an apple orchard assessed by integrating eddy covariance, biometric and continuous soil chamber measurements

    NASA Astrophysics Data System (ADS)

    Zanotelli, D.; Montagnani, L.; Manca, G.; Tagliavini, M.

    2013-05-01

    Carbon use efficiency (CUE), the ratio of net primary production (NPP) over gross primary production (GPP), is a functional parameter that could possibly link the current increasingly accurate global GPP estimates with those of net ecosystem exchange, for which global predictors are still unavailable. Nevertheless, CUE estimates are actually available for only a few ecosystem types, while information regarding agro-ecosystems is scarce, in spite of the simplified spatial structure of these ecosystems that facilitates studies on allocation patterns and temporal growth dynamics. We combined three largely deployed methods, eddy covariance, soil respiration and biometric measurements, to assess monthly values of CUE, NPP and allocation patterns in different plant organs in an apple orchard during a complete year (2010). We applied a measurement protocol optimized for quantifying monthly values of carbon fluxes in this ecosystem type, which allows for a cross check between estimates obtained from different methods. We also attributed NPP components to standing biomass increments, detritus cycle feeding and lateral exports. We found that in the apple orchard, both net ecosystem production and gross primary production on a yearly basis, 380 ± 30 g C m-2 and 1263 ± 189 g C m-2 respectively, were of a magnitude comparable to those of natural forests growing in similar climate conditions. The largest differences with respect to forests are in the allocation pattern and in the fate of produced biomass. The carbon sequestered from the atmosphere was largely allocated to production of fruit: 49% of annual NPP was taken away from the ecosystem through apple production. Organic material (leaves, fine root litter, pruned wood and early fruit falls) contributing to the detritus cycle was 46% of the NPP. Only 5% was attributable to standing biomass increment, while this NPP component is generally the largest in forests. The CUE, with an annual average of 0.71 ± 0.12, was higher than the previously suggested constant values of 0.47-0.50. Low nitrogen investment in fruit, the limited root apparatus, and the optimal growth temperature and nutritional condition observed at the site are suggested to be explanatory variables for the high CUE observed.

  2. Hiring a Pest Management Professional for Bed Bugs

    EPA Pesticide Factsheets

    If you hire someone to treat your bed bug infestation, make sure they use Integrated Pest Management (IPM) techniques, check credentials, and know they may need multiple visits, to take apart furniture, and to use vacuums, heat, and pesticides.

  3. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  4. Nonschematic drawing recognition: a new approach based on attributed graph grammar with flexible embedding

    NASA Astrophysics Data System (ADS)

    Lee, Kyu J.; Kunii, T. L.; Noma, T.

    1993-01-01

    In this paper, we propose a syntactic pattern recognition method for non-schematic drawings, based on a new attributed graph grammar with flexible embedding. In our graph grammar, the embedding rule permits the nodes of a guest graph to be arbitrarily connected with the nodes of a host graph. The ambiguity caused by this flexible embedding is controlled with the evaluation of synthesized attributes and the check of context sensitivity. To integrate parsing with the synthesized attribute evaluation and the context sensitivity check, we also develop a bottom up parsing algorithm.

  5. STS-97 Mission Specialist Garneau with full launch and entry suit during pre-pack and fit check

    NASA Technical Reports Server (NTRS)

    2000-01-01

    During pre-pack and fit check in the Operations and Checkout Building, STS-97 Commander Brent Jett gets help with his gloves from suit technician Bill Todd. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.

  6. STS-97 Mission Specialist Garneau during pre-pack and fit check

    NASA Technical Reports Server (NTRS)

    2000-01-01

    STS-97 Mission Specialist Marc Garneau gets help with his boots from suit technician Tommy McDonald during pre-pack and fit check. Garneau is with the Canadian Space Agency. Mission STS-97 is the sixth construction flight to the International Space Station. Its payload includes the P6 Integrated Truss Structure and a photovoltaic (PV) module, with giant solar arrays that will provide power to the Station. The mission includes two spacewalks to complete the solar array connections. STS-97 is scheduled to launch Nov. 30 at about 10:06 p.m. EST.

  7. Automata-Based Verification of Temporal Properties on Running Programs

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)

    2001-01-01

    This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  8. KSC01pp0126

    NASA Image and Video Library

    2001-01-17

    Workers in the Payload Changeout Room check the Payload Ground Handling Mechanism that will move the U.S. Lab Destiny out of Atlantis’ payload bay and into the PCR. After the move, Atlantis will roll back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster’s system tunnel. An extensive evaluation of NASA’s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis

  9. The Z1 truss is placed in stand to check weight and balance

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In the Space Station Processing Facility, the Integrated Truss Structure Z1 rests in the workstand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the International Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.

  10. The Z1 truss is lowered to stand to check weight and balance

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In the Space Station Processing Facility, an overhead crane lowers the Integrated Truss Structure Z1 onto a workstand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the International Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.

  11. The Z1 truss is moved to check weight and balance

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In the Space Station Processing Facility, the Integrated Truss Structure Z1, an element of the International Space Station, is moved to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.

  12. The Z1 truss is moved to check weight and balance

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In the Space Station Processing Facility, the Integrated Truss Structure Z1, an element of the International Space Station, is lifted for moving to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.

  13. Astronaut Joseph Tanner checks gloves during during launch/entry training

    NASA Image and Video Library

    1994-06-23

    S94-40082 (23 June 1994) --- Astronaut Joseph R. Tanner, mission specialist, checks his glove during a rehearsal for launch and entry phases of the scheduled November flight of STS-66. This rehearsal, held in the Crew Compartment Trainer (CCT) of the Johnson Space Center's (JSC) Shuttle Mockup and Integration Laboratory, was followed by a training session on emergency egress procedures. In November, Tanner will join four other NASA astronauts and a European mission specialist for a week and a half aboard the Space Shuttle Atlantis in Earth-orbit in support of the Atmospheric Laboratory for Applications and Science (ATLAS-3).

  14. Incremental Ontology-Based Extraction and Alignment in Semi-structured Documents

    NASA Astrophysics Data System (ADS)

    Thiam, Mouhamadou; Bennacer, Nacéra; Pernelle, Nathalie; Lô, Moussa

    SHIRIis an ontology-based system for integration of semi-structured documents related to a specific domain. The system’s purpose is to allow users to access to relevant parts of documents as answers to their queries. SHIRI uses RDF/OWL for representation of resources and SPARQL for their querying. It relies on an automatic, unsupervised and ontology-driven approach for extraction, alignment and semantic annotation of tagged elements of documents. In this paper, we focus on the Extract-Align algorithm which exploits a set of named entity and term patterns to extract term candidates to be aligned with the ontology. It proceeds in an incremental manner in order to populate the ontology with terms describing instances of the domain and to reduce the access to extern resources such as Web. We experiment it on a HTML corpus related to call for papers in computer science and the results that we obtain are very promising. These results show how the incremental behaviour of Extract-Align algorithm enriches the ontology and the number of terms (or named entities) aligned directly with the ontology increases.

  15. Incremental Bayesian Category Learning From Natural Language.

    PubMed

    Frermann, Lea; Lapata, Mirella

    2016-08-01

    Models of category learning have been extensively studied in cognitive science and primarily tested on perceptual abstractions or artificial stimuli. In this paper, we focus on categories acquired from natural language stimuli, that is, words (e.g., chair is a member of the furniture category). We present a Bayesian model that, unlike previous work, learns both categories and their features in a single process. We model category induction as two interrelated subproblems: (a) the acquisition of features that discriminate among categories, and (b) the grouping of concepts into categories based on those features. Our model learns categories incrementally using particle filters, a sequential Monte Carlo method commonly used for approximate probabilistic inference that sequentially integrates newly observed data and can be viewed as a plausible mechanism for human learning. Experimental results show that our incremental learner obtains meaningful categories which yield a closer fit to behavioral data compared to related models while at the same time acquiring features which characterize the learned categories. (An earlier version of this work was published in Frermann and Lapata .). Copyright © 2015 Cognitive Science Society, Inc.

  16. Open Source ERP Applications: A Reality Check for Their Possible Adoption and Use in Teaching Business Process Integration

    ERIC Educational Resources Information Center

    Huynh, Minh; Pinto, Ivan

    2010-01-01

    For years, there has been a need for teaching students about business process integration. The use of ERP systems has been proposed as a mechanism to meet this need. Yet, in the midst of a recent economic crisis, it is difficult to find funding for the acquisition and implementation of an ERP system for teaching purpose. While it is recognized…

  17. Revised Framingham Stroke Risk Score, Nontraditional Risk Markers, and Incident Stroke in a Multiethnic Cohort.

    PubMed

    Flueckiger, Peter; Longstreth, Will; Herrington, David; Yeboah, Joseph

    2018-02-01

    Limited data exist on the performance of the revised Framingham Stroke Risk Score (R-FSRS) and the R-FSRS in conjunction with nontraditional risk markers. We compared the R-FSRS, original FSRS, and the Pooled Cohort Equation for stroke prediction and assessed the improvement in discrimination by nontraditional risk markers. Six thousand seven hundred twelve of 6814 participants of the MESA (Multi-Ethnic Study of Atherosclerosis) were included. Cox proportional hazard, area under the curve, net reclassification improvement, and integrated discrimination increment analysis were used to assess and compare each stroke prediction risk score. Stroke was defined as fatal/nonfatal strokes (hemorrhagic or ischemic). After mean follow-up of 10.7 years, 231 of 6712 (3.4%) strokes were adjudicated (2.7% ischemic strokes). Mean stroke risks using the R-FSRS, original FSRS, and Pooled Cohort Equation were 4.7%, 5.9%, and 13.5%. The R-FSRS had the best calibration (Hosmer-Lemeshow goodness-of-fit, χ 2 =6.55; P =0.59). All risk scores were predictive of incident stroke. C statistics of R-FSRS (0.716) was similar to Pooled Cohort Equation (0.716), but significantly higher than the original FSRS (0.653; P =0.01 for comparison with R-FSRS). Adding nontraditional risk markers individually to the R-FSRS did not improve discrimination of the R-FSRS in the area under the curve analysis, but did improve category-less net reclassification improvement and integrated discrimination increment for incident stroke. The addition of coronary artery calcium to R-FSRS produced the highest category-less net reclassification improvement (0.36) and integrated discrimination increment (0.0027). Similar results were obtained when ischemic strokes were used as the outcome. The R-FSRS downgraded stroke risk but had better calibration and discriminative ability for incident stroke compared with the original FSRS. Nontraditional risk markers modestly improved the discriminative ability of the R-FSRS, with coronary artery calcium performing the best. © 2018 American Heart Association, Inc.

  18. Adjustments in channel morphology due to land-use changes and check dam installation in mountain torrents of Calabria (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Fortugno, Diego; Zema, Demetrio Antonio; Bombino, Giuseppe; Tamburino, Vincenzo; Quinonero Rubio, Juan Manuel; Boix-Fayos, Carolina

    2016-04-01

    In Mediterranean semi-arid conditions the geomorphic effects of land-use changes and check dam installation on active channel headwater morphology are not completely understood. In such environments, the availability of specific studies, which monitor channel adjustments as a response to reforestation and check dams over representative observation periods, could help develop new management strategies and erosion control measures. This investigation is an integrated approach assessing the adjustments of channel morphology in a typical torrent (Sant'Agata, Calabria, Southern Italy) after land-use changes (e.g. fire, reforestation, land abandonment) and check dam construction across a period of about 60 years (1955-2012). A statistical analysis of historical rainfall records, an analysis of land-use change in the catchment area and a geomorphological mapping of channel adjustments were carried out and combined with field surveys of bed surface grain-size over a 5-km reach including 14 check dams. The analysis of the historical rainfall records showed a slight decrease in the amount and erosivity of precipitation. Mapping of land-use changes highlighted a general increase of vegetal coverage on the slopes adjacent to the monitored reaches. Together with the check dam network installation, this increase could have induced a reduction in water and sediment supply. The different erosional and depositional forms and adjustments showed a general narrowing between consecutive check dams together with local modifications detected upstream (bed aggradation and cross section expansion together with low-flow realignments) and downstream (local incision) of the installed check dams. Changes in the torrent bends were also detected as a response to erosional and depositional processes with different intensities. The study highlighted: (i) the efficiency of check dams against the disrupting power of the most intense floods by stabilising the active channel; and (ii) the influence of reforestation in increasing hillslope protection from erosion and disconnectivity of water and sediment flows towards the active channel. The residual sediment deficit circulating in the watershed suggests the need of slight management interventions, as, for instance, the conversion of the existing check dams into open structures, allowing a definite channel and coast stability.

  19. Association between progression-free survival and health-related quality of life in oncology: a systematic review protocol

    PubMed Central

    Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng

    2016-01-01

    Introduction There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Methods and analysis Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Discussion Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology trials. Systematic evidence produced from our study will contribute to improvement of patient care and practice of evidence-based medicine in oncology. PMID:27591026

  20. Increment of serum bilirubin as an independent marker predicting new-onset type 2 diabetes mellitus in a Korean population.

    PubMed

    Lee, S-E; Lee, Y-B; Jun, J E; Jin, S-M; Jee, J H; Bae, J C; Kim, J H

    2017-03-01

    Several cross-sectional studies reported that serum bilirubin concentrations had an inverse association with type 2 diabetes mellitus (T2DM) prevalence. The aim of the current study was to investigate the relationship between percentage change in bilirubin levels (PCB) and incident risk of T2DM using a longitudinal model. 22,084 participants who received regular health check-ups between 2006 and 2012 were enrolled. Multivariable-adjusted Cox regression models were used to determine the hazard ratio (HR) of incident T2DM based on PCB. PCB was determined by subtracting baseline serum bilirubin level (BB) from the bilirubin level at the end of follow-up or a year before the last date of diagnosis, dividing by BB and multiplying by 100. Compared to non-diabetics, BB was lower in the diabetic group at the initial visit. There were 20,098 participants without T2DM at the initial visit; 1253 new cases occurred during follow-up. As PCB increased, T2DM incidence also increased (P < 0.001). After adjusting for confounders, the HR of incident T2DM in the highest PCB quartile was 2.08 (95% confidence interval [CI] 1.76-2.46). This trend remained significant when PCB was analyzed as a continuous variable (HR for 1-SD increment, 1.25; 95% CI 1.19-1.31). Additional analysis comparing the rate of PCB during the follow-up period revealed that the serum bilirubin level of the Incident T2DM group increased before T2DM development and decreased rapidly thereafter compared to others (P < 0.001). Bilirubin level increment over time is associated with T2DM development. Copyright © 2016 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human Nutrition, and the Department of Clinical Medicine and Surgery, Federico II University. Published by Elsevier B.V. All rights reserved.

  1. Report: Plans to Migrate Data to the New EPA Acquisition System Need Improvement

    EPA Pesticide Factsheets

    Report #10-P-0071, February 24, 2010. EPA’s plans for migrating data from ICMS to EAS lack sufficient incorporation of data integrity and quality checks to ensure the complete and accurate transfer of procurement data.

  2. Cortical bone fracture analysis using XFEM - case study.

    PubMed

    Idkaidek, Ashraf; Jasiuk, Iwona

    2017-04-01

    We aim to achieve an accurate simulation of human cortical bone fracture using the extended finite element method within a commercial finite element software abaqus. A two-dimensional unit cell model of cortical bone is built based on a microscopy image of the mid-diaphysis of tibia of a 70-year-old human male donor. Each phase of this model, an interstitial bone, a cement line, and an osteon, are considered linear elastic and isotropic with material properties obtained by nanoindentation, taken from literature. The effect of using fracture analysis methods (cohesive segment approach versus linear elastic fracture mechanics approach), finite element type, and boundary conditions (traction, displacement, and mixed) on cortical bone crack initiation and propagation are studied. In this study cohesive segment damage evolution for a traction separation law based on energy and displacement is used. In addition, effects of the increment size and mesh density on analysis results are investigated. We find that both cohesive segment and linear elastic fracture mechanics approaches within the extended finite element method can effectively simulate cortical bone fracture. Mesh density and simulation increment size can influence analysis results when employing either approach, and using finer mesh and/or smaller increment size does not always provide more accurate results. Both approaches provide close but not identical results, and crack propagation speed is found to be slower when using the cohesive segment approach. Also, using reduced integration elements along with the cohesive segment approach decreases crack propagation speed compared with using full integration elements. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Real Time Maintenance Approval and Required IMMT Coordination

    NASA Technical Reports Server (NTRS)

    Burchell, S.

    2016-01-01

    Payloads are assessed for nominal operations. Payload Developers have the option of performing a maintenance hazard assessment (MHA) for potential maintenance activities. When POIC (Payload Operations and Integration Center) Safety reviews an OCR calling for a maintenance procedure, we cannot approve it without a MHA. If no MHA exists, we contact MER (Mission Evaluation Room) Safety. Depending on the nature of the problem, MER Safety has the option to: Analyze and grant approval themselves; Direct the payload back to the ISRP (Integrated Safety Review Panel); Direct the payload to the IMMT (Increment Mission Management Team).

  4. Incremental value of copeptin to highly sensitive cardiac Troponin I for rapid rule-out of myocardial infarction.

    PubMed

    Wildi, Karin; Zellweger, Christa; Twerenbold, Raphael; Jaeger, Cedric; Reichlin, Tobias; Haaf, Philip; Faoro, Jonathan; Giménez, Maria Rubini; Fischer, Andreas; Nelles, Berit; Druey, Sophie; Krivoshei, Lian; Hillinger, Petra; Puelacher, Christian; Herrmann, Thomas; Campodarve, Isabel; Rentsch, Katharina; Steuer, Stephan; Osswald, Stefan; Mueller, Christian

    2015-01-01

    The incremental value of copeptin, a novel marker of endogenous stress, for rapid rule-out of non-ST-elevation myocardial infarction (NSTEMI) is unclear when sensitive or even high-sensitivity cardiac troponin cTn (hs-cTn) assays are used. In an international multicenter study we evaluated 1929 consecutive patients with symptoms suggestive of acute myocardial infarction (AMI). Measurements of copeptin, three sensitive and three hs-cTn assays were performed at presentation in a blinded fashion. The final diagnosis was adjudicated by two independent cardiologists using all clinical information including coronary angiography and levels of hs-cTnT. The incremental value in the diagnosis of NSTEMI was quantified using four outcome measures: area under the receiver-operating characteristic curve (AUC), integrated discrimination improvement (IDI), sensitivity and negative predictive value (NPV). Early presenters (< 4h since chest pain onset) were a pre-defined subgroup. NSTEMI was the adjudicated final diagnosis in 358 (18.6%) patients. As compared to the use of cTn alone, copeptin significantly increased AUC for two (33%) and IDI (between 0.010 and 0.041 (all p < 0.01)), sensitivity and NPV for all six cTn assays (100%); NPV to 96-99% when the 99 th percentile of the respective cTnI assay was combined with a copeptin level of 9 pmol/l (all p < 0.01). The incremental value in early presenters was similar to that of the overall cohort. When used for rapid rule-out of NSTEM in combination with sensitive or hs-cTnI assays, copeptin provides a numerically small, but statistically and likely also clinically significant incremental value. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. The IMS Software Integration Platform

    DTIC Science & Technology

    1993-04-12

    products to incorporate all data shared by the IMS applications. Some entities (time-series, images, a algorithm -specific parameters) must be managed...dbwhoanii, dbcancel Transaction Management: dbcommit, dbrollback Key Counter Assignment: dbgetcounter String Handling: cstr ~to~pad, pad-to- cstr Error...increment *value; String Maniputation: int cstr topad (array, string, arraylength) char *array, *string; int arrayjlength; int pad tocstr (string

  6. Unpacking the Revised Bloom's Taxonomy: Developing Case-Based Learning Activities

    ERIC Educational Resources Information Center

    Nkhoma, Mathews Zanda; Lam, Tri Khai; Sriratanaviriyakul, Narumon; Richardson, Joan; Kam, Booi; Lau, Kwok Hung

    2017-01-01

    Purpose: The purpose of this paper is to propose the use of case studies in teaching an undergraduate course of Internet for Business in class, based on the revised Bloom's taxonomy. The study provides the empirical evidence about the effect of case-based teaching method integrated the revised Bloom's taxonomy on students' incremental learning,…

  7. The influences of harmony motives and implicit beliefs on conflict styles of the collectivist.

    PubMed

    Lim, Lena L

    2009-12-01

    The collectivist preference for nonconfrontational conflict styles is usually attributed to the influences of the Confucian value of harmony, which promotes tolerance of interpersonal transgression. Harmony has two distinct motives in collectivistic Asian societies (Leung, 1997 ): harmony enhancement is affective in nature and represents a genuine concern for relationship harmony, while disintegration avoidance is instrumental in nature and sees harmony maintenance as a means to other ends. Hence, as predicted, harmony enhancement is positively related to the use of integrating and compromising, while disintegration avoidance is positively related to the use of avoiding and obliging and is negatively related to the use of integrating during a conflict with a peer in a collectivistic society, Singapore. Besides examining this from a motivational perspective, the study also examines the role of implicit beliefs of personality on one's choice of conflict styles. The two implicit theories of personality refer to the two different assumptions the lay person has about the mutability of personal attributes; an entity theorist believes that personal attributes are fixed and nonmalleable qualities, while an incremental theorist sees personal attributes as qualities that can be developed and changed. Results reveal that incrementalist implicit beliefs also significantly predicted the use of integrating. Harmony enhancement represents a genuine concern for relationship harmony and involves feelings of closeness, unity, and trust. The finding that integrating is predicted by a harmony enhancement motivation suggests the importance of fostering collectivist values of interdependence and feelings of unity and trust so as to encourage the use of integrating to discuss the opposing views openly and constructively. The present study also underscores the benefit of learning an incremental theory to be open to the positive changes in others and work toward improving the relationship and the situation during a conflict.

  8. Integration Process for the Habitat Demonstration Unit

    NASA Technical Reports Server (NTRS)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tn, Terry; Toups, Larry; Howe, A. Scott; Smitherman, David

    2011-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities. The HDU previously served as a test bed for testing technologies and sub-systems in a terrestrial surface environment. in 2010 in the Pressurized Excursion Module (PEM) configuration. Due to the amount of work involved to make the HDU project successful, the HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators The size of the team and number of systems involved With the HDU makes Integration a complicated process. However, because the HDU shell manufacturing is complete, the team has a head start on FY--11 integration activities and can focus on integrating upgrades to existing systems as well as integrating new additions. To complete the development of the FY-11 HDU from conception to rollout for operations in July 2011, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads. The highlighted HDU work for FY-11 will focus on performing upgrades to the PEM configuration, adding the X-Hab as a second level, adding a new porch providing the astronauts a larger work area outside the HDU for EVA preparations, and adding a Hygiene module. Together these upgrades result in a prototype configuration of the Deep Space Habitat (DSH), an element under evaluation by NASA's Human Exploration Framework Team (HEFT) Scheduled activates include early fit-checks and the utilization of a Habitat avionics test bed prior to installation into HDU. A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development. Modeling tools have been effective in hardware systems layout, cable routing, sub-system interface length estimation and human factors analysis. Decision processes on integration and use of all new subsystems will be defined early in the project to maximize the efficiency of both integration and field operations. In addition a series of tailored design reviews are utilized to quickly define the systems and their integration into the DSH configuration. These processes are necessary to ensure activities, such as partially reversing integration of the X-Hab second story of the HDU and deploying and stowing the new work porch for transportation to the JSC Rock Yard and to the Arizona Black Point Lava Flow Site are performed with minimal or no complications. In addition, incremental test operations leading up to an Integrated systems test allows for an orderly systems test program. For FY-11 activities, the HDU DSH will act as a laboratory utilizing a new X-Hab inflatable second floor with crew habitation features. In addition to the day to day operations involving maintenance of the HDU and exploring the surrounding terrain, testing and optimizing the use of the new X-Hab, work porch, Hygiene Module, and other sub-system enhancements will be the focus of the FY-11 test objectives. The HDU team requires a successful integration strategy using a variety of tools and approaches to prepare the DSH for these test objectives. In a challenging environment where the prototyping influences the system design, as well as Vice versa, results of the HDU DSH field tests will influence future designs of habitat systems.

  9. Programmable stream prefetch with resource optimization

    DOEpatents

    Boyle, Peter; Christ, Norman; Gara, Alan; Mawhinney, Robert; Ohmacht, Martin; Sugavanam, Krishnan

    2013-01-08

    A stream prefetch engine performs data retrieval in a parallel computing system. The engine receives a load request from at least one processor. The engine evaluates whether a first memory address requested in the load request is present and valid in a table. The engine checks whether there exists valid data corresponding to the first memory address in an array if the first memory address is present and valid in the table. The engine increments a prefetching depth of a first stream that the first memory address belongs to and fetching a cache line associated with the first memory address from the at least one cache memory device if there is not yet valid data corresponding to the first memory address in the array. The engine determines whether prefetching of additional data is needed for the first stream within its prefetching depth. The engine prefetches the additional data if the prefetching is needed.

  10. Comparison of urinary excretion of radon from the human body before and after radon bath therapy.

    PubMed

    Kávási, Norbert; Kovács, Tibor; Somlai, János; Jobbágy, Viktor; Nagy, Katalin; Deák, Eszter; Berhés, István; Bender, Tamás; Ishikawa, Tetsuo; Tokonami, Shinji

    2011-07-01

    Theoretically, the human body absorbs radon through the lungs and the skin and excretes it through the lungs and the excretory organs during radon bath therapy. To check this theory, the radon concentrations in urine samples were compared before and after radon bath therapy. During the therapy, the geometric mean (GM) and the geometric standard deviation of the radon concentration in air and in the bath water were 979 Bq m(-3), 1.58 and 73.6 Bq dm(-3), 1.1, respectively. Since radon was detected in each urine sample (GM around 3.0 Bq dm(-3)), urinary excretion of radon was confirmed. The results of this study can neither reject nor confirm the hypothesis of radon absorption through the skin. A 15 times higher increment of inhaled radon level did not cause significant changes in radon of urine samples.

  11. A Modified Mean Gray Wolf Optimization Approach for Benchmark and Biomedical Problems.

    PubMed

    Singh, Narinder; Singh, S B

    2017-01-01

    A modified variant of gray wolf optimization algorithm, namely, mean gray wolf optimization algorithm has been developed by modifying the position update (encircling behavior) equations of gray wolf optimization algorithm. The proposed variant has been tested on 23 standard benchmark well-known test functions (unimodal, multimodal, and fixed-dimension multimodal), and the performance of modified variant has been compared with particle swarm optimization and gray wolf optimization. Proposed algorithm has also been applied to the classification of 5 data sets to check feasibility of the modified variant. The results obtained are compared with many other meta-heuristic approaches, ie, gray wolf optimization, particle swarm optimization, population-based incremental learning, ant colony optimization, etc. The results show that the performance of modified variant is able to find best solutions in terms of high level of accuracy in classification and improved local optima avoidance.

  12. Self-adaptive predictor-corrector algorithm for static nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Padovan, J.

    1981-01-01

    A multiphase selfadaptive predictor corrector type algorithm was developed. This algorithm enables the solution of highly nonlinear structural responses including kinematic, kinetic and material effects as well as pro/post buckling behavior. The strategy involves three main phases: (1) the use of a warpable hyperelliptic constraint surface which serves to upperbound dependent iterate excursions during successive incremental Newton Ramphson (INR) type iterations; (20 uses an energy constraint to scale the generation of successive iterates so as to maintain the appropriate form of local convergence behavior; (3) the use of quality of convergence checks which enable various self adaptive modifications of the algorithmic structure when necessary. The restructuring is achieved by tightening various conditioning parameters as well as switch to different algorithmic levels to improve the convergence process. The capabilities of the procedure to handle various types of static nonlinear structural behavior are illustrated.

  13. Aspects concerning verification methods and rigidity increment of complex technological systems

    NASA Astrophysics Data System (ADS)

    Casian, M.

    2016-11-01

    Any technological process and technology aims a quality and precise product, something almost impossible without high rigidity machine tools, equipment and components. Therefore, from the design phase, it is very important to create structures and machines with high stiffness characteristics. At the same time, increasing the stiffness should not raise the material costs. Searching this midpoint between high rigidity and minimum expenses leads to investigations and checks in structural components through various methods and techniques and sometimes quite advanced methods. In order to highlight some aspects concerning the significance of the mechanical equipment rigidity, the finite element method and an analytical method based on the use Mathcad software were used, by taking into consideration a subassembly of a grinding machine. Graphical representations were elaborated, offering a more complete image about the stresses and deformations able to affect the considered mechanical subassembly.

  14. Integrated Heat Switch/Oxide Sorption Compressor

    NASA Technical Reports Server (NTRS)

    Bard, Steven

    1989-01-01

    Thermally-driven, nonmechanical compressor uses container filled with compressed praseodymium cerium oxide powder (PrCeOx) to provide high-pressure flow of oxygen gas for driving closed-cycle Joule-Thomson-expansion refrigeration unit. Integrated heat switch/oxide sorption compressor has no moving parts except check valves, which control flow of oxygen gas between compressor and closed-cycle Joule-Thomson refrigeration system. Oxygen expelled from sorbent at high pressure by evacuating heat-switch gap and turning on heater.

  15. Rational first integrals of geodesic equations and generalised hidden symmetries

    NASA Astrophysics Data System (ADS)

    Aoki, Arata; Houri, Tsuyoshi; Tomoda, Kentaro

    2016-10-01

    We discuss novel generalisations of Killing tensors, which are introduced by considering rational first integrals of geodesic equations. We introduce the notion of inconstructible generalised Killing tensors, which cannot be constructed from ordinary Killing tensors. Moreover, we introduce inconstructible rational first integrals, which are constructed from inconstructible generalised Killing tensors, and provide a method for checking the inconstructibility of a rational first integral. Using the method, we show that the rational first integral of the Collinson-O’Donnell solution is not inconstructible. We also provide several examples of metrics admitting an inconstructible rational first integral in two and four-dimensions, by using the Maciejewski-Przybylska system. Furthermore, we attempt to generalise other hidden symmetries such as Killing-Yano tensors.

  16. Emotional Intelligence in Secondary Education Students in Multicultural Contexts

    ERIC Educational Resources Information Center

    Pegalajar-Palomino, Ma. del Carmen; Colmenero-Ruiz, Ma. Jesus

    2014-01-01

    Introduction: The study analyzes the level of development in emotional intelligence of Secondary Education students. It also checks for statistically significant differences in educational level between Spanish and immigrant students, under the integration program "Intercultural Open Classrooms". Method: 94 students of Secondary…

  17. Thinking inside the (lock)box: using banking technology to improve the revenue cycle.

    PubMed

    D'Eramo, Michael; Umbreit, Lynda

    2005-08-01

    An integrated, image-based lockbox solution has allowed Columbus, Ohio-based MaternOhio to automate payment posting, reconcilement, and billing; store check and remittance images electronically; and automatically update its in-house patient accounting system and medical records.

  18. When mere exposure leads to less liking: the incremental threat effect in intergroup contexts.

    PubMed

    Crisp, Richard J; Hutter, Russell R C; Young, Bryony

    2009-02-01

    In two experiments we tested the hypothesis that repeated exposure to out-group-relevant attitude objects would lead to less liking following a threat to identity. In Experiment 1 exposure to abstract artwork ostensibly created by a member of an out-group university led to more liking under baseline conditions, but not following a manipulation of threat. In Experiment 2 we observed a negative relationship between exposure and liking following threat: liking reversed the typical mere exposure effect. Reported emotion mediated the interactive effect of threat and exposure on liking. These findings illustrate how social identity threat can be experienced incrementally as a function of exposure. We discuss the findings in the context of an integration of research on exposure, identity, attitudes, and contact.

  19. International Space Station Payload Operations Integration

    NASA Technical Reports Server (NTRS)

    Fanske, Elizabeth Anne

    2011-01-01

    The Payload Operations Integrator (POINT) plays an integral part in the Certification of Flight Readiness process for the Mission Operations Laboratory and the Payload Operations Integration Function that supports International Space Station Payload operations. The POINTs operate in support of the POIF Payload Operations Manager to bring together and integrate the Certification of Flight Readiness inputs from various MOL teams through maintaining an open work tracking log. The POINTs create monthly metrics for current and future payloads that the Payload Operations Integration Function supports. With these tools, the POINTs assemble the Certification of Flight Readiness package before a given flight, stating that the Mission Operations Laboratory is prepared to support it. I have prepared metrics for Increment 29/30, maintained the Open Work Tracking Logs for Flights ULF6 (STS-134) and ULF7 (STS-135), and submitted the Mission Operations Laboratory Certification of Flight Readiness package for Flight 44P to the Mission Operations Directorate (MOD/OZ).

  20. Cost-effective monolithic and hybrid integration for metro and long-haul applications

    NASA Astrophysics Data System (ADS)

    Clayton, Rick; Carter, Andy; Betty, Ian; Simmons, Timothy

    2003-12-01

    Today's telecommunication market is characterized by conservative business practices: tight management of costs, low risk investing and incremental upgrades, rather than the more freewheeling approach taken a few years ago. Optimizing optical components for the current and near term market involves substantial integration, but within particular bounds. The emphasis on evolution, in particular, has led to increased standardization of functions and so created extensive opportunities for integrated product offerings. The same standardization that enables commercially successful integrated functions also changes the competitive environment, and changes the emphasis for component development; shifting the innovation priority from raw performance to delivering the most effective integrated products. This paper will discuss, with specific examples from our transmitter, receiver and passives product families, our understanding of the issues based on extensive experience in delivering high end integrated products to the market, and the direction it drives optical components.

  1. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  2. Peat soils stabilization using Effective Microorganisms (EM)

    NASA Astrophysics Data System (ADS)

    Yusof, N. Z.; Samsuddin, N. S.; Hanif, M. F.; Syed Osman, S. B.

    2018-04-01

    Peat soil is known as geotechnical problematic soil since it is the softest soil having highly organic and moisture content which led to high compressibility, low shear strength and long-term settlement. The aim of this study was to obtain the stabilized peat soils using the Effective Microorganisms (EM). The volume of EM added and mixed with peat soils varied with 2%, 4%, 6%, 8% and 10% and then were cured for 7, 14 and 21 days. The experiment was done for uncontrolled and controlled moisture content. Prior conducting the main experiments, the physical properties such as moisture content, liquid limit, specific gravity, and plastic limit etc. were measure for raw peat samples. The Unconfined Compressive Strength (UCS) test was performed followed by regression analysis to check the effect of EM on the soil strength. Obtained results have shown that the mix design for controlled moisture contents showed the promising improvement in their compressive strength. The peat soil samples with 10% of EM shows the highest increment in UCS value and the percentage of increments are in the range of 44% to 65% after curing for 21 days. The regression analysis of the EM with the soil compressive strength showed that in controlled moisture conditions, EM significantly improved the soil stability as the value of R2 ranged between 0.97 – 0.78. The results have indicated that the addition of EM in peat soils provides significant improving in the strength of the soil as well as the other engineering properties.

  3. STS-98 U.S. Lab Destiny is moved out of Atlantis' payload bay

    NASA Technical Reports Server (NTRS)

    2001-01-01

    KENNEDY SPACE CENTER, Fla. -- Workers in the Payload Changeout Room check the U.S. Lab Destiny as its moves from Atlantis''' payload bay into the PCR. Destiny will remain in the PCR while Atlantis rolls back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster'''s system tunnel. An extensive evaluation of NASA'''s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis.

  4. The Z1 truss is moved to check weight and balance

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In the Space Station Processing Facility, workers watch as the Integrated Truss Structure Z1, an element of the International Space Station, is moved to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.

  5. Early Design Choices: Capture, Model, Integrate, Analyze, Simulate

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2004-01-01

    I. Designs are constructed incrementally to meet requirements and solve problems: a) Requirements types: objectives, scenarios, constraints, ilities. etc. b) Problem/issue types: risk/safety, cost/difficulty, interaction, conflict, etc. II. Capture requirements, problems and solutions: a) Collect design and analysis products and make them accessible for integration and analysis; b) Link changes in design requirements, problems and solutions; and c) Harvest design data for design models and choice structures. III. System designs are constructed by multiple groups designing interacting subsystems a) Diverse problems, choice criteria, analysis methods and point solutions. IV. Support integration and global analysis of repercussions: a) System implications of point solutions; b) Broad analysis of interactions beyond totals of mass, cost, etc.

  6. Age-forming aluminum panels

    NASA Technical Reports Server (NTRS)

    Baxter, G. I.

    1976-01-01

    Contoured-stiffened 63 by 337 inch 2124 aluminum alloy panels are machined in-the-flat to make integral, tapered T-capped stringers, parallel with longitudinal centerline. Aging fixture, which includes net contour formers made from lofted contour templates, has eggcrate-like structure for use in forming and checking panels.

  7. Fiber-optic system for checking the acoustical parameters of gas-turbine engine flow-through passages

    NASA Astrophysics Data System (ADS)

    Vinogradov, Vasiliy Y.; Morozov, Oleg G.; Nureev, Ilnur I.; Kuznetzov, Artem A.

    2015-03-01

    In this paper we consider the integrated approach to development of the aero-acoustical methods for diagnostics of aircraft gas-turbine engine flow-through passages by using as the base the passive fiber-optic and location technologies.

  8. Confused about Fusion? Weed Your Science Collection with a Pro.

    ERIC Educational Resources Information Center

    O'Dell, Charli

    1998-01-01

    Provides guidelines on weeding science collections in junior high/high school libraries. Highlights include checking copyright dates, online sources, 13 science subject areas that deserve special consideration (plate tectonics, fission, fusion, radioactive dating, weather/climate, astronomy/space science, elements, integrated science,…

  9. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  10. Membrane oxygenator heat exchanger failure detected by unique blood gas findings.

    PubMed

    Hawkins, Justin L

    2014-03-01

    Failure of components integrated into the cardiopulmonary bypass circuit, although rare, can bring about catastrophic results. One of these components is the heat exchanger of the membrane oxygenator. In this compartment, unsterile water from the heater cooler device is separated from the sterile blood by stainless steel, aluminum, or by polyurethane. These areas are glued or welded to keep the two compartments separate, maintaining sterility of the blood. Although quality control testing is performed by the manufacturer at the factory level, transport presents the real possibility for damage. Because of this, each manufacturer has included in the instructions for use a testing procedure for testing the integrity of the heat exchanger component. Water is circulated through the heat exchanger before priming and a visible check is made of the oxygenator bundle to check for leaks. If none are apparent, then priming of the oxygenator is performed. In this particular case, this procedure was not useful in detecting communication between the water and blood chambers of the oxygenator.

  11. Experimental and modal verification of an integral equation solution for a thin-walled dichroic plate with cross-shaped holes

    NASA Technical Reports Server (NTRS)

    Epp, L. W.; Stanton, P. H.

    1993-01-01

    In order to add the capability of an X-band uplink onto the 70-m antenna, a new dichroic plate is needed to replace the Pyle-guide-shaped dichroic plate currently in use. The replacement dichroic plate must exhibit an additional passband at the new uplink frequency of 7.165 GHz, while still maintaining a passband at the existing downlink frequency of 8.425 GHz. Because of the wide frequency separation of these two passbands, conventional methods of designing air-filled dichroic plates exhibit grating lobe problems. A new method of solving this problem by using a dichroic plate with cross-shaped holes is presented and verified experimentally. Two checks of the integral equation solution are described. One is the comparison to a modal analysis for the limiting cross shape of a square hole. As a final check, a prototype dichroic plate with cross-shaped holes was built and measured.

  12. Modelling NO2 concentrations at the street level in the GAINS integrated assessment model: projections under current legislation

    NASA Astrophysics Data System (ADS)

    Kiesewetter, G.; Borken-Kleefeld, J.; Schöpp, W.; Heyes, C.; Thunis, P.; Bessagnet, B.; Gsella, A.; Amann, M.

    2013-08-01

    NO2 concentrations at the street level are a major concern for urban air quality in Europe and have been regulated under the EU Thematic Strategy on Air Pollution. Despite the legal requirements, limit values are exceeded at many monitoring stations with little or no improvement during recent years. In order to assess the effects of future emission control regulations on roadside NO2 concentrations, a downscaling module has been implemented in the GAINS integrated assessment model. The module follows a hybrid approach based on atmospheric dispersion calculations and observations from the AirBase European air quality data base that are used to estimate site-specific parameters. Pollutant concentrations at every monitoring site with sufficient data coverage are disaggregated into contributions from regional background, urban increment, and local roadside increment. The future evolution of each contribution is assessed with a model of the appropriate scale - 28 × 28 km grid based on the EMEP Model for the regional background, 7 × 7 km urban increment based on the CHIMERE Chemistry Transport Model, and a chemical box model for the roadside increment. Thus, different emission scenarios and control options for long-range transport, regional and local emissions can be analysed. Observed concentrations and historical trends are well captured, in particular the differing NO2 and total NOx = NO + NO2 trends. Altogether, more than 1950 air quality monitoring stations in the EU are covered by the model, including more than 400 traffic stations and 70% of the critical stations. Together with its well-established bottom-up emission and dispersion calculation scheme, GAINS is thus able to bridge the scales from European-wide policies to impacts in street canyons. As an application of the model, we assess the evolution of attainment of NO2 limit values under current legislation until 2030. Strong improvements are expected with the introduction of the Euro 6 emission standard for light duty vehicles; however, for some major European cities, further measures may be required, in particular if aiming to achieve compliance at an earlier time.

  13. Modelling NO2 concentrations at the street level in the GAINS integrated assessment model: projections under current legislation

    NASA Astrophysics Data System (ADS)

    Kiesewetter, G.; Borken-Kleefeld, J.; Schöpp, W.; Heyes, C.; Thunis, P.; Bessagnet, B.; Terrenoire, E.; Gsella, A.; Amann, M.

    2014-01-01

    NO2 concentrations at the street level are a major concern for urban air quality in Europe and have been regulated under the EU Thematic Strategy on Air Pollution. Despite the legal requirements, limit values are exceeded at many monitoring stations with little or no improvement in recent years. In order to assess the effects of future emission control regulations on roadside NO2 concentrations, a downscaling module has been implemented in the GAINS integrated assessment model. The module follows a hybrid approach based on atmospheric dispersion calculations and observations from the AirBase European air quality database that are used to estimate site-specific parameters. Pollutant concentrations at every monitoring site with sufficient data coverage are disaggregated into contributions from regional background, urban increment, and local roadside increment. The future evolution of each contribution is assessed with a model of the appropriate scale: 28 × 28 km grid based on the EMEP Model for the regional background, 7 × 7 km urban increment based on the CHIMERE Chemistry Transport Model, and a chemical box model for the roadside increment. Thus, different emission scenarios and control options for long-range transport as well as regional and local emissions can be analysed. Observed concentrations and historical trends are well captured, in particular the differing NO2 and total NOx = NO + NO2 trends. Altogether, more than 1950 air quality monitoring stations in the EU are covered by the model, including more than 400 traffic stations and 70% of the critical stations. Together with its well-established bottom-up emission and dispersion calculation scheme, GAINS is thus able to bridge the scales from European-wide policies to impacts in street canyons. As an application of the model, we assess the evolution of attainment of NO2 limit values under current legislation until 2030. Strong improvements are expected with the introduction of the Euro 6 emission standard for light duty vehicles; however, for some major European cities, further measures may be required, in particular if aiming to achieve compliance at an earlier time.

  14. Essays in financial economics and econometrics

    NASA Astrophysics Data System (ADS)

    La Spada, Gabriele

    Chapter 1 (my job market paper) asks the following question: Do asset managers reach for yield because of competitive pressures in a low rate environment? I propose a tournament model of money market funds (MMFs) to study this issue. I show that funds with different costs of default respond differently to changes in interest rates, and that it is important to distinguish the role of risk-free rates from that of risk premia. An increase in the risk premium leads funds with lower default costs to increase risk-taking, while funds with higher default costs reduce risk-taking. Without changes in the premium, low risk-free rates reduce risk-taking. My empirical analysis shows that these predictions are consistent with the risk-taking of MMFs during the 2006--2008 period. Chapter 2, co-authored with Fabrizio Lillo and published in Studies in Nonlinear Dynamics and Econometrics (2014), studies the effect of round-off error (or discretization) on stationary Gaussian long-memory process. For large lags, the autocovariance is rescaled by a factor smaller than one, and we compute this factor exactly. Hence, the discretized process has the same Hurst exponent as the underlying one. We show that in presence of round-off error, two common estimators of the Hurst exponent, the local Whittle (LW) estimator and the detrended fluctuation analysis (DFA), are severely negatively biased in finite samples. We derive conditions for consistency and asymptotic normality of the LW estimator applied to discretized processes and compute the asymptotic properties of the DFA for generic long-memory processes that encompass discretized processes. Chapter 3, co-authored with Fabrizio Lillo, studies the effect of round-off error on integrated Gaussian processes with possibly correlated increments. We derive the variance and kurtosis of the realized increment process in the limit of both "small" and "large" round-off errors, and its autocovariance for large lags. We propose novel estimators for the variance and lag-one autocorrelation of the underlying, unobserved increment process. We also show that for fractionally integrated processes, the realized increments have the same Hurst exponent as the underlying ones, but the LW estimator applied to the realized series is severely negatively biased in medium-sized samples.

  15. Towards a Decision Support System for Space Flight Operations

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Hogle, Charles; Ruszkowski, James

    2013-01-01

    The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.

  16. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    NASA Astrophysics Data System (ADS)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.

  17. A Microcomputer-Based Program for Printing Check Plots of Integrated Circuits Specified in Caltech Intermediate Form.

    DTIC Science & Technology

    1984-12-01

    only four transistors[5]. Each year since that time, the semiconductor industry has con- sistently improved the quality of the fabrication tech- niques...rarely took place at universities and was almost exclusively confined to industry . IC design techniques were developed, tested, and taught only in the...community, it is not uncommon for industry to borrow ideas and even particular programs from these university designed tools. The Very Large Scale Integration

  18. Managing and delivering of 3D geo data across institutions has a web based solution - intermediate results of the project GeoMol.

    NASA Astrophysics Data System (ADS)

    Gietzel, Jan; Schaeben, Helmut; Gabriel, Paul

    2014-05-01

    The increasing relevance of geological information for policy and economy at transnational level has recently been recognized by the European Commission, who has called for harmonized information related to reserves and resources in the EU Member States. GeoMol's transnational approach responds to that, providing consistent and seamless 3D geological information of the Alpine Foreland Basins based on harmonized data and agreed methodologies. However, until recently no adequate tool existed to ensure full interoperability among the involved GSOs and to distribute the multi-dimensional information of a transnational project facing diverse data policy, data base systems and software solutions. In recent years (open) standards describing 2D spatial data have been developed and implemented in different software systems including production environments for 2D spatial data (like regular 2D-GI-Systems). Easy yet secured access to the data is of upmost importance and thus priority for any spatial data infrastructure. To overcome limitations conditioned by highly sophisticated and platform dependent geo modeling software packages functionalities of a web portals can be utilized. Thus, combining a web portal with a "check-in-check-out" system allows distributed organized editing of data and models but requires standards for the exchange of 3D geological information to ensure interoperability. Another major concern is the management of large models and the ability of 3D tiling into spatially restricted models with refined resolution, especially when creating countrywide models . Using GST ("Geosciences in Space and Time") developed initially at TU Bergakademie Freiberg and continuously extended by the company GiGa infosystems, incorporating these key issues and based on an object-relational data model, it is possible to check out parts or whole models for edits and check in again after modification. GST is the core of GeoMol's web-based collaborative environment designed to serve the GSOs concerned and the scientific community. Recently common users spaces have been installed providing a central access point to manage locally stored data at each of the project partners' IT sites. This distributed-organized system allows to keep the data of the live system locally and to share just cleared portions of the data, thus adhering to national regulations on geo data access. GST also allows for a dynamic generation of virtual drilling profiles and cross sections of the stored models. As this enables to deduce classified borehole data, a role based log in giving full access to the live system only for legally mandated or licensed bodies. The beta version of GeoMol's GST based geo data infrastructure and dissemination tool for multi-dimensional information, implemented incrementally, will be installed on GeoMol's website (http://geomol.eu) by end of February. It will be available for testing to further improve the performance and applicability of GeoMol's 3D-Explorer for instant web based access to GeoMol's future outputs. The project GeoMol is co-funded by the Alpine Space Program as part of the European Territorial Cooperation 2007-2013. The project integrates partners from Austria, France, Germany, Italy, Slovenia and Switzerland and runs from September 2012 to June 2015. Further information on http://geomol.eu.

  19. Energy Based Multiscale Modeling with Non-Periodic Boundary Conditions

    DTIC Science & Technology

    2013-05-13

    below in Figure 8. At each incremental step in the analysis , the user material defined subroutine (UMAT) was utilized to perform the communication...initiation and modeling using XFEM. Appropriate localization schemes will be developed to allow for deformations conducive for crack opening...REFERENCES 1. Talreja R., 2006, “Damage analysis for structural integrity and durability of composite materials ,” Fatigue & Fracture of

  20. Developing the Systems Engineering Experience Accelerator (SEEA) Prototype and Roadmap

    DTIC Science & Technology

    2012-10-24

    system attributes. These metrics track non-requirements performance, typically relate to production cost per unit, maintenance costs, training costs...immediately implement lessons learned from the training experience to the job, assuming the culture allows this. 1.3 MANAGEMENT PLAN/TECHNICAL OVERVIEW...resolving potential conflicts as they arise. Incrementally implement and continuously integrate capability in priority order, to ensure that final system

  1. Thermoviscoplastic analysis of fibrous periodic composites using triangular subvolumes

    NASA Technical Reports Server (NTRS)

    Walker, Kevin P.; Freed, Alan D.; Jordan, Eric H.

    1993-01-01

    The nonlinear viscoplastic behavior of fibrous periodic composites is analyzed by discretizing the unit cell into triangular subvolumes. A set of these subvolumes can be configured by the analyst to construct a representation for the unit cell of a periodic composite. In each step of the loading history, the total strain increment at any point is governed by an integral equation which applies to the entire composite. A Fourier series approximation allows the incremental stresses and strains to be determined within a unit cell of the periodic lattice. The nonlinearity arising from the viscoplastic behavior of the constituent materials comprising the composite is treated as fictitious body force in the governing integral equation. Specific numerical examples showing the stress distributions in the unit cell of a fibrous tungsten/copper metal matrix composite under viscoplastic loading conditions are given. The stress distribution resulting in the unit cell when the composite material is subjected to an overall transverse stress loading history perpendicular to the fibers is found to be highly heterogeneous, and typical homogenization techniques based on treating the stress and strain distributions within the constituent phases as homogeneous result in large errors under inelastic loading conditions.

  2. Thermoviscoplastic analysis of fibrous periodic composites by the use of triangular subvolumes

    NASA Technical Reports Server (NTRS)

    Walker, Kevin P.; Freed, Alan D.; Jordan, Eric H.

    1994-01-01

    The non-linear viscoplastic behavior of fibrous periodic composites is analyzed by discretizing the unit cell into triangular subvolumes. A set of these subvolumes can be configured by the analyst to construct a representation for the unit cell of a periodic composite. In each step of the loading history the total strain increment at any point is governed by an integral equation which applies to the entire composite. A Fourier series approximation allows the incremental stresses and strains to be determined within a unit cell of the periodic lattice. The non-linearity arising from the viscoplastic behavior of the constituent materials comprising the composite is treated as a fictitious body force in the governing integral equation. Specific numerical examples showing the stress distributions in the unit cell of a fibrous tungsten/copper metal-matrix composite under viscoplastic loading conditions are given. The stress distribution resulting in the unit cell when the composite material is subjected to an overall transverse stress loading history perpendicular to the fibers is found to be highly heterogeneous, and typical homogenization techniques based on treating the stress and strain distributions within the constituent phases as homogeneous result in large errors under inelastic loading conditions.

  3. Implicit negotiation beliefs and performance: experimental and longitudinal evidence.

    PubMed

    Kray, Laura J; Haselhuhn, Michael P

    2007-07-01

    The authors argue that implicit negotiation beliefs, which speak to the expected malleability of negotiating ability, affect performance in dyadic negotiations. They expected negotiators who believe negotiating attributes are malleable (incremental theorists) to outperform negotiators who believe negotiating attributes are fixed (entity theorists). In Study 1, they gathered evidence of convergent and discriminant validity for the implicit negotiation belief construct. In Study 2, they examined the impact of implicit beliefs on the achievement goals that negotiators pursue. In Study 3, they explored the causal role of implicit beliefs on negotiation performance by manipulating negotiators' implicit beliefs within dyads. They also identified perceived ability as a moderator of the link between implicit negotiation beliefs and performance. In Study 4, they measured negotiators' beliefs in a classroom setting and examined how these beliefs affected negotiation performance and overall performance in the course 15 weeks later. Across all performance measures, incremental theorists outperformed entity theorists. Consistent with the authors' hypotheses, incremental theorists captured more of the bargaining surplus and were more integrative than their entity theorist counterparts, suggesting implicit theories are important determinants of how negotiators perform. Implications and future directions are discussed. Copyright 2007 APA, all rights reserved.

  4. AWARE@HOME: PROFITABLY INTEGRATING CONSERVATION INTO THE AMERICAN HOME

    EPA Science Inventory

    While American households are the most resource consuming in the world, they are unlikely to become more efficient users of public utilities because of: 1) large time delays between utility use and the receipt of utility bills; 2) the inconvenience of personally checking and ...

  5. Integrative Lifecourse and Genetic Analysis of Military Working Dogs

    DTIC Science & Technology

    2012-10-01

    Recognition), ICR (Intelligent Character Recognition) and HWR ( Handwriting Recognition). A number of various software packages were evaluated and we have...the third-party software is able to recognize check-boxes and columns and do a reasonable job with handwriting – which is does. This workflow will

  6. Integrative Lifecourse and Genetic Analysis of Military Working Dogs

    DTIC Science & Technology

    2012-10-01

    Intelligent Character Recognition) and HWR ( Handwriting Recognition). A number of various software packages were evaluated and we have settled on a...third-party software is able to recognize check-boxes and columns and do a reasonable job with handwriting – which is does. This workflow will

  7. Managing Mission-Critical Infrastructure

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2012-01-01

    In the library context, they depend on sophisticated business applications specifically designed to support their work. This infrastructure consists of such components as integrated library systems, their associated online catalogs or discovery services, and self-check equipment, as well as a Web site and the various online tools and services…

  8. Quantification of Interactions between Dynamic Cellular Network Functionalities by Cascaded Layering

    PubMed Central

    Prescott, Thomas P.; Lang, Moritz; Papachristodoulou, Antonis

    2015-01-01

    Large, naturally evolved biomolecular networks typically fulfil multiple functions. When modelling or redesigning such systems, functional subsystems are often analysed independently first, before subsequent integration into larger-scale computational models. In the design and analysis process, it is therefore important to quantitatively analyse and predict the dynamics of the interactions between integrated subsystems; in particular, how the incremental effect of integrating a subsystem into a network depends on the existing dynamics of that network. In this paper we present a framework for simulating the contribution of any given functional subsystem when integrated together with one or more other subsystems. This is achieved through a cascaded layering of a network into functional subsystems, where each layer is defined by an appropriate subset of the reactions. We exploit symmetries in our formulation to exhaustively quantify each subsystem’s incremental effects with minimal computational effort. When combining subsystems, their isolated behaviour may be amplified, attenuated, or be subject to more complicated effects. We propose the concept of mutual dynamics to quantify such nonlinear phenomena, thereby defining the incompatibility and cooperativity between all pairs of subsystems when integrated into any larger network. We exemplify our theoretical framework by analysing diverse behaviours in three dynamic models of signalling and metabolic pathways: the effect of crosstalk mechanisms on the dynamics of parallel signal transduction pathways; reciprocal side-effects between several integral feedback mechanisms and the subsystems they stabilise; and consequences of nonlinear interactions between elementary flux modes in glycolysis for metabolic engineering strategies. Our analysis shows that it is not sufficient to just specify subsystems and analyse their pairwise interactions; the environment in which the interaction takes place must also be explicitly defined. Our framework provides a natural representation of nonlinear interaction phenomena, and will therefore be an important tool for modelling large-scale evolved or synthetic biomolecular networks. PMID:25933116

  9. Planarization of metal films for multilevel interconnects

    DOEpatents

    Tuckerman, D.B.

    1985-06-24

    In the fabrication of multilevel integrated circuits, each metal layer is planarized by heating to momentarily melt the layer. The layer is melted by sweeping lase pulses of suitable width, typically about 1 microsecond duration, over the layer in small increments. The planarization of each metal layer eliminates irregular and discontinuous conditions between successive layers. The planarization method is particularly applicable to circuits having ground or power planes and allows for multilevel interconnects. Dielectric layers can also be planarized to produce a fully planar multilevel interconnect structure. The method is useful for the fabrication of VLSI circuits, particularly for wafer-scale integration.

  10. Planarization of metal films for multilevel interconnects

    DOEpatents

    Tuckerman, David B.

    1987-01-01

    In the fabrication of multilevel integrated circuits, each metal layer is anarized by heating to momentarily melt the layer. The layer is melted by sweeping laser pulses of suitable width, typically about 1 microsecond duration, over the layer in small increments. The planarization of each metal layer eliminates irregular and discontinuous conditions between successive layers. The planarization method is particularly applicable to circuits having ground or power planes and allows for multilevel interconnects. Dielectric layers can also be planarized to produce a fully planar multilevel interconnect structure. The method is useful for the fabrication of VLSI circuits, particularly for wafer-scale integration.

  11. Planarization of metal films for multilevel interconnects

    DOEpatents

    Tuckerman, David B.

    1989-01-01

    In the fabrication of multilevel integrated circuits, each metal layer is anarized by heating to momentarily melt the layer. The layer is melted by sweeping laser pulses of suitable width, typically about 1 microsecond duration, over the layer in small increments. The planarization of each metal layer eliminates irregular and discontinuous conditions between successive layers. The planarization method is particularly applicable to circuits having ground or power planes and allows for multilevel interconnects. Dielectric layers can also be planarized to produce a fully planar multilevel interconnect structure. The method is useful for the fabrication of VLSI circuits, particularly for wafer-scale integration.

  12. Planarization of metal films for multilevel interconnects

    DOEpatents

    Tuckerman, D.B.

    1985-08-23

    In the fabrication of multilevel integrated circuits, each metal layer is planarized by heating to momentarily melt the layer. The layer is melted by sweeping laser pulses of suitable width, typically about 1 microsecond duration, over the layer in small increments. The planarization of each metal layer eliminates irregular and discontinuous conditions between successive layers. The planarization method is particularly applicable to circuits having ground or power planes and allows for multilevel interconnects. Dielectric layers can also be planarized to produce a fully planar multilevel interconnect structure. The method is useful for the fabrication of VLSI circuits, particularly for wafer-scale integration.

  13. Planarization of metal films for multilevel interconnects

    DOEpatents

    Tuckerman, D.B.

    1989-03-21

    In the fabrication of multilevel integrated circuits, each metal layer is planarized by heating to momentarily melt the layer. The layer is melted by sweeping laser pulses of suitable width, typically about 1 microsecond duration, over the layer in small increments. The planarization of each metal layer eliminates irregular and discontinuous conditions between successive layers. The planarization method is particularly applicable to circuits having ground or power planes and allows for multilevel interconnects. Dielectric layers can also be planarized to produce a fully planar multilevel interconnect structure. The method is useful for the fabrication of VLSI circuits, particularly for wafer-scale integration. 6 figs.

  14. Vortex tubes in turbulence velocity fields at Reynolds numbers Re lambda approximately equal to 300-1300.

    PubMed

    Mouri, Hideaki; Hori, Akihiro; Kawashima, Yoshihide

    2004-12-01

    The most elementary structures of turbulence, i.e., vortex tubes, are studied using velocity data obtained in a laboratory experiment for boundary layers with Reynolds numbers Re(lambda) =295-1258 . We conduct conditional averaging for enhancements of a small-scale velocity increment and obtain the typical velocity profile for vortex tubes. Their radii are of the order of the Kolmogorov length. Their circulation velocities are of the order of the root-mean-square velocity fluctuation. We also obtain the distribution of the interval between successive enhancements of the velocity increment as the measure of the spatial distribution of vortex tubes. They tend to cluster together below about the integral length and more significantly below about the Taylor microscale. These properties are independent of the Reynolds number and are hence expected to be universal.

  15. Case study: Lockheed-Georgia Company integrated design process

    NASA Technical Reports Server (NTRS)

    Waldrop, C. T.

    1980-01-01

    A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.

  16. Simulation of load traffic and steeped speed control of conveyor

    NASA Astrophysics Data System (ADS)

    Reutov, A. A.

    2017-10-01

    The article examines the possibilities of the step control simulation of conveyor speed within Mathcad, Simulink, Stateflow software. To check the efficiency of the control algorithms and to more accurately determine the characteristics of the control system, it is necessary to simulate the process of speed control with real values of traffic for a work shift or for a day. For evaluating the belt workload and absence of spillage it is necessary to use empirical values of load flow in a shorter period of time. The analytical formulas for optimal speed step values were received using empirical values of load. The simulation checks acceptability of an algorithm, determines optimal parameters of regulation corresponding to load flow characteristics. The average speed and the number of speed switching during simulation are admitted as criteria of regulation efficiency. The simulation example within Mathcad software is implemented. The average conveyor speed decreases essentially by two-step and three-step control. A further increase in the number of regulatory steps decreases average speed insignificantly but considerably increases the intensity of the speed switching. Incremental algorithm of speed regulation uses different number of stages for growing and reducing load traffic. This algorithm allows smooth control of the conveyor speed changes with monotonic variation of the load flow. The load flow oscillation leads to an unjustified increase or decrease of speed. Work results can be applied at the design of belt conveyors with adjustable drives.

  17. Model selection and assessment for multi­-species occupancy models

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  18. Association between progression-free survival and health-related quality of life in oncology: a systematic review protocol.

    PubMed

    Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng

    2016-09-02

    There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology trials. Systematic evidence produced from our study will contribute to improvement of patient care and practice of evidence-based medicine in oncology. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. System Identification for Integrated Aircraft Development and Flight Testing (l’Identification des systemes pour le developpement integre des aeronefs et les essais en vol)

    DTIC Science & Technology

    1999-03-01

    aerodynamics to affect load motions. The effects include a load trail angle in proportion to the drag specific force, and modification of the load pendulum...equations algorithm for flight data filtering architeture . and data consistency checking; and SCIDNT 8, an output architecture. error identification...accelerations at the seven sensor locations, identified system is proportional to the number When system identification is performed, as of flexible modes

  20. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  1. 40 CFR 63.7740 - What are my monitoring requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... a bag leak detection system according to the requirements in § 63.7741(b). (c) For each baghouse... the proper functioning of removal mechanisms. (3) Check the compressed air supply for pulse-jet... integrity of the baghouse through quarterly visual inspections of the baghouse interior for air leaks. (8...

  2. Towards Quantifying Programmable Logic Controller Resilience Against Intentional Exploits

    DTIC Science & Technology

    2012-03-22

    may improve the SCADA system’s resilience against DoS and man-in-the-middle ( MITM ) attacks. DoS attacks may be mitigated by using the redundant...paths available on the network links. MITM attacks may be mitigated by the data integrity checks associated with the middleware. Figure 4 illustrates

  3. Does It Work? 555-Timer Checker Leaves No Doubt

    ERIC Educational Resources Information Center

    Harman, Charles

    2009-01-01

    This article details the construction and use of the 555-timer checker. The 555-timer checker allows the user to dynamically check a 555-timer, an integrated circuit device. Of its many applications, it provides timing applications that unijunction transistors once performed. (Contains 4 figures and 3 photos.)

  4. The Z1 truss is moved to check weight and balance

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In the Space Station Processing Facility, photographers focus on the Integrated Truss Structure Z1, an element of the International Space Station, suspended by a crane overhead. The truss is being moved to another stand to check its weight and balance. The Z1 truss is the first of 10 trusses that will become the backbone of the Space Station, eventually stretching the length of a football field. Along with its companion payload, the third Pressurized Mating Adapter, the Z1 is scheduled to be launched aboard Space Shuttle Discovery Oct. 5 at 9:38 p.m. EDT. The launch will be the 100th in the Shuttle program.

  5. STS-98 U.S. Lab Destiny is moved out of Atlantis' payload bay

    NASA Technical Reports Server (NTRS)

    2001-01-01

    KENNEDY SPACE CENTER, Fla. -- Workers in the Payload Changeout Room check the Payload Ground Handling Mechanism that will move the U.S. Lab Destiny out of Atlantis''' payload bay and into the PCR. After the move, Atlantis will roll back to the Vehicle Assembly Building to allow workers to conduct inspections, continuity checks and X-ray analysis on the 36 solid rocket booster cables located inside each booster'''s system tunnel. An extensive evaluation of NASA'''s SRB cable inventory revealed conductor damage in four (of about 200) cables on the shelf. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis.

  6. STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF

    NASA Technical Reports Server (NTRS)

    1999-01-01

    STS-92 crew members discuss results of a Leak Seal Kit Fit Check on the Pressurized Mating Adapter -3, part of their mission payload, with JSC and Boeing representatives. From left are Mission Specialists Michael E. Lopez-Alegria; Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA); (standing) Peter J.K. 'Jeff' Wisoff (Ph.D.) and William Surles 'Bill' McArthur Jr.; (seated) Pilot Pamela A. Melroy; Dave Moore (behind Melroy), with Boeing; Mission Specialist Leroy Chiao (Ph.D.); Brian Warkentine, with JSC; and Commander Brian Duffy. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.

  7. Comparing the Correlation Length of Grain Markets in China and France

    NASA Astrophysics Data System (ADS)

    Roehner, Bertrand M.; Shiue, Carol H.

    In economics, comparative analysis plays the same role as experimental research in physics. In this paper, we closely examine several methodological problems related to comparative analysis by investigating the specific example of grain markets in China and France respectively. This enables us to answer a question in economic history which has so far remained pending, namely whether or not market integration progressed in the 18th century. In economics as in physics, before any new result being accepted, it has to be checked and re-checked by different researchers. This is what we call the replication and comparison procedures. We show how these procedures should (and can) be implemented.

  8. Integrated management of little spruce sawfly (Pristiphora abietina): design pattern

    Treesearch

    J. Holusa; K. Drapela

    2003-01-01

    This paper presents the design of a regulation model of the little spruce sawfly that is based on monitoring of defoliation. This method is based on the assessment of the percentage of defoliation of each whorl from ca. 100+ trees, beginning from the top and using four classes of defoliation. The critical value influencing the height of current increment of trees is...

  9. What Does Leaders' Character Add to Transformational Leadership?

    PubMed

    Liborius, Patrick

    2017-04-03

    The influence of leaders' character (e.g., integrity, humility/forgiveness) has rarely been examined in leadership research. The current investigation focused on the impact of integrity and humility/forgiveness on both followers' perceptions of leaders' worthiness of being followed (WBF) and stress. Results from a scenario experiment (n = 347) and a field study (n = 110) indicated that these aspects incrementally predict WBF above and beyond the impact of transformational leadership. Similar results were found concerning followers' stress with the exception of leader integrity in the field study. According to relative importance analyses, integrity and transformational leadership predict WBF equally well. The results have conceivable implications for human resources (personnel selection and development). Future research should examine additional outcome variables that are affected by certain leader characteristics as well as potential negative effects of the examined character aspects.

  10. Using surface integrals for checking Archimedes' law of buoyancy

    NASA Astrophysics Data System (ADS)

    Lima, F. M. S.

    2012-01-01

    A mathematical derivation of the force exerted by an inhomogeneous (i.e. compressible) fluid on the surface of an arbitrarily shaped body immersed in it is not found in the literature, which may be attributed to our trust in Archimedes' law of buoyancy. However, this law, also known as Archimedes' principle (AP), does not yield the force observed when the body is in contact with the container walls, as is more evident in the case of a block immersed in a liquid and in contact with the bottom, in which a downward force that increases with depth is observed. In this work, by taking into account the surface integral of the pressure force exerted by a fluid over the surface of a body, the general validity of AP is checked. For a body fully surrounded by a fluid, homogeneous or not, a gradient version of the divergence theorem applies, yielding a volume integral that simplifies to an upward force which agrees with the force predicted by AP, as long as the fluid density is a continuous function of depth. For the bottom case, this approach yields a downward force that increases with depth, which contrasts to AP but is in agreement with experiments. It also yields a formula for this force which shows that it increases with the area of contact.

  11. OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4.

    PubMed

    Schober, Daniel; Tudose, Ilinca; Svatek, Vojtech; Boeker, Martin

    2012-09-21

    Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers.

  12. Students take the lead for learning in practice: A process for building self-efficacy into undergraduate nursing education.

    PubMed

    Henderson, Amanda; Harrison, Penny; Rowe, Jennifer; Edwards, Sam; Barnes, Margaret; Henderson, Simon; Henderson, Amanda

    2018-04-10

    To prepare graduate nurses for practice, the curriculum and pedagogy need to facilitate student engagement, active learning and the development of self-efficacy. This pilot project describes and explores an initiative, the Check-in and Check-out process, that aims to engage students as active partners in their learning and teaching in their clinical preparation for practice. Three interdependent elements make up the process: a check-in (briefing) part; a clinical practice part, which supports students as they engage in their learning and practise clinical skills; and a check-out (debriefing) part. A student evaluation of this initiative confirmed the value of the process, which has subsequently been embedded in the preparation for practice and work-integrated learning courses in the undergraduate nursing programs at the participating university. The introduction of a singular learning process provides consistency in the learning approach used across clinical learning spaces, irrespective of their location or focus. A consistent learning process-including a common language that easily transfers across all clinical courses and clinical settings-arguably enhances the students' learning experience, helps them to actively manage their preparation for clinical practice and to develop self-efficacy. Copyright © 2018. Published by Elsevier Ltd.

  13. Stateless and stateful implementations of faithful execution

    DOEpatents

    Pierson, Lyndon G; Witzke, Edward L; Tarman, Thomas D; Robertson, Perry J; Eldridge, John M; Campbell, Philip L

    2014-12-16

    A faithful execution system includes system memory, a target processor, and protection engine. The system memory stores a ciphertext including value fields and integrity fields. The value fields each include an encrypted executable instruction and the integrity fields each include an encrypted integrity value for determining whether a corresponding one of the value fields has been modified. The target processor executes plaintext instructions decoded from the ciphertext while the protection engine is coupled between the system memory and the target processor. The protection engine includes logic to retrieve the ciphertext from the system memory, decrypt the value fields into the plaintext instructions, perform an integrity check based on the integrity fields to determine whether any of the corresponding value fields have been modified, and provide the plaintext instructions to the target processor for execution.

  14. Using MathCad to Evaluate Exact Integral Formulations of Spacecraft Orbital Heats for Primitive Surfaces at Any Orientation

    NASA Technical Reports Server (NTRS)

    Pinckney, John

    2010-01-01

    With the advent of high speed computing Monte Carlo ray tracing techniques has become the preferred method for evaluating spacecraft orbital heats. Monte Carlo has its greatest advantage where there are many interacting surfaces. However Monte Carlo programs are specialized programs that suffer from some inaccuracy, long calculation times and high purchase cost. A general orbital heating integral is presented here that is accurate, fast and runs on MathCad, a generally available engineering mathematics program. The integral is easy to read, understand and alter. The integral can be applied to unshaded primitive surfaces at any orientation. The method is limited to direct heating calculations. This integral formulation can be used for quick orbit evaluations and spot checking Monte Carlo results.

  15. Finite element implementation of state variable-based viscoplasticity models

    NASA Technical Reports Server (NTRS)

    Iskovitz, I.; Chang, T. Y. P.; Saleeb, A. F.

    1991-01-01

    The implementation of state variable-based viscoplasticity models is made in a general purpose finite element code for structural applications of metals deformed at elevated temperatures. Two constitutive models, Walker's and Robinson's models, are studied in conjunction with two implicit integration methods: the trapezoidal rule with Newton-Raphson iterations and an asymptotic integration algorithm. A comparison is made between the two integration methods, and the latter method appears to be computationally more appealing in terms of numerical accuracy and CPU time. However, in order to make the asymptotic algorithm robust, it is necessary to include a self adaptive scheme with subincremental step control and error checking of the Jacobian matrix at the integration points. Three examples are given to illustrate the numerical aspects of the integration methods tested.

  16. Planar isotropy of passive scalar turbulent mixing with a mean perpendicular gradient.

    PubMed

    Danaila, L; Dusek, J; Le Gal, P; Anselmet, F; Brun, C; Pumir, A

    1999-08-01

    A recently proposed evolution equation [Vaienti et al., Physica D 85, 405 (1994)] for the probability density functions (PDF's) of turbulent passive scalar increments obtained under the assumptions of fully three-dimensional homogeneity and isotropy is submitted to validation using direct numerical simulation (DNS) results of the mixing of a passive scalar with a nonzero mean gradient by a homogeneous and isotropic turbulent velocity field. It is shown that this approach leads to a quantitatively correct balance between the different terms of the equation, in a plane perpendicular to the mean gradient, at small scales and at large Péclet number. A weaker assumption of homogeneity and isotropy restricted to the plane normal to the mean gradient is then considered to derive an equation describing the evolution of the PDF's as a function of the spatial scale and the scalar increments. A very good agreement between the theory and the DNS data is obtained at all scales. As a particular case of the theory, we derive a generalized form for the well-known Yaglom equation (the isotropic relation between the second-order moments for temperature increments and the third-order velocity-temperature mixed moments). This approach allows us to determine quantitatively how the integral scale properties influence the properties of mixing throughout the whole range of scales. In the simple configuration considered here, the PDF's of the scalar increments perpendicular to the mean gradient can be theoretically described once the sources of inhomogeneity and anisotropy at large scales are correctly taken into account.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basavatia, A; Kalnicki, S; Garg, M

    Purpose: To implement a clinically useful palm vein pattern recognition biometric system to treat the correct treatment plan to the correct patient each and every time and to check-in the patient into the department to access the correct medical record. Methods: A commercially available hand vein scanning system was paired to Aria and utilized an ADT interface from the hospital electronic health system. Integration at two points in Aria, version 11 MR2, first at the appointment tracker screen for the front desk medical record access and second at the queue screen on the 4D treatment console took place for patientmore » daily time-out. A test patient was utilized to check accuracy of identification as well as to check that no unintended interactions take place between the 4D treatment console and the hand vein scanning system. This system has been in clinical use since December 2013. Results: Since implementation, 445 patients have been enrolled into our biometric system. 95% of patients learn the correct methodology of hand placement on the scanner in the first try. We have had two instances of patient not found because of a bad initial scan. We simply erased the scanned metric and the patient enrolled again in those cases. The accuracy of the match is 100% for each patient, we have not had one patient misidentified. We can state this because we still use patient photo and date of birth as identifiers. A QA test patient is run monthly to check the integrity of the system. Conclusion: By utilizing palm vein scans along with the date of birth and patient photo, another means of patient identification now exits. This work indicates the successful implementation of technology in the area of patient safety by closing the gap of treating the wrong plan to a patient in radiation oncology. FOJP Service Corporation covered some of the costs of the hardware and software of the palm vein pattern recognition biometric system.« less

  18. 4d N = 1 quiver gauge theories and the An Bailey lemma

    NASA Astrophysics Data System (ADS)

    Brünner, Frederic; Spiridonov, Vyacheslav P.

    2018-03-01

    We study the integral Bailey lemma associated with the An-root system and identities for elliptic hypergeometric integrals generated thereby. Interpreting integrals as superconformal indices of four-dimensional N = 1 quiver gauge theories with the gauge groups being products of SU(n + 1), we provide evidence for various new dualities. Further confirmation is achieved by explicitly checking that the `t Hooft anomaly matching conditions holds. We discuss a flavour symmetry breaking phenomenon for supersymmetric quantum chromodynamics (SQCD), and by making use of the Bailey lemma we indicate its manifestation in a web of linear quivers dual to SQCD that exhibits full s-confinement.

  19. Yangian Symmetry and Integrability of Planar N=4 Supersymmetric Yang-Mills Theory.

    PubMed

    Beisert, Niklas; Garus, Aleksander; Rosso, Matteo

    2017-04-07

    In this Letter, we establish Yangian symmetry of planar N=4 supersymmetric Yang-Mills theory. We prove that the classical equations of motion of the model close onto themselves under the action of Yangian generators. Moreover, we propose an off-shell extension of our statement, which is equivalent to the invariance of the action and prove that it is exactly satisfied. We assert that our relationship serves as a criterion for integrability in planar gauge theories by explicitly checking that it applies to the integrable Aharony-Bergman-Jafferis-Maldacena theory but not to the nonintegrable N=1 supersymmetric Yang-Mills theory.

  20. Verifying Data Integrity of Electronically Scanned Pressure Systems at the NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Panek, Joseph W.

    2001-01-01

    The proper operation of the Electronically Scanned Pressure (ESP) System critical to accomplish the following goals: acquisition of highly accurate pressure data for the development of aerospace and commercial aviation systems and continuous confirmation of data quality to avoid costly, unplanned, repeat wind tunnel or turbine testing. Standard automated setup and checkout routines are necessary to accomplish these goals. Data verification and integrity checks occur at three distinct stages, pretest pressure tubing and system checkouts, daily system validation and in-test confirmation of critical system parameters. This paper will give an overview of the existing hardware, software and methods used to validate data integrity.

  1. [Filariasis control: entry point for other helminthiasis control programs?].

    PubMed

    Boussinesq, M

    2006-08-01

    Filariasis control programs are based on a decentralized drug distribution strategy known as "community-directed". This strategy could also be applied to the control of schistosomiasis and intestinal nematode infections. Integration of these control programs could be highly cost-effective. However, as a prerequisite for integration, it would be necessary to identify zones where these helminthic infections co-exist, specify the population categories that should receive each medication (ivermectin, albendazole, mebendazole, and praziquantel), check that combined administration of these drugs is safe and ensure that an integrated program would have no detrimental effect on the health care system and on the efficacy of ongoing programs.

  2. Evaluation of radiation loading on finite cylindrical shells using the fast Fourier transform: A comparison with direct numerical integration.

    PubMed

    Liu, S X; Zou, M S

    2018-03-01

    The radiation loading on a vibratory finite cylindrical shell is conventionally evaluated through the direct numerical integration (DNI) method. An alternative strategy via the fast Fourier transform algorithm is put forward in this work based on the general expression of radiation impedance. To check the feasibility and efficiency of the proposed method, a comparison with DNI is presented through numerical cases. The results obtained using the present method agree well with those calculated by DNI. More importantly, the proposed calculating strategy can significantly save the time cost compared with the conventional approach of straightforward numerical integration.

  3. Mixed-initiative control of intelligent systems

    NASA Technical Reports Server (NTRS)

    Borchardt, G. C.

    1987-01-01

    Mixed-initiative user interfaces provide a means by which a human operator and an intelligent system may collectively share the task of deciding what to do next. Such interfaces are important to the effective utilization of real-time expert systems as assistants in the execution of critical tasks. Presented here is the Incremental Inference algorithm, a symbolic reasoning mechanism based on propositional logic and suited to the construction of mixed-initiative interfaces. The algorithm is similar in some respects to the Truth Maintenance System, but replaces the notion of 'justifications' with a notion of recency, allowing newer values to override older values yet permitting various interested parties to refresh these values as they become older and thus more vulnerable to change. A simple example is given of the use of the Incremental Inference algorithm plus an overview of the integration of this mechanism within the SPECTRUM expert system for geological interpretation of imaging spectrometer data.

  4. Perception Evolution Network Based on Cognition Deepening Model--Adapting to the Emergence of New Sensory Receptor.

    PubMed

    Xing, Youlu; Shen, Furao; Zhao, Jinxi

    2016-03-01

    The proposed perception evolution network (PEN) is a biologically inspired neural network model for unsupervised learning and online incremental learning. It is able to automatically learn suitable prototypes from learning data in an incremental way, and it does not require the predefined prototype number or the predefined similarity threshold. Meanwhile, being more advanced than the existing unsupervised neural network model, PEN permits the emergence of a new dimension of perception in the perception field of the network. When a new dimension of perception is introduced, PEN is able to integrate the new dimensional sensory inputs with the learned prototypes, i.e., the prototypes are mapped to a high-dimensional space, which consists of both the original dimension and the new dimension of the sensory inputs. In the experiment, artificial data and real-world data are used to test the proposed PEN, and the results show that PEN can work effectively.

  5. Recent advances in the modelling of crack growth under fatigue loading conditions

    NASA Technical Reports Server (NTRS)

    Dekoning, A. U.; Tenhoeve, H. J.; Henriksen, T. K.

    1994-01-01

    Fatigue crack growth associated with cyclic (secondary) plastic flow near a crack front is modelled using an incremental formulation. A new description of threshold behaviour under small load cycles is included. Quasi-static crack extension under high load excursions is described using an incremental formulation of the R-(crack growth resistance)- curve concept. The integration of the equations is discussed. For constant amplitude load cycles the results will be compared with existing crack growth laws. It will be shown that the model also properly describes interaction effects of fatigue crack growth and quasi-static crack extension. To evaluate the more general applicability the model is included in the NASGRO computer code for damage tolerance analysis. For this purpose the NASGRO program was provided with the CORPUS and the STRIP-YIELD models for computation of the crack opening load levels. The implementation is discussed and recent results of the verification are presented.

  6. Linear scaling computation of the Fock matrix. II. Rigorous bounds on exchange integrals and incremental Fock build

    NASA Astrophysics Data System (ADS)

    Schwegler, Eric; Challacombe, Matt; Head-Gordon, Martin

    1997-06-01

    A new linear scaling method for computation of the Cartesian Gaussian-based Hartree-Fock exchange matrix is described, which employs a method numerically equivalent to standard direct SCF, and which does not enforce locality of the density matrix. With a previously described method for computing the Coulomb matrix [J. Chem. Phys. 106, 5526 (1997)], linear scaling incremental Fock builds are demonstrated for the first time. Microhartree accuracy and linear scaling are achieved for restricted Hartree-Fock calculations on sequences of water clusters and polyglycine α-helices with the 3-21G and 6-31G basis sets. Eightfold speedups are found relative to our previous method. For systems with a small ionization potential, such as graphitic sheets, the method naturally reverts to the expected quadratic behavior. Also, benchmark 3-21G calculations attaining microhartree accuracy are reported for the P53 tetramerization monomer involving 698 atoms and 3836 basis functions.

  7. 40 CFR 75.59 - Certification, quality assurance, and quality control record provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and the run average); (B) The raw data and results for all required pre-test, post-test, pre-run and...-day calibration error tests, all daily system integrity checks (Hg monitors, only), and all off-line calibration demonstrations, including any follow-up tests after corrective action: (i) Component-system...

  8. Improving NAVFAC's total quality management of construction drawings with CLIPS

    NASA Technical Reports Server (NTRS)

    Antelman, Albert

    1991-01-01

    A diagnostic expert system to improve the quality of Naval Facilities Engineering Command (NAVFAC) construction drawings and specification is described. C Language Integrated Production System (CLIPS) and computer aided design layering standards are used in an expert system to check and coordinate construction drawings and specifications to eliminate errors and omissions.

  9. Foursquare: A Health Education Specialist Checks-In--A Commentary

    ERIC Educational Resources Information Center

    Haithcox-Dennis, Melissa

    2011-01-01

    More and more, health education specialists are integrating technology into their work. Whereas most are familiar with social media sites like Facebook, Twitter and LinkedIn, one relatively new form of social media, location based services (LBS), may be less familiar. Developed in 2000, LBS are software applications that are accessible from a…

  10. HPC Software Stack Testing Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garvey, Cormac

    The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).

  11. Think Inside the Box. Integrating Math in Your Classroom

    ERIC Educational Resources Information Center

    Naylor, Michael

    2005-01-01

    This brief article describes a few entertaining math "puzzles" that are easy to use with students at any grade level and with any operation. Not only do these puzzles help provide practice with facts and operations, they are also self-checking and may lead to some interesting big ideas in algebra.

  12. Efficiency in the Community College Sector: Stochastic Frontier Analysis

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Belfield, Clive

    2017-01-01

    This paper estimates technical efficiency scores across the community college sector in the United States. Using stochastic frontier analysis and data from the Integrated Postsecondary Education Data System for 2003-2010, we estimate efficiency scores for 950 community colleges and perform a series of sensitivity tests to check for robustness. We…

  13. Diagrammar in classical scalar field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste

    2011-09-15

    In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less

  14. Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design

    NASA Technical Reports Server (NTRS)

    Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.

    1991-01-01

    Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.

  15. Galileo: The Added Value for Integrity in Harsh Environments.

    PubMed

    Borio, Daniele; Gioia, Ciro

    2016-01-16

    A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability.

  16. Galileo: The Added Value for Integrity in Harsh Environments

    PubMed Central

    Borio, Daniele; Gioia, Ciro

    2016-01-01

    A global navigation satellite system (GNSS)-based navigation is a challenging task in a signal-degraded environments where GNSS signals are distorted by multipath and attenuated by fading effects: the navigation solution may be inaccurate or unavailable. A possible approach to improve accuracy and availability is the joint use of measurements from different GNSSs and quality check algorithms; this approach is investigated here using live GPS and Galileo signals. A modified receiver autonomous integrity monitoring (RAIM) algorithm, including geometry and separability checks, is proposed to detect and exclude erroneous measurements: the multi-constellation approach provides redundant measurements, and RAIM exploits them to exclude distorted observations. The synergy between combined GPS/Galileo navigation and RAIM is analyzed using live data; the performance is compared to the accuracy and availability of a GPS-only solution. The tests performed demonstrate that the methods developed are effective techniques for GNSS-based navigation in signal-degraded environments. The joint use of the multi-constellation approach and of modified RAIM algorithms improves the performance of the navigation system in terms of both accuracy and availability. PMID:26784205

  17. A new uniformly valid asymptotic integration algorithm for elasto-plastic creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, Abhisak; Walker, Kevin P.

    1991-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  18. Toward a patient-centric medical information model: issues and challenges for US adoption.

    PubMed

    Lorence, Daniel; Monatesti, Sabatini; Margenthaler, Robert; Hoadley, Ellen

    2005-01-01

    As the USA moves, incrementally, toward evidence-based medicine, there is growing awareness of the importance of innovation in information management. Mandates for change include improved use of resources, accelerated diffusion of knowledge and an advanced consumer role. Key among these requirements is the need for a fundamentally different patient information recording system. Within the challenges identified in the most recent national health information technology initiative, we propose a model for an electronic, patient-centric medical information infrastructure, highlighting a transportable, scalable and integrated resource. We identify resources available for technology transfer, promoting consumers as integral parts of the collaborative medical decision-making process.

  19. Planarization of metal films for multilevel interconnects by pulsed laser heating

    DOEpatents

    Tuckerman, David B.

    1987-01-01

    In the fabrication of multilevel integrated circuits, each metal layer is planarized by heating to momentarily melt the layer. The layer is melted by sweeping laser pulses of suitable width, typically about 1 microsecond duration, over the layer in small increments. The planarization of each metal layer eliminates irregular and discontinuous conditions between successive layers. The planarization method is particularly applicable to circuits having ground or power planes and allows for multilevel interconnects. Dielectric layers can also be planarized to produce a fully planar multilevel interconnect structure. The method is useful for the fabrication of VLSI circuits, particularly for wafer-scale integration.

  20. Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays.

    PubMed

    Choi, Heejin; Min, Sung-Wook; Jung, Sungyong; Park, Jae-Hyeung; Lee, Byoungho

    2003-04-21

    In spite of many advantages of integral imaging, the viewing zone in which an observer can see three-dimensional images is limited within a narrow range. Here, we propose a novel method to increase the number of viewing zones by using a dynamic barrier array. We prove our idea by fabricating and locating the dynamic barrier array between a lens array and a display panel. By tilting the barrier array, it is possible to distribute images for each viewing zone. Thus, the number of viewing zones can be increased with an increment of the states of the barrier array tilt.

  1. A new uniformly valid asymptotic integration algorithm for elasto-plastic-creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, A.; Walker, K. P.

    1989-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  2. FIBER AND INTEGRATED OPTICS: Detection of the optical anisotropy in KTP:Rb waveguides

    NASA Astrophysics Data System (ADS)

    Buritskiĭ, K. S.; Dianov, Evgenii M.; Maslov, Vladislav A.; Chernykh, V. A.; Shcherbakov, E. A.

    1990-10-01

    The optical characteristics of channel waveguides made of rubidium-activated potassium titanyl phosphate (KTP:Rb) were determined. The refractive index increment of such waveguides was found to exhibit a considerable anisotropy: Δnx / Δnz approx 2. A deviation of the distribution of the refractive index in a channel waveguide from the model distribution was observed for ion-exchange times in excess of 1 h.

  3. A Research Agenda for Service-Oriented Architecture (SOA): Maintenance and Evolution of Service-Oriented Systems

    DTIC Science & Technology

    2010-03-01

    service consumers, and infrastructure. Techniques from any iterative and incremental software development methodology followed by the organiza- tion... Service -Oriented Architecture Environment (CMU/SEI-2008-TN-008). Software Engineering Institute, Carnegie Mellon University, 2008. http://www.sei.cmu.edu...Integrating Legacy Software into a Service Oriented Architecture.” Proceedings of the 10th European Conference on Software Maintenance (CSMR 2006). Bari

  4. Dynamic analysis of nonlinear rotor-housing systems

    NASA Technical Reports Server (NTRS)

    Noah, Sherif T.

    1988-01-01

    Nonlinear analysis methods are developed which will enable the reliable prediction of the dynamic behavior of the space shuttle main engine (SSME) turbopumps in the presence of bearing clearances and other local nonlinearities. A computationally efficient convolution method, based on discretized Duhamel and transition matrix integral formulations, is developed for the transient analysis. In the formulation, the coupling forces due to the nonlinearities are treated as external forces acting on the coupled subsystems. Iteration is utilized to determine their magnitudes at each time increment. The method is applied to a nonlinear generic model of the high pressure oxygen turbopump (HPOTP). As compared to the fourth order Runge-Kutta numerical integration methods, the convolution approach proved to be more accurate and more highly efficient. For determining the nonlinear, steady-state periodic responses, an incremental harmonic balance method was also developed. The method was successfully used to determine dominantly harmonic and subharmonic responses fo the HPOTP generic model with bearing clearances. A reduction method similar to the impedance formulation utilized with linear systems is used to reduce the housing-rotor models to their coordinates at the bearing clearances. Recommendations are included for further development of the method, for extending the analysis to aperiodic and chaotic regimes and for conducting critical parameteric studies of the nonlinear response of the current SSME turbopumps.

  5. Incremental Aerodynamic Coefficient Database for the USA2

    NASA Technical Reports Server (NTRS)

    Richardson, Annie Catherine

    2016-01-01

    In March through May of 2016, a wind tunnel test was conducted by the Aerosciences Branch (EV33) to visually study the unsteady aerodynamic behavior over multiple transition geometries for the Universal Stage Adapter 2 (USA2) in the MSFC Aerodynamic Research Facility's Trisonic Wind Tunnel (TWT). The purpose of the test was to make a qualitative comparison of the transonic flow field in order to provide a recommended minimum transition radius for manufacturing. Additionally, 6 Degree of Freedom force and moment data for each configuration tested was acquired in order to determine the geometric effects on the longitudinal aerodynamic coefficients (Normal Force, Axial Force, and Pitching Moment). In order to make a quantitative comparison of the aerodynamic effects of the USA2 transition geometry, the aerodynamic coefficient data collected during the test was parsed and incorporated into a database for each USA2 configuration tested. An incremental aerodynamic coefficient database was then developed using the generated databases for each USA2 geometry as a function of Mach number and angle of attack. The final USA2 coefficient increments will be applied to the aerodynamic coefficients of the baseline geometry to adjust the Space Launch System (SLS) integrated launch vehicle force and moment database based on the transition geometry of the USA2.

  6. KSC-2013-3092

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068344 - NASA astronaut Randy Bresnik gets into position in The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Bresnik's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  7. KSC-2013-3088

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068317 - NASA astronaut Serena Aunon exits The Boeing Company's CST-100 spacecraft following a fit check evaluation at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  8. KSC-2013-3080

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068269 - NASA astronaut Serena Aunon prepares to enter The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  9. KSC-2013-3091

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068333 - NASA astronaut Randy Bresnik prepares to enter The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Bresnik's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  10. KSC-2013-3078

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068260 - NASA astronaut Serena Aunon suits up for a fit check evaluation of The Boeing Company's CST-100 spacecraft at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  11. Prototype data terminal: Multiplexer/demultiplexer

    NASA Technical Reports Server (NTRS)

    Leck, D. E.; Goodwin, J. E.

    1972-01-01

    The design and operation of a quad redundant data terminal and a multiplexer/demultiplexer (MDU) design are described. The most unique feature is the design of the quad redundant data terminal. This is one of the few designs where the unit is fail/op, fail/op, fail/safe. Laboratory tests confirm that the unit will operate satisfactorily with the failure of three out of four channels. Although the design utilizes state-of-the-art technology. The waveform error checks, the voting techniques, and the parity bit checks are believed to be used in unique configurations. Correct word selection routines are also novel, if not unique. The MDU design, while not redundant, utilizes, the latest state-of-the-art advantages of light couplers and integrated circuit amplifiers.

  12. STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF

    NASA Technical Reports Server (NTRS)

    1999-01-01

    STS-92 Mission Specialist Koichi Wakata, with the National Space Development Agency of Japan (NASDA), and Pilot Pamela A. Melroy take a break during a Leak Seal Kit Fit Check of the Pressurized Mating Adapter -3 in the Space Station Processing Facility. Also participating are the other crew members Commander Brian Duffy and Mission Specialists Leroy Chiao (Ph.D.), Peter J.K. 'Jeff' Wisoff (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. STS-92 is the fourth U.S. flight for construction of the International Space Station. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.

  13. STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the Space Station Processing Facility, STS-92 crew members take part in a Leak Seal Kit Fit Check in connection with the Pressurized Mating Adapter -3 in the background. From left are Mission Specialist Peter J.K. 'Jeff' Wisoff (Ph.D.), Pilot Pamela A. Melroy, Commander Brian Duffy, Mission Specialist Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA), Brian Warkentine, with JSC, and a Boeing worker at right. Also participating are other crew members Mission Specialists Leroy Chiao (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.

  14. Concreteness and relational effects on recall of adjective-noun pairs.

    PubMed

    Paivio, A; Khan, M; Begg, I

    2000-09-01

    Extending previous research on the problem, we studied the effects of concreteness and relatedness of adjective-noun pairs on free recall, cued recall, and memory integration. Two experiments varied the attributes in paired associates lists or sentences. Consistent with predictions from dual coding theory and prior results with noun-noun pairs, both experiments showed that the effects of concreteness were strong and independent of relatedness in free recall and cued recall. The generally positive effects of relatedness were absent in the case of free recall of sentences. The two attributes also had independent (additive) effects on integrative memory as measured by conditionalized free recall of pairs. Integration as measured by the increment from free to cued recall occurred consistently only when pairs were high in both concreteness and relatedness. Explanations focused on dual coding and relational-distinctiveness processing theories as well as task variables that affect integration measures.

  15. Steel bridge in interaction with modern slab track fastening systems under various vertical load levels

    NASA Astrophysics Data System (ADS)

    Stančík, Vojtěch; Ryjáček, Pavel; Vokáč, Miroslav

    2017-09-01

    In modern slab tracks the continuously welded rail (CWR) is coupled through the fastening system with the substructure. The resulting restriction of expansion movement causes significant rail stress increments, which in the case of extreme loading may cause rail failures. These interaction phenomenon effects are naturally higher on a bridge due to different deformation capabilities of the bridge and the CWR. The presented contribution aims at investigating the state of the art European direct fastening system that is suitable for application on steel bridges. Analysis involves experimental determination of its nonlinear longitudinal interaction parameters under various vertical loads and numerical validation. During experimental procedures a two and a half meter long laboratory sample equipped with four nodes of the Vossloh DFF 300 was tested. There have been checked both DFF 300 modifications using the skl 15 tension clamps and the low resistance skl B15 tension clamps. The effects of clamping force lowering on the interaction parameters have also been investigated. Results are discussed in the paper.

  16. Computation in Classical Mechanics with Easy Java Simulations (EJS)

    NASA Astrophysics Data System (ADS)

    Cox, Anne J.

    2006-12-01

    Let your students enjoy creating animations and incorporating some computational physics into your Classical Mechanics course. This talk will demonstrate the use of an Open Source Physics package, Easy Java Simulations (EJS), in an already existing sophomore/junior level Classical Mechanics course. EJS allows for incremental introduction of computational physics into existing courses because it is easy to use (for instructors and students alike) and it is open source. Students can use this tool for numerical solutions to problems (as they can with commercial systems: Mathcad and Mathematica), but they can also generate their own animations. For example, students in Classical Mechanics use Lagrangian mechanics to solve a problem, and then use EJS not only to numerically solve the differential equations, but to show the associated motion (and check their answers). EJS, developed by Francisco Esquembre (http://fem.um.es/Ejs/), is built on the OpenSource Physics framework (http://www.opensourcephysics.org/) supported through NSF DUE0442581.

  17. Precise and Efficient Static Array Bound Checking for Large Embedded C Programs

    NASA Technical Reports Server (NTRS)

    Venet, Arnaud

    2004-01-01

    In this paper we describe the design and implementation of a static array-bound checker for a family of embedded programs: the flight control software of recent Mars missions. These codes are large (up to 250 KLOC), pointer intensive, heavily multithreaded and written in an object-oriented style, which makes their analysis very challenging. We designed a tool called C Global Surveyor (CGS) that can analyze the largest code in a couple of hours with a precision of 80%. The scalability and precision of the analyzer are achieved by using an incremental framework in which a pointer analysis and a numerical analysis of array indices mutually refine each other. CGS has been designed so that it can distribute the analysis over several processors in a cluster of machines. To the best of our knowledge this is the first distributed implementation of static analysis algorithms. Throughout the paper we will discuss the scalability setbacks that we encountered during the construction of the tool and their impact on the initial design decisions.

  18. Rotational-path decomposition based recursive planning for spacecraft attitude reorientation

    NASA Astrophysics Data System (ADS)

    Xu, Rui; Wang, Hui; Xu, Wenming; Cui, Pingyuan; Zhu, Shengying

    2018-02-01

    The spacecraft reorientation is a common task in many space missions. With multiple pointing constraints, it is greatly difficult to solve the constrained spacecraft reorientation planning problem. To deal with this problem, an efficient rotational-path decomposition based recursive planning (RDRP) method is proposed in this paper. The uniform pointing-constraint-ignored attitude rotation planning process is designed to solve all rotations without considering pointing constraints. Then the whole path is checked node by node. If any pointing constraint is violated, the nearest critical increment approach will be used to generate feasible alternative nodes in the process of rotational-path decomposition. As the planning path of each subdivision may still violate pointing constraints, multiple decomposition is needed and the reorientation planning is designed as a recursive manner. Simulation results demonstrate the effectiveness of the proposed method. The proposed method has been successfully applied in two SPARK microsatellites to solve onboard constrained attitude reorientation planning problem, which were developed by the Shanghai Engineering Center for Microsatellites and launched on 22 December 2016.

  19. Spatial distribution of angular momentum inside the nucleon

    NASA Astrophysics Data System (ADS)

    Lorcé, Cédric; Mantovani, Luca; Pasquini, Barbara

    2018-01-01

    We discuss in detail the spatial distribution of angular momentum inside the nucleon. We show that the discrepancies between different definitions originate from terms that integrate to zero. Even though these terms can safely be dropped at the integrated level, they have to be taken into account when discussing distributions. Using the scalar diquark model, we illustrate our results and, for the first time, check explicitly that the equivalence between kinetic and canonical orbital angular momentum persists at the level of distributions, as expected in a system without gauge degrees of freedom.

  20. Path integral pricing of Wasabi option in the Black-Scholes model

    NASA Astrophysics Data System (ADS)

    Cassagnes, Aurelien; Chen, Yu; Ohashi, Hirotada

    2014-11-01

    In this paper, using path integral techniques, we derive a formula for a propagator arising in the study of occupation time derivatives. Using this result we derive a fair price for the case of the cumulative Parisian option. After confirming the validity of the derived result using Monte Carlo simulation, a new type of heavily path dependent derivative product is investigated. We derive an approximation for our so-called Wasabi option fair price and check the accuracy of our result with a Monte Carlo simulation.

  1. The NASA Lewis integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1991-01-01

    A new flight simulation facility was developed at NASA-Lewis. The purpose of this flight simulator is to allow integrated propulsion control and flight control algorithm development and evaluation in real time. As a preliminary check of the simulator facility capabilities and correct integration of its components, the control design and physics models for a short take-off and vertical landing fighter aircraft model were shown, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The initial testing and evaluation results show that this fixed based flight simulator can provide real time feedback and display of both airframe and propulsion variables for validation of integrated flight and propulsion control systems. Additionally, through the use of this flight simulator, various control design methodologies and cockpit mechanizations can be tested and evaluated in a real time environment.

  2. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  3. A reproductive health survey of rural women in Hebei.

    PubMed

    Wang, J

    1998-12-01

    This article presents the findings of a 1995 family planning survey conducted among 657 women aged 18-49 years in rural areas of Tangshan City, Zhoushou City, and Xingtai City in Hebei province, Northern China. 620 were married, 37 were single, and 6 were widowed. 85.8% of married rural women used a contraceptive method (female sterilization or IUD). There were 1219 pregnancies, 230 abortions, 31 miscarriages, and 3 stillbirths. 68.1% received prenatal check-ups at hospitals and health centers. 47.4% received prenatal care during the first trimester of pregnancy. 76.1% received check-ups at township health centers. Women were aware of the need for sound personal hygiene, sanitary napkins, and avoidance of heavy manual work during menstruation. 45.1% had less than 5 years of education; 51.8% had 6-10 years of education; and 3.1% had over 10 years of education. About 54% delivered at home. Home deliveries were due to lack of transportation, high expenses, and other reasons. Deliveries were attended by a doctor or midwife. Postpartum home visits were not assured. 32.4% had routine gynecological check-ups. 48.1% had never received gynecological services. 51.6% of married women had 2 children; 16.9% had more. The author recommended improved socioeconomic and cultural conditions, a women-centered reproductive health security system integrated with education, and legislative change. Reproductive health education should be integrated into family planning programs and include health awareness and more education. Men should participate in programs and share more responsibility for reproduction. Services should improve in quality.

  4. Thermal Model Development for Ares I-X

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; DelCorso, Joe

    2008-01-01

    Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.

  5. Online friends, offline loved ones, and full-time media: young adult "mass personal" use of communication resources for informational and emotional support.

    PubMed

    Love, Brad; Donovan, Erin E

    2014-06-01

    As Web 2.0 technologies proliferate, patient education is changing dramatically. Information about prevention and survivorship arrives from a mix of sources. The present manuscript describes a study to shed light on how young adults (YAs) affected by cancer manage the digital world. Our investigation was guided by a research question asking how young adults affected by cancer engage in communication work in an environment of mass personal communication. The sample for this research consisted of 500 posts comprising 50 complete threads from an online support community for young adults affected by cancer. Threads were purposively sampled in a multi-stage process. Researchers used constant comparison to define themes, examining text in increments. Individuals harnessed assets of various communication tools for the purposes of message preparation and credibility checking. YAs demonstrated the multi-channel way they move between channels for different purposes, driven by preparation for future interactions. The result is a process that allows co-creation of knowledge in a trusted community. Findings indicate that completing communication work through multiple channels in a deliberate and savvy way is normal for YAs, particularly for message preparation and credibility checking. The multidirectional nature of digital tools plays an important role for YAs, as interactive resources appear to be the first or second stop for information after key events in the cancer trajectory. Results from this study are important as guidance to help manage the volume and depth of information common to the cancer experience in the Web 2.0 world.

  6. Getting Ready for Inspection of Investigational Site at Short Notice

    PubMed Central

    Talele, Rajendra

    2010-01-01

    India is becoming an attractive destination for drug development and clinical research. This is evidenced by the three fold increment in clinical trial applications in last four years to the office of Drugs Controller General of India (DCGI). This upward trend is collaborative efforts of all stake holders and the quality of Indian data. Therefore to sustain this trend, it is important that stake holders such as Regulators, Sponsor, CRO, Monitor, Investigators and trial subjects required maintaining high standards of data and conduct of clinical trials. Indian regulations and the role of DCGI in quality check for Indian clinical trials is always a topic of discussion in various forums. A recent move by DCGI for conducting random inspections of investigational sites and companies at short notice, checking their compliance in accordance with the guidelines, and taking action against non-complier is welcomed. This will certainly increase over quality of the clinical trials. Quality of clinical trial conduct is measured on essential documents for their appropriateness and its correctness. It is observed that the stakeholders engaged in multitasking often overlook the requirements or appropriateness of the document due to their focused approach on a specific activity which is on priority. This can lead to serious quality problem and issues. Understanding of the process and documents reviewed by auditor is important to maintain such high quality. The proper planning and time management working on essential documents can minimize the quality issues, and we can be always ready for any type of inspection, announced or unannounced, or “short notice”. PMID:21829785

  7. Towards a Developmentally Integrative Model of Personality Change: A Focus on Three Potential Mechanisms

    PubMed Central

    Riley, Elizabeth N.; Peterson, Sarah J.; Smith, Gregory T.

    2017-01-01

    While the overall stability of personality across the lifespan has been well-documented, one does see incremental changes in a number of personality traits, changes that may impact overall life trajectories in both positive and negative ways. In this chapter, we present a new, developmentally-oriented and integrative model of the factors that might lead to personality change, drawing from the theoretical and empirical work of prior models (e.g. Caspi & Roberts, 2001; Roberts et al., 2005) as well as from our own longitudinal studies of personality change and risky behavior engagement in children, adolescents, and young adults (Boyle et al., 2016; Riley & Smith, 2016; Riley et al., 2016). We focus on change in the trait of urgency, which is a high-risk personality trait that represents the tendency to act rashly when highly emotional. We explore processes of both biologically-based personality change in adolescence, integrating neurocognitive and puberty-based models, as well as behavior-based personality change, in which behaviors and the personality traits underlying those behaviors are incrementally reinforced and shaped over time. One implication of our model for clinical psychology is the apparent presence of a positive feedback loop of risk, in which maladaptive behaviors increase high-risk personality traits, which in turn further increase the likelihood of maladaptive behaviors, a process that continues far beyond the initial experiences of maladaptive behavior engagement. Finally, we examine important future directions for continuing work on personality change, including trauma-based personality change and more directive (e.g., therapeutic) approaches aimed at shaping personality. PMID:29109672

  8. 40 CFR 63.9804 - What are my monitoring system installation, operation, and maintenance requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... your CPMS for leakage. (7) If your CPMS is not equipped with a redundant pressure sensor, perform at... sensor. (6) At least monthly, check all mechanical connections on your CPMS for leakage. (7) If your CPMS... integrity, oxidation, and galvanic corrosion. (f) For each bag leak detection system, you must meet the...

  9. Using Surface Integrals for Checking Archimedes' Law of Buoyancy

    ERIC Educational Resources Information Center

    Lima, F. M. S.

    2012-01-01

    A mathematical derivation of the force exerted by an "inhomogeneous" (i.e. compressible) fluid on the surface of an "arbitrarily shaped" body immersed in it is not found in the literature, which may be attributed to our trust in Archimedes' law of buoyancy. However, this law, also known as Archimedes' principle (AP), does not yield the force…

  10. Assessment of 3D Models Used in Contours Studies

    ERIC Educational Resources Information Center

    Alvarez, F. J. Ayala; Parra, E. B. Blazquez; Tubio, F. Montes

    2015-01-01

    This paper presents an experimental research focusing on the view of first year students. The aim is to check the quality of implementing 3D models integrated in the curriculum. We search to determine students' preference between the various means facilitated in order to understand the given subject. Students have been respondents to prove the…

  11. Pilot Test of IPEDS Finance Survey. Working Paper Series.

    ERIC Educational Resources Information Center

    Hunt, Ray C., Jr.; Budak, Susan E.

    The Private Institution Pretest Finance Survey (Pretest Survey) of the Integrated Postsecondary Education Data System (IPEDS) was sent to a sample of 1,000 private colleges and universities for the fiscal year 1996. As of May 1997, the Bureau of the Census, administering the survey, had received and completed edit checks for about 50% of the…

  12. Crewmember working on the spacelab Zeolite Crystal Growth experiment.

    NASA Technical Reports Server (NTRS)

    1992-01-01

    View showing Payload Specialists Bonnie Dunbar and Larry DeLucas in the aft section of the U. S. Microgravity Laboratory-1. Dunbar is preparing to load a sample in the Crystal Growth Furnace (CGF) Integrated Furnace Experiment Assembly (IFEA) in rack 9 of the Microgravity Laboratory. DeLucas is checking out the multi-purpose Glovebox Facility.

  13. A Multiple Sensor Machine Vision System for Automatic Hardwood Feature Detection

    Treesearch

    D. Earl Kline; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman; Robert L. Brisbin

    1993-01-01

    A multiple sensor machine vision prototype is being developed to scan full size hardwood lumber at industrial speeds for automatically detecting features such as knots holes, wane, stain, splits, checks, and color. The prototype integrates a multiple sensor imaging system, a materials handling system, a computer system, and application software. The prototype provides...

  14. Systems Check: Community Colleges Turn to Facilities Assessments to Plan Capital Projects and Avoid Expensive Emergency Repairs

    ERIC Educational Resources Information Center

    Joch, Alan

    2014-01-01

    With an emphasis on planning and cutting costs to make better use of resources, facilities managers at community colleges across the nation have undertaken facilities audits usually with the help of outside engineers. Such assessments analyze the history and structural integrity of buildings and core components on campus, including heating…

  15. The Lorax Readers' Theater: Introducing Sustainability with an Integrated Science and Literacy Activity

    ERIC Educational Resources Information Center

    Plankis, Brian; Ramsey, John; Ociepka, Anne; Martin, Pamela

    2016-01-01

    In practice, sustainable development is the use of natural resources in a manner that allows ecosystems to continue to function as natural ecosystems and biotic and abiotic interactions to maintain checks and balances are homeostatic. Historically, human activity has led to modification of nature that leads to (1) economic development, (2) biotic…

  16. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  17. Quality Management Tools: Facilitating Clinical Research Data Integrity by Utilizing Specialized Reports with Electronic Case Report Forms

    PubMed Central

    Trocky, NM; Fontinha, M

    2005-01-01

    Data collected throughout the course of a clinical research trial must be reviewed for accuracy and completeness continually. The Oracle Clinical® (OC) data management application utilized to capture clinical data facilitates data integrity through pre-programmed validations, edit and range checks, and discrepancy management modules. These functions were not enough. Coupled with the use of specially created reports in Oracle Discoverer® and Integrated Review TM, both ad-hoc query and reporting tools, research staff have enhanced their ability to clean, analyze and report more accurate data captured within and among Case Report Forms (eCRFs) by individual study or across multiple studies. PMID:16779428

  18. Notes on integral identities for 3d supersymmetric dualities

    NASA Astrophysics Data System (ADS)

    Aghaei, Nezhla; Amariti, Antonio; Sekiguchi, Yuta

    2018-04-01

    Four dimensional N=2 Argyres-Douglas theories have been recently conjectured to be described by N=1 Lagrangian theories. Such models, once reduced to 3d, should be mirror dual to Lagrangian N=4 theories. This has been numerically checked through the matching of the partition functions on the three sphere. In this article, we provide an analytic derivation for this result in the A 2 n-1 case via hyperbolic hypergeometric integrals. We study the D 4 case as well, commenting on some open questions and possible resolutions. In the second part of the paper we discuss other integral identities leading to the matching of the partition functions in 3d dual pairs involving higher monopole superpotentials.

  19. Experiment for Integrating Dutch 3d Spatial Planning and Bim for Checking Building Permits

    NASA Astrophysics Data System (ADS)

    van Berlo, L.; Dijkmans, T.; Stoter, J.

    2013-09-01

    This paper presents a research project in The Netherlands in which several SMEs collaborated to create a 3D model of the National spatial planning information. This 2D information system described in the IMRO data standard holds implicit 3D information that can be used to generate an explicit 3D model. The project realized a proof of concept to generate a 3D spatial planning model. The team used the model to integrate it with several 3D Building Information Models (BIMs) described in the open data standard Industry Foundation Classes (IFC). Goal of the project was (1) to generate a 3D BIM model from spatial planning information to be used by the architect during the early design phase, and (2) allow 3D checking of building permits. The team used several technologies like CityGML, BIM clash detection and GeoBIM to explore the potential of this innovation. Within the project a showcase was created with a part of the spatial plan from the city of The Hague. Several BIM models were integrated in the 3D spatial plan of this area. A workflow has been described that demonstrates the benefits of collaboration between the spatial domain and the AEC industry in 3D. The research results in a showcase with conclusions and considerations for both national and international practice.

  20. Incremental Predictive Value of Serum AST-to-ALT Ratio for Incident Metabolic Syndrome: The ARIRANG Study

    PubMed Central

    Ahn, Song Vogue; Baik, Soon Koo; Cho, Youn zoo; Koh, Sang Baek; Huh, Ji Hye; Chang, Yoosoo; Sung, Ki-Chul; Kim, Jang Young

    2016-01-01

    Aims The ratio of aspartate aminotransferase (AST) to alanine aminotransferase (ALT) is of great interest as a possible novel marker of metabolic syndrome. However, longitudinal studies emphasizing the incremental predictive value of the AST-to-ALT ratio in diagnosing individuals at higher risk of developing metabolic syndrome are very scarce. Therefore, our study aimed to evaluate the AST-to-ALT ratio as an incremental predictor of new onset metabolic syndrome in a population-based cohort study. Material and Methods The population-based cohort study included 2276 adults (903 men and 1373 women) aged 40–70 years, who participated from 2005–2008 (baseline) without metabolic syndrome and were followed up from 2008–2011. Metabolic syndrome was defined according to the harmonized definition of metabolic syndrome. Serum concentrations of AST and ALT were determined by enzymatic methods. Results During an average follow-up period of 2.6-years, 395 individuals (17.4%) developed metabolic syndrome. In a multivariable adjusted model, the odds ratio (95% confidence interval) for new onset of metabolic syndrome, comparing the fourth quartile to the first quartile of the AST-to-ALT ratio, was 0.598 (0.422–0.853). The AST-to-ALT ratio also improved the area under the receiver operating characteristic curve (AUC) for predicting new cases of metabolic syndrome (0.715 vs. 0.732, P = 0.004). The net reclassification improvement of prediction models including the AST-to-ALT ratio was 0.23 (95% CI: 0.124–0.337, P<0.001), and the integrated discrimination improvement was 0.0094 (95% CI: 0.0046–0.0143, P<0.001). Conclusions The AST-to-ALT ratio independently predicted the future development of metabolic syndrome and had incremental predictive value for incident metabolic syndrome. PMID:27560931

  1. Cost-Effectiveness of Screening for Primary Aldosteronism and Subtype Diagnosis in the Resistant Hypertensive Patients.

    PubMed

    Lubitz, Carrie C; Economopoulos, Konstantinos P; Sy, Stephen; Johanson, Colden; Kunzel, Heike E; Reincke, Martin; Gazelle, G Scott; Weinstein, Milton C; Gaziano, Thomas A

    2015-11-01

    Primary aldosteronism (PA) is a common and underdiagnosed disease with significant morbidity potentially cured by surgery. We aim to assess if the long-term cardiovascular benefits of identifying and treating surgically correctable PA outweigh the upfront increased costs in patients at the time patients are diagnosed with resistant hypertension (RH). A decision-analytic model compares aggregate costs and systolic blood pressure changes of 6 recommended or implemented diagnostic strategies for PA in a simulated population of at-risk RH patients. We also evaluate a 7th "treat all" strategy wherein all patients with RH are treated with a mineralocorticoid-receptor antagonist without further testing at RH diagnosis. Changes in systolic blood pressure are subsequently converted into gains in quality-adjusted life years (QALYs) by applying National Health and Nutrition Examination Survey data on concomitant risk factors to an existing cardiovascular disease simulation model. QALYs and lifetime costs were then used to calculate incremental cost-effectiveness ratios for the competing strategies. The incremental cost-effectiveness ratio for the strategy of computerized tomography (CT) followed by adrenal venous sampling (AVS) was $82,000/QALY compared with treat all. Incremental cost-effectiveness ratios for CT alone and AVS alone were $200,000/QALY and $492,000/QALY; the other strategies were more costly and less effective. Integrating differential patient-reported health-related quality of life adjustments for patients with PA, and incremental cost-effectiveness ratios for screening patients with CT followed by AVS, CT alone, and AVS alone were $52,000/QALY, $114,000/QALY, and $269,000/QALY gained. CT scanning followed by AVS was a cost-effective strategy to screen for PA among patients with RH. © 2015 American Heart Association, Inc.

  2. Research on high-speed railway's vibration analysis checking based on intelligent mobile terminal

    NASA Astrophysics Data System (ADS)

    Li, Peigang; Xie, Shulin; Zhao, Xuefeng

    2017-04-01

    Recently, the development of high-speed railway meets the requirement of society booming and it has gradually become the first choice for long-length journey. Since ensuring the safety and stable operation are of great importance to high-speed trains owing to its unique features, vibration analysis checking is one of main means to be adopted. Due to the popularization of Smartphone, in this research, a novel public-participating method to achieve high-speed railway's vibration analysis checking based on smartphone and an inspection application of high-speed railway line built in the intelligent mobile terminal were proposed. Utilizing the accelerometer, gyroscope, GPS and other high-performance sensors which were integrated in smartphone, the application can obtain multiple parameters like acceleration, angle, etc and pinpoint the location. Therefore, through analyzing the acceleration data in time domain and frequency domain using fast Fourier transform, the research compared much of data from monitoring tests under different measure conditions and measuring points. Furthermore, an idea of establishing a system about analysis checking was outlined in paper. It has been validated that the smartphone-based high-speed railway line inspection system is reliable and feasible on the high-speed railway lines. And it has more advantages, such as convenience, low cost and being widely used. Obviously, the research has important practical significance and broad application prospects.

  3. The detection and prevention of unintentional consumption of DOx and 25x-NBOMe at Portugal's Boom Festival.

    PubMed

    Martins, Daniel; Barratt, Monica J; Pires, Cristiana Vale; Carvalho, Helena; Vilamala, Mireia Ventura; Espinosa, Iván Fornís; Valente, Helena

    2017-05-01

    This paper describes the misrepresentation of LSD at Portugal's Boom Festival 2014 and the prevention of unintentional consumption of DOx and 25x-NBOMe among LSD consumers attending a drug-checking service. Two hundred forty-five drug samples expected to contain LSD were submitted to the drug-checking service for chemical analysis. One hundred ten post-test questionnaires were successfully matched with test results. About 67.3% of the alleged LSD samples tested contained only LSD; 0.8% contained LSD combined with adulterants; 24.1% did not contain LSD but did contain another psychoactive substance, including 11.4% that were 2,5-dimethoxyamphetamine derivatives and 9.8% that were N-benzyl-2,5-dimethoxyphenethylamine derivatives; and no psychoactive substance was detected in 7.8%. The majority of service users who received unexpected test results regarding their alleged LSD (74.2%) reported that they did not intend to consume the drug. Following dissemination of alerts on day 2, a larger than expected proportion of all tests conducted were for LSD, when comparing the 2014 festival to 2012, where no such alert was disseminated. Although these results support the provision of integrated drug-checking services in party settings, evidence of their utility and effectiveness would be improved through future research incorporating more robust measures of outcomes following provision of drug-checking results. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Geomorphological and ecological effects of check dams in mountain torrents of Southern Italy

    NASA Astrophysics Data System (ADS)

    Zema, Demetrio Antonio; Bombino, Giuseppe; Denisi, Pietro; Tamburino, Vincenzo; Marcello Zimbone, Santo

    2017-04-01

    It is known that installation of check dams noticeably influences torrent morphology and ecology. However, the effects of check dams on channel section and riparian vegetation of torrents are not yet completely understood. This paper provides a further contribution to a better comprehension of the actions played by check dams on hydrological and geomorphological processes in headwaters and their effects on riparian ecosystem. Field surveys on channel morphology, bed material and riparian vegetation were carried out close to five check dams in each of four mountain reaches of Calabria (Southern Italy). For each check dam three transects (one upstream, one downstream and one far from the check dam, located in the undisturbed zone and adopted as control) were identified; at each transect, a set of geomorphological and ecological indicators were surveyed as follows. Channel section morphology was assessed by the width/depth ratio (w/d); the median particle size (D50) and the finer sediment fraction (%fines) were chosen to characterize channel bed material; the specific discharge (q, the discharge per channel unit width) was assumed as measure of the flow regime. Vegetation cover and structure were evaluated by Global Canopy Cover (GCC) and Weighted Canopy Height (WCH) respectively (Bombino et al., 2008); the index of alpha-diversity (H-alpha, Hill, 1973) and the ratio between the number of alien species and the number of native species (NSA/NSN) were chosen as indicators of species richness/abundance and degree of vegetation integrity, respectively. Compared to the control transects, the values of w/d were higher upstream of check dams and lower downstream; conversely, q was lower upstream and higher in downstream sites. Upstream of the check dams D50 of bed material was lower and %fines was higher compared to the control transects; vice versa, the downstream transects showed higher D50 and lower %fines. The differences in the riparian vegetation among transects were found as the torrent ecological response to the strong contrasts surveyed in hydrological (q) and geomorphological (w/d, D50 and %fines) characteristics. Compared to control transects, vegetation was more extensive (higher GCC) and developed (higher WCH) in the upstream zones; the reverse pattern was noticed in the downstream transects (lower GCC and WCH). The indexes H-alpha and NSA/NSN were higher upstream of check dams: the presence of the check dams induced higher species richness and evenness, with alien species prevailing over native ones in the sedimentation wedge. Conversely, downstream of check dams H-alpha and NSA/NSN were lower: here, riparian vegetation lost some herbaceous species and assumed a terrestrial character. Overall, this study confirms on a quantitative approach that check dams have far reaching effects on geomorphology and ecology of mountain torrent channels; as a consequence, important and complex changes occur not only in the extent and development of riparian vegetation, but also in the species diversity and distribution. REFERENCES - Bombino G., Gurnell A.M., Tamburino V., Zema D.A., Zimbone S.M. 2008. Sediment size variation in torrents with check-dams: effects on riparian vegetation. Ecological Engineering 32(2), 166-177. - Hill MO. 1973. Diversity and evenness: a unifying notation and its consequences. Ecology 54: 427-431.

  5. Crosstalk: The Journal of Defense Software Engineering. Volume 22, Number 2, February 2009

    DTIC Science & Technology

    2009-02-01

    IT Investment With Service-Oriented Architecture ( SOA ), Geoffrey Raines examines how an SOA offers federal senior leadership teams an incremental and...values, and is used by 30 million people. [1] Given budget constraints, an incre- mental approach seems to be required. A Path Forward SOA , as implemented...point of view, SOA offers several positive benefits. Language Neutral Integration Web-enabling applications with a com- mon browser interface became a

  6. Integrated mission management operations

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Operations required to launch a modular space station and to provides sustaining ground operations for support of that orbiting station throughout its 10 year mission are studied. A baseline, incrementally manned program and attendent experiment program options are derived. In addition, features of the program that significantly effect initial development and early operating costs are identified, and their impact on the program is assessed. A preliminary design of the approved modular space station configuration is formulated.

  7. Human-In-The-Loop Experimental Research for Detect and Avoid

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria; Munoz, Cesar; Hagen, George; Narkawicz, Anthony; Upchurch, Jason; Comstock, James; Ghatas, Rania; Vincent, Michael; Chamberlain, James

    2015-01-01

    This paper describes a Detect and Avoid (DAA) concept for integration of UAS into the NAS developed by the National Aeronautics and Space Administration (NASA) and provides results from recent human-in-the-loop experiments performed to investigate interoperability and acceptability issues associated with these vehicles and operations. The series of experiments was designed to incrementally assess critical elements of the new concept and the enabling technologies that will be required.

  8. Systems Engineering and Integration for Advanced Life Support System and HST

    NASA Technical Reports Server (NTRS)

    Kamarani, Ali K.

    2005-01-01

    Systems engineering (SE) discipline has revolutionized the way engineers and managers think about solving issues related to design of complex systems: With continued development of state-of-the-art technologies, systems are becoming more complex and therefore, a systematic approach is essential to control and manage their integrated design and development. This complexity is driven from integration issues. In this case, subsystems must interact with one another in order to achieve integration objectives, and also achieve the overall system's required performance. Systems engineering process addresses these issues at multiple levels. It is a technology and management process dedicated to controlling all aspects of system life cycle to assure integration at all levels. The Advanced Integration Matrix (AIM) project serves as the systems engineering and integration function for the Human Support Technology (HST) program. AIM provides means for integrated test facilities and personnel for performance trade studies, analyses, integrated models, test results, and validated requirements of the integration of HST. The goal of AIM is to address systems-level integration issues for exploration missions. It will use an incremental systems integration approach to yield technologies, baselines for further development, and possible breakthrough concepts in the areas of technological and organizational interfaces, total information flow, system wide controls, technical synergism, mission operations protocols and procedures, and human-machine interfaces.

  9. OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4

    PubMed Central

    2012-01-01

    Background Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. Objective We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. Implementation In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. Results The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. Conclusions The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers. PMID:23046606

  10. Investigation of 1 : 1,000 Scale Map Generation by Stereo Plotting Using Uav Images

    NASA Astrophysics Data System (ADS)

    Rhee, S.; Kim, T.

    2017-08-01

    Large scale maps and image mosaics are representative geospatial data that can be extracted from UAV images. Map drawing using UAV images can be performed either by creating orthoimages and digitizing them, or by stereo plotting. While maps generated by digitization may serve the need for geospatial data, many institutions and organizations require map drawing using stereoscopic vision on stereo plotting systems. However, there are several aspects to be checked for UAV images to be utilized for stereo plotting. The first aspect is the accuracy of exterior orientation parameters (EOPs) generated through automated bundle adjustment processes. It is well known that GPS and IMU sensors mounted on a UAV are not very accurate. It is necessary to adjust initial EOPs accurately using tie points. For this purpose, we have developed a photogrammetric incremental bundle adjustment procedure. The second aspect is unstable shooting conditions compared to aerial photographing. Unstable image acquisition may bring uneven stereo coverage, which will result in accuracy loss eventually. Oblique stereo pairs will create eye fatigue. The third aspect is small coverage of UAV images. This aspect will raise efficiency issue for stereo plotting of UAV images. More importantly, this aspect will make contour generation from UAV images very difficult. This paper will discuss effects relate to these three aspects. In this study, we tried to generate 1 : 1,000 scale map from the dataset using EOPs generated from software developed in-house. We evaluated Y-disparity of the tie points extracted automatically through the photogrammetric incremental bundle adjustment process. We could confirm that stereoscopic viewing is possible. Stereoscopic plotting work was carried out by a professional photogrammetrist. In order to analyse the accuracy of the map drawing using stereoscopic vision, we compared the horizontal and vertical position difference between adjacent models after drawing a specific model. The results of analysis showed that the errors were within the specification of 1 : 1,000 map. Although the Y-parallax can be eliminated, it is still necessary to improve the accuracy of absolute ground position error in order to apply this technique to the actual work. There are a few models in which the difference in height between adjacent models is about 40 cm. We analysed the stability of UAV images by checking angle differences between adjacent images. We also analysed the average area covered by one stereo model and discussed the possible difficulty associated with this narrow coverage. In the future we consider how to reduce position errors and improve map drawing performances from UAVs.

  11. Evaluation of the implementation of an integrated program for musculoskeletal system care.

    PubMed

    Larrañaga, Igor; Soto-Gordoa, Myriam; Arrospide, Arantzazu; Jauregi, María Luz; Millas, Jesús; San Vicente, Ricardo; Aguirrebeña, Jabier; Mar, Javier

    The chronic nature of musculoskeletal diseases requires an integrated care which involves the Primary Care and the specialities of Rheumatology, Traumatology and Rehabilitation. The aim of this study was to assess the implementation of an integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease using Deming's continuous improvement process and considering referrals and resource consumption. A simulation model was used in the planning to predict the evolution of musculoskeletal diseases resource consumption and to carry out a Budget Impact Analysis from 2012 to 2020 in the Goierri-Alto Urola region. In the checking stage the status of the process in 2014 was evaluated using statistical analysis to check the degree of achievement of the objectives for each speciality. Simulation models showed that population with musculoskeletal disease in Goierri-Alto Urola will increase a 4.4% by 2020. Because of that, the expenses for a conventional healthcare system will have increased a 5.9%. However, if the intervention reaches its objectives the budget would decrease an 8.5%. The statistical analysis evidenced a decline in referrals to Traumatology service and a reduction of successive consultations in all specialities. The implementation of the integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease is still at an early stage. However, the empowerment of Primary Care improved patient referrals and reduced the costs. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  12. Value of Excess Pressure Integral for Predicting 15-Year All-Cause and Cardiovascular Mortalities in End-Stage Renal Disease Patients.

    PubMed

    Huang, Jui-Tzu; Cheng, Hao-Min; Yu, Wen-Chung; Lin, Yao-Ping; Sung, Shih-Hsien; Wang, Jiun-Jr; Wu, Chung-Li; Chen, Chen-Huan

    2017-11-29

    The excess pressure integral (XSPI), derived from analysis of the arterial pressure curve, may be a significant predictor of cardiovascular events in high-risk patients. We comprehensively investigated the prognostic value of XSPI for predicting long-term mortality in end-stage renal disease patients undergoing regular hemodialysis. A total of 267 uremic patients (50.2% female; mean age 54.2±14.9 years) receiving regular hemodialysis for more than 6 months were enrolled. Cardiovascular parameters were obtained by echocardiography and applanation tonometry. Calibrated carotid arterial pressure waveforms were analyzed according to the wave-transmission and reservoir-wave theories. Multivariable Cox proportional hazard models were constructed to account for age, sex, diabetes mellitus, albumin, body mass index, and hemodialysis treatment adequacy. Incremental utility of the parameters to risk stratification was assessed by net reclassification improvement. During a median follow-up of 15.3 years, 124 deaths (46.4%) incurred. Baseline XSPI was significantly predictive of all-cause (hazard ratio per 1 SD 1.4, 95% confidence interval 1.15-1.70, P =0.0006) and cardiovascular mortalities (1.47, 1.18-1.84, P =0.0006) after accounting for the covariates. The addition of XSPI to the base prognostic model significantly improved prediction of both all-cause mortality (net reclassification improvement=0.1549, P =0.0012) and cardiovascular mortality (net reclassification improvement=0.1535, P =0.0033). XSPI was superior to carotid-pulse wave velocity, forward and backward wave amplitudes, and left ventricular ejection fraction in consideration of overall independent and incremental prognostics values. In end-stage renal disease patients undergoing regular hemodialysis, XSPI was significantly predictive of long-term mortality and demonstrated an incremental value to conventional prognostic factors. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  13. The impact of speciated VOCs on regional ozone increment derived from measurements at the UK EMEP supersites between 1999 and 2012

    NASA Astrophysics Data System (ADS)

    Malley, C. S.; Braban, C. F.; Dumitrean, P.; Cape, J. N.; Heal, M. R.

    2015-03-01

    The impact of 27 volatile organic compounds (VOC) on the regional O3 increment was investigated using measurements made at the UK EMEP supersites Harwell (1999-2001 and 2010-2012) and Auchencorth (2012). Ozone at these sites is representative of rural O3 in south-east England and northern UK, respectively. Monthly-diurnal regional O3 increment was defined as the difference between the regional and hemispheric background O3 concentrations, respectively derived from oxidant vs. NOx correlation plots, and cluster analysis of back trajectories arriving at Mace Head, Ireland. At Harwell, which had substantially greater regional ozone increments than at Auchencorth, variation in the regional O3 increment mirrored afternoon depletion of VOCs due to photochemistry (after accounting for diurnal changes in boundary layer mixing depth, and weighting VOC concentrations according to their photochemical ozone creation potential). A positive regional O3 increment occurred consistently during the summer, during which time afternoon photochemical depletion was calculated for the majority of measured VOCs, and to the greatest extent for ethene and m + p-xylene. This indicates that, of the measured VOCs, ethene and m + p-xylene emissions reduction would be most effective in reducing the regional O3 increment, but that reductions in a larger number of VOCs would be required for further improvement. The VOC diurnal photochemical depletion was linked to the sources of the VOC emissions through the integration of gridded VOC emissions estimates over 96 h air-mass back trajectories. This demonstrated that the effectiveness of VOC gridded emissions for use in measurement and modelling studies is limited by the highly aggregated nature of the 11 SNAP source sectors in which they are reported, as monthly variation in speciated VOC trajectory emissions did not reflect monthly changes in individual VOC diurnal photochemical depletion. Additionally, the major VOC emission source sectors during elevated regional O3 increment at Harwell were more narrowly defined through disaggregation of the SNAP emissions to 91 NFR codes (i.e. sectors 3D2 (domestic solvent use), 3D3 (other product use) and 2D2 (food and drink)). However, spatial variation in the contribution of NFR sectors to parent SNAP emissions could only be accounted for at the country level. Hence, the future reporting of gridded VOC emissions in source sectors more highly disaggregated than currently (e.g. to NFR codes) would facilitate a more precise identification of those VOC sources most important for mitigation of the impact of VOCs on O3 formation. In summary, this work presents a clear methodology for achieving a coherent VOC regional-O3-impact chemical climate using measurement data and explores the effect of limited emission and measurement species on the understanding of the regional VOC contribution to O3 concentrations.

  14. The impact of speciated VOCs on regional ozone increment derived from measurements at the UK EMEP supersites between 1999 and 2012

    NASA Astrophysics Data System (ADS)

    Malley, C. S.; Braban, C. F.; Dumitrean, P.; Cape, J. N.; Heal, M. R.

    2015-07-01

    The impact of 27 volatile organic compounds (VOCs) on the regional O3 increment was investigated using measurements made at the UK EMEP supersites Harwell (1999-2001 and 2010-2012) and Auchencorth (2012). Ozone at these sites is representative of rural O3 in south-east England and northern UK, respectively. The monthly-diurnal regional O3 increment was defined as the difference between the regional and hemispheric background O3 concentrations, respectively, derived from oxidant vs. NOx correlation plots, and cluster analysis of back trajectories arriving at Mace Head, Ireland. At Harwell, which had substantially greater regional O3 increments than Auchencorth, variation in the regional O3 increment mirrored afternoon depletion of anthropogenic VOCs due to photochemistry (after accounting for diurnal changes in boundary layer mixing depth, and weighting VOC concentrations according to their photochemical ozone creation potential). A positive regional O3 increment occurred consistently during the summer, during which time afternoon photochemical depletion was calculated for the majority of measured VOCs, and to the greatest extent for ethene and m+p-xylene. This indicates that, of the measured VOCs, ethene and m+p-xylene emissions reduction would be most effective in reducing the regional O3 increment but that reductions in a larger number of VOCs would be required for further improvement. The VOC diurnal photochemical depletion was linked to anthropogenic sources of the VOC emissions through the integration of gridded anthropogenic VOC emission estimates over 96 h air-mass back trajectories. This demonstrated that one factor limiting the effectiveness of VOC gridded emissions for use in measurement and modelling studies is the highly aggregated nature of the 11 SNAP (Selected Nomenclature for Air Pollution) source sectors in which they are reported, as monthly variation in speciated VOC trajectory emissions did not reflect monthly changes in individual VOC diurnal photochemical depletion. Additionally, the major VOC emission source sectors during elevated regional O3 increment at Harwell were more narrowly defined through disaggregation of the SNAP emissions to 91 NFR (Nomenclature for Reporting) codes (i.e. sectors 3D2 (domestic solvent use), 3D3 (other product use) and 2D2 (food and drink)). However, spatial variation in the contribution of NFR sectors to parent SNAP emissions could only be accounted for at the country level. Hence, the future reporting of gridded VOC emissions in source sectors more highly disaggregated than currently (e.g. to NFR codes) would facilitate a more precise identification of those VOC sources most important for mitigation of the impact of VOCs on O3 formation. In summary, this work presents a clear methodology for achieving a coherent VOC, regional-O3-impact chemical climate using measurement data and explores the effect of limited emission and measurement species on the understanding of the regional VOC contribution to O3 concentrations.

  15. KSC-2013-3076

    NASA Image and Video Library

    2013-07-22

    HOUSTON - NASA astronaut Serena Aunon puts on her orange launch-and-entry suit for a fit check evaluation of The Boeing Company's CST-100 spacecraft at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  16. KSC-2013-3090

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068329 - NASA astronaut Randy Bresnik is interviewed by the media before he enters The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Bresnik's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  17. KSC-2013-3079

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068264 - NASA astronaut Serena Aunon's boots are covered before she enters The Boeing Company's CST-100 spacecraft for a fit check evaluation at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  18. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  19. STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the Space Station Processing Facility, STS-92 crew members discuss the Pressurized Mating Adapter -3 (PMA-3), in the background, with Boeing workers. From left are Pilot Pamela A. Melroy and Mission Specialists Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA), and Peter J.K. 'Jeff' Wisoff (Ph.D.). The STS-92 crew are taking part in a Leak Seal Kit Fit Check in connection with the PMA-3. Other crew members participating are Commander Brian Duffy and Mission Specialists Leroy Chiao (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.

  20. STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the Space Station Processing Facility, STS-92 crew members discuss the Pressurized Mating Adapter -3 in the background with workers from Boeing. At the far left is Mission Specialist William Surles 'Bill' McArthur Jr.; facing the camera are Pilot Pamela A. Melroy and Mission Specialist Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA). Also participating are other crew members Commander Brian Duffy and Mission Specialists Leroy Chiao (Ph.D.), Peter J.K. 'Jeff' Wisoff (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. The crew are taking part in a Leak Seal Kit Fit Check. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.

  1. STS-92 crew takes part in a Leak Seal Kit Fit Check in the SSPF

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the Space Station Processing Facility, STS-92 crew members discuss the Pressurized Mating Adapter -3 (PMA-3) in the background with Boeing workers. From left are Pilot Pamela A. Melroy and Mission Specialists Koichi Wakata, who represents the National Space Development Agency of Japan (NASDA), and Peter J.K. 'Jeff' Wisoff (Ph.D.). The STS-92 crew are taking part in a Leak Seal Kit Fit Check in connection with the PMA-3. Other crew members participating are Commander Brian Duffy and Mission Specialists Leroy Chiao (Ph.D.), Michael E. Lopez-Alegria and William Surles 'Bill' McArthur Jr. The mission payload also includes an integrated truss structure (Z-1 truss). Launch of STS-92 is scheduled for Feb. 24, 2000.

  2. [Social participation and vocational integration as an objective of child and adolescent psychiatric rehabilitation].

    PubMed

    Voll, Renate

    2009-09-01

    In order to avoid threatening social disintegration, it is important for children and adolescents with chronic mental disorders and also for physically disabled children to diagnose disturbances of social participation in an early stage and to commence rehabilitation measures. The need for rehabilitation, the ability to rehabilitate and the rehabilitation prognosis are important for identifying the individual rehabilitation goals. A multi-axial diagnosis according to the ICF with a determination of adaptability, a behavioural analysis, skills, activity and participation is required. For disabled children, there are only a few ICF check lists for diagnosing social participation. Because of this, the ICF check list CASP (Child & Adolescent Scale of Participation) for measuring social participation according to Bedell was translated, which is shown in the appendix.

  3. An Overview of Integration and Test of the James Webb Space Telescope Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond; hide

    2007-01-01

    The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.

  4. The Nursing Diagnosis Disturbed Thought Processes: An Integrative Review.

    PubMed

    Escalada-Hermández, Paula; Marín-Fernández, Blanca

    2017-09-08

    To analyze and synthetize the existing scientific literature in relation to the nursing diagnosis disturbed thought processes (DTPs) (00130). An integrative review was developed, identifying relevant papers through a search of international and Spanish databases and the examination of key manuals. Theoretical papers propose modifications for the nursing diagnosis DTPs. Most of the research papers offer data about its frequency in different clinical settings. There exists an interest in the nursing diagnosis DTPs. However, the available evidence is not very extensive and further work is necessary in order to refine this nursing diagnosis. The re-inclusion of DTPs in the NANDA-I classification will specially contribute to increment its utility in mental healthcare. © 2017 NANDA International, Inc.

  5. Nonlinear analysis of a closed-loop tractor-semitrailer vehicle system with time delay

    NASA Astrophysics Data System (ADS)

    Liu, Zhaoheng; Hu, Kun; Chung, Kwok-wai

    2016-08-01

    In this paper, a nonlinear analysis is performed on a closed-loop system of articulated heavy vehicles with driver steering control. The nonlinearity arises from the nonlinear cubic tire force model. An integration method is employed to derive an analytical periodic solution of the system in the neighbourhood of the critical speed. The results show that excellent accuracy can be achieved for the calculation of periodic solutions arising from Hopf bifurcation of the vehicle motion. A criterion is obtained for detecting the Bautin bifurcation which separates branches of supercritical and subcritical Hopf bifurcations. The integration method is compared to the incremental harmonic balance method in both supercritical and subcritical scenarios.

  6. Design Of Combined Stochastic Feedforward/Feedback Control

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1989-01-01

    Methodology accommodates variety of control structures and design techniques. In methodology for combined stochastic feedforward/feedback control, main objectives of feedforward and feedback control laws seen clearly. Inclusion of error-integral feedback, dynamic compensation, rate-command control structure, and like integral element of methodology. Another advantage of methodology flexibility to develop variety of techniques for design of feedback control with arbitrary structures to obtain feedback controller: includes stochastic output feedback, multiconfiguration control, decentralized control, or frequency and classical control methods. Control modes of system include capture and tracking of localizer and glideslope, crab, decrab, and flare. By use of recommended incremental implementation, control laws simulated on digital computer and connected with nonlinear digital simulation of aircraft and its systems.

  7. Improving Control System Cyber-State Awareness using Known Secure Sensor Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Milos Manic; Miles McQueen

    Abstract—This paper presents design and simulation of a low cost and low false alarm rate method for improved cyber-state awareness of critical control systems - the Known Secure Sensor Measurements (KSSM) method. The KSSM concept relies on physical measurements to detect malicious falsification of the control systems state. The KSSM method can be incrementally integrated with already installed control systems for enhanced resilience. This paper reviews the previously developed theoretical KSSM concept and then describes a simulation of the KSSM system. A simulated control system network is integrated with the KSSM components. The effectiveness of detection of various intrusion scenariosmore » is demonstrated on several control system network topologies.« less

  8. Policy coherence, integration, and proportionality in tobacco control: Should tobacco sales be limited to government outlets?

    PubMed

    Smith, Elizabeth A; McDaniel, Patricia A; Hiilamo, Heikki; Malone, Ruth E

    2017-08-01

    Multiple factors, including marijuana decriminalization/legalization, tobacco endgame discourse, and alcohol industry pressures, suggest that the retail regulatory environment for psychoactive or addictive substances is a dynamic one in which new options may be considered. In most countries, the regulation of tobacco, marijuana, and alcohol is neither coherent, nor integrated, nor proportional to the potential harms caused by these substances. We review the possible consequences of restricting tobacco sales to outlets run by government-operated alcohol retail monopolies, as well as the likely obstacles to such a policy. Such a move would allow governments more options for regulating tobacco sales, and increase coherence, integration, and proportionality of substance regulation. It might also serve as an incremental step toward an endgame goal of eliminating sales of commercial combustible tobacco.

  9. The integrable quantum group invariant A2n-1(2) and Dn+1(2) open spin chains

    NASA Astrophysics Data System (ADS)

    Nepomechie, Rafael I.; Pimenta, Rodrigo A.; Retore, Ana L.

    2017-11-01

    A family of A2n(2) integrable open spin chains with Uq (Cn) symmetry was recently identified in arxiv:arXiv:1702.01482. We identify here in a similar way a family of A2n-1(2) integrable open spin chains with Uq (Dn) symmetry, and two families of Dn+1(2) integrable open spin chains with Uq (Bn) symmetry. We discuss the consequences of these symmetries for the degeneracies and multiplicities of the spectrum. We propose Bethe ansatz solutions for two of these models, whose completeness we check numerically for small values of n and chain length N. We find formulas for the Dynkin labels in terms of the numbers of Bethe roots of each type, which are useful for determining the corresponding degeneracies. In an appendix, we briefly consider Dn+1(2) chains with other integrable boundary conditions, which do not have quantum group symmetry.

  10. Comprehensive Anti-error Study on Power Grid Dispatching Based on Regional Regulation and Integration

    NASA Astrophysics Data System (ADS)

    Zhang, Yunju; Chen, Zhongyi; Guo, Ming; Lin, Shunsheng; Yan, Yinyang

    2018-01-01

    With the large capacity of the power system, the development trend of the large unit and the high voltage, the scheduling operation is becoming more frequent and complicated, and the probability of operation error increases. This paper aims at the problem of the lack of anti-error function, single scheduling function and low working efficiency for technical support system in regional regulation and integration, the integrated construction of the error prevention of the integrated architecture of the system of dispatching anti - error of dispatching anti - error of power network based on cloud computing has been proposed. Integrated system of error prevention of Energy Management System, EMS, and Operation Management System, OMS have been constructed either. The system architecture has good scalability and adaptability, which can improve the computational efficiency, reduce the cost of system operation and maintenance, enhance the ability of regional regulation and anti-error checking with broad development prospects.

  11. Assessment of the patient, health system, and population effects of Xpert MTB/RIF and alternative diagnostics for tuberculosis in Tanzania: an integrated modelling approach.

    PubMed

    Langley, Ivor; Lin, Hsien-Ho; Egwaga, Saidi; Doulla, Basra; Ku, Chu-Chang; Murray, Megan; Cohen, Ted; Squire, S Bertel

    2014-10-01

    Several promising new diagnostic methods and algorithms for tuberculosis have been endorsed by WHO. National tuberculosis programmes now face the decision on which methods to implement and where to place them in the diagnostic algorithm. We used an integrated model to assess the effects of different algorithms of Xpert MTB/RIF and light-emitting diode (LED) fluorescence microscopy in Tanzania. To understand the effects of new diagnostics from the patient, health system, and population perspective, the model incorporated and linked a detailed operational component and a transmission component. The model was designed to represent the operational and epidemiological context of Tanzania and was used to compare the effects and cost-effectiveness of different diagnostic options. Among the diagnostic options considered, we identified three strategies as cost effective in Tanzania. Full scale-up of Xpert would have the greatest population-level effect with the highest incremental cost: 346 000 disability-adjusted life-years (DALYs) averted with an additional cost of US$36·9 million over 10 years. The incremental cost-effectiveness ratio (ICER) of Xpert scale-up ($169 per DALY averted, 95% credible interval [CrI] 104-265) is below the willingness-to-pay threshold ($599) for Tanzania. Same-day LED fluorescence microscopy is the next most effective strategy with an ICER of $45 (95% CrI 25-74), followed by LED fluorescence microscopy with an ICER of $29 (6-59). Compared with same-day LED fluorescence microscopy and Xpert full rollout, targeted use of Xpert in presumptive tuberculosis cases with HIV infection, either as an initial diagnostic test or as a follow-on test to microscopy, would produce DALY gains at a higher incremental cost and therefore is dominated in the context of Tanzania. For Tanzania, this integrated modelling approach predicts that full rollout of Xpert is a cost-effective option for tuberculosis diagnosis and has the potential to substantially reduce the national tuberculosis burden. It also estimates the substantial level of funding that will need to be mobilised to translate this into clinical practice. This approach could be adapted and replicated in other developing countries to inform rational health policy formulation. Copyright © 2014 Langley et al. Open Access article distributed under the terms of CC BY-NC-SA. Published by .. All rights reserved.

  12. TSH increment and the risk of incident type 2 diabetes mellitus in euthyroid subjects.

    PubMed

    Jun, Ji Eun; Jin, Sang-Man; Jee, Jae Hwan; Bae, Ji Cheol; Hur, Kyu Yeon; Lee, Moon-Kyu; Kim, Sun Wook; Kim, Jae Hyeon

    2017-03-01

    Thyroid function is known to influence glucose metabolism, and thyroid-stimulating hormone is the most useful parameter in screening for thyroid dysfunction. Therefore, the aim of this study was to investigate the incidence of type 2 diabetes according to baseline thyroid-stimulating hormone level and thyroid-stimulating hormone change in euthyroid subjects. We identified and enrolled 17,061 euthyroid subjects without diabetes among participants who had undergone consecutive thyroid function tests between 2006 and 2012 as a part of yearly health check-up program. Thyroid-stimulating hormone changes were determined by subtracting baseline thyroid-stimulating hormone level from thyroid-stimulating hormone level at 1 year before diagnosis of diabetes or at the end of follow-up in subjects who did not develope diabetes. During 84,595 person-years of follow-up, there were 956 new cases of type 2 diabetes. Cox proportional hazards models showed the risk of incident type 2 diabetes was significantly increased with each 1 μIU/mL increment in TSH after adjustment for multiple confounding factors (hazard ratio = 1.13, 95% confidence interval: 1.07-1.20, P < 0.001). Compared with individuals in the lowest tertile (-4.08 to 0.34 μIU/mL), those in the highest thyroid-stimulating hormone change tertile (0.41-10.84 μIU/mL) were at greater risk for incident type 2 diabetes (hazard ratio = 1.25, 95% confidence interval: 1.05-1.48, P for trend = 0.011). However, baseline thyroid-stimulating hormone level and tertile were not associated with the risk for diabetes. Prominent increase in thyroid-stimulating hormone concentration can be an additional risk factor for the development of type 2 diabetes in euthyroid subjects.

  13. The effectiveness and cost-effectiveness of intraoperative imaging in high-grade glioma resection; a comparative review of intraoperative ALA, fluorescein, ultrasound and MRI.

    PubMed

    Eljamel, M Sam; Mahboob, Syed Osama

    2016-12-01

    Surgical resection of high-grade gliomas (HGG) is standard therapy because it imparts significant progression free (PFS) and overall survival (OS). However, HGG-tumor margins are indistinguishable from normal brain during surgery. Hence intraoperative technology such as fluorescence (ALA, fluorescein) and intraoperative ultrasound (IoUS) and MRI (IoMRI) has been deployed. This study compares the effectiveness and cost-effectiveness of these technologies. Critical literature review and meta-analyses, using MEDLINE/PubMed service. The list of references in each article was double-checked for any missing references. We included all studies that reported the use of ALA, fluorescein (FLCN), IoUS or IoMRI to guide HGG-surgery. The meta-analyses were conducted according to statistical heterogeneity between studies. If there was no heterogeneity, fixed effects model was used; otherwise, a random effects model was used. Statistical heterogeneity was explored by χ 2 and inconsistency (I 2 ) statistics. To assess cost-effectiveness, we calculated the incremental cost per quality-adjusted life-year (QALY). Gross total resection (GTR) after ALA, FLCN, IoUS and IoMRI was 69.1%, 84.4%, 73.4% and 70% respectively. The differences were not statistically significant. All four techniques led to significant prolongation of PFS and tended to prolong OS. However none of these technologies led to significant prolongation of OS compared to controls. The cost/QALY was $16,218, $3181, $6049 and $32,954 for ALA, FLCN, IoUS and IoMRI respectively. ALA, FLCN, IoUS and IoMRI significantly improve GTR and PFS of HGG. Their incremental cost was below the threshold for cost-effectiveness of HGG-therapy, denoting that each intraoperative technology was cost-effective on its own. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. "Missing perikymata"--fact or fiction? A study on chimpanzee (Pan troglodytes verus) canines.

    PubMed

    Kierdorf, Horst; Witzel, Carsten; Kierdorf, Uwe; Skinner, Matthew M; Skinner, Mark F

    2015-06-01

    Recently, a lower than expected number of perikymata between repetitive furrow-type hypoplastic defects has been reported in chimpanzee canines from the Fongoli site, Senegal (Skinner and Pruetz: Am J Phys Anthropol 149 (2012) 468-482). Based on an observation in a localized enamel fracture surface of a canine of a chimpanzee from the Taï Forest (Ivory Coast), these authors inferred that a nonemergence of striae of Retzius could be the cause for the "missing perikymata" phenomenon in the Fongoli chimpanzees. To check this inference, we analyzed the structure of outer enamel in three chimpanzee canines. The teeth were studied using light-microscopic and scanning-electron microscopic techniques. Our analysis of the specimen upon which Skinner and Pruetz (Am J Phys Anthropol 149 (2012) 468-482) had made their original observation does not support their hypothesis. We demonstrate that the enamel morphology described by them is not caused by a nonemergence of striae of Retzius but can be attributed to structural variations in outer enamel that result in a differential fracture behavior. Although rejecting the presumed existence of nonemergent striae of Retzius, our study provided evidence that, in furrow-type hypoplastic defects, a pronounced tapering of Retzius increments can occur, with the striae of Retzius forming acute angles with the outer enamel surface. We suggest that in such cases the outcrop of some striae of Retzius is essentially unobservable at the enamel surface, causing too low perikymata counts. The pronounced tapering of Retzius increments in outer enamel presumably reflects a mild to moderate disturbance of the function of late secretory ameloblasts. © 2015 Wiley Periodicals, Inc.

  15. Integrating Farm Production and Natural Resource Management in Tasmania, Australia

    ERIC Educational Resources Information Center

    Cotching, W. E.; Sherriff, L.; Kilpatrick, S.

    2009-01-01

    This paper reports on the social learning from a project aimed to increase the knowledge and capacity of a group of farmers in Tasmania, Australia, to reduce the impacts of intensive agriculture on soil health and waterways, and to optimise the efficient use of on-farm inputs. The plan-do-check-review cycle adopted in this project required the…

  16. Crewmember working on the spacelab Zeolite Crystal Growth experiment.

    NASA Image and Video Library

    1992-07-09

    STS050-02-001 (9 July 1992) --- View showing Payload Specialists Bonnie Dunbar and Larry DeLucas in the aft section of the U. S. Microgravity Laboratory-1. Dunbar is preparing to load a sample in the Crystal Growth Furnace (CGF) Integrated Furnace Experiment Assembly (IFEA) in rack 9 of the Microgravity Laboratory. DeLucas is checking out the multipurpose Glovebox Facility.

  17. NEXT Single String Integration Test Results

    NASA Technical Reports Server (NTRS)

    Soulas, George C.; Patterson, Michael J.; Pinero, Luis; Herman, Daniel A.; Snyder, Steven John

    2010-01-01

    As a critical part of NASA's Evolutionary Xenon Thruster (NEXT) test validation process, a single string integration test was performed on the NEXT ion propulsion system. The objectives of this test were to verify that an integrated system of major NEXT ion propulsion system elements meets project requirements, to demonstrate that the integrated system is functional across the entire power processor and xenon propellant management system input ranges, and to demonstrate to potential users that the NEXT propulsion system is ready for transition to flight. Propulsion system elements included in this system integration test were an engineering model ion thruster, an engineering model propellant management system, an engineering model power processor unit, and a digital control interface unit simulator that acted as a test console. Project requirements that were verified during this system integration test included individual element requirements ; integrated system requirements, and fault handling. This paper will present the results of these tests, which include: integrated ion propulsion system demonstrations of performance, functionality and fault handling; a thruster re-performance acceptance test to establish baseline performance: a risk-reduction PMS-thruster integration test: and propellant management system calibration checks.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Blomer, J.

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFSmore » and Frontier.« less

  19. Integrated controls and health monitoring for chemical transfer propulsion

    NASA Technical Reports Server (NTRS)

    Millis, Marc G.; Binder, Michael P.

    1990-01-01

    NASA is reviewing various propulsion technologies for exploring space. The requirements are examined for one enabling propulsion technology: Integrated Controls and Health Monitoring (ICHM) for Chemical Transfer Propulsion (CTP). Functional requirements for a CTP-ICHM system are proposed from tentative mission scenarios, vehicle configurations, CTP specifications, and technical feasibility. These CTP-ICHM requirements go beyond traditional reliable operation and emergency shutoff control to include: (1) enhanced mission flexibility; (2) continuously variable throttling; (3) tank-head start control; (4) automated prestart and post-shutoff engine check; (5) monitoring of space exposure degradation; and (6) product evolution flexibility. Technology development plans are also discussed.

  20. Addressable test matrix for measuring analog transfer characteristics of test elements used for integrated process control and device evaluation

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor)

    1988-01-01

    A set of addressable test structures, each of which uses addressing schemes to access individual elements of the structure in a matrix, is used to test the quality of a wafer before integrated circuits produced thereon are diced, packaged and subjected to final testing. The electrical characteristic of each element is checked and compared to the electrical characteristic of all other like elements in the matrix. The effectiveness of the addressable test matrix is in readily analyzing the electrical characteristics of the test elements and in providing diagnostic information.

  1. Long period perturbations of earth satellite orbits. [Von Zeipel method and zonal harmonics

    NASA Technical Reports Server (NTRS)

    Wang, K. C.

    1979-01-01

    All the equations involved in extending the PS phi solution to include the long periodic and second order secular effects of the zonal harmonics are presented. Topics covered include DSphi elements and relations for their conconical transformation into the PS phi elements; the solution algorithm based on the Von Zeipel method; and the elimination of long periodic terms and analytical integration of primed variables. The equations were entered into the ASOP program, checked out, and verified. Comparisons with numerical integrations show the long period theory to be accurate within several meters after 800 revolutions.

  2. Parallel Implementation of Numerical Solution of Few-Body Problem Using Feynman's Continual Integrals

    NASA Astrophysics Data System (ADS)

    Naumenko, Mikhail; Samarin, Viacheslav

    2018-02-01

    Modern parallel computing algorithm has been applied to the solution of the few-body problem. The approach is based on Feynman's continual integrals method implemented in C++ programming language using NVIDIA CUDA technology. A wide range of 3-body and 4-body bound systems has been considered including nuclei described as consisting of protons and neutrons (e.g., 3,4He) and nuclei described as consisting of clusters and nucleons (e.g., 6He). The correctness of the results was checked by the comparison with the exactly solvable 4-body oscillatory system and experimental data.

  3. Unmanned Systems Acquisition and Technology Development: Is a More Integrated Approach Required

    DTIC Science & Technology

    2010-11-01

    The initial niche market of college students grew and in a few years, the music industry was forced to completely change its business models as...accepted and initially more effective sustaining technologies.39 An example of a sustaining technology would be the compact disk within the music ... industry . Like the advancement from the vinyl record to the cassette tape, the incremental advance to the compact disk was a vast improvement over the

  4. DoD Lead System Integrator (LSI) Transformation - Creating a Model Based Acquisition Framework (MBAF)

    DTIC Science & Technology

    2014-04-30

    cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to

  5. Incorporating social network effects into cost-effectiveness analysis: a methodological contribution with application to obesity prevention

    PubMed Central

    Konchak, Chad; Prasad, Kislaya

    2012-01-01

    Objectives To develop a methodology for integrating social networks into traditional cost-effectiveness analysis (CEA) studies. This will facilitate the economic evaluation of treatment policies in settings where health outcomes are subject to social influence. Design This is a simulation study based on a Markov model. The lifetime health histories of a cohort are simulated, and health outcomes compared, under alternative treatment policies. Transition probabilities depend on the health of others with whom there are shared social ties. Setting The methodology developed is shown to be applicable in any healthcare setting where social ties affect health outcomes. The example of obesity prevention is used for illustration under the assumption that weight changes are subject to social influence. Main outcome measures Incremental cost-effectiveness ratio (ICER). Results When social influence increases, treatment policies become more cost effective (have lower ICERs). The policy of only treating individuals who span multiple networks can be more cost effective than the policy of treating everyone. This occurs when the network is more fragmented. Conclusions (1) When network effects are accounted for, they result in very different values of incremental cost-effectiveness ratios (ICERs). (2) Treatment policies can be devised to take network structure into account. The integration makes it feasible to conduct a cost-benefit evaluation of such policies. PMID:23117559

  6. A table of intensity increments.

    DOT National Transportation Integrated Search

    1966-01-01

    Small intensity increments can be produced by adding larger intensity increments. A table is presented covering the range of small intensity increments from 0.008682 through 6.020 dB in 60 large intensity increments of 1 dB.

  7. Assimilation of gridded terrestrial water storage observations from GRACE into a land surface model

    NASA Astrophysics Data System (ADS)

    Girotto, Manuela; De Lannoy, Gabriëlle J. M.; Reichle, Rolf H.; Rodell, Matthew

    2016-05-01

    Observations of terrestrial water storage (TWS) from the Gravity Recovery and Climate Experiment (GRACE) satellite mission have a coarse resolution in time (monthly) and space (roughly 150,000 km2 at midlatitudes) and vertically integrate all water storage components over land, including soil moisture and groundwater. Data assimilation can be used to horizontally downscale and vertically partition GRACE-TWS observations. This work proposes a variant of existing ensemble-based GRACE-TWS data assimilation schemes. The new algorithm differs in how the analysis increments are computed and applied. Existing schemes correlate the uncertainty in the modeled monthly TWS estimates with errors in the soil moisture profile state variables at a single instant in the month and then apply the increment either at the end of the month or gradually throughout the month. The proposed new scheme first computes increments for each day of the month and then applies the average of those increments at the beginning of the month. The new scheme therefore better reflects submonthly variations in TWS errors. The new and existing schemes are investigated here using gridded GRACE-TWS observations. The assimilation results are validated at the monthly time scale, using in situ measurements of groundwater depth and soil moisture across the U.S. The new assimilation scheme yields improved (although not in a statistically significant sense) skill metrics for groundwater compared to the open-loop (no assimilation) simulations and compared to the existing assimilation schemes. A smaller impact is seen for surface and root-zone soil moisture, which have a shorter memory and receive smaller increments from TWS assimilation than groundwater. These results motivate future efforts to combine GRACE-TWS observations with observations that are more sensitive to surface soil moisture, such as L-band brightness temperature observations from Soil Moisture Ocean Salinity (SMOS) or Soil Moisture Active Passive (SMAP). Finally, we demonstrate that the scaling parameters that are applied to the GRACE observations prior to assimilation should be consistent with the land surface model that is used within the assimilation system.

  8. Assimilation of Gridded Terrestrial Water Storage Observations from GRACE into a Land Surface Model

    NASA Technical Reports Server (NTRS)

    Girotto, Manuela; De Lannoy, Gabrielle J. M.; Reichle, Rolf H.; Rodell, Matthew

    2016-01-01

    Observations of terrestrial water storage (TWS) from the Gravity Recovery and Climate Experiment (GRACE) satellite mission have a coarse resolution in time (monthly) and space (roughly 150,000 km(sup 2) at midlatitudes) and vertically integrate all water storage components over land, including soil moisture and groundwater. Data assimilation can be used to horizontally downscale and vertically partition GRACE-TWS observations. This work proposes a variant of existing ensemble-based GRACE-TWS data assimilation schemes. The new algorithm differs in how the analysis increments are computed and applied. Existing schemes correlate the uncertainty in the modeled monthly TWS estimates with errors in the soil moisture profile state variables at a single instant in the month and then apply the increment either at the end of the month or gradually throughout the month. The proposed new scheme first computes increments for each day of the month and then applies the average of those increments at the beginning of the month. The new scheme therefore better reflects submonthly variations in TWS errors. The new and existing schemes are investigated here using gridded GRACE-TWS observations. The assimilation results are validated at the monthly time scale, using in situ measurements of groundwater depth and soil moisture across the U.S. The new assimilation scheme yields improved (although not in a statistically significant sense) skill metrics for groundwater compared to the open-loop (no assimilation) simulations and compared to the existing assimilation schemes. A smaller impact is seen for surface and root-zone soil moisture, which have a shorter memory and receive smaller increments from TWS assimilation than groundwater. These results motivate future efforts to combine GRACE-TWS observations with observations that are more sensitive to surface soil moisture, such as L-band brightness temperature observations from Soil Moisture Ocean Salinity (SMOS) or Soil Moisture Active Passive (SMAP). Finally, we demonstrate that the scaling parameters that are applied to the GRACE observations prior to assimilation should be consistent with the land surface model that is used within the assimilation system.

  9. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Ahunbay, E; Li, X

    2015-06-15

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during themore » actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool in radiation oncology practice. This work is partially supported by Elekta Inc.« less

  10. Modelling spoilage of fresh turbot and evaluation of a time-temperature integrator (TTI) label under fluctuating temperature.

    PubMed

    Nuin, Maider; Alfaro, Begoña; Cruz, Ziortza; Argarate, Nerea; George, Susie; Le Marc, Yvan; Olley, June; Pin, Carmen

    2008-10-31

    Kinetic models were developed to predict the microbial spoilage and the sensory quality of fresh fish and to evaluate the efficiency of a commercial time-temperature integrator (TTI) label, Fresh Check(R), to monitor shelf life. Farmed turbot (Psetta maxima) samples were packaged in PVC film and stored at 0, 5, 10 and 15 degrees C. Microbial growth and sensory attributes were monitored at regular time intervals. The response of the Fresh Check device was measured at the same temperatures during the storage period. The sensory perception was quantified according to a global sensory indicator obtained by principal component analysis as well as to the Quality Index Method, QIM, as described by Rahman and Olley [Rahman, H.A., Olley, J., 1984. Assessment of sensory techniques for quality assessment of Australian fish. CSIRO Tasmanian Regional Laboratory. Occasional paper n. 8. Available from the Australian Maritime College library. Newnham. Tasmania]. Both methods were found equally valid to monitor the loss of sensory quality. The maximum specific growth rate of spoilage bacteria, the rate of change of the sensory indicators and the rate of change of the colour measurements of the TTI label were modelled as a function of temperature. The temperature had a similar effect on the bacteria, sensory and Fresh Check kinetics. At the time of sensory rejection, the bacterial load was ca. 10(5)-10(6) cfu/g. The end of shelf life indicated by the Fresh Check label was close to the sensory rejection time. The performance of the models was validated under fluctuating temperature conditions by comparing the predicted and measured values for all microbial, sensory and TTI responses. The models have been implemented in a Visual Basic add-in for Excel called "Fish Shelf Life Prediction (FSLP)". This program predicts sensory acceptability and growth of spoilage bacteria in fish and the response of the TTI at constant and fluctuating temperature conditions. The program is freely available at http://www.azti.es/muestracontenido.asp?idcontenido=980&content=15&nodo1=30&nodo2=0.

  11. Why and how Mastering an Incremental and Iterative Software Development Process

    NASA Astrophysics Data System (ADS)

    Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe

    2004-06-01

    One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.

  12. Scalable Integrated Multi-Mission Support System Simulator Release 3.0

    NASA Technical Reports Server (NTRS)

    Kim, John; Velamuri, Sarma; Casey, Taylor; Bemann, Travis

    2012-01-01

    The Scalable Integrated Multi-mission Support System (SIMSS) is a tool that performs a variety of test activities related to spacecraft simulations and ground segment checks. SIMSS is a distributed, component-based, plug-and-play client-server system useful for performing real-time monitoring and communications testing. SIMSS runs on one or more workstations and is designed to be user-configurable or to use predefined configurations for routine operations. SIMSS consists of more than 100 modules that can be configured to create, receive, process, and/or transmit data. The SIMSS/GMSEC innovation is intended to provide missions with a low-cost solution for implementing their ground systems, as well as significantly reducing a mission s integration time and risk.

  13. NCBI BLAST+ integrated into Galaxy.

    PubMed

    Cock, Peter J A; Chilton, John M; Grüning, Björn; Johnson, James E; Soranzo, Nicola

    2015-01-01

    The NCBI BLAST suite has become ubiquitous in modern molecular biology and is used for small tasks such as checking capillary sequencing results of single PCR products, genome annotation or even larger scale pan-genome analyses. For early adopters of the Galaxy web-based biomedical data analysis platform, integrating BLAST into Galaxy was a natural step for sequence comparison workflows. The command line NCBI BLAST+ tool suite was wrapped for use within Galaxy. Appropriate datatypes were defined as needed. The integration of the BLAST+ tool suite into Galaxy has the goal of making common BLAST tasks easy and advanced tasks possible. This project is an informal international collaborative effort, and is deployed and used on Galaxy servers worldwide. Several examples of applications are described here.

  14. Language-guided visual processing affects reasoning: the role of referential and spatial anchoring.

    PubMed

    Dumitru, Magda L; Joergensen, Gitte H; Cruickshank, Alice G; Altmann, Gerry T M

    2013-06-01

    Language is more than a source of information for accessing higher-order conceptual knowledge. Indeed, language may determine how people perceive and interpret visual stimuli. Visual processing in linguistic contexts, for instance, mirrors language processing and happens incrementally, rather than through variously-oriented fixations over a particular scene. The consequences of this atypical visual processing are yet to be determined. Here, we investigated the integration of visual and linguistic input during a reasoning task. Participants listened to sentences containing conjunctions or disjunctions (Nancy examined an ant and/or a cloud) and looked at visual scenes containing two pictures that either matched or mismatched the nouns. Degree of match between nouns and pictures (referential anchoring) and between their expected and actual spatial positions (spatial anchoring) affected fixations as well as judgments. We conclude that language induces incremental processing of visual scenes, which in turn becomes susceptible to reasoning errors during the language-meaning verification process. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Percolator: Scalable Pattern Discovery in Dynamic Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhury, Sutanay; Purohit, Sumit; Lin, Peng

    We demonstrate Percolator, a distributed system for graph pattern discovery in dynamic graphs. In contrast to conventional mining systems, Percolator advocates efficient pattern mining schemes that (1) support pattern detection with keywords; (2) integrate incremental and parallel pattern mining; and (3) support analytical queries such as trend analysis. The core idea of Percolator is to dynamically decide and verify a small fraction of patterns and their in- stances that must be inspected in response to buffered updates in dynamic graphs, with a total mining cost independent of graph size. We demonstrate a) the feasibility of incremental pattern mining by walkingmore » through each component of Percolator, b) the efficiency and scalability of Percolator over the sheer size of real-world dynamic graphs, and c) how the user-friendly GUI of Percolator inter- acts with users to support keyword-based queries that detect, browse and inspect trending patterns. We also demonstrate two user cases of Percolator, in social media trend analysis and academic collaboration analysis, respectively.« less

  16. Implementation of gamma-ray spectrometry in two real-time water monitors using NaI(Tl) scintillation detectors.

    PubMed

    Casanovas, R; Morant, J J; Salvadó, M

    2013-10-01

    In this study, the implementation of gamma-ray spectrometry in two real-time water monitors using 2 in. × 2 in. NaI(Tl) scintillation detectors is described. These monitors collect the water from the river through a pump and it is analyzed in a vessel, which is shielded with Pb. The full calibration of the monitors was performed experimentally, except for the efficiency curve, which was set using validated Monte Carlo simulations with the EGS5 code system. After the calibration, the monitors permitted the identification and quantification of the involved isotopes in a possible radioactive increment and made it possible to discard possible leaks in the nuclear plants. As an example, a radiological increment during rain is used to show the advantages of gamma-ray spectrometry. To study the capabilities of the monitor, the minimum detectable activity concentrations for (131)I, (137)Cs and (40)K are presented for different integration times. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Early Shear Failure Prediction in Incremental Sheet Forming Process Using FEM and ANN

    NASA Astrophysics Data System (ADS)

    Moayedfar, Majid; Hanaei, Hengameh; Majdi Rani, Ahmad; Musa, Mohd Azam Bin; Sadegh Momeni, Mohammad

    2018-03-01

    The application of incremental sheet forming process as a rapid forming technique is rising in variety of industries such as aerospace, automotive and biomechanical purposes. However, the sheet failure is a big challenge in this process which leads wasting lots of materials. Hence, this study tried to propose a method to predict the early sheet failure in this process using mathematical solution. For the feasibility of the study, design of experiment with the respond surface method is employed to extract a set of experiments data for the simulation. The significant forming parameters were recognized and their integration was used for prediction system. Then, the results were inserted to the artificial neural network as input parameters to predict a vast range of applicable parameters avoiding sheet failure in ISF. The value of accuracy R2 ∼0.93 was obtained and the maximum sheet stretch in the depth of 25mm were recorded. The figures generate from the trend of interaction between effective parameters were provided for future studies.

  18. 48 CFR 3452.232-71 - Incremental funding.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Incremental funding. 3452....232-71 Incremental funding. As prescribed in 3432.705-2, insert the following provision in solicitations if a cost-reimbursement contract using incremental funding is contemplated: Incremental Funding...

  19. 48 CFR 3452.232-71 - Incremental funding.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Incremental funding. 3452....232-71 Incremental funding. As prescribed in 3432.705-2, insert the following provision in solicitations if a cost-reimbursement contract using incremental funding is contemplated: Incremental Funding...

  20. 48 CFR 3452.232-71 - Incremental funding.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Incremental funding. 3452....232-71 Incremental funding. As prescribed in 3432.705-2, insert the following provision in solicitations if a cost-reimbursement contract using incremental funding is contemplated: Incremental Funding...

  1. 48 CFR 3452.232-71 - Incremental funding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Incremental funding. 3452....232-71 Incremental funding. As prescribed in 3432.705-2, insert the following provision in solicitations if a cost-reimbursement contract using incremental funding is contemplated: Incremental Funding...

  2. Health care resource use, health care expenditures and absenteeism costs associated with osteoarthritis in US healthcare system.

    PubMed

    Menon, J; Mishra, P

    2018-04-01

    We determined incremental health care resource utilization, incremental health care expenditures, incremental absenteeism, and incremental absenteeism costs associated with osteoarthritis. Medical Expenditure Panel Survey (MEPS) for 2011 was used as data source. Individuals 18 years or older and employed during 2011 were eligible for inclusion in the sample for analyses. Individuals with osteoarthritis were identified based on ICD-9-CM codes. Incremental health care resource utilization included annual hospitalization, hospital days, emergency room visits and outpatient visits. Incremental health expenditures included annual inpatient, outpatient, emergency room, medications, miscellaneous and annual total expenditures. Of the total sample, 1354 were diagnosed with osteoarthritis, and compared to non osteoarthritis individuals. Incremental resource utilization, expenditures, absenteeism and absenteeism costs were estimated using regression models, adjusting for age, gender, sex, region, marital status, insurance coverage, comorbidities, anxiety, asthma, hypertension and hyperlipidemia. Regression models revealed incremental mean annual resource use associated with osteoarthritis of 0.07 hospitalizations, equal to 70 additional hospitalizations per 100 osteoarthritic patients annually, and 3.63 outpatient visits, equal to 363 additional visits per 100 osteoarthritic patients annually. Mean annual incremental total expenditures associated with osteoarthritis were $2046. Annually, mean incremental expenditures were largest for inpatient expenditures at $826, followed by mean incremental outpatient expenditures of $659, and mean incremental medication expenditures of $325. Mean annual incremental absenteeism was 2.2 days and mean annual incremental absenteeism costs were $715.74. Total direct expenditures were estimated at $41.7 billion. Osteoarthritis was associated with significant incremental health care resource utilization, expenditures, absenteeism and absenteeism costs. Copyright © 2017 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  3. Integrated Aero-Propulsion CFD Methodology for the Hyper-X Flight Experiment

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.; Engelund, Walter C.; Bittner, Robert D.; Dilley, Arthur D.; Jentink, Tom N.; Frendi, Abdelkader

    2000-01-01

    Computational fluid dynamics (CFD) tools have been used extensively in the analysis and development of the X-43A Hyper-X Research Vehicle (HXRV). A significant element of this analysis is the prediction of integrated vehicle aero-propulsive performance, which includes an integration of aerodynamic and propulsion flow fields. This paper describes analysis tools used and the methodology for obtaining pre-flight predictions of longitudinal performance increments. The use of higher-fidelity methods to examine flow-field characteristics and scramjet flowpath component performance is also discussed. Limited comparisons with available ground test data are shown to illustrate the approach used to calibrate methods and assess solution accuracy. Inviscid calculations to evaluate lateral-directional stability characteristics are discussed. The methodology behind 3D tip-to-tail calculations is described and the impact of 3D exhaust plume expansion in the afterbody region is illustrated. Finally, future technology development needs in the area of hypersonic propulsion-airframe integration analysis are discussed.

  4. Filament wound data base development, revision 1, appendix A

    NASA Technical Reports Server (NTRS)

    Sharp, R. Scott; Braddock, William F.

    1985-01-01

    Data are presented in tabular form for the High Performance Nozzle Increments, Filament Wound Case (FWC) Systems Tunnel Increments, Steel Case Systems Tunnel Increments, FWC Stiffener Rings Increments, Steel Case Stiffener Rings Increments, FWC External Tank (ET) Attach Ring Increments, Steel Case ET Attach Ring Increments, and Data Tape 8. The High Performance Nozzle are also presented in graphical form. The tabular data consist of six-component force and moment coefficients as they vary with angle of attack at a specific Mach number and roll angle. The six coefficients are normal force, pitching moment, side force, yawing moment, axial force, and rolling moment. The graphical data for the High Performance Nozzle Increments consist of a plot of a coefficient increment as a function of angle of attack at a specific Mach number and at a roll angle of 0 deg.

  5. Building Single-Cell Models of Planktonic Metabolism Using PSAMM

    NASA Astrophysics Data System (ADS)

    Dufault-Thompson, K.; Zhang, Y.; Steffensen, J. L.

    2016-02-01

    The Genome-scale models (GEMs) of metabolic networks simulate the metabolic activities of individual cells by integrating omics data with biochemical and physiological measurements. GEMs were applied in the simulation of various photo-, chemo-, and heterotrophic organisms and provide significant insights into the function and evolution of planktonic cells. Despite the quick accumulation of GEMs, challenges remain in assembling the individual cell-based models into community-level models. Among various problems, the lack of consistencies in model representation and model quality checking has hindered the integration of individual GEMs and can lead to erroneous conclusions in the development of new modeling algorithms. Here, we present a Portable System for the Analysis of Metabolic Models (PSAMM). Along with the software a novel format of model representation was developed to enhance the readability of model files and permit the inclusion of heterogeneous, model-specific annotation information. A number of quality checking procedures was also implemented in PSAMM to ensure stoichiometric balance and to identify unused reactions. Using a case study of Shewanella piezotolerans WP3, we demonstrated the application of PSAMM in simulating the coupling of carbon utilization and energy production pathways under low-temperature and high-pressure stress. Applying PSAMM, we have also analyzed over 50 GEMs in the current literature and released an updated collection of the models with corrections on a number of common inconsistencies. Overall, PSAMM opens up new opportunities for integrating individual GEMs for the construction and mathematical simulation of community-level models in the scope of entire ecosystems.

  6. Adaptive nitrogen and integrated weed management in conservation agriculture: impacts on agronomic productivity, greenhouse gas emissions, and herbicide residues.

    PubMed

    Oyeogbe, Anthony Imoudu; Das, T K; Bhatia, Arti; Singh, Shashi Bala

    2017-04-01

    Increasing nitrogen (N) immobilization and weed interference in the early phase of implementation of conservation agriculture (CA) affects crop yields. Yet, higher fertilizer and herbicide use to improve productivity influences greenhouse gase emissions and herbicide residues. These tradeoffs precipitated a need for adaptive N and integrated weed management in CA-based maize (Zea mays L.)-wheat [Triticum aestivum (L.) emend Fiori & Paol] cropping system in the Indo-Gangetic Plains (IGP) to optimize N availability and reduce weed proliferation. Adaptive N fertilization was based on soil test value and normalized difference vegetation index measurement (NDVM) by GreenSeeker™ technology, while integrated weed management included brown manuring (Sesbania aculeata L. co-culture, killed at 25 days after sowing), herbicide mixture, and weedy check (control, i.e., without weed management). Results indicated that the 'best-adaptive N rate' (i.e., 50% basal + 25% broadcast at 25 days after sowing + supplementary N guided by NDVM) increased maize and wheat grain yields by 20 and 14% (averaged for 2 years), respectively, compared with whole recommended N applied at sowing. Weed management by brown manuring (during maize) and herbicide mixture (during wheat) resulted in 10 and 21% higher grain yields (averaged for 2 years), respectively, over the weedy check. The NDVM in-season N fertilization and brown manuring affected N 2 O and CO 2 emissions, but resulted in improved carbon storage efficiency, while herbicide residuals in soil were significantly lower in the maize season than in wheat cropping. This study concludes that adaptive N and integrated weed management enhance synergy between agronomic productivity, fertilizer and herbicide efficiency, and greenhouse gas mitigation.

  7. Move me, astonish me… delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates

    NASA Astrophysics Data System (ADS)

    Pelowski, Matthew; Markey, Patrick S.; Forster, Michael; Gerger, Gernot; Leder, Helmut

    2017-07-01

    This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between ;aesthetic; and ;everyday; emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research.

  8. Move me, astonish me… delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates.

    PubMed

    Pelowski, Matthew; Markey, Patrick S; Forster, Michael; Gerger, Gernot; Leder, Helmut

    2017-07-01

    This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between "aesthetic" and "everyday" emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. KSC-2013-3077

    NASA Image and Video Library

    2013-07-22

    HOUSTON - JSC2013e068259 - NASA astronaut Serena Aunon prepares for a fit check evaluation of The Boeing Company's CST-100 spacecraft at the company's Houston Product Support Center. Assisting her is Andrea Gilkey, a human factors engineer with The Boeing Company. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  10. Large deflection elastic-plastic dynamic response of stiffened shells of revolution

    NASA Technical Reports Server (NTRS)

    Stricklin, J. A.; Haisler, W. E.; Vonriesemann, W. A.; Leick, R. D.; Hunsaker, B.; Saczalski, K. J.

    1972-01-01

    The formulation and check out porblems for a computer code DYNAPLAS, which analyzes the large deflection elastic-plastic dynamic response of stiffened shells of revolution, are presented. The formulation for special discretization is by the finite element method with finite differences being used for the evaluation of the pseudo forces due to material and geometric nonlinearities. Time integration is by the Houbolt method. The stiffeners may be due to concentrated or distributed eccentric rings and spring supports at arbitrary angles around the circumference of the elements. Check out porblems include the comparison of solutions from DYNAPLAS with experimental and other computer solutions for rings, conical and cylindrical shells and a curved panel. A hypothetical submarine including stiffeners and missile tube is studied under a combination of hydrostatic and dynamically applied asymmetrical pressure loadings.

  11. Evaluation of supercapacitors for space applications under thermal vacuum conditions

    NASA Astrophysics Data System (ADS)

    Chin, Keith C.; Green, Nelson W.; Brandon, Erik J.

    2018-03-01

    Commercially available supercapacitor cells from three separate vendors were evaluated for use in a space environment using thermal vacuum (Tvac) testing. Standard commercial cells are not hermetically sealed, but feature crimp or double seam seals between the header and the can, which may not maintain an adequate seal under vacuum. Cells were placed in a small vacuum chamber, and cycled between three separate temperature set points. Charging and discharging of cells was executed following each temperature soak, to confirm there was no significant impact on performance. A final electrical performance check, visual inspection and mass check following testing were also performed, to confirm the integrity of the cells had not been compromised during exposure to thermal cycling under vacuum. All cells tested were found to survive this testing protocol and exhibited no significant impact on electrical performance.

  12. Fast Integer Ambiguity Resolution for GPS Attitude Determination

    NASA Technical Reports Server (NTRS)

    Lightsey, E. Glenn; Crassidis, John L.; Markley, F. Landis

    1999-01-01

    In this paper, a new algorithm for GPS (Global Positioning System) integer ambiguity resolution is shown. The algorithm first incorporates an instantaneous (static) integer search to significantly reduce the search space using a geometric inequality. Then a batch-type loss function is used to check the remaining integers in order to determine the optimal integer. This batch function represents the GPS sightline vectors in the body frame as the sum of two vectors, one depending on the phase measurements and the other on the unknown integers. The new algorithm has several advantages: it does not require an a-priori estimate of the vehicle's attitude; it provides an inherent integrity check using a covariance-type expression; and it can resolve the integers even when coplanar baselines exist. The performance of the new algorithm is tested on a dynamic hardware simulator.

  13. Integration of moral values during L2 sentence processing.

    PubMed

    Foucart, Alice; Moreno, Eva; Martin, Clara D; Costa, Albert

    2015-11-01

    This study reports an event-related potential (ERP) experiment examining whether valuation (i.e., one's own values) is integrated incrementally and whether it affects L2 speakers' online interpretation of the sentence. We presented Spanish native speakers and French-Spanish mid-proficiency late L2 speakers with visual sentences containing value-consistent and value-inconsistent statements (e.g., 'Nowadays, paedophilia should be prohibited/tolerated across the world.'). Participants' brain activity was recorded as they were reading the sentences and indicating whether they agreed with the statements or not. Behaviourally, the two groups revealed identical valuation. The ERP analyses showed both a semantic (N400) and an affect-related response (LPP) to value-inconsistent statements in the native group, but only an LPP in the non-native group. These results suggest that valuation is integrated online (presence of LPP) during L2 sentence comprehension but that it does not interfere with semantic processing (absence of N400).

  14. Tempered fractional calculus

    NASA Astrophysics Data System (ADS)

    Sabzikar, Farzad; Meerschaert, Mark M.; Chen, Jinghua

    2015-07-01

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a tempered fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered fractional difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series.

  15. TEMPERED FRACTIONAL CALCULUS.

    PubMed

    Meerschaert, Mark M; Sabzikar, Farzad; Chen, Jinghua

    2015-07-15

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a tempered fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series.

  16. TEMPERED FRACTIONAL CALCULUS

    PubMed Central

    MEERSCHAERT, MARK M.; SABZIKAR, FARZAD; CHEN, JINGHUA

    2014-01-01

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a tempered fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series. PMID:26085690

  17. Tempered fractional calculus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabzikar, Farzad, E-mail: sabzika2@stt.msu.edu; Meerschaert, Mark M., E-mail: mcubed@stt.msu.edu; Chen, Jinghua, E-mail: cjhdzdz@163.com

    2015-07-15

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a temperedmore » fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered fractional difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series.« less

  18. Theoretical modeling and experimental analysis of solar still integrated with evacuated tubes

    NASA Astrophysics Data System (ADS)

    Panchal, Hitesh; Awasthi, Anuradha

    2017-06-01

    In this present research work, theoretical modeling of single slope, single basin solar still integrated with evacuated tubes has been performed based on energy balance equations. Major variables like water temperature, inner glass cover temperature and distillate output has been computed based on theoretical modeling. The experimental setup has been made from locally available materials and installed at Gujarat Power Engineering and Research Institute, Mehsana, Gujarat, India (23.5880°N, 72.3693°E) with 0.04 m depth during 6 months of time interval. From the series of experiments, it is found considerable increment in average distillate output of a solar still when integrated with evacuated tubes not only during daytime but also from night time. In all experimental cases, the correlation of coefficient (r) and root mean square percentage deviation of theoretical modeling and experimental study found good agreement with 0.97 < r < 0.98 and 10.22 < e < 38.4% respectively.

  19. Plane elasto-plastic analysis of v-notched plate under bending by boundary integral equation method. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Rzasnicki, W.

    1973-01-01

    A method of solution is presented, which, when applied to the elasto-plastic analysis of plates having a v-notch on one edge and subjected to pure bending, will produce stress and strain fields in much greater detail than presently available. Application of the boundary integral equation method results in two coupled Fredholm-type integral equations, subject to prescribed boundary conditions. These equations are replaced by a system of simultaneous algebraic equations and solved by a successive approximation method employing Prandtl-Reuss incremental plasticity relations. The method is first applied to number of elasto-static problems and the results compared with available solutions. Good agreement is obtained in all cases. The elasto-plastic analysis provides detailed stress and strain distributions for several cases of plates with various notch angles and notch depths. A strain hardening material is assumed and both plane strain and plane stress conditions are considered.

  20. Integration of textile fabric and coconut shell in particleboard

    NASA Astrophysics Data System (ADS)

    Misnon, M. I.; Bahari, S. A.; Islam, M. M.; Epaarachchi, J. A.

    2013-08-01

    In this study, cotton fabric and coconut shell were integrated in particleboard to reduce the use of wood. Particleboards containing mixed rubberwood and coconut shell with an equal weight ratio have been integrated with various layers of cotton fabric. These materials were bonded by urea formaldehyde with a content level of 12% by weight. Flexural and water absorption tests were conducted to analyze its mechanical properties and dimensional stability. Results of flexural test showed an increment at least double strength values in fabricated materials as compared to control sample. The existence of fabric in the particleboard system also improved the dimensional stability of the produced material. Enhancement of at least 39% of water absorption could help the dimensional stability of the produced material. Overall, these new particleboards showed better results with the incorporation of cotton fabric layers and this study provided better understanding on mechanical and physical properties of the fabricated particleboard.

  1. [Comprehensive quality management in hospitals--experience and recommendations].

    PubMed

    Schubert, H J

    1999-03-01

    Total quality management concepts, increasingly being introduced into hospitals, offer opportunities for integrative leadership concepts because of their multidimensional character viewed from the aspects of results and from the standpoint of organisational design. Customized for leadership and organisation of hospitals in Germany, questions of introduction strategies as well as recommendations for the integration of total quality understanding into the daily practice of management and employees are discussed. The active support of top and middle management and a combination of radical change in selected problem areas and continual incremental improvements on a broad base have been proven as significant factors for the success in the introductory phase. For a lasting integration of the principles of a comprehensive quality management concept in a hospital, it will be necessary to carry out regularly relevant measurements of success. The results become an important part of agreements with management.

  2. Automatic system for ionization chamber current measurements.

    PubMed

    Brancaccio, Franco; Dias, Mauro S; Koskinas, Marina F

    2004-12-01

    The present work describes an automatic system developed for current integration measurements at the Laboratório de Metrologia Nuclear of Instituto de Pesquisas Energéticas e Nucleares. This system includes software (graphic user interface and control) and a module connected to a microcomputer, by means of a commercial data acquisition card. Measurements were performed in order to check the performance and for validating the proposed design.

  3. Experience with Adaptive Security Policies.

    DTIC Science & Technology

    1998-03-01

    3.1 Introduction r: 3.2 Logical Groupings of audited permission checks 29 3.3 Auditing of system servers via microkernel snooping 31 3.4...performed by servers other than the microkernel . Since altering each server to audit events would complicate the integration of new servers, a...modification to the microkernel was implemented to allow the microkernel to audit the requests made of other servers. Both methods for enhancing audit

  4. ISS Expedition 42 / 43 Soyuz Spacecraft and Crew Preparations for Launch

    NASA Image and Video Library

    2014-11-26

    NASA TV (NTV) video file of crewmembers Terry Virts, Anton Shkaplerov (Roskosmos) and Samantha Cristoforetti (ESA) during final fit check of the Soyuz TMA 15M spacedraft at the Integration Facility, Baikonurk, Kazakhstan. Includes footage of the crew climbing into the Soyuz spacecraft, interviews, visit to museum where the crew sign posters and a flag; flag raising ceremony; and visit to mating facility.

  5. jsc2016e109765

    NASA Image and Video Library

    2016-09-09

    At the Integration Facility at the Baikonur Cosmodrome in Kazakhstan, Expedition 49 crewmember Shane Kimbrough of NASA undergoes a pressure test on his Sokol launch and entry suit Sept. 9 during a pre-launch training fit check. Kimbrough and Sergey Ryzhikov and Andrey Borisenko of Roscosmos will launch Sept. 24, Kazakh time on the Soyuz MS-02 vehicle for a five-month mission on the International Space Station. NASA/Victor Zelentsov

  6. International Workshop on Discrete Time Domain Modelling of Electromagnetic Fields and Networks (2nd) Held in Berlin, Germany on October 28-29, 1993

    DTIC Science & Technology

    1993-10-29

    natural logarithm of the ratio of two maxima a period apart. Both methods are based on the results from the numerical integration. The details of this...check and okay member funtions are for sofware handshaking between the client and sever pracrss. Finally, the Forward function is used to initiate a

  7. An Empirical Methodology for Engineering Human Systems Integration

    DTIC Science & Technology

    2009-12-01

    scanning critical information and selectively skipping what is not important for the immediate task. This learned skill is called a proper “cross check...ability to recover from errors. Efficiency: the level of productivity that can be achieved once learning has occurred. 162 Learnability: the...use, to legacy systems. Knowing what design issues have plagued past operators can guide the generation of requirements to address the identified

  8. Report of the key-comparison of spectral diffuse reflectance (EURAMET.PR-K5) (Ref. 619)

    NASA Astrophysics Data System (ADS)

    Andor, György; Gál, Péter

    2018-01-01

    This report details the final results of the EURAMET comparison on regular spectral transmittance carried out between 2006 and 2016. The aim of this comparison was to check the agreement of measurement of the spectral diffuse reflectance among participants, using the measurement geometry of d/0 or 0/d in the wavelength range of 360 nm to 780 nm at 20 nm increment. We used a star type comparison: first the participants sent their samples to the pilot, than the pilot measured all the samples of the participants and sent them back. The participants measured the samples and sent them to the pilot for control measurement. Six standards were used as reference standards in order to maintain the scale during the comparison. These were three samples of BCR-406 opal glasses (BCR 30506; BCR 30303; BCR 30704), an MC20 Russian opal glass (MC 4777) and two samples made of pressed halon (polytetrafluoroethylene) powder (halon 2007A; halon 2007C). These six samples were designated as the Comparison reference standards. The diffuse reflectance was initially measured on the OMH (BFKH) absolute reflectometer. The link to the CCPR-K5 results was BFKH, and the check on BFKH was the PTB results who also participated in the CCPR-K5 comparison. The participants were GUM, INM, LNE, METAS, BFKH, PTB, SP. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCPR, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  9. Exploring correlates of diabetes-related stress among adults with Type 1 diabetes in the T1D exchange clinic registry.

    PubMed

    Boden, Matthew Tyler; Gala, Sasha

    2018-04-01

    To explore relations between diabetes-related stress and multiple sociodemographic, diabetes health, other health, and treatment-related variables among a large sample of adults with Type 1 Diabetes (T1D). The sample consisted of 10,821 adults (over 18 years old) enrolled in the T1D Exchange Clinic Registry. The T1D Exchange clinic network consists of 67 diabetes clinical centers throughout the United States selected to broadly represent pediatric and adult patients with T1D. Variables were assessed through participant self-report and extraction of clinic chart data. Univariate and multiple linear regression (with simultaneous entry of all predictors) analyses were conducted. Robustly associated with increased diabetes-related stress across analyses were multiple sociodemographic (female [vs. male], native Hawaiian/other Pacific islander [vs. white/Caucasian], decreased age and diabetes duration), diabetes health (higher HbA1c), other health (lower general health, presence of major life stress and depression, less physical activity), and treatment related variables (use of injections/pen or combination injection/pen/pump [vs. pump], use of CGM, increased frequency of missing insulin doses and BG checking, decreased frequency of BG checking prior to bolus, receipt of mental health treatment). We replicated and extended research demonstrating that diabetes-related stress among people with T1D occurs at higher levels among those with particular sociodemographic characteristics and is associated with a range poorer diabetes health and other health variables, and multiple treatment-related variables. The strong incremental prediction of diabetes-related stress by multiple variables in our study suggests that a multi-variable, personalized approach may increase the effectiveness of treatments for diabetes-related stress. Published by Elsevier B.V.

  10. Appraising the value of independent EIA follow-up verifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wessels, Jan-Albert, E-mail: janalbert.wessels@nwu.ac.za; Retief, Francois, E-mail: francois.retief@nwu.ac.za; Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture ofmore » site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities arising from EIA follow-up through the appointment of independent verifiers. - Highlights: • A framework for appraising the role of independent verifiers is established. • The value added to EIA follow-up by independent verifiers in South Africa is documented. • Verifiers add most value when involved with screening, checking compliance, influencing decisions and community engagement. • Verifiers could be more creatively utilized in pre-construction preparation, giving feedback, and performance evaluation.« less

  11. Grand challenges in mass storage: A system integrator's perspective

    NASA Technical Reports Server (NTRS)

    Mintz, Dan; Lee, Richard

    1993-01-01

    The grand challenges are the following: to develop more innovation in approach; to expand the I/O barrier; to achieve increased volumetric efficiency and incremental cost improvements; to reinforce the 'weakest link' software; to implement improved architectures; and to minimize the impact of self-destructing technologies. Mass storage is defined as any type of storage system exceeding 100 GBytes in total size, under the control of a centralized file management scheme. The topics covered are presented in viewgraph form.

  12. The Implementation of Payload Safety in an Operational Environment

    NASA Technical Reports Server (NTRS)

    Cissom, R. D.; Horvath, Tim J.; Watson, Kristi S.; Rogers, Mark N. (Technical Monitor); Vanhooser, T. (Technical Monitor)

    2002-01-01

    The objective of this paper is to define the safety life-cycle process for a payload beginning with the output of the Payload Safety Review Panel and continuing through the life of the payload on-orbit. It focuses on the processes and products of the operations safety implementation through the increment preparations and real-time operations processes. In addition, the paper addresses the role of the Payload Operations and Integration Center and the interfaces to the International Partner Payload Control Centers.

  13. A catalogue of normalized intensity functions and polarization from a cloud of particles with a size distribution of alpha to the minus 4th power

    NASA Technical Reports Server (NTRS)

    Craven, P. D.; Gary, G. A.

    1972-01-01

    The Mie theory of light scattering by spheres was used to calculate the scattered intensity functions resulting from single scattering in a polydispersed collection of spheres. The distribution used behaves according to the inverse fourth power law; graphs and tables for the angular dependence of the intensity and polarization for this law are given. The effects of the particle size range and the integration increment are investigated.

  14. Grid Integration of Single Stage Solar PV System using Three-level Voltage Source Converter

    NASA Astrophysics Data System (ADS)

    Hussain, Ikhlaq; Kandpal, Maulik; Singh, Bhim

    2016-08-01

    This paper presents a single stage solar PV (photovoltaic) grid integrated power generating system using a three level voltage source converter (VSC) operating at low switching frequency of 900 Hz with robust synchronizing phase locked loop (RS-PLL) based control algorithm. To track the maximum power from solar PV array, an incremental conductance algorithm is used and this maximum power is fed to the grid via three-level VSC. The use of single stage system with three level VSC offers the advantage of low switching losses and the operation at high voltages and high power which results in enhancement of power quality in the proposed system. Simulated results validate the design and control algorithm under steady state and dynamic conditions.

  15. Performance of an optical encoder based on a nondiffractive beam implemented with a specific photodetection integrated circuit and a diffractive optical element.

    PubMed

    Quintián, Fernando Perez; Calarco, Nicolás; Lutenberg, Ariel; Lipovetzky, José

    2015-09-01

    In this paper, we study the incremental signal produced by an optical encoder based on a nondiffractive beam (NDB). The NDB is generated by means of a diffractive optical element (DOE). The detection system is composed by an application specific integrated circuit (ASIC) sensor. The sensor consists of an array of eight concentric annular photodiodes, each one provided with a programmable gain amplifier. In this way, the system is able to synthesize a nonuniform detectivity. The contrast, amplitude, and harmonic content of the sinusoidal output signal are analyzed. The influence of the cross talk among the annular photodiodes is placed in evidence through the dependence of the signal contrast on the wavelength.

  16. Integrating technology to improve medication administration.

    PubMed

    Prusch, Amanda E; Suess, Tina M; Paoletti, Richard D; Olin, Stephen T; Watts, Starann D

    2011-05-01

    The development, implementation, and evaluation of an i.v. interoperability program to advance medication safety at the bedside are described. I.V. interoperability integrates intelligent infusion devices (IIDs), the bar-code-assisted medication administration system, and the electronic medication administration record system into a bar-code-driven workflow that populates provider-ordered, pharmacist-validated infusion parameters on IIDs. The purpose of this project was to improve medication safety through the integration of these technologies and decrease the potential for error during i.v. medication administration. Four key phases were essential to developing and implementing i.v. interoperability: (a) preparation, (b) i.v. interoperability pilot, (c) preliminary validation, and (d) expansion. The establishment of pharmacy involvement in i.v. interoperability resulted in two additional safety checks: pharmacist infusion rate oversight and nurse independent validation of the autoprogrammed rate. After instituting i.v. interoperability, monthly compliance to the telemetry drug library increased to a mean ± S.D. of 72.1% ± 2.1% from 56.5% ± 1.5%, and the medical-surgical nursing unit's drug library monthly compliance rate increased to 58.6% ± 2.9% from 34.1% ± 2.6% (p < 0.001 for both comparisons). The number of manual pump edits decreased with both telemetry and medical-surgical drug libraries, demonstrating a reduction from 56.9 ± 12.8 to 14.2 ± 3.9 and from 61.2 ± 15.4 to 14.7 ± 3.8, respectively (p < 0.001 for both comparisons). Through the integration and incorporation of pharmacist oversight for rate changes, the telemetry and medical-surgical patient care areas demonstrated a 32% reduction in reported monthly errors involving i.v. administration of heparin. By integrating two stand-alone technologies, i.v. interoperability was implemented to improve medication administration. Medication errors were reduced, nursing workflow was simplified, and pharmacists became involved in checking infusion rates of i.v. medications.

  17. SU-E-T-110: An Investigation On Monitor Unit Threshold and Effects On IMPT Delivery in Proton Pencil Beam Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syh, J; Ding, X; Syh, J

    2015-06-15

    Purpose: An approved proton pencil beam scanning (PBS) treatment plan might not be able to deliver because of existed extremely low monitor unit per beam spot. A dual hybrid plan with higher efficiency of higher spot monitor unit and the efficacy of less number of energy layers were searched and optimized. The range of monitor unit threshold setting was investigated and the plan quality was evaluated by target dose conformity. Methods: Certain limitations and requirements need to be checks and tested before a nominal proton PBS treatment plan can be delivered. The plan needs to be met the machine characterization,more » specification in record and verification to deliver the beams. Minimal threshold of monitor unit, e.g. 0.02, per spot was set to filter the low counts and plan was re-computed. Further MU threshold increment was tested in sequence without sacrificing the plan quality. The number of energy layer was also alternated due to elimination of low count layer(s). Results: Minimal MU/spot threshold, spot spacing in each energy layer and total number of energy layer and the MU weighting of beam spots of each beam were evaluated. Plan optimization between increases of the spot MU (efficiency) and less energy layers of delivery (efficacy) was adjusted. 5% weighting limit of total monitor unit per beam was feasible. Scarce spreading of beam spots was not discouraging as long as target dose conformity within 3% criteria. Conclusion: Each spot size is equivalent to the relative dose in the beam delivery system. The energy layer is associated with the depth of the targeting tumor. Our work is crucial to maintain the best possible quality plan. To keep integrity of all intrinsic elements such as spot size, spot number, layer number and the carried weighting of spots in each layer is important in this study.« less

  18. Differential Galois theory and non-integrability of planar polynomial vector fields

    NASA Astrophysics Data System (ADS)

    Acosta-Humánez, Primitivo B.; Lázaro, J. Tomás; Morales-Ruiz, Juan J.; Pantazi, Chara

    2018-06-01

    We study a necessary condition for the integrability of the polynomials vector fields in the plane by means of the differential Galois Theory. More concretely, by means of the variational equations around a particular solution it is obtained a necessary condition for the existence of a rational first integral. The method is systematic starting with the first order variational equation. We illustrate this result with several families of examples. A key point is to check whether a suitable primitive is elementary or not. Using a theorem by Liouville, the problem is equivalent to the existence of a rational solution of a certain first order linear equation, the Risch equation. This is a classical problem studied by Risch in 1969, and the solution is given by the "Risch algorithm". In this way we point out the connection of the non integrability with some higher transcendent functions, like the error function.

  19. China’s new-age small farms and their vertical integration: agribusiness or co-ops?

    PubMed

    Huang, Philip C C

    2011-01-01

    The future of Chinese agriculture lies not with large mechanized farms but with small capital-labor dual intensifying family farms for livestock-poultry-fish raising and vegetable-fruit cultivation. Chinese food consumption patterns have been changing from the old 8:1:1 pattern of 8 parts grain, 1 part meat, and 1 part vegetables to a 4:3:3 pattern, with a corresponding transformation in agricultural structure. Small family-farming is better suited for the new-age agriculture, including organic farming, than large-scale mechanized farming, because of the intensive, incremental, and variegated hand labor involved, not readily open to economies of scale, though compatible with economies of scope. It is also better suited to the realities of severe population pressure on land. But it requires vertical integration from cultivation to processing to marketing, albeit without horizontal integration for farming. It is against such a background that co-ops have arisen spontaneously for integrating small farms with processing and marketing. The Chinese government, however, has been supporting aggressively capitalistic agribusinesses as the preferred mode of vertical integration. At present, Chinese agriculture is poised at a crossroads, with the future organizational mode for vertical integration as yet uncertain.

  20. Security in the CernVM File System and the Frontier Distributed Database Caching System

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  1. Assembly and method for testing the integrity of stuffing tubes

    DOEpatents

    Morrison, Edward Francis

    1997-01-01

    A stuffing tube integrity checking assembly includes first and second annular seals, with each seal adapted to be positioned about a stuffing tube penetration component. An annular inflation bladder is provided, the bladder having a slot extending longitudinally therealong and including a separator for sealing the slot. A first valve is in fluid communication with the bladder for introducing pressurized fluid to the space defined by the bladder when mounted about the tube. First and second releasible clamps are provided. Each clamp assembly is positioned about the bladder for securing the bladder to one of the seals for thereby establishing a fluid-tight chamber about the tube.

  2. STS-2 - SOFTWARE INTEGRATION TESTS (SIT) - KSC

    NASA Image and Video Library

    1981-09-01

    S81-36331 (24 Aug. 1981) --- Astronauts Joe H. Engle, left, and Richard H. Truly pause before participating in the integrated test of the assembled space shuttle components scheduled for launch no earlier than Sept. 30, 1981. Moments later, Engle, STS-2 crew commander, and Truly, pilot, entered the cabin of the orbiter Columbia for a mission simulation. The shuttle integrated tests (SIT) are designed to check out every connection and signal path in the STS-2 vehicle composed of the orbiter, two solid rocket boosters (SRB) and an external fuel tank (ET) for Columbia?s main engines. Completion of the tests will clear the way for preparations for rollout to Pad A at Launch Complex 39, scheduled for the latter part of August or early September. Photo credit: NASA

  3. Numerical modeling of axi-symmetrical cold forging process by ``Pseudo Inverse Approach''

    NASA Astrophysics Data System (ADS)

    Halouani, A.; Li, Y. M.; Abbes, B.; Guo, Y. Q.

    2011-05-01

    The incremental approach is widely used for the forging process modeling, it gives good strain and stress estimation, but it is time consuming. A fast Inverse Approach (IA) has been developed for the axi-symmetric cold forging modeling [1-2]. This approach exploits maximum the knowledge of the final part's shape and the assumptions of proportional loading and simplified tool actions make the IA simulation very fast. The IA is proved very useful for the tool design and optimization because of its rapidity and good strain estimation. However, the assumptions mentioned above cannot provide good stress estimation because of neglecting the loading history. A new approach called "Pseudo Inverse Approach" (PIA) was proposed by Batoz, Guo et al.. [3] for the sheet forming modeling, which keeps the IA's advantages but gives good stress estimation by taking into consideration the loading history. Our aim is to adapt the PIA for the cold forging modeling in this paper. The main developments in PIA are resumed as follows: A few intermediate configurations are generated for the given tools' positions to consider the deformation history; the strain increment is calculated by the inverse method between the previous and actual configurations. An incremental algorithm of the plastic integration is used in PIA instead of the total constitutive law used in the IA. An example is used to show the effectiveness and limitations of the PIA for the cold forging process modeling.

  4. Manifesting enhanced cancellations in supergravity: integrands versus integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bern, Zvi; Enciso, Michael; Parra-Martinez, Julio

    2017-05-25

    We have found examples of `enhanced ultraviolet cancellations' with no known standard-symmetry explanation in a variety of supergravity theories. Furthermore, by examining one- and two-loop examples in four- and five-dimensional half-maximal supergravity, we argue that enhanced cancellations in general cannot be exhibited prior to integration. In light of this, we explore reorganizations of integrands into parts that are manifestly finite and parts that have poor power counting but integrate to zero due to integral identities. At two loops we find that in the large loop-momentum limit the required integral identities follow from Lorentz and SL(2) relabeling symmetry. We carry outmore » a nontrivial check at four loops showing that the identities generated in this way are a complete set. We propose that at L loops the combination of Lorentz and SL(L) symmetry is sufficient for displaying enhanced cancellations when they happen, whenever the theory is known to be ultraviolet finite up to (L - 1) loops.« less

  5. Symbolic Heuristic Search for Factored Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.

    2003-01-01

    We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.

  6. Design of wideband solar ultraviolet radiation intensity monitoring and control system

    NASA Astrophysics Data System (ADS)

    Ye, Linmao; Wu, Zhigang; Li, Yusheng; Yu, Guohe; Jin, Qi

    2009-08-01

    According to the principle of SCM (Single Chip Microcomputer) and computer communication technique, the system is composed of chips such as ATML89C51, ADL0809, integrated circuit and sensors for UV radiation, which is designed for monitoring and controlling the UV index. This system can automatically collect the UV index data, analyze and check the history database, research the law of UV radiation in the region.

  7. Security Controls in the Stockpoint Logistics Integrated Communications Environment (SPLICE).

    DTIC Science & Technology

    1985-03-01

    call programs as authorized after checks by the Terminal Management Subsystem on SAS databases . SAS overlays the TANDEM GUARDIAN operating system to...Security Access Profile database (SAP) and a query capability generating various security reports. SAS operates with the System Monitor (SMON) subsystem...system to DDN and other components. The first SAS component to be reviewed is the SAP database . SAP is organized into two types of files. Relational

  8. Piezoelectric Pulsed Microjets

    DTIC Science & Technology

    2011-04-29

    microjets presents new design capabilities [ 9 , 18, 19]. An actuator is developed and tested here that integrates these two subsystems together to produce... actuator during testing. A digital pressure gauge was placed in-line after the accumulator to monitor bias pressure during testing. A check valve is used...bled off from the hydraulic actuator without affecting the pressure maintained in the accumulator. Air is bled from the system via a bleed valve within

  9. Personal, Family and School Influences on Secondary Pupils' Feelings of Safety at School, in the School Surroundings and at Home

    ERIC Educational Resources Information Center

    Mooij, Ton

    2012-01-01

    Different types of variables seem to influence school safety and a pupil's feelings of safety at school. The research question asks which risk and promotive variables should be integrated in a theoretical model to predict a pupil's feelings of safety at school, in the school surroundings and at home; what the outcomes are of an empirical check of…

  10. A cache-aided multiprocessor rollback recovery scheme

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent

    1989-01-01

    This paper demonstrates how previous uniprocessor cache-aided recovery schemes can be applied to multiprocessor architectures, for recovering from transient processor failures, utilizing private caches and a global shared memory. As with cache-aided uniprocessor recovery, the multiprocessor cache-aided recovery scheme of this paper can be easily integrated into standard bus-based snoopy cache coherence protocols. A consistent shared memory state is maintained without the necessity of global check-pointing.

  11. Pharmacy benefit forecast for a new interferon Beta-1a for the treatment of multiple sclerosis: development of a first-line decision tool for pharmacy-budget planning using administrative claims data.

    PubMed

    Meyer, Christina M; Phipps, Robert; Cooper, Deborah; Wright, Alan

    2003-01-01

    To estimate the incremental change in pharmacy per-member-permonth (PMPM) costs, according to various formulary designs, for a new interferon beta-1a product (IB1a2) using administrative claims data. Cross-sectional sex- and age-specific disease prevalence and treatment rates for relapsing, remitting multiple sclerosis (RRMS) patients were measured using integrated medical and pharmacy claims data from a 500,000- member employer group in the southern United States. Migration to IB1a2 from other drugs in the class was based on market-share data for new and existing RRMS patients. Duration of therapy was estimated by analyzing claims for current RRMS therapies. Daily therapy cost was provided by the manufacturer of IB1a2, adjusted for migration from other therapies, and multiplied by estimated volume to predict incremental and total PMPM cost impact. Market-share estimates were used to develop a PMPM cost forecast for the next 2 years. PMPM cost estimates were calculated for preferred (copayment tier 2) and nonpreferred (copayment tier 3) formulary designs with and without prior authorization (PA). One-way sensitivity analysis was performed to assess the influence of product pricing, duration of therapy, and other market factors. Annual incremental PMPM change was $0.047 for the scenario of third copayment tier with PA. The incremental change was greatest for those aged 55 to 65 years ($0.056 PMPM) and did not vary greatly by benefit design. Duration of therapy had the greatest impact on the PMPM estimate across benefit designs. IB1a2 will not cause a significant change in managed care pharmacy budgets under a variety of formulary conditions, according to this crosssectional analysis of current care-seeking behavior by RRMS patients. Economic impact may differ if IB1a2 expands RRMS patients. treatment-seeking behavior.

  12. A Conceptual Framework for Assessment of the Benefits of a Global Earth Observation System of Systems

    NASA Astrophysics Data System (ADS)

    Fritz, S.; Scholes, R. J.; Obersteiner, M.; Bouma, J.

    2007-12-01

    The aim of the Global Earth Observation System of Systems (GEOSS) is to contribute to human wellbeing though improving the information available to decision-makers at all levels relating to human health and safety, protection of the global environment, the reduction of losses from natural disasters, and achieving sustainable development. Specifically, GEOSS proposes that better international co-operation in the collection, interpretation and sharing of Earth Observation information is an important and cost-effective mechanism for achieving this aim. While there is a widespread intuition that this proposition is correct, at some point the following question needs to be answered: how much additional investment in Earth Observation (and specifically, in its international integration) is enough? This leads directly to some challenging subsidiary questions, such as how can the benefits of Earth Observation be assessed? What are the incremental costs of GEOSS? Are there societal benefit areas where the return on investment is higher than in others? The Geo-Bene project has developed a `benefit chain' concept as a framework for addressing these questions. The basic idea is that an incremental improvement in the observing system (including its data collection, interpretation and information-sharing aspects) will result in an improvement in the quality of decisions based on that information. This will in turn lead to better societal outcomes, which have a value. This incremental value must be judged against the incremental cost of the improved observation system. Since in many cases there will be large uncertainties in the estimation of both the costs and the benefits, and it may not be possible to express one or both of them in monetary terms, we show how order-of-magnitude approaches and a qualitative understanding of the shape of the cost-benefit curves can help guide rational investment decision in Earth Observation systems.

  13. Incremental Value (Bayesian Framework) of Thoracic Ultrasonography over Thoracic Auscultation for Diagnosis of Bronchopneumonia in Preweaned Dairy Calves.

    PubMed

    Buczinski, S; Ménard, J; Timsit, E

    2016-07-01

    Thoracic ultrasonography (TUS) is a specific and relatively sensitive method to diagnose bronchopneumonia (BP) in dairy calves. Unfortunately, as it requires specific training and equipment, veterinarians typically base their diagnosis on thoracic auscultation (AUSC), which is rapid and easy to perform. We hypothesized that the use of TUS, in addition to AUSC, can significantly increase accuracy of BP diagnosis. Therefore, the objectives were to (i) determine the incremental value of TUS over AUSC for diagnosis of BP in preweaned dairy calves and (ii) assess diagnostic accuracy of AUSC. Two hundred and nine dairy calves (<1 month of age) were enrolled in this cross-sectional study. Prospective cross-sectional study. All calves from a veal calves unit were examined (independent operators) using the Wisconsin Calf Respiratory Scoring Criteria (CRSC), AUSC, and TUS. A Bayesian latent class approach was used to estimate the incremental value of AUSC over TUS (integrated discrimination improvement [IDI]) and the diagnostic accuracy of AUSC. Abnormal CRSC, AUSC, and TUS were recorded in 3.3, 53.1, and 23.9% of calves, respectively. AUSC was sensitive (72.9%; 95% Bayesian credible interval [BCI]: 50.1-96.4%), but not specific (53.3%; 95% BCI: 43.3-64.0%) to diagnose BP. Compared to AUSC, TUS was more specific (92.9%; 95% BCI: 86.5-97.1%), but had similar sensitivity (76.5%; 95% BCI: 60.2-88.8%). The incremental value of TUS over AUSC was high (IDI = 43.7%; 5% BCI: 22.0-63.0%) significantly improving proportions of sick and healthy calves appropriately classified. The use of TUS over AUSC significantly improved accuracy of BP diagnosis in dairy calves. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  14. The Technique of Changing the Drive Method of Micro Step Drive and Sensorless Drive for Hybrid Stepping Motor

    NASA Astrophysics Data System (ADS)

    Yoneda, Makoto; Dohmeki, Hideo

    The position control system with the advantage large torque, low vibration, and high resolution can be obtained by the constant current micro step drive applied to hybrid stepping motor. However loss is large, in order not to be concerned with load torque but to control current uniformly. As the one technique of a position control system in which high efficiency is realizable, the same sensorless control as a permanent magnet motor is effective. But, it was the purpose that the control method proposed until now controls speed. Then, this paper proposed changing the drive method of micro step drive and sensorless drive. The change of the drive method was verified from the simulation and the experiment. On no load, it was checked not producing change of a large speed at the time of a change by making electrical angle and carrying out zero reset of the integrator. On load, it was checked that a large speed change arose. The proposed system could change drive method by setting up the initial value of an integrator using the estimated result, without producing speed change. With this technique, the low loss position control system, which employed the advantage of the hybrid stepping motor, has been built.

  15. Fiber optics in composite materials: materials with nerves of glass

    NASA Astrophysics Data System (ADS)

    Measures, Raymond M.

    1990-08-01

    A Fiber Optic BasedSmart Structure wiipossess a structurally integrated optical microsensor system for determining its state. This built-in sensor system should, in real-time, be able to: evaluate the strain or deformation of a structure, monitor if its vibrating or subject to excessive loads, check its temperature and warn of the appearance of any hot spots. In addition a Smart Structure should maintain a vigilant survelliance over its structural integrity. The successful development of Smart StructureTechnolgy could lead to: aircraft that are safer, lighter, more efficient, easier to maintain and to service; pipelines, pressure vessels and storage tanks that constantly monitor their structuralintegrity and immediately issue an alert ifany problem is detected; space platforms that check forpressure leaks, unwanted vibration, excess thermal buildup, and deviation from some preassigned shape.This technology is particularly appropriate for composite materials where internal damage generated by: impacts, manufacturing flaws, excessive loading or fatigue could be detected and assessed. In service monitoring of structural loads, especially in regions like wing roots of aircraft, could be ofconsiderable benefit in helping to avoid structural overdesign and reduce weight. Structurally imbedded optical fibers sensors might also serve to monitor the cure state of composite thermosets during their fabrication and thereby contribute to improved quality control of these products.

  16. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  17. Integrating Climate Change Resilience Features into the Incremental Refinement of an Existing Marine Park

    PubMed Central

    Beckley, Lynnath E.; Kobryn, Halina T.; Lombard, Amanda T.; Radford, Ben; Heyward, Andrew

    2016-01-01

    Marine protected area (MPA) designs are likely to require iterative refinement as new knowledge is gained. In particular, there is an increasing need to consider the effects of climate change, especially the ability of ecosystems to resist and/or recover from climate-related disturbances, within the MPA planning process. However, there has been limited research addressing the incorporation of climate change resilience into MPA design. This study used Marxan conservation planning software with fine-scale shallow water (<20 m) bathymetry and habitat maps, models of major benthic communities for deeper water, and comprehensive human use information from Ningaloo Marine Park in Western Australia to identify climate change resilience features to integrate into the incremental refinement of the marine park. The study assessed the representation of benthic habitats within the current marine park zones, identified priority areas of high resilience for inclusion within no-take zones and examined if any iterative refinements to the current no-take zones are necessary. Of the 65 habitat classes, 16 did not meet representation targets within the current no-take zones, most of which were in deeper offshore waters. These deeper areas also demonstrated the highest resilience values and, as such, Marxan outputs suggested minor increases to the current no-take zones in the deeper offshore areas. This work demonstrates that inclusion of fine-scale climate change resilience features within the design process for MPAs is feasible, and can be applied to future marine spatial planning practices globally. PMID:27529820

  18. KSC-2013-3074

    NASA Image and Video Library

    2013-07-22

    HOUSTON - NASA astronaut Serena Aunon and Andrea Gilkey, a human factors engineer with The Boeing Company, tag up before Aunon puts on her orange launch-and-entry suit for a fit check evaluation of the CST-100 spacecraft at the company's Houston Product Support Center. Aunon's fit check will help evaluate a crew's maneuverability in the spacecraft and test communications. Boeing's CST-100 is being designed to transport crew members or a mix of crew and cargo to low-Earth-orbit destinations, including the International Space Station. The evaluation is part of the ongoing work supporting Boeing's funded Space Act Agreement with NASA's Commercial Crew Program, or CCP, during the agency's Commercial Crew Integrated Capability, or CCiCap, initiative. CCiCap is intended to make commercial human spaceflight services available for government and commercial customers. To learn more about CCP, visit http://www.nasa.gov/commercialcrew. Photo credit: NASA/Robert Markowitz

  19. The Classroom Check-up: A Classwide Teacher Consultation Model for Increasing Praise and Decreasing Disruptive Behavior

    PubMed Central

    Reinke, Wendy M.; Lewis-Palmer, Teri; Merrell, Kenneth

    2008-01-01

    School-based consultation typically focuses on individual student problems and on a small number of students rather than on changing the classroom system. The Classroom Check-up (CCU) was developed as a classwide consultation model to address the need for classroom level support while minimizing treatment integrity problems common to school-based consultation. The purpose of the study was to evaluate the effects of the CCU and Visual Performance Feedback on teacher and student behavior. Results indicated that implementation of the CCU plus Visual Performance Feedback increased teacher implementation of classroom management strategies, including increased use of praise, use of behavior specific praise, and decreased use of reprimands. Further, these changes in teacher behavior contributed to decreases in classroom disruptive behavior. The results are encouraging because they suggest that consultation at the classroom level can create meaningful teacher and student behavior change. PMID:19122805

  20. Workers in the VAB test SRB cables on STS-98 solid rocket boosters

    NASA Technical Reports Server (NTRS)

    2001-01-01

    KENNEDY SPACE CENTER, Fla. -- Working near the top of a solid rocket booster, NASA and United Space Alliance SRB technicians hook up SRB cables to a CIRRUS computer for testing. From left are Jim Glass, with USA, performing a Flex test on the cable; Steve Swichkow, with NASA, and Jim Silviano, with USA, check the results on a computer. The SRB is part of Space Shuttle Atlantis, rolled back from Launch Pad 39A in order to conduct tests on the cables. A prior extensive evaluation of NASA'''s SRB cable inventory on the shelf revealed conductor damage in four (of about 200) cables. Shuttle managers decided to prove the integrity of the system tunnel cables already on Atlantis before launching. Workers are conducting inspections, making continuity checks and conducting X-ray analysis on the cables. The launch has been rescheduled no earlier than Feb. 6.

Top