Sample records for simplification-driven automated partial

  1. Business not as usual

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Typical design simplification ideas which reduce costs; combustion chamber design simplification; combustion chambers; castings vs. machined and welded forgings; automated inspection; and life cycle costs are outlined. This presentation is represented by viewgraphs.

  2. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  3. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  4. An Empirical Evaluation of Automated Theorem Provers in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). We discuss the unique requirements this application places on the ATPs, focusing on automation, proof checking, and usability. For full automation, however, the obligations must be aggressively preprocessed and simplified, and we demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATPs to solve the proof tasks. Our results are based on 13 certification experiments that lead to more than 25,000 proof tasks which have each been attempted by Vampire, Spass, e-setheo, and Otter. The proofs found by Otter have been proof-checked by IVY.

  5. Simplification of a light-based model for estimating final internode length in greenhouse cucumber canopies.

    PubMed

    Kahlen, Katrin; Stützel, Hartmut

    2011-10-01

    Light quantity and quality affect internode lengths in cucumber (Cucumis sativus), whereby leaf area and the optical properties of the leaves mainly control light quality within a cucumber plant community. This modelling study aimed at providing a simple, non-destructive method to predict final internode lengths (FILs) using light quantity and leaf area data. Several simplifications of a light quantity and quality sensitive model for estimating FILs in cucumber have been tested. The direct simplifications substitute the term for the red : far-red (R : FR) ratios, by a term for (a) the leaf area index (LAI, m(2) m(-2)) or (b) partial LAI, the cumulative leaf area per m(2) ground, where leaf area per m(2) ground is accumulated from the top of each plant until a number, n, of leaves per plant is reached. The indirect simplifications estimate the input R : FR ratio based on partial leaf area and plant density. In all models, simulated FILs were in line with the measured FILs over various canopy architectures and light conditions, but the prediction quality varied. The indirect simplification based on leaf area of ten leaves revealed the best fit with measured data. Its prediction quality was even higher than of the original model. This study showed that for vertically trained cucumber plants, leaf area data can substitute local light quality data for estimating FIL data. In unstressed canopies, leaf area over the upper ten ranks seems to represent the feedback of the growing architecture on internode elongation with respect to light quality. This highlights the role of this domain of leaves as the primary source for the specific R : FR signal controlling the final length of an internode and could therefore guide future research on up-scaling local processes to the crop level.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasari, Venkateswara Rao

    The need for sustainable and secure nuclear energy is summarized. Driven by economics and public-private partnerships, the technology is evolving. Cost control and regulatory simplification are needed for a nuclear renaissance. Small modular reactors--simple, scalable, and inherently safe--may be the future.

  7. Hybrid stochastic simplifications for multiscale gene networks.

    PubMed

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-09-07

    Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach.

  8. Sequence-of-events-driven automation of the deep space network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  9. Sequence-of-Events-Driven Automation of the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  10. Combining Automated Theorem Provers with Symbolic Algebraic Systems: Position Paper

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Koga, Dennis (Technical Monitor)

    1999-01-01

    In contrast to pure mathematical applications where automated theorem provers (ATPs) are quite capable, proof tasks arising form real-world applications from the area of Software Engineering show quite different characteristics: they usually do not only contain much arithmetic (albeit often quite simple one), but they also often contain reasoning about specific structures (e.g. graphics, sets). Thus, an ATP must be capable of performing reasoning together with a fair amount of simplification, calculation and solving. Therefore, powerful simplifiers and other (symbolic and semi-symbolic) algorithms seem to be ideally suited to augment ATPs. In the following we shortly describe two major points of interest in combining SASs (symbolic algebraic systems) with top-down automated theorem provers (here: SETHEO [Let92, GLMS94]).

  11. Hybrid stochastic simplifications for multiscale gene networks

    PubMed Central

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-01-01

    Background Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. Results We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Conclusion Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach. PMID:19735554

  12. Reduced Attention Allocation during Short Periods of Partially Automated Driving: An Event-Related Potentials Study

    PubMed Central

    Solís-Marcos, Ignacio; Galvao-Carmona, Alejandro; Kircher, Katja

    2017-01-01

    Research on partially automated driving has revealed relevant problems with driving performance, particularly when drivers’ intervention is required (e.g., take-over when automation fails). Mental fatigue has commonly been proposed to explain these effects after prolonged automated drives. However, performance problems have also been reported after just a few minutes of automated driving, indicating that other factors may also be involved. We hypothesize that, besides mental fatigue, an underload effect of partial automation may also affect driver attention. In this study, such potential effect was investigated during short periods of partially automated and manual driving and at different speeds. Subjective measures of mental demand and vigilance and performance to a secondary task (an auditory oddball task) were used to assess driver attention. Additionally, modulations of some specific attention-related event-related potentials (ERPs, N1 and P3 components) were investigated. The mental fatigue effects associated with the time on task were also evaluated by using the same measurements. Twenty participants drove in a fixed-base simulator while performing an auditory oddball task that elicited the ERPs. Six conditions were presented (5–6 min each) combining three speed levels (low, comfortable and high) and two automation levels (manual and partially automated). The results showed that, when driving partially automated, scores in subjective mental demand and P3 amplitudes were lower than in the manual conditions. Similarly, P3 amplitude and self-reported vigilance levels decreased with the time on task. Based on previous studies, these findings might reflect a reduction in drivers’ attention resource allocation, presumably due to the underload effects of partial automation and to the mental fatigue associated with the time on task. Particularly, such underload effects on attention could explain the performance decrements after short periods of automated driving reported in other studies. However, further studies are needed to investigate this relationship in partial automation and in other automation levels. PMID:29163112

  13. Surrogate-Assisted Genetic Programming With Simplified Models for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen

    2017-09-01

    Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.

  14. A finite difference scheme for the equilibrium equations of elastic bodies

    NASA Technical Reports Server (NTRS)

    Phillips, T. N.; Rose, M. E.

    1984-01-01

    A compact difference scheme is described for treating the first-order system of partial differential equations which describe the equilibrium equations of an elastic body. An algebraic simplification enables the solution to be obtained by standard direct or iterative techniques.

  15. Designing a place for automation.

    PubMed

    Bazzoli, F

    1995-05-01

    Re-engineering is a hot topic in health care as market forces increase pressure to cut costs. Providers and payers that are redesigning their business processes are counting on information systems to help achieve simplification and make large gains in efficiency. But these same organizations say they're reluctant to make large upfront investments in information systems until they know exactly what role technology will play in the re-engineered entity.

  16. Automated Simplification of Full Chemical Mechanisms

    NASA Technical Reports Server (NTRS)

    Norris, A. T.

    1997-01-01

    A code has been developed to automatically simplify full chemical mechanisms. The method employed is based on the Intrinsic Low Dimensional Manifold (ILDM) method of Maas and Pope. The ILDM method is a dynamical systems approach to the simplification of large chemical kinetic mechanisms. By identifying low-dimensional attracting manifolds, the method allows complex full mechanisms to be parameterized by just a few variables; in effect, generating reduced chemical mechanisms by an automatic procedure. These resulting mechanisms however, still retain all the species used in the full mechanism. Full and skeletal mechanisms for various fuels are simplified to a two dimensional manifold, and the resulting mechanisms are found to compare well with the full mechanisms, and show significant improvement over global one step mechanisms, such as those by Westbrook and Dryer. In addition, by using an ILDM reaction mechanism in a CID code, a considerable improvement in turn-around time can be achieved.

  17. Basic results on the equations of magnetohydrodynamics of partially ionized inviscid plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunez, Manuel

    2009-10-15

    The equations of evolution of partially ionized plasmas have been far more studied in one of their many simplifications than in its original form. They present a relation between the velocity of each species, plus the magnetic and electric fields, which yield as an analog of Ohm's law a certain elliptic equation. Therefore, the equations represent a functional evolution system, not a classical one. Nonetheless, a priori estimates and theorems of existence may be obtained in appropriate Sobolev spaces.

  18. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)

    2001-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  19. On Abstractions and Simplifications in the Design of Human-Automation Interfaces

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.

  20. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  1. An X-Y Platform for Randsight-Type Instruments.

    ERIC Educational Resources Information Center

    Clewett, R. W.; And Others

    A hand-operated mechanical device, called an X-Y Platform because of its ability to undergo orthogonal planar motions reminiscent of X and Y coordinates, permits design simplifications and lower costs when used with RANDSIGHT-type closed circuit television to enable the partially sighted to read and write. This paper presents a detailed discussion…

  2. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less

  3. Agile Acceptance Test–Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software

    PubMed Central

    Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-01-01

    Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922

  4. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    PubMed

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.

  5. Adapting the Euler-Lagrange equation to study one-dimensional motions under the action of a constant force

    NASA Astrophysics Data System (ADS)

    Dias, Clenilda F.; Araújo, Maria A. S.; Carvalho-Santos, Vagson L.

    2018-01-01

    The Euler-Lagrange equations (ELE) are very important in the theoretical description of several physical systems. In this work we have used a simplified form of ELE to study one-dimensional motions under the action of a constant force. From the use of the definition of partial derivative, we have proposed two operators, here called mean delta operators, which may be used to solve the ELE in a simplest way. We have applied this simplification to solve three simple mechanical problems in which the particle is under the action of the gravitational field: a free fall body, the Atwood’s machine and the inclined plan. The proposed simplification can be used to introduce the lagrangian formalism in teaching classical mechanics in introductory physics courses.

  6. A toy terrestrial carbon flow model

    NASA Technical Reports Server (NTRS)

    Parton, William J.; Running, Steven W.; Walker, Brian

    1992-01-01

    A generalized carbon flow model for the major terrestrial ecosystems of the world is reported. The model is a simplification of the Century model and the Forest-Biogeochemical model. Topics covered include plant production, decomposition and nutrient cycling, biomes, the utility of the carbon flow model for predicting carbon dynamics under global change, and possible applications to state-and-transition models and environmentally driven global vegetation models.

  7. Partial Automated Alignment and Integration System

    NASA Technical Reports Server (NTRS)

    Kelley, Gary Wayne (Inventor)

    2014-01-01

    The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.

  8. Automated structure determination of proteins with the SAIL-FLYA NMR method.

    PubMed

    Takeda, Mitsuhiro; Ikeya, Teppei; Güntert, Peter; Kainosho, Masatsune

    2007-01-01

    The labeling of proteins with stable isotopes enhances the NMR method for the determination of 3D protein structures in solution. Stereo-array isotope labeling (SAIL) provides an optimal stereospecific and regiospecific pattern of stable isotopes that yields sharpened lines, spectral simplification without loss of information, and the ability to collect rapidly and evaluate fully automatically the structural restraints required to solve a high-quality solution structure for proteins up to twice as large as those that can be analyzed using conventional methods. Here, we describe a protocol for the preparation of SAIL proteins by cell-free methods, including the preparation of S30 extract and their automated structure analysis using the FLYA algorithm and the program CYANA. Once efficient cell-free expression of the unlabeled or uniformly labeled target protein has been achieved, the NMR sample preparation of a SAIL protein can be accomplished in 3 d. A fully automated FLYA structure calculation can be completed in 1 d on a powerful computer system.

  9. On the relation between friction losses and pressure pulsations caused by Rotor Stator interaction on the Francis-99 turbine

    NASA Astrophysics Data System (ADS)

    Østby, Petter T. K.; Tore Billdal, Jan; Haugen, Bjørn; Dahlhaug, Ole Gunnar

    2017-01-01

    High head Francis runners are subject to pressure pulsations caused by rotor stator interaction. To ensure safe operation of such turbines, it is important to be able to predict these pulsations. For turbine manufacturers it is often a dilemma whether to perform very advanced and time consuming CFD calculations or to rely on simpler calculations to save development time. This paper tries to evaluate simplifications of the CFD model while still capturing the RSI phenomena and ensuring that the calculation does not underpredict the pressure amplitudes. The effects which turbulence modeling, wall friction, viscosity and mesh have on the pressure amplitudes will be investigated along with time savings with each simplification. The hypothesis is that rotor stator interaction is manly driven by inviscid flow and can therefore be modeled by the Euler equations.

  10. A framework for the automated data-driven constitutive characterization of composites

    Treesearch

    J.G. Michopoulos; John Hermanson; T. Furukawa; A. Iliopoulos

    2010-01-01

    We present advances on the development of a mechatronically and algorithmically automated framework for the data-driven identification of constitutive material models based on energy density considerations. These models can capture both the linear and nonlinear constitutive response of multiaxially loaded composite materials in a manner that accounts for progressive...

  11. Welding technology transfer task/laser based weld joint tracking system for compressor girth welds

    NASA Technical Reports Server (NTRS)

    Looney, Alan

    1991-01-01

    Sensors to control and monitor welding operations are currently being developed at Marshall Space Flight Center. The laser based weld bead profiler/torch rotation sensor was modified to provide a weld joint tracking system for compressor girth welds. The tracking system features a precision laser based vision sensor, automated two-axis machine motion, and an industrial PC controller. The system benefits are elimination of weld repairs caused by joint tracking errors which reduces manufacturing costs and increases production output, simplification of tooling, and free costly manufacturing floor space.

  12. Planning actions in robot automated operations

    NASA Technical Reports Server (NTRS)

    Das, A.

    1988-01-01

    Action planning in robot automated operations requires intelligent task level programming. Invoking intelligence necessiates a typical blackboard based architecture, where, a plan is a vector between the start frame and the goal frame. This vector is composed of partially ordered bases. A partial ordering of bases presents good and bad sides in action planning. Partial ordering demands the use of a temporal data base management system.

  13. Advances in Optical Fiber-Based Faraday Rotation Diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, A D; McHale, G B; Goerz, D A

    2009-07-27

    In the past two years, we have used optical fiber-based Faraday Rotation Diagnostics (FRDs) to measure pulsed currents on several dozen capacitively driven and explosively driven pulsed power experiments. We have made simplifications to the necessary hardware for quadrature-encoded polarization analysis, including development of an all-fiber analysis scheme. We have developed a numerical model that is useful for predicting and quantifying deviations from the ideal diagnostic response. We have developed a method of analyzing quadrature-encoded FRD data that is simple to perform and offers numerous advantages over several existing methods. When comparison has been possible, we have seen good agreementmore » with our FRDs and other current sensors.« less

  14. Underlying Information Technology Tailored Quantum Error Correction

    DTIC Science & Technology

    2006-07-28

    typically constructed by using an optical beam splitter . • We used a decoherence-free-subspace encoding to reduce the sensitivity of an optical Deutsch...simplification of design constraints in solid state QC (incl. quantum dots and superconducting qubits), hybrid quantum error correction and prevention methods...process tomography on one- and two-photon polarisation states, from full and partial data "• Accomplished complete two-photon QPT. "• Discovered surprising

  15. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    NASA Technical Reports Server (NTRS)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  16. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  17. Fast and accurate metrology of multi-layered ceramic materials by an automated boundary detection algorithm developed for optical coherence tomography data

    PubMed Central

    Ekberg, Peter; Su, Rong; Chang, Ernest W.; Yun, Seok Hyun; Mattsson, Lars

    2014-01-01

    Optical coherence tomography (OCT) is useful for materials defect analysis and inspection with the additional possibility of quantitative dimensional metrology. Here, we present an automated image-processing algorithm for OCT analysis of roll-to-roll multilayers in 3D manufacturing of advanced ceramics. It has the advantage of avoiding filtering and preset modeling, and will, thus, introduce a simplification. The algorithm is validated for its capability of measuring the thickness of ceramic layers, extracting the boundaries of embedded features with irregular shapes, and detecting the geometric deformations. The accuracy of the algorithm is very high, and the reliability is better than 1 µm when evaluating with the OCT images using the same gauge block step height reference. The method may be suitable for industrial applications to the rapid inspection of manufactured samples with high accuracy and robustness. PMID:24562018

  18. Evaluation of engineering foods for closed Ecological Life Support System (CELSS)

    NASA Technical Reports Server (NTRS)

    Karel, M.

    1982-01-01

    A nutritionally adequate and acceptable diet was evaluated and developed. A design for a multipurpose food plant is discussed. The types and amounts of foods needed to be regenerated in a partially closed ecological life support system (PCELSS) were proposed. All steps of food processes to be utilized in the multipurpose food plant of PCELSS were also considered. Equipment specifications, simplification of the proposed processes, and food waste treatment were analyzed.

  19. Multi-Scale Fusion of Information for Uncertainty Quantification and Management in Large-Scale Simulations

    DTIC Science & Technology

    2015-12-02

    simplification of the equations but at the expense of introducing modeling errors. We have shown that the Wick solutions have accuracy comparable to...the system of equations for the coefficients of formal power series solutions . Moreover, the structure of this propagator is seemingly universal, i.e...the problem of computing the numerical solution to kinetic partial differential equa- tions involving many phase variables. These types of equations

  20. Automation of a suturing device for minimally invasive surgery.

    PubMed

    Göpel, Tobias; Härtl, Felix; Schneider, Armin; Buss, Martin; Feussner, Hubertus

    2011-07-01

    In minimally invasive surgery, hand suturing is categorized as a challenge in technique as well as in its duration. This calls for an easily manageable tool, permitting an all-purpose, cost-efficient, and secure viscerosynthesis. Such a tool for this field already exists: the Autosuture EndoStitch(®). In a series of studies the potential for the EndoStitch to accelerate suturing has been proven. However, its ergonomics still limits its applicability. The goal of this study was twofold: propose an optimized and partially automated EndoStitch and compare the conventional EndoStitch to the optimized and partially automated EndoStitch with respect to the speed and precision of suturing. Based on the EndoStitch, a partially automated suturing tool has been developed. With the aid of a DC motor, triggered by a button, one can suture by one-fingered handling. Using the partially automated suturing manipulator, 20 surgeons with different levels of laparoscopic experience successfully completed a continuous suture with 10 stitches using the conventional and the partially automated suture manipulator. Before that, each participant was given 1 min of instruction and 1 min for training. Absolute suturing time and stitch accuracy were measured. The quality of the automated EndoStitch with respect to manipulation was tested with the aid of a standardized questionnaire. To compare the two instruments, t tests were used for suturing accuracy and time. Of the 20 surgeons with laparoscopic experience (fewer than 5 laparoscopic interventions, n=9; fewer than 20 laparoscopic interventions, n=7; more than 20 laparoscopic interventions, n=4), there was no significant difference between the two tested systems with respect to stitching accuracy. However, the suturing time was significantly shorter with the Autostitch (P=0.01). The difference in accuracy and speed was not statistically significant considering the laparoscopic experience of the surgeons. The weight and size of the Autostitch have been criticized as well as its cable. However, the comfortable handhold, automatic needle change, and ergonomic manipulation have been rated positive. Partially automated suturing in minimally invasive surgery offers advantages with respect to the speed of operation and ergonomics. Ongoing work in this field has to concentrate on minimization, implementation in robotic systems, and development of new operation methods (NOTES).

  1. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    PubMed

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  2. Apparatus for the concurrent ultrasonic inspection of partially completed welds

    DOEpatents

    Johnson, John A.

    2000-01-01

    An apparatus for the concurrent nondestructive evaluation of partially completed welds is described and which is used in combination with an automated welder and which includes an ultrasonic signal generator mounted on the welder and which generates an ultrasonic signal which is directed toward one side of the partially completed welds; an ultrasonic signal receiver mounted on the automated welder for detecting ultrasonic signals which are transmitted by the ultrasonic signal generator and which are reflected or diffracted from one side of the partially completed weld or which passes through a given region of the partially completed weld; and an analysis assembly coupled with the ultrasonic signal receiver and which processes the ultrasonic signals received by the ultrasonic signal receiver to identify welding flaws in the partially completed weld.

  3. Nebraska data collection.

    DOT National Transportation Integrated Search

    2015-12-01

    Automated pavement performance data collection is a method that uses advanced technology to collect detailed road surface : distress information at traffic speed. Agencies are driven to use automated survey techniques to enhance or replace their : cu...

  4. Improving Information Exchange and Coordination amongst Homeland Security Organizations (Briefing Charts)

    DTIC Science & Technology

    2005-06-01

    need for user-defined dashboard • automated monitoring of web data sources • task driven data aggregation and display Working toward automated processing of task, resource, and intelligence updates

  5. Synoptic evaluation of scale-dependent metrics for hydrographic line feature geometry

    USGS Publications Warehouse

    Stanislawski, Larry V.; Buttenfield, Barbara P.; Raposo, Paulo; Cameron, Madeline; Falgout, Jeff T.

    2015-01-01

    Methods of acquisition and feature simplification for vector feature data impact cartographic representations and scientific investigations of these data, and are therefore important considerations for geographic information science (Haunert and Sester 2008). After initial collection, linear features may be simplified to reduce excessive detail or to furnish a reduced-scale version of the features through cartographic generalization (Regnauld and McMaster 2008, Stanislawski et al. 2014). A variety of algorithms exist to simplify linear cartographic features, and all of the methods affect the positional accuracy of the features (Shahriari and Tao 2002, Regnauld and McMaster 2008, Stanislawski et al. 2012). In general, simplification operations are controlled by one or more tolerance parameters that limit the amount of positional change the operation can make to features. Using a single tolerance value can have varying levels of positional change on features; depending on local shape, texture, or geometric characteristics of the original features (McMaster and Shea 1992, Shahriari and Tao 2002, Buttenfield et al. 2010). Consequently, numerous researchers have advocated calibration of simplification parameters to control quantifiable properties of resulting changes to the features (Li and Openshaw 1990, Raposo 2013, Tobler 1988, Veregin 2000, and Buttenfield, 1986, 1989).This research identifies relations between local topographic conditions and geometric characteristics of linear features that are available in the National Hydrography Dataset (NHD). The NHD is a comprehensive vector dataset of surface 18 th ICA Workshop on Generalisation and Multiple Representation, Rio de Janiero, Brazil 2015 2 water features within the United States that is maintained by the U.S. Geological Survey (USGS). In this paper, geometric characteristics of cartographic representations for natural stream and river features are summarized for subbasin watersheds within entire regions of the conterminous United States and compared to topographic metrics. A concurrent processing workflow is implemented using a Linux high-performance computing cluster to simultaneously process multiple subbasins, and thereby complete the work in a fraction of the time required for a single-process environment. In addition, similar metrics are generated for several levels of simplification of the hydrographic features to quantify the effects of simplification over the various landscape conditions. Objectives of this exploratory investigation are to quantify geometric characteristics of linear hydrographic features over the various terrain conditions within the conterminous United States and thereby illuminate relations between stream geomorphological conditions and cartographic representation. The synoptic view of these characteristics over regional watersheds that is afforded through concurrent processing, in conjunction with terrain conditions, may reveal patterns for classifying cartographic stream features into stream geomorphological classes. Furthermore, the synoptic measurement of the amount of change in geometric characteristics caused by the several levels of simplification can enable estimation of tolerance values that appropriately control simplification-induced geometric change of the cartographic features within the various geomorphological classes in the country. Hence, these empirically derived rules or relations could help generate multiscale-representations of features through automated generalization that adequately maintain surface drainage variations and patterns reflective of the natural stream geomorphological conditions across the country.

  6. Understanding and avoiding potential problems in implementing automation

    NASA Astrophysics Data System (ADS)

    Rouse, W. B.; Morris, N. M.

    1985-11-01

    Technology-driven efforts to implement automation often encounter problems due to lack of acceptance or begrudging acceptance by the personnel involved. It is argued in this paper that the level of automation perceived by an individual heavily influences whether or not the automation is accepted by that individual. The factors that appear to affect perceived level of automation are discussed. Issues considered include the impact of automation on the system and the individual, correlates of acceptance, problems and risks of automation, and factors influencing alienation. Based on an understanding of these issues, a set of eight guidelines is proposed as a possible means of avoiding problems in implementing automation.

  7. Understanding and avoiding potential problems in implementing automation

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.; Morris, N. M.

    1985-01-01

    Technology-driven efforts to implement automation often encounter problems due to lack of acceptance or begrudging acceptance by the personnel involved. It is argued in this paper that the level of automation perceived by an individual heavily influences whether or not the automation is accepted by that individual. The factors that appear to affect perceived level of automation are discussed. Issues considered include the impact of automation on the system and the individual, correlates of acceptance, problems and risks of automation, and factors influencing alienation. Based on an understanding of these issues, a set of eight guidelines is proposed as a possible means of avoiding problems in implementing automation.

  8. Computational efficiency for the surface renewal method

    NASA Astrophysics Data System (ADS)

    Kelley, Jason; Higgins, Chad

    2018-04-01

    Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.

  9. Gravity-driven pH adjustment for site-specific protein pKa measurement by solution-state NMR

    NASA Astrophysics Data System (ADS)

    Li, Wei

    2017-12-01

    To automate pH adjustment in site-specific protein pKa measurement by solution-state NMR, I present a funnel with two caps for the standard 5 mm NMR tube. The novelty of this simple-to-build and inexpensive apparatus is that it allows automatic gravity-driven pH adjustment within the magnet, and consequently results in a fully automated NMR-monitored pH titration without any hardware modification on the NMR spectrometer.

  10. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  11. Evaluating the effects of trophic complexity on a keystone predator by disassembling a partial intraguild predation food web.

    PubMed

    Davenport, Jon M; Chalcraft, David R

    2012-01-01

    1. Many taxa can be found in food webs that differ in trophic complexity, but it is unclear how trophic complexity affects the performance of particular taxa. In pond food webs, larvae of the salamander Ambystoma opacum occupy the intermediate predator trophic position in a partial intraguild predation (IGP) food web and can function as keystone predators. Larval A. opacum are also found in simpler food webs lacking either top predators or shared prey. 2. We conducted an experiment where a partial IGP food web was simplified, and we measured the growth and survival of larval A. opacum in each set of food webs. Partial IGP food webs that had either a low abundance or high abundance of total prey were also simplified by independently removing top predators and/or shared prey. 3. Removing top predators always increased A. opacum survival, but removal of shared prey had no effect on A. opacum survival, regardless of total prey abundance. 4. Surprisingly, food web simplification had no effect on the growth of A. opacum when present in food webs with a low abundance of prey but had important effects on A. opacum growth in food webs with a high abundance of prey. Simplifying a partial IGP food web with a high abundance of prey reduced A. opacum growth when either top predators or shared prey were removed from the food web and the loss of top predators and shared prey influenced A. opacum growth in a non-additive fashion. 5. The non-additive response in A. opacum growth appears to be the result of supplemental prey availability augmenting the beneficial effects of top predators. Top predators had a beneficial effect on A. opacum populations by reducing the abundance of A. opacum present and thereby reducing the intensity of intraspecific competition. 6. Our study indicates that the effects of food web simplification on the performance of A. opacum are complex and depend on both how a partial IGP food web is simplified and how abundant prey are in the food web. These findings are important because they demonstrate how trophic complexity can create variation in the performance of intermediate predators that play important roles in temporary pond food webs. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.

  12. AN AUTOMATED MONITORING SYSTEM FOR FISH PHYSIOLOGY AND TOXICOLOGY

    EPA Science Inventory

    This report describes a data acquisition and control (DAC) system that was constructed to manage selected physiological measurements and sample control for aquatic physiology and toxicology. Automated DAC was accomplished with a microcomputer running menu-driven software develope...

  13. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. The Generic Mapping Tools 6: Classic versus Modern Mode

    NASA Astrophysics Data System (ADS)

    Wessel, P.; Uieda, L.; Luis, J. M. F.; Scharroo, R.; Smith, W. H. F.; Wobbe, F.

    2017-12-01

    The Generic Mapping Tools (GMT; gmt.soest.hawaii.edu) is a 25-year old, mature open-source software package for the analysis and display of geoscience data (e.g., interpolate, filter, manipulate, project and plot temporal and spatial data). The GMT "toolbox" includes about 80 core and 40 supplemental modules sharing a common set of command options, file structures, and documentation. GMT5, when released in 2013, introduced an application programming interface (API) to allow programmatic access to GMT from other computing environments. Since then, we have released a GMT/MATLAB toolbox, an experimental GMT/Julia package, and will soon introduce a GMT/Python module. In developing these extensions, we wanted to simplify the GMT learning curve but quickly realized the main stumbling blocks to GMT command-line mastery would be ported to the external environments unless we introduced major changes. With thousands of GMT scripts already in use by scientists around the world, we were acutely aware of the need for backwards compatibility. Our solution, to be released as GMT 6, was to add a modern run mode that complements the classic mode offered so far. Modern mode completely eliminates the top three obstacles for new (and not so new) GMT users: (1) The responsibility to properly stack PostScript layers manually (i.e., the -O -K dance), (2) the responsibility of handling output redirection of PostScript (create versus append), and (3) the need to provide commands with repeated information about regions (-R) and projections (-J). Thus, modern mode results in shorter, simpler scripts with fewer pitfalls, without interfering with classic scripts. Our implementation adds five new commands that begin and end a modern session, simplify figure management, automate the conversion of PostScript to more suitable formats, automate region detection, and offer a new automated subplot environment for multi-panel illustrations. Here, we highlight the GMT modern mode and the simplifications it offers, both for command-line use and in external environments. GMT 6 is in beta mode but accessible from our repository. Numerous improvements have been added in addition to modern mode; we expect a formal release in early 2018. Publication partially supported by FCT project UID/GEO/50019/2013 - Instituto D. Luiz.

  15. Driving performance at lateral system limits during partially automated driving.

    PubMed

    Naujoks, Frederik; Purucker, Christian; Wiedemann, Katharina; Neukum, Alexandra; Wolter, Stefan; Steiger, Reid

    2017-11-01

    This study investigated driver performance during system limits of partially automated driving. Using a motion-based driving simulator, drivers encountered different situations in which a partially automated vehicle could no longer safely keep the lateral guidance. Drivers were distracted by a non-driving related task on a touch display or driving without an additional secondary task. While driving in partially automated mode drivers could either take their hands off the steering wheel for only a short period of time (10s, so-called 'Hands-on' variant) or for an extended period of time (120s, so-called 'Hands-off' variant). When the system limit was reached (e.g., when entering a work zone with temporary lines), the lateral vehicle control by the automation was suddenly discontinued and a take-over request was issued to the drivers. Regardless of the hands-off interval and the availability of a secondary task, all drivers managed the transition to manual driving safely. No lane exceedances were observed and the situations were rated as 'harmless' by the drivers. The lack of difference between the hands-off intervals can be partly attributed to the fact that most of the drivers kept contact to the steering wheel, even in the hands-off condition. Although all drivers were able to control the system limits, most of them could not explain why exactly the take-over request was issued. The average helpfulness of the take-over request was rated on an intermediate level. Consequently, providing drivers with information about the reason for a system limit can be recommended. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Automated MAD and MIR structure solution

    PubMed Central

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316

  17. Management tools for the 21st century environmental office: The role of office automation and information technology. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fittipaldi, J.J.; Sliwinski, B.J.

    1991-06-01

    Army environmental planning and compliance activities continue to grow in magnitude and complexity, straining the resources of installation environmental offices. New efficiencies must be found to meet the increasing demands of planning and compliance imperatives. This study examined how office automation/information technology (OA/IT) may boost productivity in U.S. Army Training and Doctrine Command (TRADOC) installation environmental offices between now and the year 2000. A survey of four TRADOC installation environmental offices revealed that the workload often exceeds the capacity of staff. Computer literacy among personnel varies widely, limiting the benefits available from OA/IT now in use. Since environmental personnel aremore » primarily gatherers and processors of information, better implementation of OA/IT could substantially improve work quality and productivity. Advanced technologies expected to reach the consumer market during the 1990s will dramatically increase the potential productivity of environmental office personnel. Multitasking operating environments will allow simultaneous automation of communications, document processing, and engineering software. Increased processor power and parallel processing techniques will spur simplification of the user interface and greater software capabilities in general. The authors conclude that full implementation of this report's OA/IT recommendations could double TRADOC environmental office productivity by the year 2000.« less

  18. Analyzing Living Surveys: Visualization Beyond the Data Release

    NASA Astrophysics Data System (ADS)

    Buddelmeijer, H.; Noorishad, P.; Williams, D.; Ivanova, M.; Roerdink, J. B. T. M.; Valentijn, E. A.

    2015-09-01

    Surveys need to provide more than periodic data releases. Science often requires data that is not captured in such releases. This mismatch between the constraints set by a fixed data release and the needs of the scientists is solved in the Astro-WISE information system by extending its request-driven data handling into the analysis domain. This leads to Query-Driven Visualization, where all data handling is automated and scalable by exploiting the strengths of data pulling. Astro-WISE is data-centric: new data creates itself automatically, if no suitable existing data can be found to fulfill a request. This approach allows scientists to visualize exactly the data they need, without any manual data management, freeing their time for research. The benefits of query-driven visualization are highlighted by searching for distant quasars in KiDS, a 1500 square degree optical survey. KiDS needs to be treated as a living survey to minimize the time between observation and (spectral) followup. The first window of opportunity would be missed if it were necessary to wait for data releases. The results from the default processing pipelines are used for a quick and broad selection of quasar candidates. More precise measurements of source properties can subsequently be requested to downsize the candidate set, requiring partial reprocessing of the images. Finally, the raw and reduced pixels themselves are inspected by eye to rank the final candidate list. The quality of the resulting candidate list and the speed of its creation were only achievable due to query driven-visualization of the living archive.

  19. TFTR diagnostic control and data acquisition system

    NASA Astrophysics Data System (ADS)

    Sauthoff, N. R.; Daniels, R. E.

    1985-05-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man-machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ``groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.

  20. TFTR diagnostic control and data acquisition system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sauthoff, N.R.; Daniels, R.E.; PPL Computer Division

    1985-05-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man--machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ''groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.

  1. Performance of an Additional Task During Level 2 Automated Driving: An On-Road Study Comparing Drivers With and Without Experience With Partial Automation.

    PubMed

    Solís-Marcos, Ignacio; Ahlström, Christer; Kircher, Katja

    2018-05-01

    To investigate the influence of prior experience with Level 2 automation on additional task performance during manual and Level 2 partially automated driving. Level 2 automation is now on the market, but its effects on driver behavior remain unclear. Based on previous studies, we could expect an increase in drivers' engagement in secondary tasks during Level 2 automated driving, but it is yet unknown how drivers will integrate all the ongoing demands in such situations. Twenty-one drivers (12 without, 9 with Level 2 automation experience) drove on a highway manually and with Level 2 automation (exemplified by Volvo Pilot Assist generation 2; PA2) while performing an additional task. In half of the conditions, the task could be interrupted (self-paced), and in the other half, it could not (system-paced). Drivers' visual attention, additional task performance, and other compensatory strategies were analyzed. Driving with PA2 led to decreased scores in the additional task and more visual attention to the dashboard. In the self-paced condition, all drivers looked more to the task and perceived a lower mental demand. The drivers experienced with PA2 used the system and the task more than the novice group and performed more overtakings. The additional task interfered more with Level 2 automation than with manual driving. The drivers, particularly the automation novice drivers, used some compensatory strategies. Automation designers need to consider these potential effects in the development of future automated systems.

  2. Science Goal Monitor: Science Goal Driven Automation for NASA Missions

    NASA Technical Reports Server (NTRS)

    Koratkar, Anuradha; Grosvenor, Sandy; Jung, John; Pell, Melissa; Matusow, David; Bailyn, Charles

    2004-01-01

    Infusion of automation technologies into NASA s future missions will be essential because of the need to: (1) effectively handle an exponentially increasing volume of scientific data, (2) successfully meet dynamic, opportunistic scientific goals and objectives, and (3) substantially reduce mission operations staff and costs. While much effort has gone into automating routine spacecraft operations to reduce human workload and hence costs, applying intelligent automation to the science side, i.e., science data acquisition, data analysis and reactions to that data analysis in a timely and still scientifically valid manner, has been relatively under-emphasized. In order to introduce science driven automation in missions, we must be able to: capture and interpret the science goals of observing programs, represent those goals in machine interpretable language; and allow spacecrafts onboard systems to autonomously react to the scientist's goals. In short, we must teach our platforms to dynamically understand, recognize, and react to the scientists goals. The Science Goal Monitor (SGM) project at NASA Goddard Space Flight Center is a prototype software tool being developed to determine the best strategies for implementing science goal driven automation in missions. The tools being developed in SGM improve the ability to monitor and react to the changing status of scientific events. The SGM system enables scientists to specify what to look for and how to react in descriptive rather than technical terms. The system monitors streams of science data to identify occurrences of key events previously specified by the scientist. When an event occurs, the system autonomously coordinates the execution of the scientist s desired reactions. Through SGM, we will improve om understanding about the capabilities needed onboard for success, develop metrics to understand the potential increase in science returns, and develop an operational prototype so that the perceived risks associated with increased use of automation can be reduced.

  3. Text Simplification and Comprehensible Input: A Case for an Intuitive Approach

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, David; McNamara, Danielle S.

    2012-01-01

    Texts are routinely simplified to make them more comprehensible for second language learners. However, the effects of simplification upon the linguistic features of texts remain largely unexplored. Here we examine the effects of one type of text simplification: intuitive text simplification. We use the computational tool, Coh-Metrix, to examine…

  4. Automation for deep space vehicle monitoring

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.

    1991-01-01

    Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.

  5. Results of an Experimental Exploration of Advanced Automated Geospatial Tools: Agility in Complex Planning

    DTIC Science & Technology

    2009-06-01

    AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet

  6. Human factors phase III : effects of train control technology on operator performance

    DOT National Transportation Integrated Search

    2005-01-01

    This report describes a study evaluating the effects of train control technology on locomotive engineer performance. Several types : of train control systems were evaluated: partial automation (cruise control and programmed stop) and full automation ...

  7. Human factors phase III : effects of train control technology on operator performance.

    DOT National Transportation Integrated Search

    2005-01-31

    This report describes a study evaluating the effects of train control technology on locomotive engineer performance. Several types of train control systems were evaluated: partial automation (cruise control and programmed stop) and full automation we...

  8. 77 FR 67816 - Proposed Agency Information Collection Activities; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ... information collection on respondents, including through the use of automated collection techniques or other... trades driven by retail investors versus those that are wholesale driven. This would yield information... FEDERAL RESERVE SYSTEM Proposed Agency Information Collection Activities; Comment Request AGENCY...

  9. Automated Indexing of the Hazardous Substances Data Bank (HSDB)

    PubMed Central

    Nuss, Carlo; Chang, Hua Florence; Moore, Dorothy; Fonger, George C.

    2003-01-01

    The Hazardous Substances Data Bank (HSDB), produced and maintained by the National Library of Medicine (NLM), contains over 4600 records on potentially hazardous chemicals. To enhance information retrieval from HSDB, NLM has undertaken the development of an automated HSDB indexing protocol as part of its Indexing Initiative. The NLM Indexing Initiative investigates methods whereby automated indexing may partially or completely substitute for human indexing. The poster’s purpose is to describe the HSDB Automated Indexing Project. PMID:14728459

  10. 76 FR 14906 - Laminated Woven Sacks From the People's Republic of China: Final Results of First Antidumping...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... has added several case numbers to the Case Reference file within the Automated Commercial Environment... Vietnam: Preliminary Results, Preliminary Partial Rescission and Final Partial Rescission of the Second... the Socialist Republic of Vietnam: Final Results and Final Partial Rescission of Antidumping Duty...

  11. Simplified subsurface modelling: data assimilation and violated model assumptions

    NASA Astrophysics Data System (ADS)

    Erdal, Daniel; Lange, Natascha; Neuweiler, Insa

    2017-04-01

    Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated compartments. Under such circumstances, direct modelling using a simplified model will not provide good results. However, a more data driven (e.g. grey box) approach, driven by the filter, may still provide an improved understanding of the system. Comparisons between full 3D simulations and simplified filter driven models will be shown and the resulting benefits and drawbacks will be discussed.

  12. Building simplification algorithms based on user cognition in mobile environment

    NASA Astrophysics Data System (ADS)

    Shen, Jie; Shi, Junfei; Wang, Meizhen; Wu, Chenyan

    2008-10-01

    With the development of LBS, mobile map should adaptively satisfy the cognitive requirement of user. User cognition in mobile environment is much more objective oriented and also seem to be a heavier burden than the user in static environment. The holistic idea and methods of map generalization can not fully suitable for the mobile map. This paper took the building simplification in habitation generalization as example, analyzed the characteristic of user cognition in mobile environment and the basic rules of building simplification, collected and studied the state-of-the-art of algorithms of building simplification in the static and mobile environment, put forward the idea of hierarchical building simplification based on user cognition. This paper took Hunan road business district of Nanjing as test area and took the building data with shapfile format of ESRI as test data and realized the simplification algorithm. The method took user as center, calculated the distance between user and the building which will be simplified and took the distance as the basis for choosing different simplification algorithm for different spaces. This contribution aimed to hierarchically present the building in different level of detail by real-time simplification.

  13. Arctic sea ice albedo - A comparison of two satellite-derived data sets

    NASA Technical Reports Server (NTRS)

    Schweiger, Axel J.; Serreze, Mark C.; Key, Jeffrey R.

    1993-01-01

    Spatial patterns of mean monthly surface albedo for May, June, and July, derived from DMSP Operational Line Scan (OLS) satellite imagery are compared with surface albedos derived from the International Satellite Cloud Climatology Program (ISCCP) monthly data set. Spatial patterns obtained by the two techniques are in general agreement, especially for June and July. Nevertheless, systematic differences in albedo of 0.05 - 0.10 are noted which are most likely related to uncertainties in the simple parameterizations used in the DMSP analyses, problems in the ISCCP cloud-clearing algorithm and other modeling simplifications. However, with respect to the eventual goal of developing a reliable automated retrieval algorithm for compiling a long-term albedo data base, these initial comparisons are very encouraging.

  14. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  15. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials.

    PubMed

    Probst, Yasmine; Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-07-28

    Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used.

  16. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials

    PubMed Central

    Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-01-01

    Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used. PMID:27471104

  17. Complexity measures to track the evolution of a SNOMED hierarchy.

    PubMed

    Wei, Duo; Wang, Yue; Perl, Yehoshua; Xu, Junchuan; Halper, Michael; Spackman, Kent A; Spackman, Kent

    2008-11-06

    SNOMED CT is an extensive terminology with an attendant amount of complexity. Two measures are proposed for quantifying that complexity. Both are based on abstraction networks, called the area taxonomy and the partial-area taxonomy, that provide, for example, distributions of the relationships within a SNOMED hierarchy. The complexity measures are employed specifically to track the complexity of versions of the Specimen hierarchy of SNOMED before and after it is put through an auditing process. The pre-audit and post-audit versions are compared. The results show that the auditing process indeed leads to a simplification of the terminology's structure.

  18. Recursive Definitions of Partial Functions and Their Computations

    DTIC Science & Technology

    1972-03-01

    standard simplifications of -’ terminate with the same term, which must be -/,. n Ncv, by the ptvvi’ius Lemna , wo kiuiw that tlu-re...Til »*ft<p»ubit (er,) (f»I Fsubst [aj (f,f)) by Lemna 5.2.2 of Chapter % *lKt,’ ^(ft«l,|H’-»9n(M«.f) ) • On the other hand; ¥^.f’,V...ly{f ,T) is in ^ and if if is not in A it must T must be a’. To prove (1)| we first noru that If the computation terminates with a, Lemna

  19. Lower extremity EMG-driven modeling of walking with automated adjustment of musculoskeletal geometry

    PubMed Central

    Meyer, Andrew J.; Patten, Carolynn

    2017-01-01

    Neuromusculoskeletal disorders affecting walking ability are often difficult to manage, in part due to limited understanding of how a patient’s lower extremity muscle excitations contribute to the patient’s lower extremity joint moments. To assist in the study of these disorders, researchers have developed electromyography (EMG) driven neuromusculoskeletal models utilizing scaled generic musculoskeletal geometry. While these models can predict individual muscle contributions to lower extremity joint moments during walking, the accuracy of the predictions can be hindered by errors in the scaled geometry. This study presents a novel EMG-driven modeling method that automatically adjusts surrogate representations of the patient’s musculoskeletal geometry to improve prediction of lower extremity joint moments during walking. In addition to commonly adjusted neuromusculoskeletal model parameters, the proposed method adjusts model parameters defining muscle-tendon lengths, velocities, and moment arms. We evaluated our EMG-driven modeling method using data collected from a high-functioning hemiparetic subject walking on an instrumented treadmill at speeds ranging from 0.4 to 0.8 m/s. EMG-driven model parameter values were calibrated to match inverse dynamic moments for five degrees of freedom in each leg while keeping musculoskeletal geometry close to that of an initial scaled musculoskeletal model. We found that our EMG-driven modeling method incorporating automated adjustment of musculoskeletal geometry predicted net joint moments during walking more accurately than did the same method without geometric adjustments. Geometric adjustments improved moment prediction errors by 25% on average and up to 52%, with the largest improvements occurring at the hip. Predicted adjustments to musculoskeletal geometry were comparable to errors reported in the literature between scaled generic geometric models and measurements made from imaging data. Our results demonstrate that with appropriate experimental data, joint moment predictions for walking generated by an EMG-driven model can be improved significantly when automated adjustment of musculoskeletal geometry is included in the model calibration process. PMID:28700708

  20. Partially Decentralized Control Architectures for Satellite Formations

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Bauer, Frank H.

    2002-01-01

    In a partially decentralized control architecture, more than one but less than all nodes have supervisory capability. This paper describes an approach to choosing the number of supervisors in such au architecture, based on a reliability vs. cost trade. It also considers the implications of these results for the design of navigation systems for satellite formations that could be controlled with a partially decentralized architecture. Using an assumed cost model, analytic and simulation-based results indicate that it may be cheaper to achieve a given overall system reliability with a partially decentralized architecture containing only a few supervisors, than with either fully decentralized or purely centralized architectures. Nominally, the subset of supervisors may act as centralized estimation and control nodes for corresponding subsets of the remaining subordinate nodes, and act as decentralized estimation and control peers with respect to each other. However, in the context of partially decentralized satellite formation control, the absolute positions and velocities of each spacecraft are unique, so that correlations which make estimates using only local information suboptimal only occur through common biases and process noise. Covariance and monte-carlo analysis of a simplified system show that this lack of correlation may allow simplification of the local estimators while preserving the global optimality of the maneuvers commanded by the supervisors.

  1. Evaluation of Automated and Semi-Automated Scoring of Polysomnographic Recordings from a Clinical Trial Using Zolpidem in the Treatment of Insomnia

    PubMed Central

    Svetnik, Vladimir; Ma, Junshui; Soper, Keith A.; Doran, Scott; Renger, John J.; Deacon, Steve; Koblan, Ken S.

    2007-01-01

    Objective: To evaluate the performance of 2 automated systems, Morpheus and Somnolyzer24X7, with various levels of human review/editing, in scoring polysomnographic (PSG) recordings from a clinical trial using zolpidem in a model of transient insomnia. Methods: 164 all-night PSG recordings from 82 subjects collected during 2 nights of sleep, one under placebo and one under zolpidem (10 mg) treatment were used. For each recording, 6 different methods were used to provide sleep stage scores based on Rechtschaffen & Kales criteria: 1) full manual scoring, 2) automated scoring by Morpheus 3) automated scoring by Somnolyzer24X7, 4) automated scoring by Morpheus with full manual review, 5) automated scoring by Morpheus with partial manual review, 6) automated scoring by Somnolyzer24X7 with partial manual review. Ten traditional clinical efficacy measures of sleep initiation, maintenance, and architecture were calculated. Results: Pair-wise epoch-by-epoch agreements between fully automated and manual scores were in the range of intersite manual scoring agreements reported in the literature (70%-72%). Pair-wise epoch-by-epoch agreements between automated scores manually reviewed were higher (73%-76%). The direction and statistical significance of treatment effect sizes using traditional efficacy endpoints were essentially the same whichever method was used. As the degree of manual review increased, the magnitude of the effect size approached those estimated with fully manual scoring. Conclusion: Automated or semi-automated sleep PSG scoring offers valuable alternatives to costly, time consuming, and intrasite and intersite variable manual scoring, especially in large multicenter clinical trials. Reduction in scoring variability may also reduce the sample size of a clinical trial. Citation: Svetnik V; Ma J; Soper KA; Doran S; Renger JJ; Deacon S; Koblan KS. Evaluation of automated and semi-automated scoring of polysomnographic recordings from a clinical trial using zolpidem in the treatment of insomnia. SLEEP 2007;30(11):1562-1574. PMID:18041489

  2. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.

  3. Is partially automated driving a bad idea? Observations from an on-road study.

    PubMed

    Banks, Victoria A; Eriksson, Alexander; O'Donoghue, Jim; Stanton, Neville A

    2018-04-01

    The automation of longitudinal and lateral control has enabled drivers to become "hands and feet free" but they are required to remain in an active monitoring state with a requirement to resume manual control if required. This represents the single largest allocation of system function problem with vehicle automation as the literature suggests that humans are notoriously inefficient at completing prolonged monitoring tasks. To further explore whether partially automated driving solutions can appropriately support the driver in completing their new monitoring role, video observations were collected as part of an on-road study using a Tesla Model S being operated in Autopilot mode. A thematic analysis of video data suggests that drivers are not being properly supported in adhering to their new monitoring responsibilities and instead demonstrate behaviour indicative of complacency and over-trust. These attributes may encourage drivers to take more risks whilst out on the road. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Automation Hooks Architecture for Flexible Test Orchestration - Concept Development and Validation

    NASA Technical Reports Server (NTRS)

    Lansdowne, C. A.; Maclean, John R.; Winton, Chris; McCartney, Pat

    2011-01-01

    The Automation Hooks Architecture Trade Study for Flexible Test Orchestration sought a standardized data-driven alternative to conventional automated test programming interfaces. The study recommended composing the interface using multicast DNS (mDNS/SD) service discovery, Representational State Transfer (Restful) Web Services, and Automatic Test Markup Language (ATML). We describe additional efforts to rapidly mature the Automation Hooks Architecture candidate interface definition by validating it in a broad spectrum of applications. These activities have allowed us to further refine our concepts and provide observations directed toward objectives of economy, scalability, versatility, performance, severability, maintainability, scriptability and others.

  5. Development of an automated scanning monochromator for sensitivity calibration of the MUSTANG instrument

    NASA Astrophysics Data System (ADS)

    Rivers, Thane D.

    1992-06-01

    An Automated Scanning Monochromator was developed using: an Acton Research Corporation (ARC) Monochromator, Ealing Photomultiplier Tube and a Macintosh PC in conjunction with LabVIEW software. The LabVIEW Virtual Instrument written to operate the ARC Monochromator is a mouse driven user friendly program developed for automated spectral data measurements. Resolution and sensitivity of the Automated Scanning Monochromator System were determined experimentally. The Automated monochromator was then used for spectral measurements of a Platinum Lamp. Additionally, the reflectivity curve for a BaSO4 coated screen has been measured. Reflectivity measurements indicate a large discrepancy with expected results. Further analysis of the reflectivity experiment is required for conclusive results.

  6. Practical simplifications for radioimmunotherapy dosimetric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, S.; DeNardo, G.L.; O`Donnell, R.T.

    1999-01-01

    Radiation dosimetry is potentially useful for assessment and prediction of efficacy and toxicity for radionuclide therapy. The usefulness of these dose estimates relies on the establishment of a dose-response model using accurate pharmacokinetic data and a radiation dosimetric model. Due to the complexity in radiation dose estimation, many practical simplifications have been introduced in the dosimetric modeling for clinical trials of radioimmunotherapy. Although research efforts are generally needed to improve the simplifications used at each stage of model development, practical simplifications are often possible for specific applications without significant consequences to the dose-response model. In the development of dosimetric methodsmore » for radioimmunotherapy, practical simplifications in the dosimetric models were introduced. This study evaluated the magnitude of uncertainty associated with practical simplifications for: (1) organ mass of the MIRD phantom; (2) radiation contribution from target alone; (3) interpolation of S value; (4) macroscopic tumor uniformity; and (5) fit of tumor pharmacokinetic data.« less

  7. Eye blink detection for different driver states in conditionally automated driving and manual driving using EOG and a driver camera.

    PubMed

    Schmidt, Jürgen; Laarousi, Rihab; Stolzmann, Wolfgang; Karrer-Gauß, Katja

    2018-06-01

    In this article, we examine the performance of different eye blink detection algorithms under various constraints. The goal of the present study was to evaluate the performance of an electrooculogram- and camera-based blink detection process in both manually and conditionally automated driving phases. A further comparison between alert and drowsy drivers was performed in order to evaluate the impact of drowsiness on the performance of blink detection algorithms in both driving modes. Data snippets from 14 monotonous manually driven sessions (mean 2 h 46 min) and 16 monotonous conditionally automated driven sessions (mean 2 h 45 min) were used. In addition to comparing two data-sampling frequencies for the electrooculogram measures (50 vs. 25 Hz) and four different signal-processing algorithms for the camera videos, we compared the blink detection performance of 24 reference groups. The analysis of the videos was based on very detailed definitions of eyelid closure events. The correct detection rates for the alert and manual driving phases (maximum 94%) decreased significantly in the drowsy (minus 2% or more) and conditionally automated (minus 9% or more) phases. Blinking behavior is therefore significantly impacted by drowsiness as well as by automated driving, resulting in less accurate blink detection.

  8. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  9. Humans: still vital after all these years of automation.

    PubMed

    Parasuraman, Raja; Wickens, Christopher D

    2008-06-01

    The authors discuss empirical studies of human-automation interaction and their implications for automation design. Automation is prevalent in safety-critical systems and increasingly in everyday life. Many studies of human performance in automated systems have been conducted over the past 30 years. Developments in three areas are examined: levels and stages of automation, reliance on and compliance with automation, and adaptive automation. Automation applied to information analysis or decision-making functions leads to differential system performance benefits and costs that must be considered in choosing appropriate levels and stages of automation. Human user dependence on automated alerts and advisories reflects two components of operator trust, reliance and compliance, which are in turn determined by the threshold designers use to balance automation misses and false alarms. Finally, adaptive automation can provide additional benefits in balancing workload and maintaining the user's situation awareness, although more research is required to identify when adaptation should be user controlled or system driven. The past three decades of empirical research on humans and automation has provided a strong science base that can be used to guide the design of automated systems. This research can be applied to most current and future automated systems.

  10. Rasch-Master's Partial Credit Model in the assessment of children's creativity in drawings.

    PubMed

    Nakano, Tatiana de Cássia; Primi, Ricardo

    2014-01-01

    The purpose of the present study was to use the Partial Credit Model to study the factors of the Test of Creativity in Children and identify which characteristics of the creative person would be more effective to differentiate subjects according to their ability level. A sample of 1426 students from first to eighth grades answered the instrument. The Partial Credits model was used to estimate the ability of the subjects and item difficulties on a common scale for each of the four factors, indicating which items required a higher level of creativity to be scored and will differentiate the more creative individuals. The results demonstrated that the greater part of the characteristics showed good fit indices, with values between 0.80 and 1.30 both infit and outfit, indicating a response pattern consistent with the model. The characteristics of Unusual Perspective, Expression of Emotion and Originality have been identified as better predictors of creative performance because requires greater ability level (usually above two standard deviation). These results may be used in the future development of an instrument's reduced form or simplification of the current correction model.

  11. Identifying and locating surface defects in wood: Part of an automated lumber processing system

    Treesearch

    Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa

    1983-01-01

    Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...

  12. Simplification.

    ERIC Educational Resources Information Center

    George, H. V.

    This article discusses language simplification as one aspect of a person's speech activity, and relates simplification to second language learning. Translation from language to language and translation within one language are processes through which a person, as decoder, decontextualizes a message form-sequence through perception of its…

  13. Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields.

    PubMed

    Skraba, Primoz; Bei Wang; Guoning Chen; Rosen, Paul

    2015-08-01

    Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.

  14. Dynamic simulation solves process control problem in Oman

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-11-16

    A dynamic simulation study solved the process control problems for a Saih Rawl, Oman, gas compressor station operated by Petroleum Development of Oman (PDO). PDO encountered persistent compressor failure that caused frequent facility shutdowns, oil production deferment, and gas flaring. It commissioned MSE (Consultants) Ltd., U.K., to find a solution for the problem. Saih Rawl, about 40 km from Qarn Alam, produces oil and associated gas from a large number of low and high-pressure wells. Oil and gas are separated in three separators. The oil is pumped to Qarn Alam for treatment and export. Associated gas is compressed in twomore » parallel trains. Train K-1115 is a 350,000 standard cu m/day, four-stage reciprocating compressor driven by a fixed-speed electric motor. Train K-1120 is a 1 million standard cu m/day, four-stage reciprocating compressor driven by a fixed-speed electric motor. Train K-1120 is a 1 million standard cu m/day, four-stage centrifugal compressor driven by a variable-speed motor. The paper describes tripping and surging problems with the gas compressor and the control simplifications that solved the problem.« less

  15. Just like the rest of evolution in Mother Nature, the evolution of cancers may be driven by natural selection, and not by haphazard mutations

    PubMed Central

    Zhang, Ju; Lou, Xiaomin; Zellmer, Lucas; Liu, Siqi; Xu, Ningzhi; Liao, D. Joshua

    2014-01-01

    Sporadic carcinogenesis starts from immortalization of a differentiated somatic cell or an organ-specific stem cell. The immortalized cell incepts a new or quasinew organism that lives like a parasite in the patient and usually proceeds to progressive simplification, constantly engendering intermediate organisms that are simpler than normal cells. Like organismal evolution in Mother Nature, this cellular simplification is a process of Darwinian selection of those mutations with growth- or survival-advantages, from numerous ones that occur randomly and stochastically. Therefore, functional gain of growth- or survival-sustaining oncogenes and functional loss of differentiation-sustaining tumor suppressor genes, which are hallmarks of cancer cells and contribute to phenotypes of greater malignancy, are not drivers of carcinogenesis but are results from natural selection of advantageous mutations. Besides this mutation-load dependent survival mechanism that is evolutionarily low and of an asexual nature, cancer cells may also use cell fusion for survival, which is an evolutionarily-higher mechanism and is of a sexual nature. Assigning oncogenes or tumor suppressor genes or their mutants as drivers to induce cancer in animals may somewhat coerce them to create man-made oncogenic pathways that may not really be a course of sporadic cancer formations in the human. PMID:25594068

  16. Effects on Text Simplification: Evaluation of Splitting up Noun Phrases

    PubMed Central

    Leroy, Gondy; Kauchak, David; Hogue, Alan

    2016-01-01

    To help increase health literacy, we are developing a text simplification tool that creates more accessible patient education materials. Tool development is guided by data-driven feature analysis comparing simple and difficult text. In the present study, we focus on the common advice to split long noun phrases. Our previous corpus analysis showed that easier texts contained shorter noun phrases. Subsequently, we conduct a user study to measure the difficulty of sentences containing noun phrases of different lengths (2-gram, 3-gram and 4-gram), conditions (split or not) and, to simulate unknown terms, use of pseudowords (present or not). We gathered 35 evaluations for 30 sentences in each condition (3×2×2 conditions) on Amazon’s Mechanical Turk (N=12,600). We conducted a three-way ANOVA for perceived and actual difficulty. Splitting noun phrases had a positive effect on perceived difficulty but a negative effect on actual difficulty. The presence of pseudowords increased perceived and actual difficulty. Without pseudowords, longer noun phrase led to increased perceived and actual difficulty. A follow-up study using the phrases (N = 1,350) showed that measuring awkwardness may indicate when to split noun phrases. We conclude that splitting noun phrases benefits perceived difficulty, but hurts actual difficulty when the phrasing becomes less natural. PMID:27043754

  17. How to assess driver's interaction with partially automated driving systems - A framework for early concept assessment.

    PubMed

    van den Beukel, Arie P; van der Voort, Mascha C

    2017-03-01

    The introduction of partially automated driving systems changes the driving task into supervising the automation with an occasional need to intervene. To develop interface solutions that adequately support drivers in this new role, this study proposes and evaluates an assessment framework that allows designers to evaluate driver-support within relevant real-world scenarios. Aspects identified as requiring assessment in terms of driver-support within the proposed framework are Accident Avoidance, gained Situation Awareness (SA) and Concept Acceptance. Measurement techniques selected to operationalise these aspects and the associated framework are pilot-tested with twenty-four participants in a driving simulator experiment. The objective of the test is to determine the reliability of the applied measurements for the assessment of the framework and whether the proposed framework is effective in predicting the level of support offered by the concepts. Based on the congruency between measurement scores produced in the test and scores with predefined differences in concept-support, this study demonstrates the framework's reliability. A remaining concern is the framework's weak sensitivity to small differences in offered support. The article concludes that applying the framework is especially advantageous for evaluating early design phases and can successfully contribute to the efficient development of driver's in-control and safe means of operating partially automated vehicles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Simplifications of the mini nutritional assessment short-form are predictive of mortality among hospitalized young and middle-aged adults.

    PubMed

    Asiimwe, Stephen B

    2016-01-01

    Measuring malnutrition in hospitalized patients is difficult in all settings. I evaluated associations of items in the mini nutritional assessment short-form (MNA-sf), a nutritional-risk screening tool previously validated in the elderly, with malnutrition among hospitalized patients in Uganda. I used results to construct two simplifications of this tool that may be applicable to young and middle-aged adults. I assessed the association of each MNA-sf item with the mid-upper arm circumference (MUAC), a specific measure of malnutrition at appropriate cut-offs. I incorporated only malnutrition-specific items into the proposed simplifications scoring each item according to its association with malnutrition. I assessed numbers classified to different score-levels by the simplifications and, via proportional hazards regression, how the simplifications predicted in-hospital mortality. I analyzed 318 patients (median age 37, interquartile range 27 to 56). Variables making it into the simplifications were: reduced food intake, weight loss, mobility, and either BMI in kg/m(2) (categorized as <16, 16 to 16.9, and ≥17) or MUAC in centimeters (categorized as <16 or <17, 16 to 18.9 or 17 to 19.9, and ≥19 or ≥20 for females and males respectively). Compared to the traditional MNA-sf, the simplifications classified fewer patients as malnourished, yet remained strongly predictive of in-hospital mortality. In the MUAC-incorporating simplification, malnourished patients had 3.8-fold (95% CI 1.9 to 7.8) higher risk of in-hospital death than those not malnourished; adjusting for age, sex, and HIV status. The MNA-sf simplifications described may provide an improved measure of malnutrition in hospitalized young and middle-aged adults. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Fluctuating hydrodynamics, current fluctuations, and hyperuniformity in boundary-driven open quantum chains

    NASA Astrophysics Data System (ADS)

    Carollo, Federico; Garrahan, Juan P.; Lesanovsky, Igor; Pérez-Espigares, Carlos

    2017-11-01

    We consider a class of either fermionic or bosonic noninteracting open quantum chains driven by dissipative interactions at the boundaries and study the interplay of coherent transport and dissipative processes, such as bulk dephasing and diffusion. Starting from the microscopic formulation, we show that the dynamics on large scales can be described in terms of fluctuating hydrodynamics. This is an important simplification as it allows us to apply the methods of macroscopic fluctuation theory to compute the large deviation (LD) statistics of time-integrated currents. In particular, this permits us to show that fermionic open chains display a third-order dynamical phase transition in LD functions. We show that this transition is manifested in a singular change in the structure of trajectories: while typical trajectories are diffusive, rare trajectories associated with atypical currents are ballistic and hyperuniform in their spatial structure. We confirm these results by numerically simulating ensembles of rare trajectories via the cloning method, and by exact numerical diagonalization of the microscopic quantum generator.

  20. Fluctuating hydrodynamics, current fluctuations, and hyperuniformity in boundary-driven open quantum chains.

    PubMed

    Carollo, Federico; Garrahan, Juan P; Lesanovsky, Igor; Pérez-Espigares, Carlos

    2017-11-01

    We consider a class of either fermionic or bosonic noninteracting open quantum chains driven by dissipative interactions at the boundaries and study the interplay of coherent transport and dissipative processes, such as bulk dephasing and diffusion. Starting from the microscopic formulation, we show that the dynamics on large scales can be described in terms of fluctuating hydrodynamics. This is an important simplification as it allows us to apply the methods of macroscopic fluctuation theory to compute the large deviation (LD) statistics of time-integrated currents. In particular, this permits us to show that fermionic open chains display a third-order dynamical phase transition in LD functions. We show that this transition is manifested in a singular change in the structure of trajectories: while typical trajectories are diffusive, rare trajectories associated with atypical currents are ballistic and hyperuniform in their spatial structure. We confirm these results by numerically simulating ensembles of rare trajectories via the cloning method, and by exact numerical diagonalization of the microscopic quantum generator.

  1. Simplification: A Viewpoint in Outline. Appendix.

    ERIC Educational Resources Information Center

    Tickoo, Makhan L.

    This essay examines language simplification for second language learners as a linguistic and a pedagogic phenomenon, posing questions for further study by considering past research. It discusses linguistic simplification (LS) in relation to the development of artificial languages, such as Esperanto, "pidgin" languages, Basic English,…

  2. 7 CFR 3015.311 - Simplification, consolidation, or substitution of State plans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Simplification, consolidation, or substitution of... (Continued) OFFICE OF THE CHIEF FINANCIAL OFFICER, DEPARTMENT OF AGRICULTURE UNIFORM FEDERAL ASSISTANCE... Simplification, consolidation, or substitution of State plans. (a) As used in this section: (1) Simplify means...

  3. Simulation of an ammonia-water heat pump water heater with combustion products-driven evaporator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Blanco, Horacio; Gluesenkamp, K.; Ally, Moonis Raza

    Here, the objective of this work is to simulate a single effct (SE) ammonia-water heat pump for domestic water heating, with innovative aspects for cycle simulation and eventual implementation. Seasonal temperature variations demand verfication of distillation column viability. For the given application and temperature ranges, it is found that some variables need to be controlled if the same column is to be used all year round. In addition, a number of simplifications are considered in this work: an advanced evaporator requireing minimal gas flow and surface area, subcooling at two crucial spots of the cycle and the viability of somemore » pump designs to assuage cavitation issues.« less

  4. Simulation of an ammonia-water heat pump water heater with combustion products-driven evaporator

    DOE PAGES

    Perez-Blanco, Horacio; Gluesenkamp, K.; Ally, Moonis Raza

    2016-12-19

    Here, the objective of this work is to simulate a single effct (SE) ammonia-water heat pump for domestic water heating, with innovative aspects for cycle simulation and eventual implementation. Seasonal temperature variations demand verfication of distillation column viability. For the given application and temperature ranges, it is found that some variables need to be controlled if the same column is to be used all year round. In addition, a number of simplifications are considered in this work: an advanced evaporator requireing minimal gas flow and surface area, subcooling at two crucial spots of the cycle and the viability of somemore » pump designs to assuage cavitation issues.« less

  5. Results from the first fully automated PBS-mask process and pelliclization

    NASA Astrophysics Data System (ADS)

    Oelmann, Andreas B.; Unger, Gerd M.

    1994-02-01

    Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.

  6. Diffusion tensor driven contour closing for cell microinjection targeting.

    PubMed

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2010-01-01

    This article introduces a novel approach to robust automatic detection of unstained living cells in bright-field (BF) microscope images with the goal of producing a target list for an automated microinjection system. The overall image analysis process is described and includes: preprocessing, ridge enhancement, image segmentation, shape analysis and injection point definition. The developed algorithm implements a new version of anisotropic contour completion (ACC) based on the partial differential equation (PDE) for heat diffusion which improves the cell segmentation process by elongating the edges only along their tangent direction. The developed ACC algorithm is equivalent to a dilation of the binary edge image with a continuous elliptic structural element that takes into account local orientation of the contours preventing extension towards normal direction. Experiments carried out on real images of 10 to 50 microm CHO-K1 adherent cells show a remarkable reliability in the algorithm along with up to 85% success for cell detection and injection point definition.

  7. NMR spectroscopic properties (1H at 500 MHz) of deuterated* ribonucleotide-dimers ApU*, GpC*, partially deuterated 2'-deoxyribonucleotide-dimers d(TpA*), d(ApT*), d(GpC*) and their comparison with natural counterparts (1H-NMR window).

    PubMed

    Földesi, A; Nilson, F P; Glemarec, C; Gioeli, C; Chattopadhyaya, J

    1993-02-01

    Pure 1'#,2',3',4'#,5',5''-2H6-ribonucleoside derivatives 10-14, 1'#,2',2'',3',4'#,5',5''-2H7-2'-deoxynucleoside blocks 15-18 and their natural-abundance counterparts were used to assemble partially deuterated ribonucleotide-dimers (* indicates deuteration at 1'#,2',3',4'#,5',5''(2H6)): ApU* 21, GpC* 22 and partially deuterated 2'-deoxyribonucleotide-dimers d(TpA*) 23, d(ApT*) 25, d(GpC*) 26 (* indicates deuteration at 1'#,2',2'',3',4'#,5',5''(2H7)) according to the procedure described by Földesi et al. (Tetrahedron, in press). These five partially deuterated oligonucleotides were subsequently compared with their corresponding natural-abundance counterparts by 500 MHz 1H-NMR spectroscopy to evaluate the actual NMR simplifications achieved in the non-deuterated part (1H-NMR window) as a result of specific deuterium incorporation. Detailed one-dimensional 1H-NMR (500 MHz), two-dimensional correlation spectra (DQF-COSY and TOCSY) and deuterium isotope effect on the chemical shifts of oligonucleotides have been presented.

  8. Utilization of paramagnetic microparticles for automated isolation of free circulating mRNA as a new tool in prostate cancer diagnostics.

    PubMed

    Fojtu, Michaela; Gumulec, Jaromir; Balvan, Jan; Raudenska, Martina; Sztalmachova, Marketa; Polanska, Hana; Smerkova, Kristyna; Adam, Vojtech; Kizek, Rene; Masarik, Michal

    2014-02-01

    Determination of serum mRNA gained a lot of attention in recent years, particularly from the perspective of disease markers. Streptavidin-modified paramagnetic particles (SMPs) seem an interesting technique, mainly due to possible automated isolation and high efficiency. The aim of this study was to optimize serum isolation protocol to reduce the consumption of chemicals and sample volume. The following factors were optimized: amounts of (i) paramagnetic particles, (ii) oligo(dT)20 probe, (iii) serum, and (iv) the binding sequence (SMPs, oligo(dT)20 , serum vs. oligo(dT)20 , serum and SMPs). RNA content was measured, and the expression of metallothionein-2A as possible prostate cancer marker was analyzed to demonstrate measurable RNA content with ability for RT-PCR detection. Isolation is possible on serum volume range (10-200 μL) without altering of efficiency or purity. Amount of SMPs can be reduced up to 5 μL, with optimal results within 10-30 μL SMPs. Volume of oligo(dT)20 does not affect efficiency, when used within 0.1-0.4 μL. This optimized protocol was also modified to fit needs of automated one-step single-tube analysis with identical efficiency compared to conventional setup. One-step analysis protocol is considered a promising simplification, making RNA isolation suitable for automatable process. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Influence of prepreg characteristics on stamp consolidation

    NASA Astrophysics Data System (ADS)

    Slange, T. K.; Warnet, L. L.; Grouve, W. J. B.; Akkerman, R.

    2017-10-01

    Stamp forming is a rapid manufacturing technology used to shape flat blanks of thermoplastic composite material into three-dimensional components. The development of automated lay-up technologies further extends the applicability of stamp forming by allowing rapid lay-up of tailored blanks and partial preconsolidation. This partial preconsolidation makes the influence of prepreg more critical compared to conventional preconsolidation methods which provide full preconsolidation. This paper aims to highlight consolidation challenges that can appear when stamp forming blanks manufactured by automated lay-up. Important prepreg characteristics were identified based on an experimental study where a comparison was made between various prepreg in their as-received, deconsolidated and stamp consolidated state. It was found that adding up small thickness variations across the width of a prepreg when stacking plies into a blank by automated lay-up can cause non-uniform consolidation. Additionally, deconsolidation of the prepreg does not seem to obstruct interlaminar bonding, while intralaminar voids initially present in a prepreg cannot be removed during stamp forming. An additional preconsolidation step after automated lay-up seems necessary to remove blank thickness variations and intralaminar voids for the current prepregs. Eliminating this process step and the successful combination of rapid automated lay-up and stamp forming requires prepregs which are void-free and have less thickness variation.

  10. Building Flexible User Interfaces for Solving PDEs

    NASA Astrophysics Data System (ADS)

    Logg, Anders; Wells, Garth N.

    2010-09-01

    FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.

  11. Pilot workload, performance and aircraft control automation

    NASA Technical Reports Server (NTRS)

    Hart, S. G.; Sheridan, T. B.

    1984-01-01

    Conceptual and practical issues associated with the design, operation, and performance of advanced systems and the impact of such systems on the human operators are reviewed. The development of highly automated systems is driven by the availability of new technology and the requirement that operators safely and economically perform more and more activities in increasingly difficult and hostile environments. It is noted that the operators workload may become a major area of concern in future design considerations. Little research was done to determine how automation and workload relate to each other, although it is assumed that the abstract, supervisory, or management roles that are performed by operators of highly automated systems will impose increased mental workload. The relationship between performance and workload is discussed in relation to highly complex and automated environments.

  12. Explicitly correlated coupled-cluster theory using cusp conditions. II. Treatment of connected triple excitations.

    PubMed

    Köhn, Andreas

    2010-11-07

    The coupled-cluster singles and doubles method augmented with single Slater-type correlation factors (CCSD-F12) determined by the cusp conditions (also denoted as SP ansatz) yields results close to the basis set limit with only small overhead compared to conventional CCSD. Quantitative calculations on many-electron systems, however, require to include the effect of connected triple excitations at least. In this contribution, the recently proposed [A. Köhn, J. Chem. Phys. 130, 131101 (2009)] extended SP ansatz and its application to the noniterative triples correction CCSD(T) is reviewed. The approach allows to include explicit correlation into connected triple excitations without introducing additional unknown parameters. The explicit expressions are presented and analyzed, and possible simplifications to arrive at a computationally efficient scheme are suggested. Numerical tests based on an implementation obtained by an automated approach are presented. Using a partial wave expansion for the neon atom, we can show that the proposed ansatz indeed leads to the expected (L(max)+1)(-7) convergence of the noniterative triples correction, where L(max) is the maximum angular momentum in the orbital expansion. Further results are reported for a test set of 29 molecules, employing Peterson's F12-optimized basis sets. We find that the customary approach of using the conventional noniterative triples correction on top of a CCSD-F12 calculation leads to significant basis set errors. This, however, is not always directly visible for total CCSD(T) energies due to fortuitous error compensation. The new approach offers a thoroughly explicitly correlated CCSD(T)-F12 method with improved basis set convergence of the triples contributions to both total and relative energies.

  13. Reflection of a Year Long Model-Driven Business and UI Modeling Development Project

    NASA Astrophysics Data System (ADS)

    Sukaviriya, Noi; Mani, Senthil; Sinha, Vibha

    Model-driven software development enables users to specify an application at a high level - a level that better matches problem domain. It also promises the users with better analysis and automation. Our work embarks on two collaborating domains - business process and human interactions - to build an application. Business modeling expresses business operations and flows then creates business flow implementation. Human interaction modeling expresses a UI design, its relationship with business data, logic, and flow, and can generate working UI. This double modeling approach automates the production of a working system with UI and business logic connected. This paper discusses the human aspects of this modeling approach after a year long of building a procurement outsourcing contract application using the approach - the result of which was deployed in December 2008. The paper discusses in multiple areas the happy endings and some heartache. We end with insights on how a model-driven approach could do better for humans in the process.

  14. Documentation Driven Development for Complex Real-Time Systems

    DTIC Science & Technology

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  15. VEST: Abstract Vector Calculus Simplification in Mathematica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Squire, J. Burby and H. Qin

    2013-03-12

    We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce scalar and vector expressions of a very general type using a systematic canonicalization procedure. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by canonicalization, subsequently applying these to simplify large expressions. In a companion paper [1], we employ VEST in the automation of the calculation of Lagrangians for the single particle guiding center system in plasma physics, amore » computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations. __________________________________________________« less

  16. Toward the automated analysis of plasma physics problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mynick, H.E.

    1989-04-01

    A program (CALC) is described, which carries out nontrivial plasma physics calculations, in a manner intended to emulate the approach of a human theorist. This includes the initial process of gathering the relevant equations from a plasma knowledge base, and then determining how to solve them. Solution of the sets of equations governing physics problems, which in general have a nonuniform,irregular structure, not amenable to solution by standardized algorithmic procedures, is facilitated by an analysis of the structure of the equations and the relations among them. This often permits decompositions of the full problem into subproblems, and other simplifications inmore » form, which renders the resultant subsystems soluble by more standardized tools. CALC's operation is illustrated by a detailed description of its treatment of a sample plasma calculation. 5 refs., 3 figs.« less

  17. VEST: Abstract vector calculus simplification in Mathematica

    NASA Astrophysics Data System (ADS)

    Squire, J.; Burby, J.; Qin, H.

    2014-01-01

    We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce three-dimensional scalar and vector expressions of a very general type to a well defined standard form. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by reduction, subsequently applying these to simplify large expressions. In a companion paper Burby et al. (2013) [12], we employ VEST in the automation of the calculation of high-order Lagrangians for the single particle guiding center system in plasma physics, a computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations.

  18. Test particle propagation in magnetostatic turbulence. 2: The local approximation method

    NASA Technical Reports Server (NTRS)

    Klimas, A. J.; Sandri, G.; Scudder, J. D.; Howell, D. R.

    1976-01-01

    An approximation method for statistical mechanics is presented and applied to a class of problems which contains a test particle propagation problem. All of the available basic equations used in statistical mechanics are cast in the form of a single equation which is integrodifferential in time and which is then used as the starting point for the construction of the local approximation method. Simplification of the integrodifferential equation is achieved through approximation to the Laplace transform of its kernel. The approximation is valid near the origin in the Laplace space and is based on the assumption of small Laplace variable. No other small parameter is necessary for the construction of this approximation method. The n'th level of approximation is constructed formally, and the first five levels of approximation are calculated explicitly. It is shown that each level of approximation is governed by an inhomogeneous partial differential equation in time with time independent operator coefficients. The order in time of these partial differential equations is found to increase as n does. At n = 0 the most local first order partial differential equation which governs the Markovian limit is regained.

  19. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  20. Simplifications and Idealizations in High School Physics in Mechanics: A Study of Slovenian Curriculum and Textbooks

    ERIC Educational Resources Information Center

    Forjan, Matej; Sliško, Josip

    2014-01-01

    This article presents the results of an analysis of three Slovenian textbooks for high school physics, from the point of view of simplifications and idealizations in the field of mechanics. In modeling of physical systems, making simplifications and idealizations is important, since one ignores minor effects and focuses on the most important…

  1. Complicity as Infiltration: The (Im)possibilities of Research with/in NSF Engineering Grants in the Age of Neoliberal Scientism

    ERIC Educational Resources Information Center

    Daza, Stephanie Lynn

    2012-01-01

    In this article, I use a Spivakian decolonizing perspective to take simplification to task in two ways--the simplification of methodology with/in grants and the simplification of critique that skirts the impossibility of noncomplicitous research and researchers. I posit that "neoliberal scientism"'s grants culture is colonizing research--narrowly…

  2. A Comprehensive Optimization Strategy for Real-time Spatial Feature Sharing and Visual Analytics in Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Li, W.; Shao, H.

    2017-12-01

    For geospatial cyberinfrastructure enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for the vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: 1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance (ADT) to meet various visualization requirements, and at the same time speed up simplification efficiency; 2) a progressive attribute transmission method to reduce data size and therefore the service response time; 3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to widen the use of web service providing vector data to support real-time spatial feature sharing, visual analytics and decision-making.

  3. Mechanizing the Merc: The Chicago Mercantile Exchange and the Rise of High-Frequency Trading.

    PubMed

    MacKenzie, Donald

    2015-07-01

    This article investigates one important strand in the evolution of today's high-frequency trading or HFT (the fast, automated trading of large numbers of financial securities). That strand is the history of the automation of trading on what has become the world's most prominent futures exchange, the Chicago Mercantile Exchange or Merc. The process of the automation of the Merc was episodic, often driven by responses to perceived external threats, and involved both "local" politics and transnational considerations. The article discusses the relationship between the Merc's automation and the embodied, deeply social trading practices of the Merc's open-outcry trading pits, and compares how the Merc was mechanized with the quite different-and in a sense more explicitly "social"-project of automation launched by the Merc's rival, the Chicago Board of Trade.

  4. A data-driven multiplicative fault diagnosis approach for automation processes.

    PubMed

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  5. Gaseous Viscous Peeling of Linearly Elastic Substrates

    NASA Astrophysics Data System (ADS)

    Elbaz, Shai; Jacob, Hila; Gat, Amir

    2017-11-01

    We study pressure-driven propagation of gas into a micron-scale gap between two linearly elastic substrates. Applying the lubrication approximation, the governing nonlinear evolution equation describes the interaction between elasticity and viscosity, as well as weak rarefaction and low-Mach-number compressibility, characteristic to gaseous microflows. Several physical limits allow simplification of the evolution equation and enable solution by self-similarity. During the peeling process the flow-field transitions between the different limits and the respective approximate solutions. The sequence of limits occurring during the propagation dynamics can be related to the thickness of the prewetting layer of the configuration at rest, yielding an approximate description of the entire peeling dynamics. The results are validated by numerical solutions of the evolution equation. Israel Science Foundation 818/13.

  6. Automating concept identification in the electronic medical record: an experiment in extracting dosage information.

    PubMed Central

    Evans, D. A.; Brownlow, N. D.; Hersh, W. R.; Campbell, E. M.

    1996-01-01

    We discuss the development and evaluation of an automated procedure for extracting drug-dosage information from clinical narratives. The process was developed rapidly using existing technology and resources, including categories of terms from UMLS96. Evaluations over a large training and smaller test set of medical records demonstrate an approximately 80% rate of exact and partial matches' on target phrases, with few false positives and a modest rate of false negatives. The results suggest a strategy for automating general concept identification in electronic medical records. PMID:8947694

  7. Good vibrations: tactile feedback in support of attention allocation and human-automation coordination in event-driven domains.

    PubMed

    Sklar, A E; Sarter, N B

    1999-12-01

    Observed breakdowns in human-machine communication can be explained, in part, by the nature of current automation feedback, which relies heavily on focal visual attention. Such feedback is not well suited for capturing attention in case of unexpected changes and events or for supporting the parallel processing of large amounts of data in complex domains. As suggested by multiple-resource theory, one possible solution to this problem is to distribute information across various sensory modalities. A simulator study was conducted to compare the effectiveness of visual, tactile, and redundant visual and tactile cues for indicating unexpected changes in the status of an automated cockpit system. Both tactile conditions resulted in higher detection rates for, and faster response times to, uncommanded mode transitions. Tactile feedback did not interfere with, nor was its effectiveness affected by, the performance of concurrent visual tasks. The observed improvement in task-sharing performance indicates that the introduction of tactile feedback is a promising avenue toward better supporting human-machine communication in event-driven, information-rich domains.

  8. Inspection apparatus for evaluating a partially completed weld

    DOEpatents

    Smartt, Herschel B.; Larsen, Eric D.; Johnson, Jonn A.

    2001-01-01

    An inspection apparatus for evaluating a partially completed weld is described and which is utilized in combination with an automated movable welder which moves across a supporting surface, and wherein the inspection apparatus includes a coupling member mounted on the welder; a frame member mounted on the coupling member; an ultrasonic sensor mounted on the frame member and disposed in ultrasonic sound transmitting relation relative to the partially completed weld; and a drive assembly for adjusting the position of the ultrasonic sensor relative to the partially completed weld.

  9. Visual texture for automated characterisation of geological features in borehole televiewer imagery

    NASA Astrophysics Data System (ADS)

    Al-Sit, Waleed; Al-Nuaimy, Waleed; Marelli, Matteo; Al-Ataby, Ali

    2015-08-01

    Detailed characterisation of the structure of subsurface fractures is greatly facilitated by digital borehole logging instruments, the interpretation of which is typically time-consuming and labour-intensive. Despite recent advances towards autonomy and automation, the final interpretation remains heavily dependent on the skill, experience, alertness and consistency of a human operator. Existing computational tools fail to detect layers between rocks that do not exhibit distinct fracture boundaries, and often struggle characterising cross-cutting layers and partial fractures. This paper presents a novel approach to the characterisation of planar rock discontinuities from digital images of borehole logs. Multi-resolution texture segmentation and pattern recognition techniques utilising Gabor filters are combined with an iterative adaptation of the Hough transform to enable non-distinct, partial, distorted and steep fractures and layers to be accurately identified and characterised in a fully automated fashion. This approach has successfully detected fractures and layers with high detection accuracy and at a relatively low computational cost.

  10. Oversimplifying quantum factoring.

    PubMed

    Smolin, John A; Smith, Graeme; Vargo, Alexander

    2013-07-11

    Shor's quantum factoring algorithm exponentially outperforms known classical methods. Previous experimental implementations have used simplifications dependent on knowing the factors in advance. However, as we show here, all composite numbers admit simplification of the algorithm to a circuit equivalent to flipping coins. The difficulty of a particular experiment therefore depends on the level of simplification chosen, not the size of the number factored. Valid implementations should not make use of the answer sought.

  11. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices

    PubMed Central

    2018-01-01

    The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario. PMID:29748468

  12. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices.

    PubMed

    Muñoz, Sergio; Araque, Oscar; Sánchez-Rada, J Fernando; Iglesias, Carlos A

    2018-05-10

    The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario.

  13. MEthods of ASsessing blood pressUre: identifying thReshold and target valuEs (MeasureBP): a review & study protocol.

    PubMed

    Blom, Kimberly C; Farina, Sasha; Gomez, Yessica-Haydee; Campbell, Norm R C; Hemmelgarn, Brenda R; Cloutier, Lyne; McKay, Donald W; Dawes, Martin; Tobe, Sheldon W; Bolli, Peter; Gelfer, Mark; McLean, Donna; Bartlett, Gillian; Joseph, Lawrence; Featherstone, Robin; Schiffrin, Ernesto L; Daskalopoulou, Stella S

    2015-04-01

    Despite progress in automated blood pressure measurement (BPM) technology, there is limited research linking hard outcomes to automated office BPM (OBPM) treatment targets and thresholds. Equivalences for automated BPM devices have been estimated from approximations of standardized manual measurements of 140/90 mmHg. Until outcome-driven targets and thresholds become available for automated measurement methods, deriving evidence-based equivalences between automated methods and standardized manual OBPM is the next best solution. The MeasureBP study group was initiated by the Canadian Hypertension Education Program to close this critical knowledge gap. MeasureBP aims to define evidence-based equivalent values between standardized manual OBPM and automated BPM methods by synthesizing available evidence using a systematic review and individual subject-level data meta-analyses. This manuscript provides a review of the literature and MeasureBP study protocol. These results will lay the evidenced-based foundation to resolve uncertainties within blood pressure guidelines which, in turn, will improve the management of hypertension.

  14. Automated cellular pathology in noninvasive confocal microscopy

    NASA Astrophysics Data System (ADS)

    Ting, Monica; Krueger, James; Gareau, Daniel

    2014-03-01

    A computer algorithm was developed to automatically identify and count melanocytes and keratinocytes in 3D reflectance confocal microscopy (RCM) images of the skin. Computerized pathology increases our understanding and enables prevention of superficial spreading melanoma (SSM). Machine learning involved looking at the images to measure the size of cells through a 2-D Fourier transform and developing an appropriate mask with the erf() function to model the cells. Implementation involved processing the images to identify cells whose image segments provided the least difference when subtracted from the mask. With further simplification of the algorithm, the program may be directly implemented on the RCM images to indicate the presence of keratinocytes in seconds and to quantify the keratinocytes size in the en face plane as a function of depth. Using this system, the algorithm can identify any irregularities in maturation and differentiation of keratinocytes, thereby signaling the possible presence of cancer.

  15. An Improved Representation of Regional Boundaries on Parcellated Morphological Surfaces

    PubMed Central

    Hao, Xuejun; Xu, Dongrong; Bansal, Ravi; Liu, Jun; Peterson, Bradley S.

    2010-01-01

    Establishing the correspondences of brain anatomy with function is important for understanding neuroimaging data. Regional delineations on morphological surfaces define anatomical landmarks and help to visualize and interpret both functional data and morphological measures mapped onto the cortical surface. We present an efficient algorithm that accurately delineates the morphological surface of the cerebral cortex in real time during generation of the surface using information from parcellated 3D data. With this accurate region delineation, we then develop methods for boundary-preserved simplification and smoothing, as well as procedures for the automated correction of small, misclassified regions to improve the quality of the delineated surface. We demonstrate that our delineation algorithm, together with a new method for double-snapshot visualization of cortical regions, can be used to establish a clear correspondence between brain anatomy and mapped quantities, such as morphological measures, across groups of subjects. PMID:21144708

  16. A natural language processing pipeline for pairing measurements uniquely across free-text CT reports.

    PubMed

    Sevenster, Merlijn; Bozeman, Jeffrey; Cowhy, Andrea; Trost, William

    2015-02-01

    To standardize and objectivize treatment response assessment in oncology, guidelines have been proposed that are driven by radiological measurements, which are typically communicated in free-text reports defying automated processing. We study through inter-annotator agreement and natural language processing (NLP) algorithm development the task of pairing measurements that quantify the same finding across consecutive radiology reports, such that each measurement is paired with at most one other ("partial uniqueness"). Ground truth is created based on 283 abdomen and 311 chest CT reports of 50 patients each. A pre-processing engine segments reports and extracts measurements. Thirteen features are developed based on volumetric similarity between measurements, semantic similarity between their respective narrative contexts and structural properties of their report positions. A Random Forest classifier (RF) integrates all features. A "mutual best match" (MBM) post-processor ensures partial uniqueness. In an end-to-end evaluation, RF has precision 0.841, recall 0.807, F-measure 0.824 and AUC 0.971; with MBM, which performs above chance level (P<0.001), it has precision 0.899, recall 0.776, F-measure 0.833 and AUC 0.935. RF (RF+MBM) has error-free performance on 52.7% (57.4%) of report pairs. Inter-annotator agreement of three domain specialists with the ground truth (κ>0.960) indicates that the task is well defined. Domain properties and inter-section differences are discussed to explain superior performance in abdomen. Enforcing partial uniqueness has mixed but minor effects on performance. A combined machine learning-filtering approach is proposed for pairing measurements, which can support prospective (supporting treatment response assessment) and retrospective purposes (data mining). Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Fabric circuits and method of manufacturing fabric circuits

    NASA Technical Reports Server (NTRS)

    Chu, Andrew W. (Inventor); Dobbins, Justin A. (Inventor); Scully, Robert C. (Inventor); Trevino, Robert C. (Inventor); Lin, Greg Y. (Inventor); Fink, Patrick W. (Inventor)

    2011-01-01

    A flexible, fabric-based circuit comprises a non-conductive flexible layer of fabric and a conductive flexible layer of fabric adjacent thereto. A non-conductive thread, an adhesive, and/or other means may be used for attaching the conductive layer to the non-conductive layer. In some embodiments, the layers are attached by a computer-driven embroidery machine at pre-determined portions or locations in accordance with a pre-determined attachment layout before automated cutting. In some other embodiments, an automated milling machine or a computer-driven laser using a pre-designed circuit trace as a template cuts the conductive layer so as to separate an undesired portion of the conductive layer from a desired portion of the conductive layer. Additional layers of conductive fabric may be attached in some embodiments to form a multi-layer construct.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covington, E; Younge, K; Chen, X

    Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One examplemore » is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.« less

  19. Flight deck disturbance management: a simulator study of diagnosis and recovery from breakdowns in pilot-automation coordination.

    PubMed

    Nikolic, Mark I; Sarter, Nadine B

    2007-08-01

    To examine operator strategies for diagnosing and recovering from errors and disturbances as well as the impact of automation design and time pressure on these processes. Considerable efforts have been directed at error prevention through training and design. However, because errors cannot be eliminated completely, their detection, diagnosis, and recovery must also be supported. Research has focused almost exclusively on error detection. Little is known about error diagnosis and recovery, especially in the context of event-driven tasks and domains. With a confederate pilot, 12 airline pilots flew a 1-hr simulator scenario that involved three challenging automation-related tasks and events that were likely to produce erroneous actions or assessments. Behavioral data were compared with a canonical path to examine pilots' error and disturbance management strategies. Debriefings were conducted to probe pilots' system knowledge. Pilots seldom followed the canonical path to cope with the scenario events. Detection of a disturbance was often delayed. Diagnostic episodes were rare because of pilots' knowledge gaps and time criticality. In many cases, generic inefficient recovery strategies were observed, and pilots relied on high levels of automation to manage the consequences of an error. Our findings describe and explain the nature and shortcomings of pilots' error management activities. They highlight the need for improved automation training and design to achieve more timely detection, accurate explanation, and effective recovery from errors and disturbances. Our findings can inform the design of tools and techniques that support disturbance management in various complex, event-driven environments.

  20. A Tool for Automatic Verification of Real-Time Expert Systems

    NASA Technical Reports Server (NTRS)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  1. AFTI/F16 Automated Maneuvering Attack System Test Reports/Special Technologies and Outlook.

    DTIC Science & Technology

    1986-07-11

    Multiplex Data Bus A-A Air-To-Air A-S Air-to-Surface AFTI Advanced Fighter Technology Integration SYSTEM DESIGN AGL Above-Ground-Level AMAS Automated...Maneuvering Attack System Design requirements for the AFTI/F-16 are driven AMUX Avionics Multiplex Data Bus by realistic air combat scenarios and are...the avionics subsystem IFIM and avionics systems are single-thread, much of the sensed various flight control sensors. Additionally, along with data

  2. Investigating mode errors on automated flight decks: illustrating the problem-driven, cumulative, and interdisciplinary nature of human factors research.

    PubMed

    Sarter, Nadine

    2008-06-01

    The goal of this article is to illustrate the problem-driven, cumulative, and highly interdisciplinary nature of human factors research by providing a brief overview of the work on mode errors on modern flight decks over the past two decades. Mode errors on modem flight decks were first reported in the late 1980s. Poor feedback, inadequate mental models of the automation, and the high degree of coupling and complexity of flight deck systems were identified as main contributors to these breakdowns in human-automation interaction. Various improvements of design, training, and procedures were proposed to address these issues. The author describes when and why the problem of mode errors surfaced, summarizes complementary research activities that helped identify and understand the contributing factors to mode errors, and describes some countermeasures that have been developed in recent years. This brief review illustrates how one particular human factors problem in the aviation domain enabled various disciplines and methodological approaches to contribute to a better understanding of, as well as provide better support for, effective human-automation coordination. Converging operations and interdisciplinary collaboration over an extended period of time are hallmarks of successful human factors research. The reported body of research can serve as a model for future research and as a teaching tool for students in this field of work.

  3. Experiments on vibration-driven stick-slip locomotion: A sliding bifurcation perspective

    NASA Astrophysics Data System (ADS)

    Du, Zhouwei; Fang, Hongbin; Zhan, Xiong; Xu, Jian

    2018-05-01

    Dry friction appears at the contact interface between two surfaces and is the source of stick-slip vibrations. Instead of being a negative factor, dry friction is essential for vibration-driven locomotion system to take effect. However, the dry-friction-induced stick-slip locomotion has not been fully understood in previous research, especially in terms of experiments. In this paper, we experimentally study the stick-slip dynamics of a vibration-driven locomotion system from a sliding bifurcation perspective. To this end, we first design and build a vibration-driven locomotion prototype based on an internal piezoelectric cantilever. By utilizing the mechanical resonance, the small piezoelectric deformation is significantly amplified to drive the prototype to achieve effective locomotion. Through identifying the stick-slip characteristics in velocity histories, we could categorize the system's locomotion into four types and obtain a stick-slip categorization diagram. In each zone of the diagram the locomotion exhibits qualitatively different stick-slip dynamics. Such categorization diagram is actually a sliding bifurcation diagram; crossing from one stick-slip zone to another corresponds to the triggering of a sliding bifurcation. In addition, a simplified single degree-of-freedom model is established, with the rationality of simplification been explained theoretically and numerically. Based on the equivalent model, a numerical stick-slip categorization is also obtained, which shows good agreement with the experiments both qualitatively and quantitatively. To the best of our knowledge, this is the first work that experimentally generates a sliding bifurcation diagram. The obtained stick-slip categorizations deepen our understanding of stick-slip dynamics in vibration-driven systems and could serve as a base for system design and optimization.

  4. A consistent positive association between landscape simplification and insecticide use across the Midwestern US from 1997 through 2012

    DOE PAGES

    Meehan, Timothy D.; Gratton, Claudio

    2015-10-27

    During 2007, counties across the Midwestern US with relatively high levels of landscape simplification (i.e., widespread replacement of seminatural habitats with cultivated crops) had relatively high crop-pest abundances which, in turn, were associated with relatively high insecticide application. These results suggested a positive relationship between landscape simplification and insecticide use, mediated by landscape effects on crop pests or their natural enemies. A follow-up study, in the same region but using different statistical methods, explored the relationship between landscape simplification and insecticide use between 1987 and 2007, and concluded that the relationship varied substantially in sign and strength across years. Here,more » we explore this relationship from 1997 through 2012, using a single dataset and two different analytical approaches. We demonstrate that, when using ordinary least squares (OLS) regression, the relationship between landscape simplification and insecticide use is, indeed, quite variable over time. However, the residuals from OLS models show strong spatial autocorrelation, indicating spatial structure in the data not accounted for by explanatory variables, and violating a standard assumption of OLS. When modeled using spatial regression techniques, relationships between landscape simplification and insecticide use were consistently positive between 1997 and 2012, and model fits were dramatically improved. We argue that spatial regression methods are more appropriate for these data, and conclude that there remains compelling correlative support for a link between landscape simplification and insecticide use in the Midwestern US. We discuss the limitations of inference from this and related studies, and suggest improved data collection campaigns for better understanding links between landscape structure, crop-pest pressure, and pest-management practices.« less

  5. A consistent positive association between landscape simplification and insecticide use across the Midwestern US from 1997 through 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meehan, Timothy D.; Gratton, Claudio

    During 2007, counties across the Midwestern US with relatively high levels of landscape simplification (i.e., widespread replacement of seminatural habitats with cultivated crops) had relatively high crop-pest abundances which, in turn, were associated with relatively high insecticide application. These results suggested a positive relationship between landscape simplification and insecticide use, mediated by landscape effects on crop pests or their natural enemies. A follow-up study, in the same region but using different statistical methods, explored the relationship between landscape simplification and insecticide use between 1987 and 2007, and concluded that the relationship varied substantially in sign and strength across years. Here,more » we explore this relationship from 1997 through 2012, using a single dataset and two different analytical approaches. We demonstrate that, when using ordinary least squares (OLS) regression, the relationship between landscape simplification and insecticide use is, indeed, quite variable over time. However, the residuals from OLS models show strong spatial autocorrelation, indicating spatial structure in the data not accounted for by explanatory variables, and violating a standard assumption of OLS. When modeled using spatial regression techniques, relationships between landscape simplification and insecticide use were consistently positive between 1997 and 2012, and model fits were dramatically improved. We argue that spatial regression methods are more appropriate for these data, and conclude that there remains compelling correlative support for a link between landscape simplification and insecticide use in the Midwestern US. We discuss the limitations of inference from this and related studies, and suggest improved data collection campaigns for better understanding links between landscape structure, crop-pest pressure, and pest-management practices.« less

  6. Unmanned Airlift: How Should We Proceed?

    DTIC Science & Technology

    2002-04-01

    Research Report Submitted to the Faculty In Partial Fulfillment of the Graduation Requirements Advisor: Lt Col Vincent T. Jovene Maxwell Air Force Base...thank my research advisor, Lt Col Jim Jovene for his guidance and assistance. I also owe a special thanks to Mr. Bob Peak of the Southeast SATS Lab...National Aerospace System ( NAS ) are driving the advancement of automating technologies, and it shows how this drive toward automation is laying the

  7. Status of thermalhydraulic modelling and assessment: Open issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bestion, D.; Barre, F.

    1997-07-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situationsmore » where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes.« less

  8. Simplification of the laser absorption process in the particle simulation for the laser-induced shockwave processing

    NASA Astrophysics Data System (ADS)

    Shimamura, Kohei

    2016-09-01

    To reduce the computational cost in the particle method for the numerical simulation of the laser plasma, we examined the simplification of the laser absorption process. Because the laser frequency is sufficiently larger than the collision frequency between the electron and heavy particles, we assumed that the electron obtained the constant value from the laser irradiation. First of all, the simplification of the laser absorption process was verified by the comparison of the EEDF and the laser-absorptivity with PIC-FDTD method. Secondary, the laser plasma induced by TEA CO2 laser in Argon atmosphere was modeled using the 1D3V DSMC method with the simplification of the laser-absorption. As a result, the LSDW was observed with the typical electron and neutral density distribution.

  9. A simplified formalism of the algebra of partially transposed permutation operators with applications

    NASA Astrophysics Data System (ADS)

    Mozrzymas, Marek; Studziński, Michał; Horodecki, Michał

    2018-03-01

    Herein we continue the study of the representation theory of the algebra of permutation operators acting on the n -fold tensor product space, partially transposed on the last subsystem. We develop the concept of partially reduced irreducible representations, which allows us to significantly simplify previously proved theorems and, most importantly, derive new results for irreducible representations of the mentioned algebra. In our analysis we are able to reduce the complexity of the central expressions by getting rid of sums over all permutations from the symmetric group, obtaining equations which are much more handy in practical applications. We also find relatively simple matrix representations for the generators of the underlying algebra. The obtained simplifications and developments are applied to derive the characteristics of a deterministic port-based teleportation scheme written purely in terms of irreducible representations of the studied algebra. We solve an eigenproblem for the generators of the algebra, which is the first step towards a hybrid port-based teleportation scheme and gives us new proofs of the asymptotic behaviour of teleportation fidelity. We also show a connection between the density operator characterising port-based teleportation and a particular matrix composed of an irreducible representation of the symmetric group, which encodes properties of the investigated algebra.

  10. Scattering of plane evanescent waves by buried cylinders: Modeling the coupling to guided waves and resonances

    NASA Astrophysics Data System (ADS)

    Marston, Philip L.

    2003-04-01

    The coupling of sound to buried targets can be associated with acoustic evanescent waves when the sea bottom is smooth. To understand the excitation of guided waves on buried fluid cylinders and shells by acoustic evanescent waves and the associated target resonances, the two-dimensional partial wave series for the scattering is found for normal incidence in an unbounded medium. The shell formulation uses the simplifications of thin-shell dynamics. The expansion of the incident wave becomes a double summation with products of modified and ordinary Bessel functions [P. L. Marston, J. Acoust. Soc. Am. 111, 2378 (2002)]. Unlike the case of an ordinary incident wave, the counterpropagating partial waves of the same angular order have unequal magnitudes when the incident wave is evanescent. This is a consequence of the exponential dependence of the incident wave amplitude on depth. Some consequences of this imbalance of partial-wave amplitudes are given by modifying previous ray theory for the scattering [P. L. Marston and N. H. Sun, J. Acoust. Soc. Am. 97, 777-783 (1995)]. The exponential dependence of the scattering on the location of a scatterer was previously demonstrated in air [T. J. Matula and P. L. Marston, J. Acoust. Soc. Am. 93, 1192-1195 (1993)].

  11. Prosthetic implant treatment of the edentulous maxilla with overdenture.

    PubMed

    Inversini, M

    2006-10-01

    The literature concerning the success of prosthetic implant treatment with maxillary overdenture is reviewed, and variables affecting treatment from the implant and prosthetic standpoint are analysed. Guidelines for the fabrication of maxillary overimplants comparable to those for mandibular overimplants are still lacking, as are evidence-based prosthetic design concepts. Individual methods and techniques have been described, but evaluation standards for the outcome of maxillary prosthetic-implant treatment are lacking or individually interpreted. The biomechanics involved in the proposed system are described, together with the advantages of telescopic crowns for the retention of removable partial prosthesis supported by maxillary implants. The positive influence, in terms of long-term prognosis, of the perio-protective design of removable partial prostheses supported by maxillary overimplants is also discussed. The proposed system not only provides stability, support and retention for removable partial prostheses supported by maxillary implants, but also enables implant survival rates to be improved, both for biomechanical reasons and due to improved oral hygiene. The simplification of fabrication procedures, repair, rebasing and re-operating also reduce the cost of follow-up and improve the cost/benefit ratio. However, additional studies are needed to clarify the number and most appropriate distribution of implants, as well as the most favourable prosthetic designs for maxillary overimplants.

  12. Domain specific software architectures: Command and control

    NASA Technical Reports Server (NTRS)

    Braun, Christine; Hatch, William; Ruegsegger, Theodore; Balzer, Bob; Feather, Martin; Goldman, Neil; Wile, Dave

    1992-01-01

    GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing.

  13. Understanding and enhancing user acceptance of computer technology

    NASA Technical Reports Server (NTRS)

    Rouse, William B.; Morris, Nancy M.

    1986-01-01

    Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.

  14. Multigenerational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior Health (MICEHAB): An Investigation of a Long Duration, Partial Gravity, Autonomous Rodent Colony

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Simon, Matthew A.; Antol, Jeffrey; Chai, Patrick R.; Jones, Christopher A.; Klovstad, Jordan J.; Neilan, James H.; Stillwagen, Frederic H.; Williams, Phillip A.; Bednara, Michael; hide

    2015-01-01

    The path from Earth to Mars requires exploration missions to be increasingly Earth-independent as the foundation is laid for a sustained human presence in the following decades. NASA pioneering of Mars will expand the boundaries of human exploration, as a sustainable presence on the surface requires humans to successfully reproduce in a partial gravity environment independent from Earth intervention. Before significant investment is made in capabilities leading to such pioneering efforts, the challenges of multigenerational mammalian reproduction in a partial gravity environment need be investigated. The Multi-generational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior health is designed to study these challenges. The proposed concept is a conceptual, long duration, autonomous habitat designed to house rodents in a partial gravity environment with the goal of understanding the effects of partial gravity on mammalian reproduction over multiple generations and how to effectively design such a facility to operate autonomously while keeping the rodents healthy in order to achieve multiple generations. All systems are designed to feed forward directly to full-scale human missions to Mars. This paper presents the baseline design concept formulated after considering challenges in the mission and vehicle architectures such as: vehicle automation, automated crew health management/medical care, unique automated waste disposal and hygiene, handling of deceased crew members, reliable long-duration crew support systems, and radiation protection. This concept was selected from an architectural trade space considering the balance between mission science return and robotic and autonomy capabilities. The baseline design is described in detail including: transportation and facility operation constraints, artificial gravity system design, habitat design, and a full-scale mock-up demonstration of autonomous rodent care facilities. The proposed concept has the potential to integrate into existing mission architectures in order to achieve exploration objectives, and to demonstrate and mature common capabilities that enable a range of destinations and missions.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buche, D. L.; Perry, S.

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  16. Combining Offline and Online Computation for Solving Partially Observable Markov Decision Process

    DTIC Science & Technology

    2015-03-06

    David Hsu and Wee Sun Lee, Monte Carlo Bayesian Reinforcement Learning, International Conference on Machine Learning (ICML), 2012. • Haoyu Bai, David...and Automation (ICRA), 2015. • Zhan Wei Lim, David Hsu, and Wee Sun Lee, Adaptive Informative Path Planning in Metric Spaces. Submitted to Int. J... Automation (ICRA), 2015. 2. Bai, H., Hsu, D., Kochenderfer, M. J., and Lee, W. S., Unmanned aircraft collision avoidance using continuous state POMDPs

  17. Modeling of shallow and inefficient convection in the outer layers of the Sun using realistic physics

    NASA Technical Reports Server (NTRS)

    Kim, Yong-Cheol; Fox, Peter A.; Sofia, Sabatino; Demarque, Pierre

    1995-01-01

    In an attempt to understand the properties of convective energy transport in the solar convective zone, a numerical model has been constructed for turbulent flows in a compressible, radiation-coupled, nonmagnetic, gravitationally stratified medium using a realistic equation of state and realistic opacities. The time-dependent, three-dimensional hydrodynamic equations are solved with minimal simplifications. The statistical information obtained from the present simulation provides an improved undserstanding of solar photospheric convection. The characteristics of solar convection in shallow regions is parameterized and compared with the results of Chan & Sofia's (1989) simulations of deep and efficient convection. We assess the importance of the zones of partial ionization in the simulation and confirm that the radiative energy transfer is negliglble throughout the region except in the uppermost scale heights of the convection zone, a region of very high superadiabaticity. When the effects of partial ionization are included, the dynamics of flows are altered significantly. However, we confirm the Chan & Sofia result that kinetic energy flux is nonnegligible and can have a negative value in the convection zone.

  18. LED-based near infrared sensor for cancer diagnostics

    NASA Astrophysics Data System (ADS)

    Bogomolov, Andrey; Ageev, Vladimir; Zabarylo, Urszula; Usenov, Iskander; Schulte, Franziska; Kirsanov, Dmitry; Belikova, Valeria; Minet, Olaf; Feliksberger, E.; Meshkovsky, I.; Artyushenko, Viacheslav

    2016-03-01

    Optical spectroscopic technologies are increasingly used for cancer diagnostics. Feasibility of differentiation between malignant and healthy samples of human kidney using Fluorescence, Raman, MIR and NIR spectroscopy has been recently reported . In the present work, a simplification of NIR spectroscopy method has been studied. Traditional high-resolution NIR spectrometry was replaced by an optical sensor based on a set of light-emitting diodes at selected wavelengths as light sources and a photodiode. Two prototypes of the sensor have been developed and tested using 14 in-vitro samples of seven kidney tumor patients. Statistical evaluation of results using principal component analysis and partial least-squares discriminant analysis has been performed. Despite only partial discrimination between tumor and healthy tissue achieved by the presented new technique, the results evidence benefits of LED-based near-infrared sensing used for oncological diagnostics. Publisher's Note: This paper, originally published on 4 March, 2016, was replaced with a corrected/revised version on 7 April, 2016. If you downloaded the original PDF but are unable to access the revision, please contact SPIE Digital Library Customer Service for assistance.

  19. Generation of longitudinal vibrations in piano strings: From physics to sound synthesis

    NASA Astrophysics Data System (ADS)

    Bank, Balázs; Sujbert, László

    2005-04-01

    Longitudinal vibration of piano strings greatly contributes to the distinctive character of low piano notes. In this paper a simplified modal model is developed, which describes the generation of phantom partials and longitudinal free modes jointly. The model is based on the simplification that the coupling from the transverse vibration to the longitudinal polarization is unidirectional. The modal formulation makes it possible to predict the prominent components of longitudinal vibration as a function of transverse modal frequencies. This provides a qualitative insight into the generation of longitudinal vibration, while the model is still capable of explaining the empirical results of earlier works. The semi-quantitative agreement with measurement results implies that the main source of phantom partials is the transverse to longitudinal coupling, while the string termination and the longitudinal to transverse coupling have only small influence. The results suggest that the longitudinal component of the tone can be treated as a quasi-harmonic spectrum with formantlike peaks at the longitudinal modal frequencies. The model is further simplified and applied for the real-time synthesis of piano sound with convincing sonic results. .

  20. Landscape simplification reduces classical biological control and crop yield.

    PubMed

    Grab, Heather; Danforth, Bryan; Poveda, Katja; Loeb, Greg

    2018-03-01

    Agricultural intensification resulting in the simplification of agricultural landscapes is known to negatively impact the delivery of key ecosystem services such as the biological control of crop pests. Both conservation and classical biological control may be influenced by the landscape context in which they are deployed; yet studies examining the role of landscape structure in the establishment and success of introduced natural enemies and their interactions with native communities are lacking. In this study, we investigated the relationship between landscape simplification, classical and conservation biological control services and importantly, the outcome of these interactions for crop yield. We showed that agricultural simplification at the landscape scale is associated with an overall reduction in parasitism rates of crop pests. Additionally, only introduced parasitoids were identified, and no native parasitoids were found in crop habitat, irrespective of agricultural landscape simplification. Pest densities in the crop were lower in landscapes with greater proportions of semi-natural habitats. Furthermore, farms with less semi-natural cover in the landscape and consequently, higher pest numbers, had lower yields than farms in less agriculturally dominated landscapes. Our study demonstrates the importance of landscape scale agricultural simplification in mediating the success of biological control programs and highlights the potential risks to native natural enemies in classical biological control programs against native insects. Our results represent an important contribution to an understanding of the landscape-mediated impacts on crop yield that will be essential to implementing effective policies that simultaneously conserve biodiversity and ecosystem services. © 2018 by the Ecological Society of America.

  1. Application of the H-Mode, a Design and Interaction Concept for Highly Automated Vehicles, to Aircraft

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Flemisch, Frank O.; Schutte, Paul C.; Williams, Ralph A.

    2006-01-01

    Driven by increased safety, efficiency, and airspace capacity, automation is playing an increasing role in aircraft operations. As aircraft become increasingly able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help us understand their use and guide their design using new forms of automation and interaction. We propose a novel design metaphor to aid the conceptualization, design, and operation of highly-automated aircraft. Design metaphors transfer meaning from common experiences to less familiar applications or functions. A notable example is the "Desktop metaphor" for manipulating files on a computer. This paper describes a metaphor for highly automated vehicles known as the H-metaphor and a specific embodiment of the metaphor known as the H-mode as applied to aircraft. The fundamentals of the H-metaphor are reviewed followed by an overview of an exploratory usability study investigating human-automation interaction issues for a simple H-mode implementation. The envisioned application of the H-mode concept to aircraft is then described as are two planned evaluations.

  2. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  3. Tackling the x-ray cargo inspection challenge using machine learning

    NASA Astrophysics Data System (ADS)

    Jaccard, Nicolas; Rogers, Thomas W.; Morton, Edward J.; Griffin, Lewis D.

    2016-05-01

    The current infrastructure for non-intrusive inspection of cargo containers cannot accommodate exploding com-merce volumes and increasingly stringent regulations. There is a pressing need to develop methods to automate parts of the inspection workflow, enabling expert operators to focus on a manageable number of high-risk images. To tackle this challenge, we developed a modular framework for automated X-ray cargo image inspection. Employing state-of-the-art machine learning approaches, including deep learning, we demonstrate high performance for empty container verification and specific threat detection. This work constitutes a significant step towards the partial automation of X-ray cargo image inspection.

  4. Experiments in cooperative-arm object manipulation with a two-armed free-flying robot. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Koningstein, Ross

    1990-01-01

    Developing computed-torque controllers for complex manipulator systems using current techniques and tools is difficult because they address the issues pertinent to simulation, as opposed to control. A new formulation of computed-torque (CT) control that leads to an automated computer-torque robot controller program is presented. This automated tool is used for simulations and experimental demonstrations of endpoint and object control from a free-flying robot. A new computed-torque formulation states the multibody control problem in an elegant, homogeneous, and practical form. A recursive dynamics algorithm is presented that numerically evaluates kinematics and dynamics terms for multibody systems given a topological description. Manipulators may be free-flying, and may have closed-chain constraints. With the exception of object squeeze-force control, the algorithm does not deal with actuator redundancy. The algorithm is used to implement an automated 2D computed-torque dynamics and control package that allows joint, endpoint, orientation, momentum, and object squeeze-force control. This package obviates the need for hand-derivation of kinematics and dynamics, and is used for both simulation and experimental control. Endpoint control experiments are performed on a laboratory robot that has two arms to manipulate payloads, and uses an air bearing to achieve very-low drag characteristics. Simulations and experimental data for endpoint and object controllers are presented for the experimental robot - a complex dynamic system. There is a certain rather wide set of conditions under which CT endpoint controllers can neglect robot base accelerations (but not motions) and achieve comparable performance including base accelerations in the model. The regime over which this simplification holds is explored by simulation and experiment.

  5. Automation, Miniature Robotics and Sensors for Nondestructive Testing and Evaluation, Volume 4

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Y.; Baumgartner, E.; Backes, P.; Sherrit, S.; Bao, X.; Leary, S.; Kennedy, B.; Mavroidis, C.; Pfeiffer, C.; Culbert, C.; hide

    1999-01-01

    The development of NDE techniques has always been driven by the ongoing need for low-cost, rapid, user-friendly, reliable and efficient methods of detecting and characterizing flaws as well as determining material properties.

  6. An ontology-driven, diagnostic modeling system.

    PubMed

    Haug, Peter J; Ferraro, Jeffrey P; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Dean, Nathan; Jones, Jason

    2013-06-01

    To present a system that uses knowledge stored in a medical ontology to automate the development of diagnostic decision support systems. To illustrate its function through an example focused on the development of a tool for diagnosing pneumonia. We developed a system that automates the creation of diagnostic decision-support applications. It relies on a medical ontology to direct the acquisition of clinic data from a clinical data warehouse and uses an automated analytic system to apply a sequence of machine learning algorithms that create applications for diagnostic screening. We refer to this system as the ontology-driven diagnostic modeling system (ODMS). We tested this system using samples of patient data collected in Salt Lake City emergency rooms and stored in Intermountain Healthcare's enterprise data warehouse. The system was used in the preliminary development steps of a tool to identify patients with pneumonia in the emergency department. This tool was compared with a manually created diagnostic tool derived from a curated dataset. The manually created tool is currently in clinical use. The automatically created tool had an area under the receiver operating characteristic curve of 0.920 (95% CI 0.916 to 0.924), compared with 0.944 (95% CI 0.942 to 0.947) for the manually created tool. Initial testing of the ODMS demonstrates promising accuracy for the highly automated results and illustrates the route to model improvement. The use of medical knowledge, embedded in ontologies, to direct the initial development of diagnostic computing systems appears feasible.

  7. Automated segmentation of murine lung tumors in x-ray micro-CT images

    NASA Astrophysics Data System (ADS)

    Swee, Joshua K. Y.; Sheridan, Clare; de Bruin, Elza; Downward, Julian; Lassailly, Francois; Pizarro, Luis

    2014-03-01

    Recent years have seen micro-CT emerge as a means of providing imaging analysis in pre-clinical study, with in-vivo micro-CT having been shown to be particularly applicable to the examination of murine lung tumors. Despite this, existing studies have involved substantial human intervention during the image analysis process, with the use of fully-automated aids found to be almost non-existent. We present a new approach to automate the segmentation of murine lung tumors designed specifically for in-vivo micro-CT-based pre-clinical lung cancer studies that addresses the specific requirements of such study, as well as the limitations human-centric segmentation approaches experience when applied to such micro-CT data. Our approach consists of three distinct stages, and begins by utilizing edge enhancing and vessel enhancing non-linear anisotropic diffusion filters to extract anatomy masks (lung/vessel structure) in a pre-processing stage. Initial candidate detection is then performed through ROI reduction utilizing obtained masks and a two-step automated segmentation approach that aims to extract all disconnected objects within the ROI, and consists of Otsu thresholding, mathematical morphology and marker-driven watershed. False positive reduction is finally performed on initial candidates through random-forest-driven classification using the shape, intensity, and spatial features of candidates. We provide validation of our approach using data from an associated lung cancer study, showing favorable results both in terms of detection (sensitivity=86%, specificity=89%) and structural recovery (Dice Similarity=0.88) when compared against manual specialist annotation.

  8. Netwar

    NASA Astrophysics Data System (ADS)

    Keen, Arthur A.

    2006-04-01

    This paper describes technology being developed at 21st Century Technologies to automate Computer Network Operations (CNO). CNO refers to DoD activities related to Attacking and Defending Computer Networks (CNA & CND). Next generation cyber threats are emerging in the form of powerful Internet services and tools that automate intelligence gathering, planning, testing, and surveillance. We will focus on "Search-Engine Hacks", queries that can retrieve lists of router/switch/server passwords, control panels, accessible cameras, software keys, VPN connection files, and vulnerable web applications. Examples include "Titan Rain" attacks against DoD facilities and the Santy worm, which identifies vulnerable sites by searching Google for URLs containing application-specific strings. This trend will result in increasingly sophisticated and automated intelligence-driven cyber attacks coordinated across multiple domains that are difficult to defeat or even understand with current technology. One traditional method of CNO relies on surveillance detection as an attack predictor. Unfortunately, surveillance detection is difficult because attackers can perform search engine-driven surveillance such as with Google Hacks, and avoid touching the target site. Therefore, attack observables represent only about 5% of the attacker's total attack time, and are inadequate to provide warning. In order to predict attacks and defend against them, CNO must also employ more sophisticated techniques and work to understand the attacker's Motives, Means and Opportunities (MMO). CNO must use automated reconnaissance tools, such as Google, to identify information vulnerabilities, and then utilize Internet tools to observe the intelligence gathering, planning, testing, and collaboration activities that represent 95% of the attacker's effort.

  9. Thermodynamics and Mechanism of the Interaction of Willardiine Partial Agonists with a Glutamate Receptor: Implications for Drug Development

    PubMed Central

    2015-01-01

    Understanding the thermodynamics of binding of a lead compound to a receptor can provide valuable information for drug design. The binding of compounds, particularly partial agonists, to subtypes of the α-amino-3-hydroxy-5-methyl-4-isoxazole-propionic acid (AMPA) receptor is, in some cases, driven by increases in entropy. Using a series of partial agonists based on the structure of the natural product, willardiine, we show that the charged state of the ligand determines the enthalpic contribution to binding. Willardiines have uracil rings with pKa values ranging from 5.5 to 10. The binding of the charged form is largely driven by enthalpy, while that of the uncharged form is largely driven by entropy. This is due at least in part to changes in the hydrogen bonding network within the binding site involving one water molecule. This work illustrates the importance of charge to the thermodynamics of binding of agonists and antagonists to AMPA receptors and provides clues for further drug discovery. PMID:24850223

  10. Production of a national 1:1,000,000-scale hydrography dataset for the United States: feature selection, simplification, and refinement

    USGS Publications Warehouse

    Gary, Robin H.; Wilson, Zachary D.; Archuleta, Christy-Ann M.; Thompson, Florence E.; Vrabel, Joseph

    2009-01-01

    During 2006-09, the U.S. Geological Survey, in cooperation with the National Atlas of the United States, produced a 1:1,000,000-scale (1:1M) hydrography dataset comprising streams and waterbodies for the entire United States, including Puerto Rico and the U.S. Virgin Islands, for inclusion in the recompiled National Atlas. This report documents the methods used to select, simplify, and refine features in the 1:100,000-scale (1:100K) (1:63,360-scale in Alaska) National Hydrography Dataset to create the national 1:1M hydrography dataset. Custom tools and semi-automated processes were created to facilitate generalization of the 1:100K National Hydrography Dataset (1:63,360-scale in Alaska) to 1:1M on the basis of existing small-scale hydrography datasets. The first step in creating the new 1:1M dataset was to address feature selection and optimal data density in the streams network. Several existing methods were evaluated. The production method that was established for selecting features for inclusion in the 1:1M dataset uses a combination of the existing attributes and network in the National Hydrography Dataset and several of the concepts from the methods evaluated. The process for creating the 1:1M waterbodies dataset required a similar approach to that used for the streams dataset. Geometric simplification of features was the next step. Stream reaches and waterbodies indicated in the feature selection process were exported as new feature classes and then simplified using a geographic information system tool. The final step was refinement of the 1:1M streams and waterbodies. Refinement was done through the use of additional geographic information system tools.

  11. A Fast Surrogate-facilitated Data-driven Bayesian Approach to Uncertainty Quantification of a Regional Groundwater Flow Model with Structural Error

    NASA Astrophysics Data System (ADS)

    Xu, T.; Valocchi, A. J.; Ye, M.; Liang, F.

    2016-12-01

    Due to simplification and/or misrepresentation of the real aquifer system, numerical groundwater flow and solute transport models are usually subject to model structural error. During model calibration, the hydrogeological parameters may be overly adjusted to compensate for unknown structural error. This may result in biased predictions when models are used to forecast aquifer response to new forcing. In this study, we extend a fully Bayesian method [Xu and Valocchi, 2015] to calibrate a real-world, regional groundwater flow model. The method uses a data-driven error model to describe model structural error and jointly infers model parameters and structural error. In this study, Bayesian inference is facilitated using high performance computing and fast surrogate models. The surrogate models are constructed using machine learning techniques to emulate the response simulated by the computationally expensive groundwater model. We demonstrate in the real-world case study that explicitly accounting for model structural error yields parameter posterior distributions that are substantially different from those derived by the classical Bayesian calibration that does not account for model structural error. In addition, the Bayesian with error model method gives significantly more accurate prediction along with reasonable credible intervals.

  12. 46 CFR 12.15-15 - Engineman.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... satisfactorily at least 2 weeks indoctrination and training in the engine department of a partially automated steam vessel of 4,000 horsepower or over; or (3) Satisfactory completion of a course of training for...

  13. Automation of Hubble Space Telescope Mission Operations

    NASA Technical Reports Server (NTRS)

    Burley, Richard; Goulet, Gregory; Slater, Mark; Huey, William; Bassford, Lynn; Dunham, Larry

    2012-01-01

    On June 13, 2011, after more than 21 years, 115 thousand orbits, and nearly 1 million exposures taken, the operation of the Hubble Space Telescope successfully transitioned from 24x7x365 staffing to 815 staffing. This required the automation of routine mission operations including telemetry and forward link acquisition, data dumping and solid-state recorder management, stored command loading, and health and safety monitoring of both the observatory and the HST Ground System. These changes were driven by budget reductions, and required ground system and onboard spacecraft enhancements across the entire operations spectrum, from planning and scheduling systems to payload flight software. Changes in personnel and staffing were required in order to adapt to the new roles and responsibilities required in the new automated operations era. This paper will provide a high level overview of the obstacles to automating nominal HST mission operations, both technical and cultural, and how those obstacles were overcome.

  14. Direct-methods structure determination of a trypanosome RNA-editing substrate fragment with translational pseudosymmetry

    DOE PAGES

    Mooers, Blaine H. M.

    2016-03-24

    Using direct methods starting from random phases, the crystal structure of a 32-base-pair RNA (675 non-H RNA atoms in the asymmetric unit) was determined using only the native diffraction data (resolution limit 1.05 Å) and the computer program SIR2014. The almost three helical turns of the RNA in the asymmetric unit introduced partial or imperfect translational pseudosymmetry (TPS) that modulated the intensities when averaged by the lMiller indices but still escaped automated detection. Almost six times as many random phase sets had to be tested on average to reach a correct structure compared with a similar-sized RNA hairpin (27 nucleotides,more » 580 non-H RNA atoms) without TPS. Lastly, more sensitive methods are needed for the automated detection of partial TPS.« less

  15. Direct-methods structure determination of a trypanosome RNA-editing substrate fragment with translational pseudosymmetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mooers, Blaine H. M.

    Using direct methods starting from random phases, the crystal structure of a 32-base-pair RNA (675 non-H RNA atoms in the asymmetric unit) was determined using only the native diffraction data (resolution limit 1.05 Å) and the computer program SIR2014. The almost three helical turns of the RNA in the asymmetric unit introduced partial or imperfect translational pseudosymmetry (TPS) that modulated the intensities when averaged by the lMiller indices but still escaped automated detection. Almost six times as many random phase sets had to be tested on average to reach a correct structure compared with a similar-sized RNA hairpin (27 nucleotides,more » 580 non-H RNA atoms) without TPS. Lastly, more sensitive methods are needed for the automated detection of partial TPS.« less

  16. A fully-automated multiscale kernel graph cuts based particle localization scheme for temporal focusing two-photon microscopy

    NASA Astrophysics Data System (ADS)

    Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei

    2017-03-01

    The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.

  17. Studies of planning behavior of aircraft pilots in normal, abnormal and emergency situations

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.; Hillmann, K.

    1981-01-01

    A methodology for the study of planning is presented and the results of applying the methodology within two experimental investigations of planning behavior of aircraft pilots in normal, abnormal, and emergency situations are discussed. Beyond showing that the methodology yields consistent results, these experiments also lead to concepts in terms of a dichotomy between event driven and time driven planning, subtle effects of automation on planning, and the relationship of planning to workload and flight performance.

  18. A Balancing Act.

    ERIC Educational Resources Information Center

    Bilal, Dania; Barry, Jeff; Penniman, W. David

    1999-01-01

    Reviews automated-systems activities in libraries during the past year and profiles major vendors. Topics include new partnership arrangements driven by competition; library-systems revenues; Y2K issues; Windows-based interfaces; consulting; outsourcing; development trends; global system sales; and sales by type of library. (LRW)

  19. Officials nationwide give a green light to automated traffic enforcement

    DOT National Transportation Integrated Search

    2000-03-11

    There has been resistance to using cameras to automatically identify vehicles driven by motorists who run red lights and drive faster than the posted speed limits. Fairness, privacy, and "big brother" have been cited as reasons. The article examines ...

  20. Analysis of Slug Tests in Formations of High Hydraulic Conductivity

    USGS Publications Warehouse

    Butler, J.J.; Garnett, E.J.; Healey, J.M.

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  1. Nonlinear model of a rotating hub-beams structure: Equations of motion

    NASA Astrophysics Data System (ADS)

    Warminski, Jerzy

    2018-01-01

    Dynamics of a rotating structure composed of a rigid hub and flexible beams is presented in the paper. A nonlinear model of a beam takes into account bending, extension and nonlinear curvature. The influence of geometric nonlinearity and nonconstant angular velocity on dynamics of the rotating structure is presented. The exact equations of motion and associated boundary conditions are derived on the basis of the Hamilton's principle. The simplification of the exact nonlinear mathematical model is proposed taking into account the second order approximation. The reduced partial differential equations of motion together with associated boundary conditions can be used to study natural or forced vibrations of a rotating structure considering constant or nonconstant angular speed of a rigid hub and an arbitrary number of flexible blades.

  2. Diffusion of Charged Species in Liquids

    NASA Astrophysics Data System (ADS)

    Del Río, J. A.; Whitaker, S.

    2016-11-01

    In this study the laws of mechanics for multi-component systems are used to develop a theory for the diffusion of ions in the presence of an electrostatic field. The analysis begins with the governing equation for the species velocity and it leads to the governing equation for the species diffusion velocity. Simplification of this latter result provides a momentum equation containing three dominant forces: (a) the gradient of the partial pressure, (b) the electrostatic force, and (c) the diffusive drag force that is a central feature of the Maxwell-Stefan equations. For ideal gas mixtures we derive the classic Nernst-Planck equation. For liquid-phase diffusion we encounter a situation in which the Nernst-Planck contribution to diffusion differs by several orders of magnitude from that obtained for ideal gases.

  3. Diffusion of Charged Species in Liquids.

    PubMed

    Del Río, J A; Whitaker, S

    2016-11-04

    In this study the laws of mechanics for multi-component systems are used to develop a theory for the diffusion of ions in the presence of an electrostatic field. The analysis begins with the governing equation for the species velocity and it leads to the governing equation for the species diffusion velocity. Simplification of this latter result provides a momentum equation containing three dominant forces: (a) the gradient of the partial pressure, (b) the electrostatic force, and (c) the diffusive drag force that is a central feature of the Maxwell-Stefan equations. For ideal gas mixtures we derive the classic Nernst-Planck equation. For liquid-phase diffusion we encounter a situation in which the Nernst-Planck contribution to diffusion differs by several orders of magnitude from that obtained for ideal gases.

  4. Diffusion of Charged Species in Liquids

    PubMed Central

    del Río, J. A.; Whitaker, S.

    2016-01-01

    In this study the laws of mechanics for multi-component systems are used to develop a theory for the diffusion of ions in the presence of an electrostatic field. The analysis begins with the governing equation for the species velocity and it leads to the governing equation for the species diffusion velocity. Simplification of this latter result provides a momentum equation containing three dominant forces: (a) the gradient of the partial pressure, (b) the electrostatic force, and (c) the diffusive drag force that is a central feature of the Maxwell-Stefan equations. For ideal gas mixtures we derive the classic Nernst-Planck equation. For liquid-phase diffusion we encounter a situation in which the Nernst-Planck contribution to diffusion differs by several orders of magnitude from that obtained for ideal gases. PMID:27811959

  5. Analysis of slug tests in formations of high hydraulic conductivity.

    PubMed

    Butler, James J; Garnett, Elizabeth J; Healey, John M

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  6. Work Simplification

    ERIC Educational Resources Information Center

    Ross, Lynne

    1970-01-01

    Excerpts from a talk by Mrs. Ross at the 23rd annual convention of the American School Food Service Association in Detroit, August 5, 1969. A book on work simplification by Mrs. Ross will be available in June from the Iowa State University Press, Ames, Iowa. (Editor)

  7. 76 FR 27753 - Proposed Collection; Comment Request for Regulation Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    ... collection requirements related to Simplification of Entity Classification Rules. DATES: Written comments....gov . SUPPLEMENTARY INFORMATION: Title: Simplification of Entity Classification Rules. OMB Number... partnerships for federal tax purposes. The election is made by filing Form 8832, Entity Classification Election...

  8. DNA G-segment bending is not the sole determinant of topology simplification by type II DNA topoisomerases.

    PubMed

    Thomson, Neil H; Santos, Sergio; Mitchenall, Lesley A; Stuchinskaya, Tanya; Taylor, James A; Maxwell, Anthony

    2014-08-21

    DNA topoisomerases control the topology of DNA. Type II topoisomerases exhibit topology simplification, whereby products of their reactions are simplified beyond that expected based on thermodynamic equilibrium. The molecular basis for this process is unknown, although DNA bending has been implicated. To investigate the role of bending in topology simplification, the DNA bend angles of four enzymes of different types (IIA and IIB) were measured using atomic force microscopy (AFM). The enzymes tested were Escherichia coli topo IV and yeast topo II (type IIA enzymes that exhibit topology simplification), and Methanosarcina mazei topo VI and Sulfolobus shibatae topo VI (type IIB enzymes, which do not). Bend angles were measured using the manual tangent method from topographical AFM images taken with a novel amplitude-modulated imaging mode: small amplitude small set-point (SASS), which optimises resolution for a given AFM tip size and minimises tip convolution with the sample. This gave improved accuracy and reliability and revealed that all 4 topoisomerases bend DNA by a similar amount: ~120° between the DNA entering and exiting the enzyme complex. These data indicate that DNA bending alone is insufficient to explain topology simplification and that the 'exit gate' may be an important determinant of this process.

  9. DNA G-segment bending is not the sole determinant of topology simplification by type II DNA topoisomerases

    NASA Astrophysics Data System (ADS)

    Thomson, Neil H.; Santos, Sergio; Mitchenall, Lesley A.; Stuchinskaya, Tanya; Taylor, James A.; Maxwell, Anthony

    2014-08-01

    DNA topoisomerases control the topology of DNA. Type II topoisomerases exhibit topology simplification, whereby products of their reactions are simplified beyond that expected based on thermodynamic equilibrium. The molecular basis for this process is unknown, although DNA bending has been implicated. To investigate the role of bending in topology simplification, the DNA bend angles of four enzymes of different types (IIA and IIB) were measured using atomic force microscopy (AFM). The enzymes tested were Escherichia coli topo IV and yeast topo II (type IIA enzymes that exhibit topology simplification), and Methanosarcina mazei topo VI and Sulfolobus shibatae topo VI (type IIB enzymes, which do not). Bend angles were measured using the manual tangent method from topographical AFM images taken with a novel amplitude-modulated imaging mode: small amplitude small set-point (SASS), which optimises resolution for a given AFM tip size and minimises tip convolution with the sample. This gave improved accuracy and reliability and revealed that all 4 topoisomerases bend DNA by a similar amount: ~120° between the DNA entering and exiting the enzyme complex. These data indicate that DNA bending alone is insufficient to explain topology simplification and that the `exit gate' may be an important determinant of this process.

  10. Simplification of the kinematic model of human movement

    NASA Astrophysics Data System (ADS)

    Dusza, Jacek J.; Wawrzyniak, Zbigniew M.; del Prado Martinez, David

    2013-10-01

    The paper presents a methods of simplification of the human gait model. The experimental data were obtained in the laboratory of the group SATI in the Electronics Engineering Department of the University of Valencia. As a result of the Mean Double Step (MDS) procedure, the human motion were described by a matrix containing the Cartesian coordinates of 26 markers placed on the human body recorded in the 100 time points. With these data it has been possible to develop an software application which performs a wide diversity of tasks like array simplification, mask calculation for the simplification, error calculation as well as tools for signals comparison and movement animation of the markers. Simplifications were made by the spectral analysis of signals and calculating the standard deviation of the differences between the signal and its approximation. Using this method the signals of displacement could be written as the time series limited to a small number of harmonic signals. This approach allows us for a high degree of data compression. The model presented in this work can be applied into the context of medical diagnostics or rehabilitation because for a given approximation error and a large number of harmonics may demonstrate some abnormalities (of orthopaedic symptoms) in the gait cycle analysis.

  11. Concurrent ultrasonic weld evaluation system

    DOEpatents

    Hood, Donald W.; Johnson, John A.; Smartt, Herschel B.

    1987-01-01

    A system for concurrent, non-destructive evaluation of partially completed welds for use in conjunction with an automated welder. The system utilizes real time, automated ultrasonic inspection of a welding operation as the welds are being made by providing a transducer which follows a short distance behind the welding head. Reflected ultrasonic signals are analyzed utilizing computer based digital pattern recognition techniques to discriminate between good and flawed welds on a pass by pass basis. The system also distinguishes between types of weld flaws.

  12. Concurrent ultrasonic weld evaluation system

    DOEpatents

    Hood, D.W.; Johnson, J.A.; Smartt, H.B.

    1985-09-04

    A system for concurrent, non-destructive evaluation of partially completed welds for use in conjunction with an automated welder. The system utilizes real time, automated ultrasonic inspection of a welding operation as the welds are being made by providing a transducer which follows a short distance behind the welding head. Reflected ultrasonic signals are analyzed utilizing computer based digital pattern recognition techniques to discriminate between good and flawed welds on a pass by pass basis. The system also distinguishes between types of weld flaws.

  13. Concurrent ultrasonic weld evaluation system

    DOEpatents

    Hood, D.W.; Johnson, J.A.; Smartt, H.B.

    1987-12-15

    A system for concurrent, non-destructive evaluation of partially completed welds for use in conjunction with an automated welder is disclosed. The system utilizes real time, automated ultrasonic inspection of a welding operation as the welds are being made by providing a transducer which follows a short distance behind the welding head. Reflected ultrasonic signals are analyzed utilizing computer based digital pattern recognition techniques to discriminate between good and flawed welds on a pass by pass basis. The system also distinguishes between types of weld flaws. 5 figs.

  14. Hierarchically Parallelized Constrained Nonlinear Solvers with Automated Substructuring

    NASA Technical Reports Server (NTRS)

    Padovan, Joe; Kwang, Abel

    1994-01-01

    This paper develops a parallelizable multilevel multiple constrained nonlinear equation solver. The substructuring process is automated to yield appropriately balanced partitioning of each succeeding level. Due to the generality of the procedure,_sequential, as well as partially and fully parallel environments can be handled. This includes both single and multiprocessor assignment per individual partition. Several benchmark examples are presented. These illustrate the robustness of the procedure as well as its capability to yield significant reductions in memory utilization and calculational effort due both to updating and inversion.

  15. Automation - Changes in cognitive demands and mental workload

    NASA Technical Reports Server (NTRS)

    Tsang, Pamela S.; Johnson, Walter W.

    1987-01-01

    The effect of partial automation on mental workloads in man/machine tasks is investigated experimentally. Subjective workload measures are obtained from six subjects after performance of a task battery comprising two manual (flight-path control, FC, and target acquisition, TA) tasks and one decisionmaking (engine failure, EF) task; the FC task was performed in both a fully manual (altitude and lateral control) mode and in a semiautomated mode (autmatic latitude control). The performance results and subjective evaluations are presented in graphs and characterized in detail. The automation is shown to improve objective performance and lower subjective workload significantly in the combined FC/TA task, but not in the FC task alone or in the FC/EF task.

  16. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    NASA Astrophysics Data System (ADS)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  17. Vehicle automation: a remedy for driver stress?

    PubMed

    Funke, G; Matthews, G; Warm, J S; Emo, A K

    2007-08-01

    The present study addressed the effects of stress, vehicle automation and subjective state on driver performance and mood in a simulated driving task. A total of 168 college students participated. Participants in the stress-induction condition completed a 'winter' drive, which included periodic loss of control episodes. Participants in the no-stress-induction condition were not exposed to loss of control. An additional, independent manipulation of vehicle speed was also conducted, consisting of two control conditions requiring manual speed regulation and a third in which vehicle speed was automatically regulated by the simulation. Stress and automation both influenced subjective distress, but the two factors did not interact. Driver performance data indicated that vehicle automation impacted performance similarly in the stress and no-stress conditions. Individual differences in subjective stress response and performance were also investigated. Resource theory provides a framework that partially but not completely explains the relationship between vehicle automation and driver stress. Implications for driver workload, safety and training are discussed.

  18. Manual versus automated γ-H2AX foci analysis across five European laboratories: can this assay be used for rapid biodosimetry in a large scale radiation accident?

    PubMed

    Rothkamm, Kai; Barnard, Stephen; Ainsbury, Elizabeth A; Al-Hafidh, Jenna; Barquinero, Joan-Francesc; Lindholm, Carita; Moquet, Jayne; Perälä, Marjo; Roch-Lefèvre, Sandrine; Scherthan, Harry; Thierens, Hubert; Vral, Anne; Vandersickel, Veerle

    2013-08-30

    The identification of severely exposed individuals and reassurance of the 'worried well' are of prime importance for initial triage following a large scale radiation accident. We aim to develop the γ-H2AX foci assay into a rapid biomarker tool for use in accidents. Here, five laboratories established a standard operating procedure and analysed 100 ex vivo γ-irradiated, 4 or 24h incubated and overnight-shipped lymphocyte samples from four donors to generate γ-H2AX reference data, using manual and/or automated foci scoring strategies. In addition to acute, homogeneous exposures to 0, 1, 2 and 4Gy, acute simulated partial body (4Gy to 50% of cells) and protracted exposures (4Gy over 24h) were analysed. Data from all laboratories could be satisfactorily fitted with linear dose response functions. Average yields observed at 4h post exposure were 2-4 times higher than at 24h and varied considerably between laboratories. Automated scoring caused larger uncertainties than manual scoring and was unable to identify partial exposures, which were detectable in manually scored samples due to their overdispersed foci distributions. Protracted exposures were detectable but doses could not be accurately estimated with the γ-H2AX assay. We conclude that the γ-H2AX assay may be useful for rapid triage following a recent acute radiation exposure. The potentially higher speed and convenience of automated relative to manual foci scoring needs to be balanced against its compromised accuracy and inability to detect partial body exposures. Regular re-calibration or inclusion of reference samples may be necessary to ensure consistent results between laboratories or over long time periods. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  19. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    PubMed

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-12-02

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  20. Semantics and types of cough

    PubMed Central

    Chung, Kian Fan; Bolser, Don; Davenport, Paul; Fontana, Giovanni; Morice, Alyn; Widdicombe, John

    2010-01-01

    The panel considered the different types of cough in terms of basic mechanisms and clinical manifestations; both experimentally and clinically cough could occur in single efforts and as ‘bouts’ or ‘epochs’. There were different definitions of cough but, provided the definition used was clear, this did not seem to be a major concern. The methods available for determining the nature or type of clinical cough were discussed, in particular automated cough counting in the clinic and more sophisticated methods available in the laboratory. With regard to semantics, there has been great variation in the names used; this applies to nervous sensors for cough, to cough reflexes and epochs, to clinical names for cough, and to cough sounds. Some simplification and uniformity of nomenclature seemed desirable although, provided the use of a name was clear, little confusion probably existed. The panel felt that the cough nomenclature would evolve with time and would prove to be useful for investigators, clinicians and coughers. PMID:19136069

  1. Development of a semi-automated model identification and calibration tool for conceptual modelling of sewer systems.

    PubMed

    Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick

    2013-01-01

    Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.

  2. A universal piezo-driven ultrasonic cell microinjection system.

    PubMed

    Huang, Haibo; Mills, James K; Lu, Cong; Sun, Dong

    2011-08-01

    Over the past decade, the rapid development of biotechnologies such as gene injection, in-vitro fertilization, intracytoplasmic sperm injection (ICSI) and drug development have led to great demand for highly automated, high precision equipment for microinjection. Recently a new cell injection technology using piezo-driven pipettes with a very small mercury column was proposed and successfully applied in ICSI to a variety of mammal species. Although this technique significantly improves the survival rates of the ICSI process, shortcomings due to the toxicity of mercury and damage to the cell membrane due to large lateral tip oscillations of the injector pipette may limit its application. In this paper, a new cell injection system for automatic batch injection of suspended cells is developed. A new design of the piezo-driven cell injector is proposed for automated suspended cell injection. This new piezo-driven cell injector design relocates the piezo oscillation actuator to the injector pipette which eliminates the vibration effect on other parts of the micromanipulator. A small piezo stack is sufficient to perform the cell injection process. Harmful lateral tip oscillations of the injector pipette are reduced substantially without the use of a mercury column. Furthermore, ultrasonic vibration micro-dissection (UVM) theory is utilized to analyze the piezo-driven cell injection process, and the source of the lateral oscillations of the injector pipette is investigated. From preliminary experiments of cell injection of a large number of zebrafish embryos (n = 200), the injector pipette can easily pierce through the cell membrane at a low injection speed and almost no deformation of the cell wall, and with a high success rate(96%) and survival rate(80.7%) This new injection approach shows good potential for precision injection with less damage to the injected cells.

  3. Flavourzyme, an Enzyme Preparation with Industrial Relevance: Automated Nine-Step Purification and Partial Characterization of Eight Enzymes.

    PubMed

    Merz, Michael; Eisele, Thomas; Berends, Pieter; Appel, Daniel; Rabe, Swen; Blank, Imre; Stressler, Timo; Fischer, Lutz

    2015-06-17

    Flavourzyme is sold as a peptidase preparation from Aspergillus oryzae. The enzyme preparation is widely and diversely used for protein hydrolysis in industrial and research applications. However, detailed information about the composition of this mixture is still missing due to the complexity. The present study identified eight key enzymes by mass spectrometry and partially by activity staining on native polyacrylamide gels or gel zymography. The eight enzymes identified were two aminopeptidases, two dipeptidyl peptidases, three endopeptidases, and one α-amylase from the A. oryzae strain ATCC 42149/RIB 40 (yellow koji mold). Various specific marker substrates for these Flavourzyme enzymes were ascertained. An automated, time-saving nine-step protocol for the purification of all eight enzymes within 7 h was designed. Finally, the purified Flavourzyme enzymes were biochemically characterized with regard to pH and temperature profiles and molecular sizes.

  4. Medical Evidence Influence on Inpatients and Nurses Pain Ratings Agreement

    PubMed Central

    Samolsky Dekel, Boaz Gedaliahu; Gori, Alberto; Vasarri, Alessio; Sorella, Maria Cristina; Di Nino, Gianfranco; Melotti, Rita Maria

    2016-01-01

    Biased pain evaluation due to automated heuristics driven by symptom uncertainty may undermine pain treatment; medical evidence moderators are thought to play a role in such circumstances. We explored, in this cross-sectional survey, the effect of such moderators (e.g., nurse awareness of patients' pain experience and treatment) on the agreement between n = 862 inpatients' self-reported pain and n = 115 nurses' pain ratings using a numerical rating scale. We assessed the mean of absolute difference, agreement (κ-statistics), and correlation (Spearman rank) of inpatients and nurses' pain ratings and analyzed congruence categories' (CCs: underestimation, congruence, and overestimation) proportions and dependence upon pain categories for each medical evidence moderator (χ 2 analysis). Pain ratings agreement and correlation were limited; the CCs proportions were further modulated by the studied moderators. Medical evidence promoted in nurses overestimation of low and underestimation of high inpatients' self-reported pain. Knowledge of the negative influence of automated heuristics driven by symptoms uncertainty and medical-evidence moderators on pain evaluation may render pain assessment more accurate. PMID:27445633

  5. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-01-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  6. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Astrophysics Data System (ADS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-02-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  7. White-light parametric instabilities in plasmas.

    PubMed

    Santos, J E; Silva, L O; Bingham, R

    2007-06-08

    Parametric instabilities driven by partially coherent radiation in plasmas are described by a generalized statistical Wigner-Moyal set of equations, formally equivalent to the full wave equation, coupled to the plasma fluid equations. A generalized dispersion relation for stimulated Raman scattering driven by a partially coherent pump field is derived, revealing a growth rate dependence, with the coherence width sigma of the radiation field, scaling with 1/sigma for backscattering (three-wave process), and with 1/sigma1/2 for direct forward scattering (four-wave process). Our results demonstrate the possibility to control the growth rates of these instabilities by properly using broadband pump radiation fields.

  8. A test matrix sequencer for research test facility automation

    NASA Technical Reports Server (NTRS)

    Mccartney, Timothy P.; Emery, Edward F.

    1990-01-01

    The hardware and software configuration of a Test Matrix Sequencer, a general purpose test matrix profiler that was developed for research test facility automation at the NASA Lewis Research Center, is described. The system provides set points to controllers and contact closures to data systems during the course of a test. The Test Matrix Sequencer consists of a microprocessor controlled system which is operated from a personal computer. The software program, which is the main element of the overall system is interactive and menu driven with pop-up windows and help screens. Analog and digital input/output channels can be controlled from a personal computer using the software program. The Test Matrix Sequencer provides more efficient use of aeronautics test facilities by automating repetitive tasks that were once done manually.

  9. TRIAGE DOSE ASSESSMENT FOR PARTIAL-BODY EXPOSURE: DICENTRIC ANALYSIS

    PubMed Central

    Moroni, Maria; Pellmar, Terry C.

    2009-01-01

    Partial-body biodosimetry is likely to be required after a radiological or nuclear exposure. Clinical signs and symptoms, distribution of dicentrics in circulating blood cells, organ-specific biomarkers, physical signals in teeth and nails all can provide indications of non-homogeneous exposures. Organ specific biomarkers may provide early warning regarding physiological systems at risk after radiation injury. Use of a combination of markers and symptoms will be needed for clinical insights for therapeutic approaches. Analysis of dicentrics, a marker specific for radiation injury, is the “Gold standard” of biodosimetry and can reveal partial-body exposures. Automation of sample processing for dicentric analysis can increase throughput with customization of off-the-shelf technologies for cytogenetic sample processing and information management. Automated analysis of the metaphase spreads is currently limited but improvements are in development. Our efforts bridge the technological gaps to allow the use of dicentric chromosome assay (DCA) for risk-based stratification of mass casualties. This article summarizes current knowledge on partial-body cytogenetic dose assessment synthesizing information leading to the proposal of an approach to triage dose prediction in radiation mass casualties, based on equivalent whole-body doses under partial-body exposure conditions and assesses the validity of using this model. An initial screening using only 20 metaphase spreads per subject can confirm irradiation above 2-Gy. A subsequent increase to 50 metaphases improves dose determination to allow risk stratification for clinical triage. Metaphases evaluated for inhomogeneous distribution of dicentrics can reveal partial-body exposures. We tested the validity of this approach in an in vitro model that simulates partial-body irradiation by mixing irradiated and un-irradiated lymphocytes in various proportions. Our preliminary results support the notion that this approach will be effective under a range of conditions including some partial-body exposures, but may have limitations with low doses or small proportions of irradiated body. Our studies address an important problem in the diagnosis of partial-body irradiation and dose assessment in mass casualties and propose a solution. However, additional work is needed to fully develop and validate the application of DCA to partial-body exposures. PMID:20065689

  10. Increased Resistance to Flow and Ventilator Failure Secondary to Faulty CO2 Absorbent Insert Not Detected During Automated Anesthesia Machine Check: A Case Report.

    PubMed

    Moreno-Duarte, Ingrid; Montenegro, Julio; Balonov, Konstantin; Schumann, Roman

    2017-04-15

    Most modern anesthesia workstations provide automated checkout, which indicates the readiness of the anesthesia machine. In this case report, an anesthesia machine passed the automated machine checkout. Minutes after the induction of general anesthesia, we observed a mismatch between the selected and delivered tidal volumes in the volume auto flow mode with increased inspiratory resistance during manual ventilation. Endotracheal tube kinking, circuit obstruction, leaks, and patient-related factors were ruled out. Further investigation revealed a broken internal insert within the CO2 absorbent canister that allowed absorbent granules to cause a partial obstruction to inspiratory and expiratory flow triggering contradictory alarms. We concluded that even when the automated machine checkout indicates machine readiness, unforeseen equipment failure due to unexpected events can occur and require providers to remain vigilant.

  11. Estimated Bounds and Important Factors for Fuel Use and Consumer Costs of Connected and Automated Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, T. S.; Gonder, Jeff; Chen, Yuche

    This report details a study of the potential effects of connected and automated vehicle (CAV) technologies on vehicle miles traveled (VMT), vehicle fuel efficiency, and consumer costs. Related analyses focused on a range of light-duty CAV technologies in conventional powertrain vehicles -- from partial automation to full automation, with and without ridesharing -- compared to today's base-case scenario. Analysis results revealed widely disparate upper- and lower-bound estimates for fuel use and VMT, ranging from a tripling of fuel use to decreasing light-duty fuel use to below 40% of today's level. This wide range reflects uncertainties in the ways that CAVmore » technologies can influence vehicle efficiency and use through changes in vehicle designs, driving habits, and travel behavior. The report further identifies the most significant potential impacting factors, the largest areas of uncertainty, and where further research is particularly needed.« less

  12. Treatment simplification in HIV-infected adults as a strategy to prevent toxicity, improve adherence, quality of life and decrease healthcare costs

    PubMed Central

    Nachega, Jean B; Mugavero, Michael J; Zeier, Michele; Vitória, Marco; Gallant, Joel E

    2011-01-01

    Since the advent of highly active antiretroviral therapy (HAART), the treatment of human immunodeficiency virus (HIV) infection has become more potent and better tolerated. While the current treatment regimens still have limitations, they are more effective, more convenient, and less toxic than regimens used in the early HAART era, and new agents, formulations and strategies continue to be developed. Simplification of therapy is an option for many patients currently being treated with antiretroviral therapy (ART). The main goals are to reduce pill burden, improve quality of life and enhance medication adherence, while minimizing short- and long-term toxicities, reducing the risk of virologic failure and maximizing cost-effectiveness. ART simplification strategies that are currently used or are under study include the use of once-daily regimens, less toxic drugs, fixed-dose coformulations and induction-maintenance approaches. Improved adherence and persistence have been observed with the adoption of some of these strategies. The role of regimen simplification has implications not only for individual patients, but also for health care policy. With increased interest in ART regimen simplification, it is critical to study not only implications for individual tolerability, toxicity, adherence, persistence and virologic efficacy, but also cost, scalability, and potential for dissemination and implementation, such that limited human and financial resources are optimally allocated for maximal efficiency, coverage and sustainability of global HIV/AIDS treatment. PMID:21845035

  13. Effects of supervisory train control technology on operator attention.

    DOT National Transportation Integrated Search

    2005-07-31

    This report describes an experiment evaluating the effects of supervisory control automation on attention allocation while operating : a train. The study compared two levels of supervisory control (partial and full) to manual control, in terms of how...

  14. Ontology-Driven Information Integration

    NASA Technical Reports Server (NTRS)

    Tissot, Florence; Menzel, Chris

    2005-01-01

    Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.

  15. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  16. Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Donnell, James T.; Maile, Tobias; Rose, Cody

    Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less

  17. Automated full-3D digitization system for documentation of paintings

    NASA Astrophysics Data System (ADS)

    Karaszewski, Maciej; Adamczyk, Marcin; Sitnik, Robert; Michoński, Jakub; Załuski, Wojciech; Bunsch, Eryk; Bolewicki, Paweł

    2013-05-01

    In this paper, a fully automated 3D digitization system for documentation of paintings is presented. It consists of a specially designed frame system for secure fixing of painting, a custom designed, structured light-based, high-resolution measurement head with no IR and UV emission. This device is automatically positioned in two axes (parallel to the surface of digitized painting) with additional manual positioning in third, perpendicular axis. Manual change of observation angle is also possible around two axes to re-measure even partially shadowed areas. The whole system is built in a way which provides full protection of digitized object (moving elements cannot reach its vicinity) and is driven by computer-controlled, highly precise servomechanisms. It can be used for automatic (without any user attention) and fast measurement of the paintings with some limitation to their properties: maximum size of the picture is 2000mm x 2000mm (with deviation of flatness smaller than 20mm) Measurement head is automatically calibrated by the system and its possible working volume starts from 50mm x 50mm x 20mm (10000 points per square mm) and ends at 120mm x 80mm x 60mm (2500 points per square mm). The directional measurements obtained with this system are automatically initially aligned due to the measurement head's position coordinates known from servomechanisms. After the whole painting is digitized, the measurements are fine-aligned with color-based ICP algorithm to remove any influence of possible inaccuracy of positioning devices. We present exemplary digitization results along with the discussion about the opportunities of analysis which appear for such high-resolution, 3D computer models of paintings.

  18. Icing research tunnel rotating bar calibration measurement system

    NASA Technical Reports Server (NTRS)

    Gibson, Theresa L.; Dearmon, John M.

    1993-01-01

    In order to measure icing patterns across a test section of the Icing Research Tunnel, an automated rotating bar measurement system was developed at the NASA Lewis Research Center. In comparison with the previously used manual measurement system, this system provides a number of improvements: increased accuracy and repeatability, increased number of data points, reduced tunnel operating time, and improved documentation. The automated system uses a linear variable differential transformer (LVDT) to measure ice accretion. This instrument is driven along the bar by means of an intelligent stepper motor which also controls data recording. This paper describes the rotating bar calibration measurement system.

  19. Electronic states of carbon alloy catalysts and nitrogen substituent effects on catalytic activity

    NASA Astrophysics Data System (ADS)

    Hata, Tomoyuki; Ushiyama, Hiroshi; Yamashita, Koichi

    2013-03-01

    In recent years, Carbon Alloy Catalysts (CACs) are attracting attention as a candidate for non-platinum-based cathode catalysts in fuel cells. Oxygen reduction reactions at the cathode are divided into two elementary processes, electron transfer and oxygen adsorption. The electron transfer reaction is the rate-determining, and by comparison of energy levels, catalytic activity can be evaluated quantitatively. On the other hand, to begin with, adsorption mechanism is obscure. The purpose of this study is to understand the effect of nitrogen substitution and oxygen adsorption mechanism, by first-principle electronic structure calculations for nitrogen substituted models. To reproduce the elementary processes of oxygen adsorption, we assumed that the initial structures are formed based on the Pauling model, a CACs model and nitrogen substituted CACs models in which various points are replaced with nitrogen. When we try to focus only on the DOS peaks of oxygen, in some substituted model that has high adsorption activity, a characteristic partial occupancy state was found. We conclude that this state will affect the adsorption activity, and discuss on why partially occupied states appear with simplification by using an orbital correlation diagram.

  20. Relationship of scattering phase shifts to special radiation force conditions for spheres in axisymmetric wave-fields.

    PubMed

    Marston, Philip L; Zhang, Likun

    2017-05-01

    When investigating the radiation forces on spheres in complicated wave-fields, the interpretation of analytical results can be simplified by retaining the s-function notation and associated phase shifts imported into acoustics from quantum scattering theory. For situations in which dissipation is negligible, as taken to be the case in the present investigation, there is an additional simplification in that partial-wave phase shifts become real numbers that vanish when the partial-wave index becomes large and when the wave-number-sphere-radius product vanishes. By restricting attention to monopole and dipole phase shifts, transitions in the axial radiation force for axisymmetric wave-fields are found to be related to wave-field parameters for traveling and standing Bessel wave-fields by considering the ratio of the phase shifts. For traveling waves, the special force conditions concern negative forces while for standing waves, the special force conditions concern vanishing radiation forces. An intermediate step involves considering the functional dependence on phase shifts. An appendix gives an approximation for zero-force plane standing wave conditions. Connections with early investigations of acoustic levitation are mentioned and some complications associated with viscosity are briefly noted.

  1. Explosive Transient Camera (ETC) Program

    NASA Technical Reports Server (NTRS)

    Ricker, George

    1991-01-01

    Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

  2. Use of Syntactic Elaboration Techniques to Enhance Comprehensibility of EST Texts

    ERIC Educational Resources Information Center

    Rahimi, Mohammad Ali; Rezaei, Amir

    2011-01-01

    The current study examined differential effects of two pre-modification types, syntactic elaboration and syntactic simplification (at the level of syntax and irrespective of problematic lexis), on EST students' reading comprehension. The purpose was to see whether a priori syntactic elaborative adjustment, given its advantages over simplification,…

  3. 77 FR 66361 - Reserve Requirements of Depository Institutions: Reserves Simplification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ... simplifications related to the administration of reserve requirements: 1. Create a common two-week maintenance... earlier than it would implement the common maintenance period and the penalty-free band. The Board... effective July 12, 2012. The Board announced January 24, 2013, as the implementation date for the common two...

  4. Towards Just-In-Time Partial Evaluation of Prolog

    NASA Astrophysics Data System (ADS)

    Bolz, Carl Friedrich; Leuschel, Michael; Rigo, Armin

    We introduce a just-in-time specializer for Prolog. Just-in-time specialization attempts to unify of the concepts and benefits of partial evaluation (PE) and just-in-time (JIT) compilation. It is a variant of PE that occurs purely at runtime, which lazily generates residual code and is constantly driven by runtime feedback.

  5. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    PubMed

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  6. Discovering Hidden Controlling Parameters using Data Analytics and Dimensional Analysis

    NASA Astrophysics Data System (ADS)

    Del Rosario, Zachary; Lee, Minyong; Iaccarino, Gianluca

    2017-11-01

    Dimensional Analysis is a powerful tool, one which takes a priori information and produces important simplifications. However, if this a priori information - the list of relevant parameters - is missing a relevant quantity, then the conclusions from Dimensional Analysis will be incorrect. In this work, we present novel conclusions in Dimensional Analysis, which provide a means to detect this failure mode of missing or hidden parameters. These results are based on a restated form of the Buckingham Pi theorem that reveals a ridge function structure underlying all dimensionless physical laws. We leverage this structure by constructing a hypothesis test based on sufficient dimension reduction, allowing for an experimental data-driven detection of hidden parameters. Both theory and examples will be presented, using classical turbulent pipe flow as the working example. Keywords: experimental techniques, dimensional analysis, lurking variables, hidden parameters, buckingham pi, data analysis. First author supported by the NSF GRFP under Grant Number DGE-114747.

  7. SU-E-T-502: Initial Results of a Comparison of Treatment Plans Produced From Automated Prioritized Planning Method and a Commercial Treatment Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiwari, P; Chen, Y; Hong, L

    2015-06-15

    Purpose We developed an automated treatment planning system based on a hierarchical goal programming approach. To demonstrate the feasibility of our method, we report the comparison of prostate treatment plans produced from the automated treatment planning system with those produced by a commercial treatment planning system. Methods In our approach, we prioritized the goals of the optimization, and solved one goal at a time. The purpose of prioritization is to ensure that higher priority dose-volume planning goals are not sacrificed to improve lower priority goals. The algorithm has four steps. The first step optimizes dose to the target structures, whilemore » sparing key sensitive organs from radiation. In the second step, the algorithm finds the best beamlet weight to reduce toxicity risks to normal tissue while holding the objective function achieved in the first step as a constraint, with a small amount of allowed slip. Likewise, the third and fourth steps introduce lower priority normal tissue goals and beam smoothing. We compared with prostate treatment plans from Memorial Sloan Kettering Cancer Center developed using Eclipse, with a prescription dose of 72 Gy. A combination of liear, quadratic, and gEUD objective functions were used with a modified open source solver code (IPOPT). Results Initial plan results on 3 different cases show that the automated planning system is capable of competing or improving on expert-driven eclipse plans. Compared to the Eclipse planning system, the automated system produced up to 26% less mean dose to rectum and 24% less mean dose to bladder while having the same D95 (after matching) to the target. Conclusion We have demonstrated that Pareto optimal treatment plans can be generated automatically without a trial-and-error process. The solver finds an optimal plan for the given patient, as opposed to database-driven approaches that set parameters based on geometry and population modeling.« less

  8. Automated interpretation of 3D laserscanned point clouds for plant organ segmentation.

    PubMed

    Wahabzada, Mirwaes; Paulus, Stefan; Kersting, Kristian; Mahlein, Anne-Katrin

    2015-08-08

    Plant organ segmentation from 3D point clouds is a relevant task for plant phenotyping and plant growth observation. Automated solutions are required to increase the efficiency of recent high-throughput plant phenotyping pipelines. However, plant geometrical properties vary with time, among observation scales and different plant types. The main objective of the present research is to develop a fully automated, fast and reliable data driven approach for plant organ segmentation. The automated segmentation of plant organs using unsupervised, clustering methods is crucial in cases where the goal is to get fast insights into the data or no labeled data is available or costly to achieve. For this we propose and compare data driven approaches that are easy-to-realize and make the use of standard algorithms possible. Since normalized histograms, acquired from 3D point clouds, can be seen as samples from a probability simplex, we propose to map the data from the simplex space into Euclidean space using Aitchisons log ratio transformation, or into the positive quadrant of the unit sphere using square root transformation. This, in turn, paves the way to a wide range of commonly used analysis techniques that are based on measuring the similarities between data points using Euclidean distance. We investigate the performance of the resulting approaches in the practical context of grouping 3D point clouds and demonstrate empirically that they lead to clustering results with high accuracy for monocotyledonous and dicotyledonous plant species with diverse shoot architecture. An automated segmentation of 3D point clouds is demonstrated in the present work. Within seconds first insights into plant data can be deviated - even from non-labelled data. This approach is applicable to different plant species with high accuracy. The analysis cascade can be implemented in future high-throughput phenotyping scenarios and will support the evaluation of the performance of different plant genotypes exposed to stress or in different environmental scenarios.

  9. Computer vision for microscopy diagnosis of malaria.

    PubMed

    Tek, F Boray; Dempster, Andrew G; Kale, Izzet

    2009-07-13

    This paper reviews computer vision and image analysis studies aiming at automated diagnosis or screening of malaria infection in microscope images of thin blood film smears. Existing works interpret the diagnosis problem differently or propose partial solutions to the problem. A critique of these works is furnished. In addition, a general pattern recognition framework to perform diagnosis, which includes image acquisition, pre-processing, segmentation, and pattern classification components, is described. The open problems are addressed and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.

  10. Automated spectrophotometric bicarbonate analysis in duodenal juice compared to the back titration method.

    PubMed

    Erchinger, Friedemann; Engjom, Trond; Gudbrandsen, Oddrun Anita; Tjora, Erling; Gilja, Odd H; Dimcevski, Georg

    2016-01-01

    We have recently evaluated a short endoscopic secretin test for exocrine pancreatic function. Bicarbonate concentration in duodenal juice is an important parameter in this test. Measurement of bicarbonate by back titration as the gold standard method is time consuming, expensive and technically difficult, thus a simplified method is warranted. We aimed to evaluate an automated spectrophotometric method in samples spanning the effective range of bicarbonate concentrations in duodenal juice. We also evaluated if freezing of samples before analyses would affect its results. Patients routinely examined with short endoscopic secretin test suspected to have decreased pancreatic function of various reasons were included. Bicarbonate in duodenal juice was quantified by back titration and automatic spectrophotometry. Both fresh and thawed samples were analysed spectrophotometrically. 177 samples from 71 patients were analysed. Correlation coefficient of all measurements was r = 0.98 (p < 0.001). Correlation coefficient of fresh versus frozen samples conducted with automatic spectrophotometry (n = 25): r = 0.96 (p < 0.001) CONCLUSIONS: The measurement of bicarbonate in fresh and thawed samples by automatic spectrophotometrical analysis correlates excellent with the back titration gold standard. This is a major simplification of direct pancreas function testing, and allows a wider distribution of bicarbonate testing in duodenal juice. Extreme values for Bicarbonate concentration achieved by the autoanalyser method have to be interpreted with caution. Copyright © 2016 IAP and EPC. Published by Elsevier India Pvt Ltd. All rights reserved.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buche, D. L.

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  12. Technical College Graduate Perceptions of College and Career Readiness

    ERIC Educational Resources Information Center

    Hanson, Dale M.

    2013-01-01

    The United States workplace requires increased levels of postsecondary education to support workforce development for an economy driven by technology, automation and global competition. By 2018, 63 % of new jobs created will require postsecondary education (Carnevale, Smith, & Strohl, 2010). Currently, one in four graduates earns a bachelor's…

  13. Technology-Driven Resource Sharing: Paying for Improvement.

    ERIC Educational Resources Information Center

    Rush, James E.

    1993-01-01

    Addresses the inadequacies of traditional methods of library financing and proposes a strategy to be implemented in an environment supported by automation and networks. Publisher pricing of data rather than publications, options for charging library users with debit cards, and the role of regional library networks are discussed. (EAM)

  14. Dangerous intersections? A review of studies of fatigue and distraction in the automated vehicle.

    PubMed

    Matthews, Gerald; Neubauer, Catherine; Saxby, Dyani J; Wohleber, Ryan W; Lin, Jinchao

    2018-04-10

    The impacts of fatigue on the vehicle driver may change with technological advancements including automation and the increasing prevalence of potentially distracting in-car systems. This article reviews the authors' simulation studies of how fatigue, automation, and distraction may intersect as threats to safety. Distinguishing between states of active and passive fatigue supports understanding of fatigue and the development of countermeasures. Active fatigue is a stress-like state driven by overload of cognitive capabilities. Passive fatigue is produced by underload and monotony, and is associated with loss of task engagement and alertness. Our studies show that automated driving reliably elicits subjective symptoms of passive fatigue and also loss of alertness that persists following manual takeover. Passive fatigue also impairs attention and automation use in operators of Remotely Piloted Vehicles (RPVs). Use of in-vehicle media has been proposed as a countermeasure to fatigue, but such media may also be distracting. Studies tested whether various forms of phone-based media interacted with automation-induced fatigue, but effects were complex and dependent on task configuration. Selection of fatigue countermeasures should be guided by an understanding of the form of fatigue confronting the operator. System design, regulation of level of automation, managing distraction, and selection of fatigue-resilient personnel are all possible interventions for passive fatigue, but careful evaluation of interventions is necessary prior to deployment. Copyright © 2018. Published by Elsevier Ltd.

  15. Shear Banding in a Partially Molten Mantle

    NASA Astrophysics Data System (ADS)

    Alisic, L.; Rudge, J. F.; Wells, G.; Katz, R. F.; Rhebergen, S.

    2013-12-01

    We investigate the nonlinear behaviour of partially molten mantle material under shear. Numerical models of compaction and advection-diffusion of a porous matrix with a spherical inclusion are built using the automated code generation package FEniCS. The time evolution of melt distribution with increasing shear in these models is compared to laboratory experiments that show high-porosity shear banding in the medium and pressure shadows around the inclusion. We focus on understanding the interaction between these shear bands and pressure shadows as a function of rheological parameters.

  16. A little anthropomorphism goes a long way: Effects of oxytocin on trust, compliance and team performance with automated agents

    PubMed Central

    de Visser, Ewart J.; Monfort, Samuel S.; Goodyear, Kimberly; Lu, Li; O’Hara, Martin; Lee, Mary R.; Parasuraman, Raja; Krueger, Frank

    2017-01-01

    Objective We investigated the effects of exogenous oxytocin on trust, compliance, and team decision making with agents varying in anthropomorphism (computer, avatar, human) and reliability (100%, 50%). Background Recent work has explored psychological similarities in how we trust human-like automation compared to how we trust other humans. Exogenous administration of oxytocin, a neuropeptide associated with trust among humans, offers a unique opportunity to probe the anthropomorphism continuum of automation to infer when agents are trusted like another human or merely a machine. Method Eighty-four healthy male participants collaborated with automated agents varying in anthropomorphism that provided recommendations in a pattern recognition task. Results Under placebo, participants exhibited less trust and compliance with automated aids as the anthropomorphism of those aids increased. Under oxytocin, participants interacted with aids on the extremes of the anthropomorphism continuum similarly to placebos, but increased their trust, compliance, and performance with the avatar, an agent on the midpoint of the anthropomorphism continuum. Conclusion This study provided the first evidence that administration of exogenous oxytocin affected trust, compliance, and team decision making with automated agents. These effects provide support for the premise that oxytocin increases affinity for social stimuli in automated aids. Application Designing automation to mimic basic human characteristics is sufficient to elicit behavioral trust outcomes that are driven by neurological processes typically observed in human-human interactions. Designers of automated systems should consider the task, the individual, and the level of anthropomorphism to achieve the desired outcome. PMID:28146673

  17. Aspects of a Theory of Simplification, Debugging, and Coaching.

    ERIC Educational Resources Information Center

    Fischer, Gerhard; And Others

    This paper analyses new methods of teaching skiing in terms of a computational paradigm for learning called increasingly complex microworlds (ICM). Examining the factors that underlie the dramatic enhancement of the learning of skiing led to the focus on the processes of simplification, debugging, and coaching. These three processes are studied in…

  18. Legal Language: What Is It and What Can We Do About It?

    ERIC Educational Resources Information Center

    Charrow, Veda R.; Crandall, Jo Ann

    The simplification of legal language is required by President Carter's Executive Order requiring "clear and simple English" in government regulations. A major problem in the simplification process is the absence of any adequate description or classification of legal language. This paper defines some specific features of legal language,…

  19. Between-Word Simplification Patterns in the Continuous Speech of Children with Speech Sound Disorders

    ERIC Educational Resources Information Center

    Klein, Harriet B.; Liu-Shea, May

    2009-01-01

    Purpose: This study was designed to identify and describe between-word simplification patterns in the continuous speech of children with speech sound disorders. It was hypothesized that word combinations would reveal phonological changes that were unobserved with single words, possibly accounting for discrepancies between the intelligibility of…

  20. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.

  1. Elasticity-driven partial demixing in cholesteric liquid crystal films.

    PubMed

    Schmidtke, Jürgen; Coles, Harry J

    2009-07-01

    We discuss the partial demixing of a chiral nematic mixture of a chiral and an achiral compound, induced by inhomogeneous confinement between substrates. While the effect is tiny in low molar mass mixtures, it is predicted to be noticeable in polymeric systems. The potential of the effect for improving performance of liquid crystal based photonic devices is discussed.

  2. Long-term response of oak-hickory regeneration to partial harvest and repeated fires: influence of light and moisture

    Treesearch

    Louis R. Iverson; Todd F. Hutchinson; Matthew P. Peters; Daniel A. Yaussy

    2017-01-01

    By tracking oak-hickory (Quercus-Carya) regeneration for 13 yr across management-manipulated light and topographically driven moisture gradients after partial harvest and three prescribed fires, we document best-case conditions to promote advanced oak regeneration, and thereby provide a promising management tool to reverse the downward spiral in...

  3. Effects of non-neuronal components for functional connectivity analysis from resting-state functional MRI toward automated diagnosis of schizophrenia

    NASA Astrophysics Data System (ADS)

    Kim, Junghoe; Lee, Jong-Hwan

    2014-03-01

    A functional connectivity (FC) analysis from resting-state functional MRI (rsfMRI) is gaining its popularity toward the clinical application such as diagnosis of neuropsychiatric disease. To delineate the brain networks from rsfMRI data, non-neuronal components including head motions and physiological artifacts mainly observed in cerebrospinal fluid (CSF), white matter (WM) along with a global brain signal have been regarded as nuisance variables in calculating the FC level. However, it is still unclear how the non-neuronal components can affect the performance toward diagnosis of neuropsychiatric disease. In this study, a systematic comparison of classification performance of schizophrenia patients was provided employing the partial correlation coefficients (CCs) as feature elements. Pair-wise partial CCs were calculated between brain regions, in which six combinatorial sets of nuisance variables were considered. The partial CCs were used as candidate feature elements followed by feature selection based on the statistical significance test between two groups in the training set. Once a linear support vector machine was trained using the selected features from the training set, the classification performance was evaluated using the features from the test set (i.e. leaveone- out cross validation scheme). From the results, the error rate using all non-neuronal components as nuisance variables (12.4%) was significantly lower than those using remaining combination of non-neuronal components as nuisance variables (13.8 ~ 20.0%). In conclusion, the non-neuronal components substantially degraded the automated diagnosis performance, which supports our hypothesis that the non-neuronal components are crucial in controlling the automated diagnosis performance of the neuropsychiatric disease using an fMRI modality.

  4. Discovery informatics in biological and biomedical sciences: research challenges and opportunities.

    PubMed

    Honavar, Vasant

    2015-01-01

    New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).

  5. Application-driven ground motion prediction equation for seismic hazard assessments in non-cratonic moderate-seismicity areas

    NASA Astrophysics Data System (ADS)

    Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.

    2017-09-01

    We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.

  6. Automation of electromagnetic compatability (EMC) test facilities

    NASA Technical Reports Server (NTRS)

    Harrison, C. A.

    1986-01-01

    Efforts to automate electromagnetic compatibility (EMC) test facilities at Marshall Space Flight Center are discussed. The present facility is used to accomplish a battery of nine standard tests (with limited variations) deigned to certify EMC of Shuttle payload equipment. Prior to this project, some EMC tests were partially automated, but others were performed manually. Software was developed to integrate all testing by means of a desk-top computer-controller. Near real-time data reduction and onboard graphics capabilities permit immediate assessment of test results. Provisions for disk storage of test data permit computer production of the test engineer's certification report. Software flexibility permits variation in the tests procedure, the ability to examine more closely those frequency bands which indicate compatibility problems, and the capability to incorporate additional test procedures.

  7. A Methodology for Model Comparison Using the Theater Simulation of Airbase Resources and All Mobile Tactical Air Force Models

    DTIC Science & Technology

    1992-09-01

    ease with which a model is employed, may depend on several factors, among them the users’ past experience in modeling, preferences for menu driven...partially on our knowledge of important logistics factors, partially on the past work of Diener (12), and partially on the assumption that comparison of...flexibility in output report selection. The minimum output was used in each instance 74 to conserve computer storage and to minimize the consumption of paper

  8. Encountering Productive Forms of Complexity in Learning Modern Physics

    ERIC Educational Resources Information Center

    Levrini, Olivia; Fantini, Paola

    2013-01-01

    This paper aims at supporting the claim that some forms of hyper-simplification, by making physics seem easy, are at risk of dangerously distorting the content as well as the process of learning physics. The paper presents examples of dangerous simplifications in the teaching of quantum physics. Then, examples of productive forms of complexity are…

  9. 76 FR 40457 - Administrative Simplification: Adoption of Operating Rules for Eligibility for a Health Plan and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-08

    ... Plan and Health Care Claim Status Transactions; Interim Final Rule #0;#0;Federal Register / Vol. 76, No... Simplification: Adoption of Operating Rules for Eligibility for a Health Plan and Health Care Claim Status... rules for the health care industry and directed the Secretary of Health and Human Services to ``adopt a...

  10. A novel pressure-driven piezodispenser for nanoliter volumes.

    PubMed

    McGuire, Shawn; Fisher, Charles; Holl, Mark; Meldrum, Deirdre

    2008-08-01

    A successful dispensing device has been built for use in biotechnology applications requiring nanoliter volume liquid transfer. Air pressure is used as the primary driving force and is controlled via a high speed miniature solenoid valve as opposed to many existing systems that use a valve in line with constantly pressurized fluid to start and stop the dispensing action. This automated pressure-driven system is used to improve a typical piezodriven microdispenser. The resulting system is much less prone to failures resulting from air entrainment and can dispense much higher viscosity fluids than the microdispenser alone.

  11. Behavior driven testing in ALMA telescope calibration software

    NASA Astrophysics Data System (ADS)

    Gil, Juan P.; Garces, Mario; Broguiere, Dominique; Shen, Tzu-Chiang

    2016-07-01

    ALMA software development cycle includes well defined testing stages that involves developers, testers and scientists. We adapted Behavior Driven Development (BDD) to testing activities applied to Telescope Calibration (TELCAL) software. BDD is an agile technique that encourages communication between roles by defining test cases using natural language to specify features and scenarios, what allows participants to share a common language and provides a high level set of automated tests. This work describes how we implemented and maintain BDD testing for TELCAL, the infrastructure needed to support it and proposals to expand this technique to other subsystems.

  12. Simplification of irreversible Markov chains by removal of states with fast leaving rates.

    PubMed

    Jia, Chen

    2016-07-07

    In the recent work of Ullah et al. (2012a), the authors developed an effective method to simplify reversible Markov chains by removal of states with low equilibrium occupancies. In this paper, we extend this result to irreversible Markov chains. We show that an irreversible chain can be simplified by removal of states with fast leaving rates. Moreover, we reveal that the irreversibility of the chain will always decrease after model simplification. This suggests that although model simplification can retain almost all the dynamic information of the chain, it will lose some thermodynamic information as a trade-off. Examples from biology are also given to illustrate the main results of this paper. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A knowledge-driven probabilistic framework for the prediction of protein-protein interaction networks.

    PubMed

    Browne, Fiona; Wang, Haiying; Zheng, Huiru; Azuaje, Francisco

    2010-03-01

    This study applied a knowledge-driven data integration framework for the inference of protein-protein interactions (PPI). Evidence from diverse genomic features is integrated using a knowledge-driven Bayesian network (KD-BN). Receiver operating characteristic (ROC) curves may not be the optimal assessment method to evaluate a classifier's performance in PPI prediction as the majority of the area under the curve (AUC) may not represent biologically meaningful results. It may be of benefit to interpret the AUC of a partial ROC curve whereby biologically interesting results are represented. Therefore, the novel application of the assessment method referred to as the partial ROC has been employed in this study to assess predictive performance of PPI predictions along with calculating the True positive/false positive rate and true positive/positive rate. By incorporating domain knowledge into the construction of the KD-BN, we demonstrate improvement in predictive performance compared with previous studies based upon the Naive Bayesian approach. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  14. A frictionally and hydraulically constrained model of the convectively driven mean flow in partially enclosed seas

    NASA Astrophysics Data System (ADS)

    Maxworthy, T.

    1997-08-01

    A simple three-layer model of the dynamics of partially enclosed seas, driven by a surface buoyancy flux, is presented. It contains two major elements, a hydraulic constraint at the exit contraction and friction in the interior of the main body of the sea; both together determine the vertical structure and magnitudes of the interior flow variables, i.e. velocity and density. Application of the model to the large-scale dynamics of the Red Sea gives results that are not in disagreement with observation once the model is applied, also, to predict the dense outflow from the Gulf of Suez. The latter appears to be the agent responsible for the formation of dense bottom water in this system. Also, the model is reasonably successful in predicting the density of the outflow from the Persian Gulf, and can be applied to any number of other examples of convectively driven flow in long, narrow channels, with or without sills and constrictions at their exits.

  15. An Ada programming support environment

    NASA Technical Reports Server (NTRS)

    Tyrrill, AL; Chan, A. David

    1986-01-01

    The toolset of an Ada Programming Support Environment (APSE) being developed at North American Aircraft Operations (NAAO) of Rockwell International, is described. The APSE is resident on three different hosts and must support developments for the hosts and for embedded targets. Tools and developed software must be freely portable between the hosts. The toolset includes the usual editors, compilers, linkers, debuggers, configuration magnagers, and documentation tools. Generally, these are being supplied by the host computer vendors. Other tools, for example, pretty printer, cross referencer, compilation order tool, and management tools were obtained from public-domain sources, are implemented in Ada and are being ported to the hosts. Several tools being implemented in-house are of interest, these include an Ada Design Language processor based on compilable Ada. A Standalone Test Environment Generator facilitates test tool construction and partially automates unit level testing. A Code Auditor/Static Analyzer permits the Ada programs to be evaluated against measures of quality. An Ada Comment Box Generator partially automates generation of header comment boxes.

  16. Strategies for Selecting Vendors and Evaluating Their Performance--From the Vendor's Perspective.

    ERIC Educational Resources Information Center

    Born, Kathleen

    1992-01-01

    Presents the vendor's perspective on how libraries choose and evaluate a vendor's performance. Characteristics of vendors are described, including being profit-oriented, market-driven, and highly competitive; issues in the three basic areas of service, price, and automation are discussed; and the influence of personality is considered. (LRW)

  17. Lewis Online Travel System: Preparer's/Traveler's Manual, Release 1.0

    NASA Technical Reports Server (NTRS)

    Seese, Michael

    1992-01-01

    The Lewis Online Travel System (LOTS) is a menu-driven interactive application that automates nearly all of the functions associated with government travel. The purpose of this manual is to provide LOTS users with concise instructions for using the computerized application. As such, it will not go into the details of travel regulations.

  18. From Intuition to Evidence: A Data-Driven Approach to Transforming CS Education

    ERIC Educational Resources Information Center

    Allevato, Anthony J.

    2012-01-01

    Educators in many disciplines are too often forced to rely on intuition about how students learn and the effectiveness of teaching to guide changes and improvements to their curricula. In computer science, systems that perform automated collection and assessment of programming assignments are seeing increased adoption, and these systems generate a…

  19. Using AGWA and the KINEROS2 Model-to-Model Green Infrastructure in Two Typical Residential Lots in Prescott, AZ

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) Urban tool provides a step-by-step process to model subdivisions using the KINEROS2 model, with and without Green Infrastructure (GI) practices. AGWA utilizes the Kinematic Runoff and Erosion (KINEROS2) model, an event driven, ...

  20. The automated radiosynthesis and purification of the opioid receptor antagonist, [6-O-methyl-11C]diprenorphine on the GE TRACERlab FXFE radiochemistry module.

    PubMed

    Fairclough, Michael; Prenant, Christian; Brown, Gavin; McMahon, Adam; Lowe, Jonathan; Jones, Anthony

    2014-05-15

    [6-O-Methyl-(11)C]diprenorphine ([(11)C]diprenorphine) is a positron emission tomography ligand used to probe the endogenous opioid system in vivo. Diprenorphine acts as an antagonist at all of the opioid receptor subtypes, that is, μ (mu), κ (kappa) and δ (delta). The radiosynthesis of [(11)C]diprenorphine using [(11)C]methyl iodide produced via the 'wet' method on a home-built automated radiosynthesis set-up has been described previously. Here, we describe a modified synthetic method to [(11)C]diprenorphine performed using [(11)C]methyl iodide produced via the gas phase method on a GE TRACERlab FXFE radiochemistry module. Also described is the use of [(11)C]methyl triflate as the carbon-11 methylating agent for the [(11)C]diprenorphine syntheses. [(11)C]Diprenorphine was produced to good manufacturing practice standards for use in a clinical setting. In comparison to previously reported [(11)C]diprenorphine radiosyntheisis, the method described herein gives a higher specific activity product which is advantageous for receptor occupancy studies. The radiochemical purity of [(11)C]diprenorphine is similar to what has been reported previously, although the radiochemical yield produced in the method described herein is reduced, an issue that is inherent in the gas phase radiosynthesis of [(11)C]methyl iodide. The yields of [(11)C]diprenorphine are nonetheless sufficient for clinical research applications. Other advantages of the method described herein are an improvement to both reproducibility and reliability of the production as well as simplification of the purification and formulation steps. We suggest that our automated radiochemistry route to [(11)C]diprenorphine should be the method of choice for routine [(11)C]diprenorphine productions for positron emission tomography studies, and the production process could easily be transferred to other radiochemistry modules such as the TRACERlab FX C pro. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Interactive Classification of Construction Materials: Feedback Driven Framework for Annotation and Analysis of 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Hess, M. R.; Petrovic, V.; Kuester, F.

    2017-08-01

    Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.

  2. Nitrous oxide and methane dynamics in a coral reef lagoon driven by pore water exchange: Insights from automated high-frequency observations

    NASA Astrophysics Data System (ADS)

    O'Reilly, Chiara; Santos, Isaac R.; Cyronak, Tyler; McMahon, Ashly; Maher, Damien T.

    2015-04-01

    Automated cavity ring down spectroscopy was used to make continuous measurements of dissolved methane, nitrous oxide, and carbon dioxide in a coral reef lagoon for 2 weeks (Heron Island, Great Barrier Reef). Radon (222Rn) was used to trace the influence of tidally driven pore water exchange on greenhouse gas dynamics. Clear tidal variation was observed for CH4, which correlated to 222Rn in lagoon waters. N2O correlated to 222Rn during the day only, which appears to be a response to coupled nitrification-denitrification in oxic sediments, fueled by nitrate derived from bird guano. The lagoon was a net source of CH4 and N2O to the atmosphere and a sink for atmospheric CO2. The estimated pore water-derived CH4 and N2O fluxes were 3.2-fold and 24.0-fold greater than the fluxes to the atmosphere. Overall, pore water and/or groundwater exchange were the only important sources of CH4 and major controls of N2O in the coral reef lagoon.

  3. Driving out errors through tight integration between software and automation.

    PubMed

    Reifsteck, Mark; Swanson, Thomas; Dallas, Mary

    2006-01-01

    A clear case has been made for using clinical IT to improve medication safety, particularly bar-code point-of-care medication administration and computerized practitioner order entry (CPOE) with clinical decision support. The equally important role of automation has been overlooked. When the two are tightly integrated, with pharmacy information serving as a hub, the distinctions between software and automation become blurred. A true end-to-end medication management system drives out errors from the dockside to the bedside. Presbyterian Healthcare Services in Albuquerque has been building such a system since 1999, beginning by automating pharmacy operations to support bar-coded medication administration. Encouraged by those results, it then began layering on software to further support clinician workflow and improve communication, culminating with the deployment of CPOE and clinical decision support. This combination, plus a hard-wired culture of safety, has resulted in a dramatically lower mortality and harm rate that could not have been achieved with a partial solution.

  4. Summary of a Crew-Centered Flight Deck Design Philosophy for High-Speed Civil Transport (HSCT) Aircraft

    NASA Technical Reports Server (NTRS)

    Palmer, Michael T.; Rogers, William H.; Press, Hayes N.; Latorella, Kara A.; Abbott, Terence S.

    1995-01-01

    Past flight deck design practices used within the U.S. commercial transport aircraft industry have been highly successful in producing safe and efficient aircraft. However, recent advances in automation have changed the way pilots operate aircraft, and these changes make it necessary to reconsider overall flight deck design. Automated systems have become more complex and numerous, and often their inner functioning is partially or fully opaque to the flight crew. Recent accidents and incidents involving autoflight system mode awareness Dornheim, 1995) are an example. This increase in complexity raises pilot concerns about the trustworthiness of automation, and makes it difficult for the crew to be aware of all the intricacies of operation that may impact safe flight. While pilots remain ultimately responsible for mission success, performance of flight deck tasks has been more widely distributed across human and automated resources. Advances in sensor and data integration technologies now make far more information available than may be prudent to present to the flight crew.

  5. Improving single molecule force spectroscopy through automated real-time data collection and quantification of experimental conditions

    PubMed Central

    Scholl, Zackary N.; Marszalek, Piotr E.

    2013-01-01

    The benefits of single molecule force spectroscopy (SMFS) clearly outweigh the challenges which include small sample sizes, tedious data collection and introduction of human bias during the subjective data selection. These difficulties can be partially eliminated through automation of the experimental data collection process for atomic force microscopy (AFM). Automation can be accomplished using an algorithm that triages usable force-extension recordings quickly with positive and negative selection. We implemented an algorithm based on the windowed fast Fourier transform of force-extension traces that identifies peaks using force-extension regimes to correctly identify usable recordings from proteins composed of repeated domains. This algorithm excels as a real-time diagnostic because it involves <30 ms computational time, has high sensitivity and specificity, and efficiently detects weak unfolding events. We used the statistics provided by the automated procedure to clearly demonstrate the properties of molecular adhesion and how these properties change with differences in the cantilever tip and protein functional groups and protein age. PMID:24001740

  6. Solving Partial Differential Equations in a data-driven multiprocessor environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaudiot, J.L.; Lin, C.M.; Hosseiniyar, M.

    1988-12-31

    Partial differential equations can be found in a host of engineering and scientific problems. The emergence of new parallel architectures has spurred research in the definition of parallel PDE solvers. Concurrently, highly programmable systems such as data-how architectures have been proposed for the exploitation of large scale parallelism. The implementation of some Partial Differential Equation solvers (such as the Jacobi method) on a tagged token data-flow graph is demonstrated here. Asynchronous methods (chaotic relaxation) are studied and new scheduling approaches (the Token No-Labeling scheme) are introduced in order to support the implementation of the asychronous methods in a data-driven environment.more » New high-level data-flow language program constructs are introduced in order to handle chaotic operations. Finally, the performance of the program graphs is demonstrated by a deterministic simulation of a message passing data-flow multiprocessor. An analysis of the overhead in the data-flow graphs is undertaken to demonstrate the limits of parallel operations in dataflow PDE program graphs.« less

  7. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  8. Booster propulsion/vehicle impact study

    NASA Technical Reports Server (NTRS)

    Weldon, Vincent; Dunn, Michael; Fink, Lawrence; Phillips, Dwight; Wetzel, Eric

    1988-01-01

    The use of hydrogen RP-1, propane, and methane as fuels for booster engines of launch vehicles is discussed. An automated procedure for integrated launch vehicle, engine sizing, and design optimization was used to define two stage and single stage concepts for minimum dry weight. The two stage vehicles were unmanned and used a flyback booster and partially reusable orbiter. The single stage designs were fully reusable, manned flyback vehicles. Comparisons of these vehicle designs, showing the effects of using different fuels, as well as sensitivity and trending data, are presented. In addition, the automated design technique utilized for the study is described.

  9. An automated procedure for calculating system matrices from perturbation data generated by an EAI Pacer and 100 hybrid computer system

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Krosel, S. M.

    1977-01-01

    Techniques are presented for determining the elements of the A, B, C, and D state variable matrices for systems simulated on an EAI Pacer 100 hybrid computer. An automated procedure systematically generates disturbance data necessary to linearize the simulation model and stores these data on a floppy disk. A separate digital program verifies this data, calculates the elements of the system matrices, and prints these matrices appropriately labeled. The partial derivatives forming the elements of the state variable matrices are approximated by finite difference calculations.

  10. Magnetic Heat Pump Containing Flow Diverters

    NASA Technical Reports Server (NTRS)

    Howard, Frank S.

    1995-01-01

    Proposed magnetic heat pump contains flow diverters for suppression of undesired flows. If left unchecked, undesired flows mix substantial amounts of partially heated and partially cooled portions of working fluid, effectively causing leakage of heat from heated side to cooled side. By reducing leakage of heat, flow diverters increase energy efficiency of magnetic heat pump, potentially offering efficiency greater than compressor-driven refrigerator.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This paper reports on an automated metering/proving system for custody transfer of crude oil at the Phillips 66 Co. tanker unloading terminal in Freeport, Texas. It is described as one of the most sophisticated systems developed. The menu-driven, one-button automation removes the proving sequence entirely from manual control. The system also is the to be cost-effective and versatile compared to a dedicated flow computer with API calculation capabilities. Developed by Puffer-Sweiven, systems integrators, the new technology additionally is thought to be the first custody transfer system to employ a programmable logic controller (PLC). The PLC provides the automation, gathers andmore » stores all raw data, and prints alarms. Also the system uses a personal computer operator interface (OI) that runs on the Intel iRMX real time operating system. The OI is loaded with Puffer-Sweiven application software that performs API meter factor and volume correction calculations as well as present color graphics and generate reports.« less

  12. Data-Driven Surface Traversability Analysis for Mars 2020 Landing Site Selection

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Rothrock, Brandon; Almeida, Eduardo; Ansar, Adnan; Otero, Richard; Huertas, Andres; Heverly, Matthew

    2015-01-01

    The objective of this paper is three-fold: 1) to describe the engineering challenges in the surface mobility of the Mars 2020 Rover mission that are considered in the landing site selection processs, 2) to introduce new automated traversability analysis capabilities, and 3) to present the preliminary analysis results for top candidate landing sites. The analysis capabilities presented in this paper include automated terrain classification, automated rock detection, digital elevation model (DEM) generation, and multi-ROI (region of interest) route planning. These analysis capabilities enable to fully utilize the vast volume of high-resolution orbiter imagery, quantitatively evaluate surface mobility requirements for each candidate site, and reject subjectivity in the comparison between sites in terms of engineering considerations. The analysis results supported the discussion in the Second Landing Site Workshop held in August 2015, which resulted in selecting eight candidate sites that will be considered in the third workshop.

  13. The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot.

    PubMed

    Kitson, Philip J; Glatzel, Stefan; Cronin, Leroy

    2016-01-01

    An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic 'programs' which can run on similar low cost, user-constructed robotic platforms towards an 'open-source' regime in the area of chemical synthesis.

  14. The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot

    PubMed Central

    Kitson, Philip J; Glatzel, Stefan

    2016-01-01

    An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic ‘programs’ which can run on similar low cost, user-constructed robotic platforms towards an ‘open-source’ regime in the area of chemical synthesis. PMID:28144350

  15. The BAARA (Biological AutomAted RAdiotracking) System: A New Approach in Ecological Field Studies

    PubMed Central

    Řeřucha, Šimon; Bartonička, Tomáš; Jedlička, Petr; Čížek, Martin; Hlouša, Ondřej; Lučan, Radek; Horáček, Ivan

    2015-01-01

    Radiotracking is an important and often the only possible method to explore specific habits and the behaviour of animals, but it has proven to be very demanding and time-consuming, especially when frequent positioning of a large group is required. Our aim was to address this issue by making the process partially automated, to mitigate the demands and related costs. This paper presents a novel automated tracking system that consists of a network of automated tracking stations deployed within the target area. Each station reads the signals from telemetry transmitters, estimates the bearing and distance of the tagged animals and records their position. The station is capable of tracking a theoretically unlimited number of transmitters on different frequency channels with the period of 5–15 seconds per single channel. An ordinary transmitter that fits within the supported frequency band might be used with BAARA (Biological AutomAted RAdiotracking); an extra option is the use of a custom-programmable transmitter with configurable operational parameters, such as the precise frequency channel or the transmission parameters. This new approach to a tracking system was tested for its applicability in a series of field and laboratory tests. BAARA has been tested within fieldwork explorations of Rousettus aegyptiacus during field trips to Dakhla oasis in Egypt. The results illustrate the novel perspective which automated radiotracking opens for the study of spatial behaviour, particularly in addressing topics in the domain of population ecology. PMID:25714910

  16. The effect of varying levels of vehicle automation on drivers' lane changing behaviour.

    PubMed

    Madigan, Ruth; Louw, Tyron; Merat, Natasha

    2018-01-01

    Much of the Human Factors research into vehicle automation has focused on driver responses to critical scenarios where a crash might occur. However, there is less knowledge about the effects of vehicle automation on drivers' behaviour during non-critical take-over situations, such as driver-initiated lane-changing or overtaking. The current driving simulator study, conducted as part of the EC-funded AdaptIVe project, addresses this issue. It uses a within-subjects design to compare drivers' lane-changing behaviour in conventional manual driving, partially automated driving (PAD) and conditionally automated driving (CAD). In PAD, drivers were required to re-take control from an automated driving system in order to overtake a slow moving vehicle, while in CAD, the driver used the indicator lever to initiate a system-performed overtaking manoeuvre. Results showed that while drivers' acceptance of both the PAD and CAD systems was high, they generally preferred CAD. A comparison of overtaking positions showed that drivers initiated overtaking manoeuvres slightly later in PAD than in manual driving or CAD. In addition, when compared to conventional driving, drivers had higher deviations in lane positioning and speed, along with higher lateral accelerations during lane changes following PAD. These results indicate that even in situations which are not time-critical, drivers' vehicle control after automation is degraded compared to conventional driving.

  17. The effect of varying levels of vehicle automation on drivers’ lane changing behaviour

    PubMed Central

    Louw, Tyron; Merat, Natasha

    2018-01-01

    Much of the Human Factors research into vehicle automation has focused on driver responses to critical scenarios where a crash might occur. However, there is less knowledge about the effects of vehicle automation on drivers’ behaviour during non-critical take-over situations, such as driver-initiated lane-changing or overtaking. The current driving simulator study, conducted as part of the EC-funded AdaptIVe project, addresses this issue. It uses a within-subjects design to compare drivers’ lane-changing behaviour in conventional manual driving, partially automated driving (PAD) and conditionally automated driving (CAD). In PAD, drivers were required to re-take control from an automated driving system in order to overtake a slow moving vehicle, while in CAD, the driver used the indicator lever to initiate a system-performed overtaking manoeuvre. Results showed that while drivers’ acceptance of both the PAD and CAD systems was high, they generally preferred CAD. A comparison of overtaking positions showed that drivers initiated overtaking manoeuvres slightly later in PAD than in manual driving or CAD. In addition, when compared to conventional driving, drivers had higher deviations in lane positioning and speed, along with higher lateral accelerations during lane changes following PAD. These results indicate that even in situations which are not time-critical, drivers’ vehicle control after automation is degraded compared to conventional driving. PMID:29466402

  18. Deconstructing the simplification of jury instructions: How simplifying the features of complexity affects jurors' application of instructions.

    PubMed

    Baguley, Chantelle M; McKimmie, Blake M; Masser, Barbara M

    2017-06-01

    Research consistently shows that techniques currently used to simplify jury instructions do not always improve mock jurors' comprehension. If improvements are observed, these are limited and overall comprehension remains low. It is unclear, however, why this occurs. It is possible that current simplification techniques do not effectively simplify the features of complexity, present in standardized instructions, which have the greatest effect on jurors' comprehension. It is not yet known, however, how much each feature of complexity individually affects jurors' comprehension. To investigate this, the authors used existing data from published empirical studies to examine how simplifying each feature of complexity affects mock jurors' application of instructions, as jurors can only apply instructions to the extent they understand them. The results suggest that reducing the conceptual complexity and proportion of supplementary information was associated with increased application of the instructions; however, reducing both the linguistic complexity and amount of information, and providing the instructions in a written format was not. In addition, results showed an unexpected adverse effect of simplification-reducing the amount of information was associated with an increase in the punitiveness of mock jurors' verdicts, independently of the instruction content. Together, these results suggest a need to make jury instructions comprehensible, highlight the key principles in the decision-process, and identify a way to eliminate the negative effect of reducing the amount of information. Addressing these needs is essential for developing a simplification technique that maximizes jurors' comprehension and application of instructions, while minimizing the previously overlooked negative effects of simplification. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Financial Aid Administrators' Views on Simplifying Financial Aid: NASFAA's 2008 Financial Aid Simplification Survey Report

    ERIC Educational Resources Information Center

    National Association of Student Financial Aid Administrators (NJ1), 2008

    2008-01-01

    Despite a decade of simplification efforts, students and families are often still baffled by the student aid process and cringe at the sight of financial aid application forms. Contrary to its purpose of helping students to access college, the student aid application process causes families frustration and confusion that has been cited as an…

  20. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    PubMed Central

    2011-01-01

    Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854

  1. Completely automated estimation of prostate volume for 3-D side-fire transrectal ultrasound using shape prior approach

    NASA Astrophysics Data System (ADS)

    Li, Lu; Narayanan, Ramakrishnan; Miller, Steve; Shen, Feimo; Barqawi, Al B.; Crawford, E. David; Suri, Jasjit S.

    2008-02-01

    Real-time knowledge of capsule volume of an organ provides a valuable clinical tool for 3D biopsy applications. It is challenging to estimate this capsule volume in real-time due to the presence of speckles, shadow artifacts, partial volume effect and patient motion during image scans, which are all inherent in medical ultrasound imaging. The volumetric ultrasound prostate images are sliced in a rotational manner every three degrees. The automated segmentation method employs a shape model, which is obtained from training data, to delineate the middle slices of volumetric prostate images. Then a "DDC" algorithm is applied to the rest of the images with the initial contour obtained. The volume of prostate is estimated with the segmentation results. Our database consists of 36 prostate volumes which are acquired using a Philips ultrasound machine using a Side-fire transrectal ultrasound (TRUS) probe. We compare our automated method with the semi-automated approach. The mean volumes using the semi-automated and complete automated techniques were 35.16 cc and 34.86 cc, with the error of 7.3% and 7.6% compared to the volume obtained by the human estimated boundary (ideal boundary), respectively. The overall system, which was developed using Microsoft Visual C++, is real-time and accurate.

  2. Coactivation index of children with congenital upper limb reduction deficiencies before and after using a wrist-driven 3D printed partial hand prosthesis.

    PubMed

    Zuniga, Jorge M; Dimitrios, Katsavelis; Peck, Jean L; Srivastava, Rakesh; Pierce, James E; Dudley, Drew R; Salazar, David A; Young, Keaton J; Knarr, Brian A

    2018-06-08

    Co-contraction is the simultaneous activation of agonist and antagonist muscles that produces forces around a joint. It is unknown if the use of a wrist-driven 3D printed transitional prostheses has any influence on the neuromuscular motor control strategies of the affected hand of children with unilateral upper-limb reduction deficiencies. Thus, the purpose of the current investigation was to examine the coactivation index (CI) of children with congenital upper-limb reduction deficiencies before and after 6 months of using a wrist-driven 3D printed partial hand prosthesis. Electromyographic activity of wrist flexors and extensors (flexor carpi ulnaris and extensor digitorum) was recorded during maximal voluntary contraction of the affected and non-affected wrists. Co-contraction was calculated using the coactivation index and was expressed as percent activation of antagonist over agonist. Nine children (two girls and seven boys, 6 to 16 years of age) with congenital upper-limb deficiencies participated in this study and were fitted with a wrist-driven 3D printed prosthetic hand. From the nine children, five (two girls and three boys, 7 to 10 years of age) completed a second visit after using the wrist-driven 3D printed partial hand prosthesis for 6 months. Separate two-way repeated measures ANOVAs were performed to analyze the coactivation index and strength data. There was a significant main effect for hand with the affected hand resulting in a higher coactivation index for flexion and extension than the non-affected hand. For wrist flexion there was a significant main effect for time indicating that the affected and non-affected hand had a significantly lower coactivation index after a period of 6 months. The use of a wrist-driven 3D printed hand prosthesis lowered the coactivation index by 70% in children with congenital upper limb reduction deficiencies. This reduction in coactivation and possible improvement in motor control strategies can potentially improve prosthetic rehabilitation outcomes.

  3. Digital control and data acquisition for high-value GTA welding

    NASA Astrophysics Data System (ADS)

    George, T. G.; Franco-Ferreira, E. A.

    Electric power for the Cassini space probe will be provided by radioisotope thermoelectric generators (RTG's) thermally driven by General-Purpose Heat Source (GPHS) modules. Each GPHS module contains four, 150-g, pellets of Pu-238O2, and each of the four pellets is encapsulated within a thin-wall iridium-alloy shell. GTA girth welding of these capsules is performed at Los Alamos National Laboratory (LANL) on an automated, digitally-controlled welding system. Baseline design considerations for system automation and strategies employed to maximize process yield, improve process consistency, and generate required quality assurance information are discussed. Design of the automated girth welding system was driven by a number of factors which militated for precise parametric control and data acquisition. Foremost among these factors was the extraordinary value of the capsule components. In addition, DOE order 5700.6B, which took effect on 23 Sep. 1986, required that all operations adhere to strict levels of process quality assurance. A detailed technical specification for the GPHS welding system was developed on the basis of a joint LANL/Westinghouse Savannah River Company (WSRC) design effort. After a competitive bidding process, Jetline Engineering, Inc., of Irvine, California, was selected as the system manufacturer. During the period over which four identical welding systems were fabricated, very close liason was maintained between the LANL/WSRC technical representatives and the vendor. The level of rapport was outstanding, and the end result was the 1990 delivery of four systems that met or exceeded all specification requirements.

  4. Computer transformation of partial differential equations into any coordinate system

    NASA Technical Reports Server (NTRS)

    Sullivan, R. D.

    1977-01-01

    The use of tensors to provide a compact way of writing partial differential equations in a form valid in all coordinate systems is discussed. In order to find solutions to the equations with their boundary conditions they must be expressed in terms of the coordinate system under consideration. The process of arriving at these expressions from the tensor formulation was automated by a software system, TENSR. An allied system that analyzes the resulting expressions term by term and drops those that are negligible is also described.

  5. Streaming simplification of tetrahedral meshes.

    PubMed

    Vo, Huy T; Callahan, Steven P; Lindstrom, Peter; Pascucci, Valerio; Silva, Cláudio T

    2007-01-01

    Unstructured tetrahedral meshes are commonly used in scientific computing to represent scalar, vector, and tensor fields in three dimensions. Visualization of these meshes can be difficult to perform interactively due to their size and complexity. By reducing the size of the data, we can accomplish real-time visualization necessary for scientific analysis. We propose a two-step approach for streaming simplification of large tetrahedral meshes. Our algorithm arranges the data on disk in a streaming, I/O-efficient format that allows coherent access to the tetrahedral cells. A quadric-based simplification is sequentially performed on small portions of the mesh in-core. Our output is a coherent streaming mesh which facilitates future processing. Our technique is fast, produces high quality approximations, and operates out-of-core to process meshes too large for main memory.

  6. Challenges and complications in neighborhood mapping: from neighborhood concept to operationalization

    NASA Astrophysics Data System (ADS)

    Deng, Yongxin

    2016-07-01

    This paper examines complications in neighborhood mapping and corresponding challenges for the GIS community, taking both a conceptual and a methodological perspective. It focuses on the social and spatial dimensions of the neighborhood concept and highlights their relationship in neighborhood mapping. Following a brief summary of neighborhood definitions, five interwoven factors are identified to be origins of neighborhood mapping difficulties: conceptual vagueness, uncertainty of various sources, GIS representation, scale, and neighborhood homogeneity or continuity. Existing neighborhood mapping methods are grouped into six categories to be assessed: perception based, physically based, inference based, preexisting, aggregated, and automated. Mapping practices in various neighborhood-related disciplines and applications are cited as examples to demonstrate how the methods work, as well as how they should be evaluated. A few mapping strategies for the improvement of neighborhood mapping are prescribed from a GIS perspective: documenting simplifications employed in the mapping procedure, addressing uncertainty sources, developing new data solutions, and integrating complementary mapping methods. Incorporation of high-resolution data and introduction of more GIS ideas and methods (such as fuzzy logic) are identified to be future opportunities.

  7. Understanding the Data Complexity continuum to reduce data management costs and increase data usability through partnerships with the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Mesick, S.; Weathers, K. W.

    2017-12-01

    Data complexity can be seen as a continuum from complex to simple. The term data complexity refers to data collections that are disorganized, poorly documented, and generally do not follow best data management practices. Complex data collections are challenging and expensive to manage. Simplified collections readily support automated archival processes, enhanced discovery and data access, as well as production of services that make data easier to reuse. In this session, NOAA NCEI scientific data stewards will discuss the data complexity continuum. This talk will explore data simplification concepts, methods, and tools that data managers can employ which may offer more control over data management costs and processes, while achieving policy goals for open data access and ready reuse. Topics will include guidance for data managers on best allocation of limited data management resources; models for partnering with NCEI to accomplish shared data management goals; and will demonstrate through case studies the benefits of investing in documentation, accessibility, and services to increase data value and return on investment.

  8. RF-based power distribution system for optogenetic experiments

    NASA Astrophysics Data System (ADS)

    Filipek, Tomasz A.; Kasprowicz, Grzegorz H.

    2017-08-01

    In this paper, the wireless power distribution system for optogenetic experiment was demonstrated. The design and the analysis of the power transfer system development is described in details. The architecture is outlined in the context of performance requirements that had to be met. We show how to design a wireless power transfer system using resonant coupling circuits which consist of a number of receivers and one transmitter covering the entire cage area with a specific power density. The transmitter design with the full automated protection stage is described with detailed consideration of the specification and the construction of the transmitting loop antenna. In addition, the design of the receiver is described, including simplification of implementation and the minimization of the impact of component tolerances on the performance of the distribution system. The conducted analysis has been confirmed by calculations and measurement results. The presented distribution system was designed to provide 100 mW power supply to each of the ten possible receivers in a limited 490 x 350 mm cage space while using a single transmitter working at the coupling resonant frequency of 27 MHz.

  9. Making Information Available to Partially Sighted and Blind Clients.

    ERIC Educational Resources Information Center

    Long, C. A.

    1993-01-01

    Provides an empirical review of problems facing library users with visual impairments using computers, and reviews some of the technology that can help alleviate these problems. Highlights include software; GUI (Graphical User Interfaces); advising and training; library automation; and appendices that list further sources of relevant information.…

  10. The Design and Development of an Evaluation System for Online Instruction.

    ERIC Educational Resources Information Center

    Wentling, Tim L.; Johnson, Scott D.

    This paper describes the conceptualization and development of an evaluation system that can be used to monitor and evaluate online instructional efforts. The evaluation system addresses concerns of both program administrators and course instructors. Computer technology is used to provide partial automation to reduce respondent burden and to…

  11. Scattering of plane evanescent waves by cylindrical shells and wave vector coupling conditions for exciting flexural waves

    NASA Astrophysics Data System (ADS)

    Marston, Philip L.

    2002-05-01

    The coupling of sound to buried targets can be associated with acoustic evanescent waves when the sea bottom is smooth. To understand the excitation of flexural waves on buried shells by acoustic evanescent waves, the partial wave series for the scattering is found for cylindrical shells at normal incidence in an unbounded medium. The formulation uses the simplifications of thin-shell dynamics. In the case of ordinary waves incident on a shell, a ray formulation is available to describe the coupling to subsonic flexural waves [P. L. Marston and N. H. Sun, J. Acoust. Soc. Am. 97, 777-783 (1995)]. When the incident wave is evanescent, the distance between propagating plane wavefronts is smaller than the ordinary acoustical wavelength at the same frequency and the coupling condition for the excitation of flexural waves on shells or plates is modified. Instead of matching the flexural wave number with the propagating part of the acoustic wave number only at the coincidence frequency, a second low-frequency wave number matching condition is found for highly evanescent waves. Numerical evaluation of the modified partial-wave-series appropriate for an evanescent wave is used to investigate the low-frequency coupling of evanescent waves with flexural wave resonances of shells.

  12. Rule-driven defect detection in CT images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2000-01-01

    This paper deals with automated detection and identification of internal defects in hardwood logs using computed tomography (CT) images. We have developed a system that employs artificial neural networks to perform tentative classification of logs on a pixel-by-pixel basis. This approach achieves a high level of classification accuracy for several hardwood species (...

  13. Dialogue as Data in Learning Analytics for Productive Educational Dialogue

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    This paper provides a novel, conceptually driven stance on the state of the contemporary analytic challenges faced in the treatment of dialogue as a form of data across on- and offline sites of learning. In prior research, preliminary steps have been taken to detect occurrences of such dialogue using automated analysis techniques. Such advances…

  14. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  15. Data fusion for automated non-destructive inspection

    PubMed Central

    Brierley, N.; Tippetts, T.; Cawley, P.

    2014-01-01

    In industrial non-destructive evaluation (NDE), it is increasingly common for data acquisition to be automated, driving a recent substantial increase in the availability of data. The collected data need to be analysed, typically necessitating the painstaking manual labour of a skilled operator. Moreover, in automated NDE a region of an inspected component is typically interrogated several times, be it within a single data channel due to multiple probe passes, across several channels acquired simultaneously or over the course of repeated inspections. The systematic combination of these diverse readings is recognized to offer an opportunity to improve the reliability of the inspection, but is not achievable in a manual analysis. This paper describes a data-fusion-based software framework providing a partial automation capability, allowing component regions to be declared defect-free to a very high probability while readily identifying defect indications, thereby optimizing the use of the operator's time. The system is designed to applicable to a wide range of automated NDE scenarios, but the processing is exemplified using the industrial ultrasonic immersion inspection of aerospace turbine discs. Results obtained for industrial datasets demonstrate an orders-of-magnitude reduction in false-call rates, for a given probability of detection, achievable using the developed software system. PMID:25002828

  16. View-Dependent Simplification of Arbitrary Polygonal Environments

    DTIC Science & Technology

    2006-01-01

    of backfacing nodes are not rendered [ Kumar 96]. 4.3 Triangle-Budget Simplification The screenspace error threshold and silhouette test allow the user...Greg Turk, and Dinesh Manocha for their invaluable guidance and support throughout this project. Funding for this work was provided by DARPA...Proceedings Visualization 95 , IEEE Computer Society Press (Atlanta, GA), 1995, pp. 296-303. [ Kumar 96] Kumar , Subodh, D. Manocha, W. Garrett, M. Lin

  17. Perceptual Recovery from Consonant-Cluster Simplification in Korean Using Language-Specific Phonological Knowledge

    ERIC Educational Resources Information Center

    Cho, Taehong; McQueen, James M.

    2011-01-01

    Two experiments examined whether perceptual recovery from Korean consonant-cluster simplification is based on language-specific phonological knowledge. In tri-consonantal C1C2C3 sequences such as /lkt/ and /lpt/ in Seoul Korean, either C1 or C2 can be completely deleted. Seoul Koreans monitored for C2 targets (/p/ or /k/, deleted or preserved) in…

  18. The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment. NBER Working Paper No. 15361

    ERIC Educational Resources Information Center

    Bettinger, Eric P.; Long, Bridget Terry; Oreopoulos, Philip; Sanbonmatsu, Lisa

    2009-01-01

    Growing concerns about low awareness and take-up rates for government support programs like college financial aid have spurred calls to simplify the application process and enhance visibility. This project examines the effects of two experimental treatments designed to test of the importance of simplification and information using a random…

  19. WWC Quick Review of the Report "The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment" Revised

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    This paper presents an updated WWC (What Works Clearinghouse) Quick Review of the report "The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment". The study examined whether assistance in filling out the Free Application for Federal Student Aid (FAFSA) increases the likelihood of…

  20. WWC Quick Review of the Report "The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment"

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    This paper presents a WWC (What Works Clearinghouse) Quick Review of the report "The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment". The study examined whether assistance in filling out the Free Application for Federal Student Aid (FAFSA) increases the likelihood of filing…

  1. The Persistence of Erroneous Familiarity in an Epileptic Male: Challenging Perceptual Theories of Deja Vu Activation

    ERIC Educational Resources Information Center

    O'Connor, Akira R.; Moulin, Christopher J. A.

    2008-01-01

    We report the case of a 39-year-old, temporal lobe epileptic male, MH. Prior to complex partial seizure, experienced up to three times a day, MH often experiences an aura experienced as a persistent sensation of deja vu. Data-driven theories of deja vu formation suggest that partial familiarity for the perceived stimulus is responsible for the…

  2. Development of Test Rig for Robotization of Mining Technological Processes - Oversized Rock Breaking Process Case

    NASA Astrophysics Data System (ADS)

    Pawel, Stefaniak; Jacek, Wodecki; Jakubiak, Janusz; Zimroz, Radoslaw

    2017-12-01

    Production chain (PCh) in underground copper ore mine consists of several subprocesses. From our perspective implementation of so called ZEPA approach (Zero Entry Production Area) might be very interesting [16]. In practice, it leads to automation/robotization of subprocesses in production area. In this paper was investigated a specific part of PCh i.e. a place when cyclic transport by LHDs is replaced with continuous transport by conveying system. Such place is called dumping point. The objective of dumping points with screen is primary classification of the material (into coarse and fine material) and breaking oversized rocks with hydraulic hammer. Current challenges for the underground mining include e.g. safety improvement as well as production optimization related to bottlenecks, stoppages and operational efficiency of the machines. As a first step, remote control of the hydraulic hammer has been introduced, which not only transferred the operator to safe workplace, but also allowed for more comfortable work environment and control over multiple technical objects by a single person. Today literature analysis shows that current mining industry around the world is oriented to automation and robotization of mining processes and reveals technological readiness for 4th industrial revolution. The paper is focused on preliminary analysis of possibilities for the use of the robotic system to rock-breaking process. Prototype test rig has been proposed and experimental works have been carried out. Automatic algorithms for detection of oversized rocks, crushing them as well as sweeping and loosening of material have been formulated. Obviously many simplifications have been assumed. Some near future works have been proposed.

  3. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  4. The Ultimate Private Observatory

    NASA Astrophysics Data System (ADS)

    Aymond, J.

    2009-03-01

    An amateur astronomer from Washington Parish, Southeast Louisiana, USA has designed and built an amazing observatory. It is not only an astronomical observatory, but a home theater, and tornado shelter designed to take a direct hit from an F5 tornado. The facility is fully equipped and automated, with a hydraulically driven roof that weighs 20,571 lbs., which lifts up, then rolls away to the end of the tracks. This leaves the user sitting inside of four 14-foot high walls open to the night sky. It has two premium quality telescopes for viewing deep space and objects inside the solar system. The chair that the observer sits on is also hydraulically driven.

  5. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  6. Crystallographic analysis of the solid-state dewetting of polycrystalline gold film using automated indexing in a transmission electron microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, S. A.; Lee, H. J.; Oh, Y. J., E-mail: yjoh@hanbat.ac.kr

    We analyzed the effect of crystallographic anisotropy on the morphological evolution of a 12-nm-thick gold film during solid-state dewetting at high temperatures using automated indexing tool in a transmission electron microscopy. Dewetting initiated at grain-boundary triple junctions adjacent to large grains resulting from abnormal grain growth driven by (111) texture development. Voids at the junctions developed shapes with faceted edges bounded by low-index crystal planes. The kinetic mobility of the edges varied with the crystal orientation normal to the edges, with a predominance of specific edges with the slowest retraction rates as the annealing time was increased.

  7. Effects of simplifying fracture network representation on inert chemical migration in fracture-controlled aquifers

    USGS Publications Warehouse

    Wellman, Tristan; Shapiro, Allen M.; Hill, Mary C.

    2009-01-01

    While it is widely recognized that highly permeable 'large-scale' fractures dominate chemical migration in many fractured aquifers, recent studies suggest that the pervasive 'small-scale' fracturing once considered of less significance can be equally important for characterizing the spatial extent and residence time associated with transport processes. A detailed examination of chemical migration through fracture-controlled aquifers is used to advance this conceptual understanding. The influence of fracture structure is evaluated by quantifying the effects to transport caused by a systematic removal of fractures from three-dimensional discrete fracture models whose attributes are derived from geologic and hydrologic conditions at multiple field sites. Results indicate that the effects to transport caused by network simplification are sensitive to the fracture network characteristics, degree of network simplification, and plume travel distance, but primarily in an indirect sense since correlation to individual attributes is limited. Transport processes can be 'enhanced' or 'restricted' from network simplification meaning that the elimination of fractures may increase or decrease mass migration, mean travel time, dispersion, and tailing of the concentration plume. The results demonstrate why, for instance, chemical migration may not follow the classic advection-dispersion equation where dispersion approximates the effect of the ignored geologic structure as a strictly additive process to the mean flow. The analyses further reveal that the prediction error caused by fracture network simplification is reduced by at least 50% using the median estimate from an ensemble of simplified fracture network models, and that the error from network simplification is at least 70% less than the stochastic variability from multiple realizations. Copyright 2009 by the American Geophysical Union.

  8. Theories of Simplification and Scaling of Spatially Distributed Processes. Chapter 12

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Pacala, Stephen W.

    1997-01-01

    The problem of scaling is at the heart of ecological theory, the essence of understanding and of the development of a predictive capability. The description of any system depends on the spatial, temporal, and organizational perspective chosen; hence it is essential to understand not only how patterns and dynamics vary with scale, but also how patterns at one scale are manifestations of processes operating at other scales. Evolution has shaped the characteristics of species in ways that result in scale displacement: Each species experiences the environment at its own unique set of spatial and temporal scales and interfaces the biota through unique assemblages of phenotypes. In this way, coexistence becomes possible, and biodiversity is enhanced. By averaging over space, time, and biological interactions, a genotype filters variation at fine scales and selects the arena in which it will face the vicissitudes of nature. Variation at finer scales is then noise, of minor importance to the survival and dynamics of the species, and consequently of minor importance in any attempt at description. In attempting to model ecological interactions in space, contributors throughout this book have struggled with a trade-off between simplification and "realistic" complexity and detail. Although the challenge of simplification is widely recognized in ecology, less appreciated is the intertwining of scaling questions and scaling laws with the process of simplification. In the context of this chapter simplification will in general mean the use of spatial or ensemble means and low-order moments to capture more detailed interactions by integrating over given areas. In this way, one can derive descriptions of the system at different spatial scales, which provides the essentials for the extraction of scaling laws by examination of how system properties vary with scale.

  9. Local, smooth, and consistent Jacobi set simplification

    DOE PAGES

    Bhatia, Harsh; Wang, Bei; Norgard, Gregory; ...

    2014-10-31

    The relation between two Morse functions defined on a smooth, compact, and orientable 2-manifold can be studied in terms of their Jacobi set. The Jacobi set contains points in the domain where the gradients of the two functions are aligned. Both the Jacobi set itself as well as the segmentation of the domain it induces, have shown to be useful in various applications. In practice, unfortunately, functions often contain noise and discretization artifacts, causing their Jacobi set to become unmanageably large and complex. Although there exist techniques to simplify Jacobi sets, they are unsuitable for most applications as they lackmore » fine-grained control over the process, and heavily restrict the type of simplifications possible. In this paper, we introduce a new framework that generalizes critical point cancellations in scalar functions to Jacobi set in two dimensions. We present a new interpretation of Jacobi set simplification based on the perspective of domain segmentation. Generalizing the cancellation of critical points from scalar functions to Jacobi sets, we focus on simplifications that can be realized by smooth approximations of the corresponding functions, and show how these cancellations imply simultaneous simplification of contiguous subsets of the Jacobi set. Using these extended cancellations as atomic operations, we introduce an algorithm to successively cancel subsets of the Jacobi set with minimal modifications to some user-defined metric. We show that for simply connected domains, our algorithm reduces a given Jacobi set to its minimal configuration, that is, one with no birth–death points (a birth–death point is a specific type of singularity within the Jacobi set where the level sets of the two functions and the Jacobi set have a common normal direction).« less

  10. Direct-method SAD phasing with partial-structure iteration: towards automation.

    PubMed

    Wang, J W; Chen, J R; Gu, Y X; Zheng, C D; Fan, H F

    2004-11-01

    The probability formula of direct-method SAD (single-wavelength anomalous diffraction) phasing proposed by Fan & Gu (1985, Acta Cryst. A41, 280-284) contains partial-structure information in the form of a Sim-weighting term. Previously, only the substructure of anomalous scatterers has been included in this term. In the case that the subsequent density modification and model building yields only structure fragments, which do not straightforwardly lead to the complete solution, the partial structure can be fed back into the Sim-weighting term of the probability formula in order to strengthen its phasing power and to benefit the subsequent automatic model building. The procedure has been tested with experimental SAD data from two known proteins with copper and sulfur as the anomalous scatterers.

  11. Optimal Control of Stochastic Systems Driven by Fractional Brownian Motions

    DTIC Science & Technology

    2014-10-09

    problems for stochastic partial differential equations driven by fractional Brownian motions are explicitly solved. For the control of a continuous time...linear systems with Brownian motion or a discrete time linear system with a white Gaussian noise and costs 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 stochastic optimal control, fractional Brownian motion , stochastic

  12. Rage against the machine? Google's self-driving cars versus human drivers.

    PubMed

    Teoh, Eric R; Kidd, David G

    2017-12-01

    Automated driving represents both challenges and opportunities in highway safety. Google has been developing self-driving cars and testing them under employee supervision on public roads since 2009. These vehicles have been involved in several crashes, and it is of interest how this testing program compares to human drivers in terms of safety. Google car crashes were coded by type and severity based on narratives released by Google. Crash rates per million vehicle miles traveled (VMT) were computed for crashes deemed severe enough to be reportable to police. These were compared with police-reported crash rates for human drivers. Crash types also were compared. Google cars had a much lower rate of police-reportable crashes per million VMT than human drivers in Mountain View, Calif., during 2009-2015 (2.19 vs 6.06), but the difference was not statistically significant. The most common type of collision involving Google cars was when they got rear-ended by another (human-driven) vehicle. Google cars shared responsibility for only one crash. These results suggest Google self-driving cars, while a test program, are safer than conventional human-driven passenger vehicles; however, currently there is insufficient information to fully examine the extent to which disengagements affected these results. Results suggest that highly-automated vehicles can perform more safely than human drivers in certain conditions, but will continue to be involved in crashes with conventionally-driven vehicles. Copyright © 2017. Published by Elsevier Ltd.

  13. Evaluation of the toroidal torque driven by external non-resonant non-axisymmetric magnetic field perturbations in a tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasilov, Sergei V.; Institute of Plasma Physics National Science Center “Kharkov Institute of Physics and Technology” ul. Akademicheskaya 1, 61108 Kharkov; Kernbichler, Winfried

    2014-09-15

    The toroidal torque driven by external non-resonant magnetic perturbations (neoclassical toroidal viscosity) is an important momentum source affecting the toroidal plasma rotation in tokamaks. The well-known force-flux relation directly links this torque to the non-ambipolar neoclassical particle fluxes arising due to the violation of the toroidal symmetry of the magnetic field. Here, a quasilinear approach for the numerical computation of these fluxes is described, which reduces the dimension of a standard neoclassical transport problem by one without model simplifications of the linearized drift kinetic equation. The only limiting condition is that the non-axisymmetric perturbation field is small enough such thatmore » the effect of the perturbation field on particle motion within the flux surface is negligible. Therefore, in addition to most of the transport regimes described by the banana (bounce averaged) kinetic equation also such regimes as, e.g., ripple-plateau and resonant diffusion regimes are naturally included in this approach. Based on this approach, a quasilinear version of the code NEO-2 [W. Kernbichler et al., Plasma Fusion Res. 3, S1061 (2008).] has been developed and benchmarked against a few analytical and numerical models. Results from NEO-2 stay in good agreement with results from these models in their pertinent range of validity.« less

  14. A new practice-driven approach to develop software in a cyber-physical system environment

    NASA Astrophysics Data System (ADS)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  15. The non-independence discussion about cycle structure in the computer language: the final simplification of computer language in the structural design

    NASA Astrophysics Data System (ADS)

    Yang, Peilu

    2013-03-01

    In the first place, the article discusses the theory, content, development, and questions about structured programming design. The further extension on this basement provides the cycle structure in computer language is the sequence structure, branch structure, and the cycle structure with independence. Through the deeply research by the writer, we find the non-independence and reach the final simplification about the computer language design. In the first, the writer provides the language structure of linear structure (I structure) and curvilinear structure (Y structure). This makes the computer language has high proficiency with simplification during the program exploration. The research in this article is corresponding with the widely used dualistic structure in the computer field. Moreover, it is greatly promote the evolution of computer language.

  16. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    PubMed

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  17. Emerging Technologies for the Clinical Microbiology Laboratory

    PubMed Central

    Buchan, Blake W.

    2014-01-01

    SUMMARY In this review we examine the literature related to emerging technologies that will help to reshape the clinical microbiology laboratory. These topics include nucleic acid amplification tests such as isothermal and point-of-care molecular diagnostics, multiplexed panels for syndromic diagnosis, digital PCR, next-generation sequencing, and automation of molecular tests. We also review matrix-assisted laser desorption ionization–time of flight (MALDI-TOF) and electrospray ionization (ESI) mass spectrometry methods and their role in identification of microorganisms. Lastly, we review the shift to liquid-based microbiology and the integration of partial and full laboratory automation that are beginning to impact the clinical microbiology laboratory. PMID:25278575

  18. Automated identification and indexing of dislocations in crystal interfaces

    DOE PAGES

    Stukowski, Alexander; Bulatov, Vasily V.; Arsenlis, Athanasios

    2012-10-31

    Here, we present a computational method for identifying partial and interfacial dislocations in atomistic models of crystals with defects. Our automated algorithm is based on a discrete Burgers circuit integral over the elastic displacement field and is not limited to specific lattices or dislocation types. Dislocations in grain boundaries and other interfaces are identified by mapping atomic bonds from the dislocated interface to an ideal template configuration of the coherent interface to reveal incompatible displacements induced by dislocations and to determine their Burgers vectors. Additionally, the algorithm generates a continuous line representation of each dislocation segment in the crystal andmore » also identifies dislocation junctions.« less

  19. Comparison of DNA decatenation by Escherichia coli topoisomerase IV and topoisomerase III: implications for non-equilibrium topology simplification

    PubMed Central

    Seol, Yeonee; Hardin, Ashley H.; Strub, Marie-Paule; Charvin, Gilles; Neuman, Keir C.

    2013-01-01

    Type II topoisomerases are essential enzymes that regulate DNA topology through a strand-passage mechanism. Some type II topoisomerases relax supercoils, unknot and decatenate DNA to below thermodynamic equilibrium. Several models of this non-equilibrium topology simplification phenomenon have been proposed. The kinetic proofreading (KPR) model postulates that strand passage requires a DNA-bound topoisomerase to collide twice in rapid succession with a second DNA segment, implying a quadratic relationship between DNA collision frequency and relaxation rate. To test this model, we used a single-molecule assay to measure the unlinking rate as a function of DNA collision frequency for Escherichia coli topoisomerase IV (topo IV) that displays efficient non-equilibrium topology simplification activity, and for E. coli topoisomerase III (topo III), a type IA topoisomerase that unlinks and unknots DNA to equilibrium levels. Contrary to the predictions of the KPR model, topo IV and topo III unlinking rates were linearly related to the DNA collision frequency. Furthermore, topo III exhibited decatenation activity comparable with that of topo IV, supporting proposed roles for topo III in DNA segregation. This study enables us to rule out the KPR model for non-equilibrium topology simplification. More generally, we establish an experimental approach to systematically control DNA collision frequency. PMID:23460205

  20. Sleep deprivation affects reactivity to positive but not negative stimuli.

    PubMed

    Pilcher, June J; Callan, Christina; Posey, J Laura

    2015-12-01

    The current study examined the effects of partial and total sleep deprivation on emotional reactivity. Twenty-eight partially sleep-deprived participants and 31 totally sleep-deprived participants rated their valence and arousal responses to positive and negative pictures across four testing sessions during the day following partial sleep deprivation or during the night under total sleep deprivation. The results suggest that valence and arousal ratings decreased under both sleep deprivation conditions. In addition, partial and total sleep deprivation had a greater negative effect on positive events than negative events. These results suggest that sleep-deprived persons are more likely to respond less to positive events than negative events. One explanation for the current findings is that negative events could elicit more attentive behavior and thus stable responding under sleep deprivation conditions. As such, sleep deprivation could impact reactivity to emotional stimuli through automated attentional and self-regulatory processes. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Partial pneumonectomy of telomerase null mice carrying shortened telomeres initiates cell growth arrest resulting in a limited compensatory growth response

    PubMed Central

    Jackson, Sha-Ron; Lee, Jooeun; Reddy, Raghava; Williams, Genevieve N.; Kikuchi, Alexander; Freiberg, Yael; Warburton, David

    2011-01-01

    Telomerase mutations and significantly shortened chromosomal telomeres have recently been implicated in human lung pathologies. Natural telomere shortening is an inevitable consequence of aging, which is also a risk factor for development of lung disease. However, the impact of shortened telomeres and telomerase dysfunction on the ability of lung cells to respond to significant challenge is still largely unknown. We have previously shown that lungs of late generation, telomerase null B6.Cg-Terctm1Rdp mice feature alveolar simplification and chronic stress signaling at baseline, a phenocopy of aged lung. To determine the role telomerase plays when the lung is challenged, B6.Cg-Terctm1Rdp mice carrying shortened telomeres and wild-type controls were subjected to partial pneumonectomy. We found that telomerase activity was strongly induced in alveolar epithelial type 2 cells (AEC2) of the remaining lung immediately following surgery. Eighty-six percent of wild-type animals survived the procedure and exhibited a burst of early compensatory growth marked by upregulation of proliferation, stress response, and DNA repair pathways in AEC2. In B6.Cg-Terctm1Rdp mice carrying shortened telomeres, response to pneumonectomy was characterized by decreased survival, diminished compensatory lung growth, attenuated distal lung progenitor cell response, persistent DNA damage, and cell growth arrest. Overall, survival correlated strongly with telomere length. We conclude that functional telomerase and properly maintained telomeres play key roles in both long-term survival and the early phase of compensatory lung growth following partial pneumonectomy. PMID:21460122

  2. Pressure- and buoyancy-driven thermal convection in a rectangular enclosure

    NASA Technical Reports Server (NTRS)

    Spradley, L. W.; Churchill, S. W.

    1975-01-01

    Results are presented for unsteady laminar thermal convection in compressible fluids at various reduced levels of gravity in a rectangular enclosure which is heated on one side and cooled on the opposite side. The results were obtained by solving numerically the equations of conservation for a viscous, compressible, heat-conducting, ideal gas in the presence of a gravitational body force. The formulation differs from the Boussinesq simplification in that the effects of variable density are completely retained. A conservative, explicit, time-dependent, finite-difference technique was used and good agreement was found for the limited cases where direct comparison with previous investigations was possible. The solutions show that the thermally induced motion is acoustic in nature at low levels of gravity and that the unsteady-state rate of heat transfer is thereby greatly enhanced relative to pure conduction. The nonlinear variable density profile skews the streamlines towards the cooler walls but is shown to have little effect on the steady-state isotherms.

  3. Time Scale for Adiabaticity Breakdown in Driven Many-Body Systems and Orthogonality Catastrophe

    NASA Astrophysics Data System (ADS)

    Lychkovskiy, Oleg; Gamayun, Oleksandr; Cheianov, Vadim

    2017-11-01

    The adiabatic theorem is a fundamental result in quantum mechanics, which states that a system can be kept arbitrarily close to the instantaneous ground state of its Hamiltonian if the latter varies in time slowly enough. The theorem has an impressive record of applications ranging from foundations of quantum field theory to computational molecular dynamics. In light of this success it is remarkable that a practicable quantitative understanding of what "slowly enough" means is limited to a modest set of systems mostly having a small Hilbert space. Here we show how this gap can be bridged for a broad natural class of physical systems, namely, many-body systems where a small move in the parameter space induces an orthogonality catastrophe. In this class, the conditions for adiabaticity are derived from the scaling properties of the parameter-dependent ground state without a reference to the excitation spectrum. This finding constitutes a major simplification of a complex problem, which otherwise requires solving nonautonomous time evolution in a large Hilbert space.

  4. Energy landscape-driven non-equilibrium evolution of inherent structure in disordered material

    DOE PAGES

    Fan, Yue; Iwashita, Takuya; Egami, Takeshi

    2017-05-19

    Complex states in glasses can be neatly expressed by the potential energy landscape (PEL). But, because PEL is highly multi-dimensional it is difficult to describe how the system moves around in PEL. We demonstrate that it is possible to predict the evolution of macroscopic state in a metallic glass, such as ageing and rejuvenation, through a set of simple equations describing excitations in the PEL. The key to this simplification is the realization that the step of activation from the initial state to the saddle point in PEL and the following step of relaxation to the final state are essentiallymore » decoupled. Furthermore, the model shows that the interplay between activation and relaxation in PEL is the key driving force that simultaneously explains both the equilibrium of supercooled liquid and the thermal hysteresis observed in experiments. It further predicts anomalous peaks in truncated thermal scanning, validated by independent molecular dynamics simulation.« less

  5. Recent advances and plans in processing and geocoding of SAR data at the DFD

    NASA Technical Reports Server (NTRS)

    Noack, W.

    1993-01-01

    Because of the needs of future projects like ENVISAT and the experiences made with the current operational ERS-1 facilities, a radical change in the synthetic aperture radar (SAR) processing scenarios can be predicted for the next years. At the German PAF several new developments were initialized which are driven mainly either by user needs or by system and operational constraints ('lessons learned'). At the end there will be a major simplification and uniformation of all used computer systems. Especially the following changes are likely to be implemented at the German PAF: transcription before archiving, processing of all standard products with high throughput directly at the receiving stations, processing of special 'high-valued' products at the PAF, usage of a single type of processor hardware, implementation of a large and fast on-line data archive, and improved and unified fast data network between the processing and archiving facilities. A short description of the current operational SAR facilities as well as the future implementations are given.

  6. Two consecutive microtubule-based epithelial seaming events mediate dorsal closure in the scuttle fly Megaselia abdita.

    PubMed

    Fraire-Zamora, Juan Jose; Jaeger, Johannes; Solon, Jérôme

    2018-03-14

    Evolution of morphogenesis is generally associated with changes in genetic regulation. Here, we report evidence indicating that dorsal closure, a conserved morphogenetic process in dipterans, evolved as the consequence of rearrangements in epithelial organization rather than signaling regulation. In Drosophila melanogaster , dorsal closure consists of a two-tissue system where the contraction of extraembryonic amnioserosa and a JNK/Dpp-dependent epidermal actomyosin cable result in microtubule-dependent seaming of the epidermis. We find that dorsal closure in Megaselia abdita, a three-tissue system comprising serosa, amnion and epidermis, differs in morphogenetic rearrangements despite conservation of JNK/Dpp signaling. In addition to an actomyosin cable, M. abdita dorsal closure is driven by the rupture and contraction of the serosa and the consecutive microtubule-dependent seaming of amnion and epidermis. Our study indicates that the evolutionary transition to a reduced system of dorsal closure involves simplification of the seaming process without changing the signaling pathways of closure progression. © 2018, Fraire-Zamora et al.

  7. Expanding the informational chemistries of life: peptide/RNA networks

    NASA Astrophysics Data System (ADS)

    Taran, Olga; Chen, Chenrui; Omosun, Tolulope O.; Hsieh, Ming-Chien; Rha, Allisandra; Goodwin, Jay T.; Mehta, Anil K.; Grover, Martha A.; Lynn, David G.

    2017-11-01

    The RNA world hypothesis simplifies the complex biopolymer networks underlining the informational and metabolic needs of living systems to a single biopolymer scaffold. This simplification requires abiotic reaction cascades for the construction of RNA, and this chemistry remains the subject of active research. Here, we explore a complementary approach involving the design of dynamic peptide networks capable of amplifying encoded chemical information and setting the stage for mutualistic associations with RNA. Peptide conformational networks are known to be capable of evolution in disease states and of co-opting metal ions, aromatic heterocycles and lipids to extend their emergent behaviours. The coexistence and association of dynamic peptide and RNA networks appear to have driven the emergence of higher-order informational systems in biology that are not available to either scaffold independently, and such mutualistic interdependence poses critical questions regarding the search for life across our Solar System and beyond. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  8. Two consecutive microtubule-based epithelial seaming events mediate dorsal closure in the scuttle fly Megaselia abdita

    PubMed Central

    Jaeger, Johannes

    2018-01-01

    Evolution of morphogenesis is generally associated with changes in genetic regulation. Here, we report evidence indicating that dorsal closure, a conserved morphogenetic process in dipterans, evolved as the consequence of rearrangements in epithelial organization rather than signaling regulation. In Drosophila melanogaster, dorsal closure consists of a two-tissue system where the contraction of extraembryonic amnioserosa and a JNK/Dpp-dependent epidermal actomyosin cable result in microtubule-dependent seaming of the epidermis. We find that dorsal closure in Megaselia abdita, a three-tissue system comprising serosa, amnion and epidermis, differs in morphogenetic rearrangements despite conservation of JNK/Dpp signaling. In addition to an actomyosin cable, M. abdita dorsal closure is driven by the rupture and contraction of the serosa and the consecutive microtubule-dependent seaming of amnion and epidermis. Our study indicates that the evolutionary transition to a reduced system of dorsal closure involves simplification of the seaming process without changing the signaling pathways of closure progression. PMID:29537962

  9. Task-focused modeling in automated agriculture

    NASA Astrophysics Data System (ADS)

    Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack

    1993-01-01

    Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.

  10. RCrane: semi-automated RNA model building.

    PubMed

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  11. The Problem of Labels in E-Assessment of Diagrams

    ERIC Educational Resources Information Center

    Jayal, Ambikesh; Shepperd, Martin

    2009-01-01

    In this article we explore a problematic aspect of automated assessment of diagrams. Diagrams have partial and sometimes inconsistent semantics. Typically much of the meaning of a diagram resides in the labels; however, the choice of labeling is largely unrestricted. This means a correct solution may utilize differing yet semantically equivalent…

  12. Kinematic synthesis of adjustable robotic mechanisms

    NASA Astrophysics Data System (ADS)

    Chuenchom, Thatchai

    1993-01-01

    Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.

  13. Advances on the constitutive characterization of composites via multiaxial robotic testing and design optimization

    Treesearch

    John G. Michopoulos; John Hermanson; Athanasios Iliopoulos

    2014-01-01

    The research areas of mutiaxial robotic testing and design optimization have been recently utilized for the purpose of data-driven constitutive characterization of anisotropic material systems. This effort has been enabled by both the progress in the areas of computers and information in engineering as well as the progress in computational automation. Although our...

  14. Lights Out: Foreseeable Catastrophic Effects of Geomagnetic Storms on the North American Power Grid and How to Mitigate Them

    DTIC Science & Technology

    2011-08-21

    poultry, pork , beef, fish, and other meat products also are typically automated operations, done on electrically driven processing lines. 53 Food ...Infrastructure ..................................................... 18 Power Outage Impact on Consumables ( Food , Water, Medication...transportation, consumables ( food , water, and medication), and emergency services, are so highly dependent on reliable power supply from the grid, a

  15. SnoMAP: Pioneering the Path for Clinical Coding to Improve Patient Care.

    PubMed

    Lawley, Michael; Truran, Donna; Hansen, David; Good, Norm; Staib, Andrew; Sullivan, Clair

    2017-01-01

    The increasing demand for healthcare and the static resources available necessitate data driven improvements in healthcare at large scale. The SnoMAP tool was rapidly developed to provide an automated solution that transforms and maps clinician-entered data to provide data which is fit for both administrative and clinical purposes. Accuracy of data mapping was maintained.

  16. Closing the Loop: Automated Data-Driven Cognitive Model Discoveries Lead to Improved Instruction and Learning Gains

    ERIC Educational Resources Information Center

    Liu, Ran; Koedinger, Kenneth R.

    2017-01-01

    As the use of educational technology becomes more ubiquitous, an enormous amount of learning process data is being produced. Educational data mining seeks to analyze and model these data, with the ultimate goal of improving learning outcomes. The most firmly grounded and rigorous evaluation of an educational data mining discovery is whether it…

  17. Analyzing Vehicle Fuel Saving Opportunities through Intelligent Driver Feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, J.; Earleywine, M.; Sparks, W.

    2012-06-01

    Driving style changes, e.g., improving driver efficiency and motivating driver behavior changes, could deliver significant petroleum savings. This project examines eliminating stop-and-go driving and unnecessary idling, and also adjusting acceleration rates and cruising speeds to ideal levels to quantify fuel savings. Such extreme adjustments can result in dramatic fuel savings of over 30%, but would in reality only be achievable through automated control of vehicles and traffic flow. In real-world driving, efficient driving behaviors could reduce fuel use by 20% on aggressively driven cycles and by 5-10% on more moderately driven trips. A literature survey was conducted of driver behaviormore » influences, and pertinent factors from on-road experiments with different driving styles were observed. This effort highlighted important driver influences such as surrounding vehicle behavior, anxiety over trying to get somewhere quickly, and the power/torque available from the vehicle. Existing feedback approaches often deliver efficiency information and instruction. Three recommendations for maximizing fuel savings from potential drive cycle improvement are: (1) leveraging applications with enhanced incentives, (2) using an approach that is easy and widely deployable to motivate drivers, and (3) utilizing connected vehicle and automation technologies to achieve large and widespread efficiency improvements.« less

  18. A Chip-Capillary Hybrid Device for Automated Transfer of Sample Pre-Separated by Capillary Isoelectric Focusing to Parallel Capillary Gel Electrophoresis for Two-Dimensional Protein Separation

    PubMed Central

    Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong

    2012-01-01

    In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584

  19. A roadmap to defining the clinical reportable ranges of chemistry analytes: Increasing automation efficiency and decreasing manual dilutions.

    PubMed

    Lo, Sheng-Ying; Baird, Geoffrey S; Greene, Dina N

    2015-12-07

    Proper utilization of resources is an important operational objective for clinical laboratories. To reduce unnecessary manual interventions on automated instruments, we conducted a workflow analysis that optimized dilution parameters and reporting of abnormally high chemistry results for the Beckman AU series of chemistry analyzers while maintaining clinically acceptable reportable ranges. Workflow analysis for the Beckman AU680/5812 and DxC800 chemistry analyzers was performed using historical data. Clinical reportable ranges for 53 chemistry analytes were evaluated. Optimized dilution parameters and upper limit of reportable ranges for the AU680/5812 instruments were derived and validated to meet these reportable ranges. The number of specimens that required manual dilutions before and after optimization was determined for both the AU680/5812 and DxC800, with the DxC800 serving as the reference instrument. Retrospective data analysis revealed that 7700 specimens required manual dilutions on the DxC over a 2-y period. Using our optimized AU-specific dilution and reporting parameters, the data-driven simulation analysis showed a 61% reduction in manual dilutions. For the specimens that required manual dilutions on the AU680/5812, we developed standardized dilution procedures to further streamline workflow. We provide a data-driven, practical outline for clinical laboratories to efficiently optimize their use of automated chemistry analyzers. The outcomes can be used to assist laboratories wishing to improve their existing procedures or to facilitate transitioning into a new line of instrumentation, regardless of the instrument model or manufacturer. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. A method for automated control of belt velocity changes with an instrumented treadmill.

    PubMed

    Hinkel-Lipsker, Jacob W; Hahn, Michael E

    2016-01-04

    Increased practice difficulty during asymmetrical split-belt treadmill rehabilitation has been shown to improve gait outcomes during retention and transfer tests. However, research in this area has been limited by manual treadmill operation. In the case of variable practice, which requires stride-by-stride changes to treadmill belt velocities, the treadmill control must be automated. This paper presents a method for automation of asymmetrical split-belt treadmill walking, and evaluates how well this method performs with regards to timing of gait events. One participant walked asymmetrically for 100 strides, where the non-dominant limb was driven at their self-selected walking speed, while the other limb was driven randomly on a stride-by-stride basis. In the control loop, the key factors to insure that the treadmill belt had accelerated to its new velocity safely during the swing phase were the sampling rate of the A/D converter, processing time within the controller software, and acceleration of the treadmill belt. The combination of these three factors resulted in a total control loop time during each swing phase that satisfied these requirements with a factor of safety that was greater than 4. Further, a polynomial fit indicated that belt acceleration was the largest contributor to changes in this total time. This approach appears to be safe and reliable for stride-by-stride adjustment of treadmill belt speed, making it suitable for future asymmetrical split-belt walking studies. Further, it can be incorporated into virtual reality rehabilitation paradigms that utilize split-belt treadmill walking. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Digital control and data acquisition for high-value GTA welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, T.G.; Franco-Ferreira, E.A.

    1993-10-01

    Electric power for the Cassini space probe wig be provided by radioisotope thermoelectric generators (RTGs) thermally driven by General-Purpose Heat Source (GPHS) modules. Each GPHS module contains four, 150-g, pellets of {sup 238}PuO{sub 2}, and each of the four pellets is encapsulated within a thin-wall iridium-alloy shell. GTA girth welding of these capsules is performed at Los Alamos National Laboratory (LANL) on an automated, digitally-controlled welding system. This paper discusses baseline design considerations for system automation and strategies employed to maximize process yield, improve process consistency, and generate required quality assurance information. Design of the automated girth welding system wasmore » driven by a number of factors which militated for precise parametric control and data acquisition. Foremost among these factors was the extraordinary value of the capsule components. In addition, DOE order 5700.6B, which took effect on 23 September 1986, required that all operations adhere to strict levels of process quality assurance. A detailed technical specification for the GPHS welding system was developed on the basis of a joint Lanl/Westinghouse Savannah River Company (WSRC) design effort. After a competitive bidding process, Jetline Engineering, Inc., of Irvine, California, was selected as the system manufacturer. During the period over which four identical welding systems were fabricated, very close liason was maintained between the LANL/WSRC technical representatives and the vendor. The level of rapport was outstanding, and the end result was the 1990 delivery of four systems that met or exceeded all specification requirements.« less

  2. Potentials for Platooning in U.S. Highway Freight Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muratori, Matteo; Holden, Jacob; Lammert, Michael

    2017-03-28

    Smart technologies enabling connection among vehicles and between vehicles and infrastructure as well as vehicle automation to assist human operators are receiving significant attention as a means for improving road transportation systems by reducing fuel consumption - and related emissions - while also providing additional benefits through improving overall traffic safety and efficiency. For truck applications, which are currently responsible for nearly three-quarters of the total U.S. freight energy use and greenhouse gas (GHG) emissions, platooning has been identified as an early feature for connected and automated vehicles (CAVs) that could provide significant fuel savings and improved traffic safety andmore » efficiency without radical design or technology changes compared to existing vehicles. A statistical analysis was performed based on a large collection of real-world U.S. truck usage data to estimate the fraction of total miles that are technically suitable for platooning. In particular, our analysis focuses on estimating 'platoonable' mileage based on overall highway vehicle use and prolonged high-velocity traveling, and established that about 65% of the total miles driven by combination trucks from this data sample could be driven in platoon formation, leading to a 4% reduction in total truck fuel consumption. This technical potential for 'platoonable' miles in the United States provides an upper bound for scenario analysis considering fleet willingness and convenience to platoon as an estimate of overall benefits of early adoption of connected and automated vehicle technologies. A benefit analysis is proposed to assess the overall potential for energy savings and emissions mitigation by widespread implementation of highway platooning for trucks.« less

  3. Simplification of the helical TEN2 laser

    NASA Astrophysics Data System (ADS)

    Krahn, K.-H.

    1980-04-01

    The observation that the helical TEN2 laser can effectively be simplified by giving up the use of decoupling elements as well as by abolishing the segmentation of the electrode structure is examined. Although, as a consequence of this simplification, the operating pressure range was slightly decreased, the output power could be improved by roughly 30%, a result which is attributed to the new electrode geometry exhibiting lower inductance and lower damping losses.

  4. Rightsizing LISA

    NASA Technical Reports Server (NTRS)

    Stebbins, Robin T.

    2009-01-01

    The LISA science requirements and conceptual design have been fairly stable for over a decade. In the interest of reducing costs, the LISA Project at NASA has looked for simplifications of the architecture, at downsizing of subsystems, and at descopes of the entire mission. This is a natural activity of the formulation phase, and one that is particularly timely in the current NASA budgetary context. There is, and will continue to be, enormous pressure for cost reduction from both ESA and NASA, reviewers and the broader research community. Here, the rationale for the baseline architecture is reviewed, and recent efforts to find simplifications and other reductions that might lead to savings are reported. A few possible simplifications have been found in the LISA baseline architecture. In the interest of exploring cost sensitivity, one moderate and one aggressive descope have been evaluated; the cost savings are modest and the loss of science is not.

  5. Electric Power Distribution System Model Simplification Using Segment Substitution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers modelmore » bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). In contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.« less

  6. An alternative Biot's displacement formulation for porous materials.

    PubMed

    Dazel, Olivier; Brouard, Bruno; Depollier, Claude; Griffiths, Stéphane

    2007-06-01

    This paper proposes an alternative displacement formulation of Biot's linear model for poroelastic materials. Its advantage is a simplification of the formalism without making any additional assumptions. The main difference between the method proposed in this paper and the original one is the choice of the generalized coordinates. In the present approach, the generalized coordinates are chosen in order to simplify the expression of the strain energy, which is expressed as the sum of two decoupled terms. Hence, new equations of motion are obtained whose elastic forces are decoupled. The simplification of the formalism is extended to Biot and Willis thought experiments, and simpler expressions of the parameters of the three Biot waves are also provided. A rigorous derivation of equivalent and limp models is then proposed. It is finally shown that, for the particular case of sound-absorbing materials, additional simplifications of the formalism can be obtained.

  7. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. CARES: Completely Automated Robust Edge Snapper for carotid ultrasound IMT measurement on a multi-institutional database of 300 images: a two stage system combining an intensity-based feature approach with first order absolute moments

    NASA Astrophysics Data System (ADS)

    Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.

    2011-03-01

    The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-

  9. Active machine learning-driven experimentation to determine compound effects on protein patterns.

    PubMed

    Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F

    2016-02-03

    High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance.

  10. Automated parking garage system model

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr.

    1975-01-01

    A one-twenty-fifth scale model of the key components of an automated parking garage system is described. The design of the model required transferring a vehicle from an entry level, vertically (+Z, -Z), to a storage location at any one of four storage positions (+X, -X, +Y, +Y, -Y) on the storage levels. There are three primary subsystems: (1) a screw jack to provide the vertical motion of the elevator, (2) a cam-driven track-switching device to provide X to Y motion, and (3) a transfer cart to provide horizontal travel and a small amount to vertical motion for transfer to the storage location. Motive power is provided by dc permanent magnet gear motors, one each for the elevator and track switching device and two for the transfer cart drive system (one driving the cart horizontally and the other providing the vertical transfer). The control system, through the use of a microprocessor, provides complete automation through a feedback system which utilizes sensing devices.

  11. A plan for time-phased incorporation of automation and robotics on the US space station

    NASA Technical Reports Server (NTRS)

    Purves, R. B.; Lin, P. S.; Fisher, E. M., Jr.

    1988-01-01

    A plan for the incorporation of Automation and Robotics technology on the Space Station is presented. The time phased introduction of twenty two selected candidates is set forth in accordance with a technology development forecast. Twenty candidates were chosed primarily for their potential to relieve the crew of mundane or dangerous operations and maintenance burdens, thus freeing crew time for mission duties and enhancing safety. Two candidates were chosen based on a potential for increasing the productivity of laboratory experiments and thus directly enhancing the scientific value of the Space Station. A technology assessment for each candidate investigates present state of the art, development timelines including space qualification considerations, and potential for technology transfer to earth applications. Each candidate is evaluated using a crew workload model driven by crew size, number of pressurized U.S. modules and external payloads, which makes it possible to assess the impact of automation during a growth scenario. Costs for each increment of implementation are estimated and accumulated.

  12. Using generic tool kits to build intelligent systems

    NASA Technical Reports Server (NTRS)

    Miller, David J.

    1994-01-01

    The Intelligent Systems and Robots Center at Sandia National Laboratories is developing technologies for the automation of processes associated with environmental remediation and information-driven manufacturing. These technologies, which focus on automated planning and programming and sensor-based and model-based control, are used to build intelligent systems which are able to generate plans of action, program the necessary devices, and use sensors to react to changes in the environment. By automating tasks through the use of programmable devices tied to computer models which are augmented by sensing, requirements for faster, safer, and cheaper systems are being satisfied. However, because of the need for rapid cost-effect prototyping and multi-laboratory teaming, it is also necessary to define a consistent approach to the construction of controllers for such systems. As a result, the Generic Intelligent System Controller (GISC) concept has been developed. This concept promotes the philosophy of producing generic tool kits which can be used and reused to build intelligent control systems.

  13. Certification of tactics and strategies in aviation

    NASA Technical Reports Server (NTRS)

    Koelman, Hartmut

    1994-01-01

    The paper suggests that the 'tactics and strategies' notion is a highly suitable paradigm to describe the cognitive involvement of human operators in advanced aviation systems (far more suitable than classical functional analysis), and that the workload and situational awareness of operators are intimately associated with the planning and execution of their tactics and strategies. If system designers have muddled views about the collective tactics and strategies to be used during operation, they will produce sub-optimum designs. If operators use unproven and/or inappropriate tactics and strategies, the system may fail. The author wants to make a point that, beyond certification of people or system designs, there may be a need to go into more detail and examine (certify?) the set of tactics and strategies (i.e., the Operational Concept) which makes the people and systems perform as expected. The collective tactics and strategies determine the information flows and situational awareness which exists in organizations and composite human-machine systems. The available infrastructure and equipment (automation) enable these information flows and situational awareness, but are at the same time the constraining factor. Frequently, the tactics and strategies are driven by technology, whereas we would rather like to see a system designed to support an optimized Operational Concept, i.e., to support a sufficiently coherent, cooperative and modular set of anticipation and planning mechanisms. Again, in line with the view of MacLeod and Taylor (1993), this technology driven situation may be caused by the system designer's and operator job designer's over-emphasis on functional analysis (a mechanistic engineering concept), at the expense of a subject which does not seem to be well understood today: the role of the (human cognitive and/or automated) tactics and strategies which are embedded in composite human-machine systems. Research would be needed to arrive at a generally accepted 'planning theory' which can elevate the analysis, description and design of tactics and strategies from today's cottage industry methods to an engineering discipline. The available infrastructure and equipment (automation) enable these information flows and situational awareness, but are at the same time the constraining factor. Frequently, the tactics and strategies are driven by technology, whereas we would rather like to see a system designed to support an optimized Operational Concept, i.e., to support a sufficiently coherent, cooperative and modular set of anticipation and planning mechanisms. Again, in line with the view of MacLeod and Taylor (1993), this technology driven situation may be caused by the system designer's and operator job designer's over-emphasis on functional analysis (a mechanistic engineering concept), at the expense of a subject which does not seem to be well understood today: the role of the (human cognitive and/or automated) tactics and strategies which are embedded in composite human-machine systems. Research would be needed to arrive at a generally accepted 'planning theory' which can evaluate the analysis, description and design of tactics and strategies from today's cottage industry methods to an engineering discipline.

  14. Polarimetry based partial least square classification of ex vivo healthy and basal cell carcinoma human skin tissues.

    PubMed

    Ahmad, Iftikhar; Ahmad, Manzoor; Khan, Karim; Ikram, Masroor

    2016-06-01

    Optical polarimetry was employed for assessment of ex vivo healthy and basal cell carcinoma (BCC) tissue samples from human skin. Polarimetric analyses revealed that depolarization and retardance for healthy tissue group were significantly higher (p<0.001) compared to BCC tissue group. Histopathology indicated that these differences partially arise from BCC-related characteristic changes in tissue morphology. Wilks lambda statistics demonstrated the potential of all investigated polarimetric properties for computer assisted classification of the two tissue groups. Based on differences in polarimetric properties, partial least square (PLS) regression classified the samples with 100% accuracy, sensitivity and specificity. These findings indicate that optical polarimetry together with PLS statistics hold promise for automated pathology classification. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Case and Administrative Support Tools

    EPA Pesticide Factsheets

    Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help reduce office administrative labor costs while increasing employee effectiveness. CAST supports business functions which rely on and store Privacy Act sensitive data (PII). Specific business processes included in CAST (and respective PII) are: -Civil Rights Cast Tracking (name, partial medical history, summary of case, and case correspondance). -Employment Law Case Tracking (name, summary of case). -Federal Tort Claims Act Incident Tracking (name, summary of incidents). -Ethics Program Support Tools and Tracking (name, partial financial history). -Summer Honors Application Tracking (name, home address, telephone number, employment history). -Workforce Flexibility Initiative Support Tools (name, alternative workplace phone number). -Resource and Personnel Management Support Tools (name, partial employment and financial history).

  16. Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery.

    PubMed

    Payre, William; Cestac, Julien; Delhomme, Patricia

    2016-03-01

    An experiment was performed in a driving simulator to investigate the impacts of practice, trust, and interaction on manual control recovery (MCR) when employing fully automated driving (FAD). To increase the use of partially or highly automated driving efficiency and to improve safety, some studies have addressed trust in driving automation and training, but few studies have focused on FAD. FAD is an autonomous system that has full control of a vehicle without any need for intervention by the driver. A total of 69 drivers with a valid license practiced with FAD. They were distributed evenly across two conditions: simple practice and elaborate practice. When examining emergency MCR, a correlation was found between trust and reaction time in the simple practice group (i.e., higher trust meant a longer reaction time), but not in the elaborate practice group. This result indicated that to mitigate the negative impact of overtrust on reaction time, more appropriate practice may be needed. Drivers should be trained in how the automated device works so as to improve MCR performance in case of an emergency. The practice format used in this study could be used for the first interaction with an FAD car when acquiring such a vehicle. © 2015, Human Factors and Ergonomics Society.

  17. An automatic experimental apparatus to study arm reaching in New World monkeys.

    PubMed

    Yin, Allen; An, Jehi; Lehew, Gary; Lebedev, Mikhail A; Nicolelis, Miguel A L

    2016-05-01

    Several species of the New World monkeys have been used as experimental models in biomedical and neurophysiological research. However, a method for controlled arm reaching tasks has not been developed for these species. We have developed a fully automated, pneumatically driven, portable, and reconfigurable experimental apparatus for arm-reaching tasks suitable for these small primates. We have utilized the apparatus to train two owl monkeys in a visually-cued arm-reaching task. Analysis of neural recordings demonstrates directional tuning of the M1 neurons. Our apparatus allows automated control, freeing the experimenter from manual experiments. The presented apparatus provides a valuable tool for conducting neurophysiological research on New World monkeys. Copyright © 2016. Published by Elsevier B.V.

  18. Orthogonal-blendshape-based editing system for facial motion capture data.

    PubMed

    Li, Qing; Deng, Zhigang

    2008-01-01

    The authors present a novel data-driven 3D facial motion capture data editing system using automated construction of an orthogonal blendshape face model and constrained weight propagation, aiming to bridge the popular facial motion capture technique and blendshape approach. In this work, a 3D facial-motion-capture-editing problem is transformed to a blendshape-animation-editing problem. Given a collected facial motion capture data set, we construct a truncated PCA space spanned by the greatest retained eigenvectors and a corresponding blendshape face model for each anatomical region of the human face. As such, modifying blendshape weights (PCA coefficients) is equivalent to editing their corresponding motion capture sequence. In addition, a constrained weight propagation technique allows animators to balance automation and flexible controls.

  19. The value of the Semantic Web in the laboratory.

    PubMed

    Frey, Jeremy G

    2009-06-01

    The Semantic Web is beginning to impact on the wider chemical and physical sciences, beyond the earlier adopted bio-informatics. While useful in large-scale data driven science with automated processing, these technologies can also help integrate the work of smaller scale laboratories producing diverse data. The semantics aid the discovery, reliable re-use of data, provide improved provenance and facilitate automated processing by increased resilience to changes in presentation and reduced ambiguity. The Semantic Web, its tools and collections are not yet competitive with well-established solutions to current problems. It is in the reduced cost of instituting solutions to new problems that the versatility of Semantic Web-enabled data and resources will make their mark once the more general-purpose tools are more available.

  20. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  1. Partial Red Blood Cell Exchange in Children and Young Patients with Sickle Cell Disease: Manual Versus Automated Procedure.

    PubMed

    Escobar, Carlos; Moniz, Marta; Nunes, Pedro; Abadesso, Clara; Ferreira, Teresa; Barra, António; Lichtner, Anabela; Loureiro, Helena; Dias, Alexandra; Almeida, Helena

    2017-10-31

    The benefits of manual versus automated red blood cell exchange have rarely been documented and studies in young sickle cell disease patients are scarce. We aim to describe and compare our experience in these two procedures. Young patients (≤ 21 years old) who underwent manual- or automated-red blood cell exchange for prevention or treatment of sickle cell disease complications were included. Clinical, technical and hematological data were prospectively recorded and analyzed. Ninety-four red blood cell exchange sessions were performed over a period of 68 months, including 57 manual and 37 automated, 63 for chronic complications prevention, 30 for acute complications and one in the pre-operative setting. Mean decrease in sickle hemoglobin levels was higher in automated-red blood cell exchange (p < 0.001) and permitted a higher sickle hemoglobin level decrease per volume removed (p < 0.001), while hemoglobin and hematocrit remained stable. Ferritin levels on chronic patients decreased 54%. Most frequent concern was catheter outflow obstruction on manual-red blood cell exchange and access alarm on automated-red blood cell exchange. No major complication or alloimunization was recorded. Automated-red blood cell exchange decreased sickle hemoglobin levels more efficiently than manual procedure in the setting of acute and chronic complications of sickle cell disease, with minor technical concerns mainly due to vascular access. The threshold of sickle hemoglobin should be individualized for clinical and hematological goals. In our cohort of young patients, the need for an acceptable venous access was a limiting factor, but iron-overload was avoided. Automated red blood cell exchange is safe and well tolerated. It permits a higher sickle hemoglobin removal efficacy, better volume status control and iron-overload avoidance.

  2. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  3. A Landscape View of Agricultural Insecticide Use across the Conterminous US from 1997 through 2012

    DOE PAGES

    Meehan, Timothy D.; Gratton, Claudio; Zhang, Youjun

    2016-11-30

    Simplification of agricultural landscapes is expected to have positive effects on many crop pests and negative effects on their natural enemies, potentially leading to increased pest pressure, decreased crop yield, and increased insecticide use. While many intermediate links in this causal chain have empirical support, there is mixed evidence for ultimate relationships between landscape simplification, crop yield, and insecticide use, especially at large spatial and temporal scales. We explored relationships between landscape simplification (proportion of a county in harvested cropland) and insecticide use (proportion of harvested cropland treated with insecticides), using county-level data from the US Census of Agriculture andmore » a variety of standard and spatiotemporal regression techniques. The best model indicated that insecticide use across the US has increased between 1997 and 2012, was strongly dependent on the crops grown in a county, increased with average farm income and size, and increased with annual growing degree days. After accounting for those variables, and other unidentified spatial and temporal structure in the data, there remained a statistically significant, moderate, positive relationship between insecticide use and landscape simplification. Finally, these results lend general support to the causal chain outlined above, and to the notion that a landscape perspective is useful for managing ecosystem services that are provided by mobile organisms and valuable to agriculture.« less

  4. A Landscape View of Agricultural Insecticide Use across the Conterminous US from 1997 through 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meehan, Timothy D.; Gratton, Claudio; Zhang, Youjun

    Simplification of agricultural landscapes is expected to have positive effects on many crop pests and negative effects on their natural enemies, potentially leading to increased pest pressure, decreased crop yield, and increased insecticide use. While many intermediate links in this causal chain have empirical support, there is mixed evidence for ultimate relationships between landscape simplification, crop yield, and insecticide use, especially at large spatial and temporal scales. We explored relationships between landscape simplification (proportion of a county in harvested cropland) and insecticide use (proportion of harvested cropland treated with insecticides), using county-level data from the US Census of Agriculture andmore » a variety of standard and spatiotemporal regression techniques. The best model indicated that insecticide use across the US has increased between 1997 and 2012, was strongly dependent on the crops grown in a county, increased with average farm income and size, and increased with annual growing degree days. After accounting for those variables, and other unidentified spatial and temporal structure in the data, there remained a statistically significant, moderate, positive relationship between insecticide use and landscape simplification. Finally, these results lend general support to the causal chain outlined above, and to the notion that a landscape perspective is useful for managing ecosystem services that are provided by mobile organisms and valuable to agriculture.« less

  5. Automated cerebral infarct volume measurement in follow-up noncontrast CT scans of patients with acute ischemic stroke.

    PubMed

    Boers, A M; Marquering, H A; Jochem, J J; Besselink, N J; Berkhemer, O A; van der Lugt, A; Beenen, L F; Majoie, C B

    2013-08-01

    Cerebral infarct volume as observed in follow-up CT is an important radiologic outcome measure of the effectiveness of treatment of patients with acute ischemic stroke. However, manual measurement of CIV is time-consuming and operator-dependent. The purpose of this study was to develop and evaluate a robust automated measurement of the CIV. The CIV in early follow-up CT images of 34 consecutive patients with acute ischemic stroke was segmented with an automated intensity-based region-growing algorithm, which includes partial volume effect correction near the skull, midline determination, and ventricle and hemorrhage exclusion. Two observers manually delineated the CIV. Interobserver variability of the manual assessments and the accuracy of the automated method were evaluated by using the Pearson correlation, Bland-Altman analysis, and Dice coefficients. The accuracy was defined as the correlation with the manual assessment as a reference standard. The Pearson correlation for the automated method compared with the reference standard was similar to the manual correlation (R = 0.98). The accuracy of the automated method was excellent with a mean difference of 0.5 mL with limits of agreement of -38.0-39.1 mL, which were more consistent than the interobserver variability of the 2 observers (-40.9-44.1 mL). However, the Dice coefficients were higher for the manual delineation. The automated method showed a strong correlation and accuracy with the manual reference measurement. This approach has the potential to become the standard in assessing the infarct volume as a secondary outcome measure for evaluating the effectiveness of treatment.

  6. A Procedural Content Generation-Based Framework for Educational Games: Toward a Tailored Data-Driven Game for Developing Early English Reading Skills

    ERIC Educational Resources Information Center

    Hooshyar, Danial; Yousefi, Moslem; Lim, Heuiseok

    2018-01-01

    Automated content generation for educational games has become an emerging research problem, as manual authoring is often time consuming and costly. In this article, we present a procedural content generation framework that intends to produce educational game content from the viewpoint of both designer and user. This framework generates content by…

  7. Giant Volume Change of Active Gels under Continuous Flow

    DTIC Science & Technology

    2014-04-21

    harnessing chemical energy to produce motion, for example, using the energy released by ATP hydrolysis to power the directed movement of muscle fibers or micro ...microfluidic systems to generate capsules of biopolymer hydrogels, Herr demonstrated the use of gels for automated microfluidic protein blotting,13 Wu...active gels driven by the Belousov−Zhabotinsky reaction. These results demon- strate that microfluidics offers a useful and facile experimental

  8. Data-Driven Property Estimation for Protective Clothing

    DTIC Science & Technology

    2014-09-01

    reliable predictions falls under the rubric “machine learning”. Inspired by the applications of machine learning in pharmaceutical drug design and...using genetic algorithms, for instance— descriptor selection can be automated as well. A well-known structured learning technique—Artificial Neural...descriptors automatically, by iteration, e.g., using a genetic algorithm [49]. 4.2.4 Avoiding Overfitting A peril of all regression—least squares as

  9. Model Driven Development of Web Services and Dynamic Web Services Composition

    DTIC Science & Technology

    2005-01-01

    27 2.4.1 Feature-Oriented Domain Analysis ( FODA ).......................................27 2.4.2 The need of automation for Feature-Oriented...Diagram Algebra FDL Feature Description Language FODA Feature-Oriented Domain Analysis FSM Finite State Machine GDM Generative Domain...Oriented Domain Analysis ( FODA ) in Section 2.4 and Aspect-Oriented Generative Do- main Modeling (AOGDM) in Section 2.5, which not only represent two

  10. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  11. Automating Partial Period Bond Valuation with Excel's Day Counting Functions

    ERIC Educational Resources Information Center

    Vicknair, David; Spruell, James

    2009-01-01

    An Excel model for calculating the actual price of bonds under a 30 day/month, 360 day/year day counting assumption by nesting the DAYS360 function within the PV function is developed. When programmed into an Excel spreadsheet, the model can accommodate annual and semiannual payment bonds sold on or between interest dates using six fundamental…

  12. An Easily Assembled Laboratory Exercise in Computed Tomography

    ERIC Educational Resources Information Center

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-01-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near…

  13. Automated volume of interest delineation and rendering of cone beam CT images in interventional cardiology

    NASA Astrophysics Data System (ADS)

    Lorenz, Cristian; Schäfer, Dirk; Eshuis, Peter; Carroll, John; Grass, Michael

    2012-02-01

    Interventional C-arm systems allow the efficient acquisition of 3D cone beam CT images. They can be used for intervention planning, navigation, and outcome assessment. We present a fast and completely automated volume of interest (VOI) delineation for cardiac interventions, covering the whole visceral cavity including mediastinum and lungs but leaving out rib-cage and spine. The problem is addressed in a model based approach. The procedure has been evaluated on 22 patient cases and achieves an average surface error below 2mm. The method is able to cope with varying image intensities, varying truncations due to the limited reconstruction volume, and partially with heavy metal and motion artifacts.

  14. MIQuant – Semi-Automation of Infarct Size Assessment in Models of Cardiac Ischemic Injury

    PubMed Central

    Esteves, Tiago; de Pina, Maria de Fátima; Guedes, Joana G.; Freire, Ana; Quelhas, Pedro; Pinto-do-Ó, Perpétua

    2011-01-01

    Background The cardiac regenerative potential of newly developed therapies is traditionally evaluated in rodent models of surgically induced myocardial ischemia. A generally accepted key parameter for determining the success of the applied therapy is the infarct size. Although regarded as a gold standard method for infarct size estimation in heart ischemia, histological planimetry is time-consuming and highly variable amongst studies. The purpose of this work is to contribute towards the standardization and simplification of infarct size assessment by providing free access to a novel semi-automated software tool. The acronym MIQuant was attributed to this application. Methodology/Principal Findings Mice were subject to permanent coronary artery ligation and the size of chronic infarcts was estimated by area and midline-length methods using manual planimetry and with MIQuant. Repeatability and reproducibility of MIQuant scores were verified. The validation showed high correlation (r midline length = 0.981; r area = 0.970 ) and agreement (Bland-Altman analysis), free from bias for midline length and negligible bias of 1.21% to 3.72% for area quantification. Further analysis demonstrated that MIQuant reduced by 4.5-fold the time spent on the analysis and, importantly, MIQuant effectiveness is independent of user proficiency. The results indicate that MIQuant can be regarded as a better alternative to manual measurement. Conclusions We conclude that MIQuant is a reliable and an easy-to-use software for infarct size quantification. The widespread use of MIQuant will contribute towards the standardization of infarct size assessment across studies and, therefore, to the systematization of the evaluation of cardiac regenerative potential of emerging therapies. PMID:21980376

  15. Simplification of a scoring system maintained overall accuracy but decreased the proportion classified as low risk.

    PubMed

    Sanders, Sharon; Flaws, Dylan; Than, Martin; Pickering, John W; Doust, Jenny; Glasziou, Paul

    2016-01-01

    Scoring systems are developed to assist clinicians in making a diagnosis. However, their uptake is often limited because they are cumbersome to use, requiring information on many predictors, or complicated calculations. We examined whether, and how, simplifications affected the performance of a validated score for identifying adults with chest pain in an emergency department who have low risk of major adverse cardiac events. We simplified the Emergency Department Assessment of Chest pain Score (EDACS) by three methods: (1) giving equal weight to each predictor included in the score, (2) reducing the number of predictors, and (3) using both methods--giving equal weight to a reduced number of predictors. The diagnostic accuracy of the simplified scores was compared with the original score in the derivation (n = 1,974) and validation (n = 909) data sets. There was no difference in the overall accuracy of the simplified versions of the score compared with the original EDACS as measured by the area under the receiver operating characteristic curve (0.74 to 0.75 for simplified versions vs. 0.75 for the original score in the validation cohort). With score cut-offs set to maintain the sensitivity of the combination of score and tests (electrocardiogram and cardiac troponin) at a level acceptable to clinicians (99%), simplification reduced the proportion of patients classified as low risk from 50% with the original score to between 22% and 42%. Simplification of a clinical score resulted in similar overall accuracy but reduced the proportion classified as low risk and therefore eligible for early discharge compared with the original score. Whether the trade-off is acceptable, will depend on the context in which the score is to be used. Developers of clinical scores should consider simplification as a method to increase uptake, but further studies are needed to determine the best methods of deriving and evaluating simplified scores. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Simplification and its consequences in biological modelling: conclusions from a study of calcium oscillations in hepatocytes.

    PubMed

    Hetherington, James P J; Warner, Anne; Seymour, Robert M

    2006-04-22

    Systems Biology requires that biological modelling is scaled up from small components to system level. This can produce exceedingly complex models, which obscure understanding rather than facilitate it. The successful use of highly simplified models would resolve many of the current problems faced in Systems Biology. This paper questions whether the conclusions of simple mathematical models of biological systems are trustworthy. The simplification of a specific model of calcium oscillations in hepatocytes is examined in detail, and the conclusions drawn from this scrutiny generalized. We formalize our choice of simplification approach through the use of functional 'building blocks'. A collection of models is constructed, each a progressively more simplified version of a well-understood model. The limiting model is a piecewise linear model that can be solved analytically. We find that, as expected, in many cases the simpler models produce incorrect results. However, when we make a sensitivity analysis, examining which aspects of the behaviour of the system are controlled by which parameters, the conclusions of the simple model often agree with those of the richer model. The hypothesis that the simplified model retains no information about the real sensitivities of the unsimplified model can be very strongly ruled out by treating the simplification process as a pseudo-random perturbation on the true sensitivity data. We conclude that sensitivity analysis is, therefore, of great importance to the analysis of simple mathematical models in biology. Our comparisons reveal which results of the sensitivity analysis regarding calcium oscillations in hepatocytes are robust to the simplifications necessarily involved in mathematical modelling. For example, we find that if a treatment is observed to strongly decrease the period of the oscillations while increasing the proportion of the cycle during which cellular calcium concentrations are rising, without affecting the inter-spike or maximum calcium concentrations, then it is likely that the treatment is acting on the plasma membrane calcium pump.

  17. Closed-form solutions in stress-driven two-phase integral elasticity for bending of functionally graded nano-beams

    NASA Astrophysics Data System (ADS)

    Barretta, Raffaele; Fabbrocino, Francesco; Luciano, Raimondo; Sciarra, Francesco Marotti de

    2018-03-01

    Strain-driven and stress-driven integral elasticity models are formulated for the analysis of the structural behaviour of fuctionally graded nano-beams. An innovative stress-driven two-phases constitutive mixture defined by a convex combination of local and nonlocal phases is presented. The analysis reveals that the Eringen strain-driven fully nonlocal model cannot be used in Structural Mechanics since it is ill-posed and the local-nonlocal mixtures based on the Eringen integral model partially resolve the ill-posedeness of the model. In fact, a singular behaviour of continuous nano-structures appears if the local fraction tends to vanish so that the ill-posedness of the Eringen integral model is not eliminated. On the contrary, local-nonlocal mixtures based on the stress-driven theory are mathematically and mechanically appropriate for nanosystems. Exact solutions of inflected functionally graded nanobeams of technical interest are established by adopting the new local-nonlocal mixture stress-driven integral relation. Effectiveness of the new nonlocal approach is tested by comparing the contributed results with the ones corresponding to the mixture Eringen theory.

  18. Sediment transport through self-adjusting, bedrock-walled waterfall plunge pools

    NASA Astrophysics Data System (ADS)

    Scheingross, Joel S.; Lamb, Michael P.

    2016-05-01

    Many waterfalls have deep plunge pools that are often partially or fully filled with sediment. Sediment fill may control plunge-pool bedrock erosion rates, partially determine habitat availability for aquatic organisms, and affect sediment routing and debris flow initiation. Currently, there exists no mechanistic model to describe sediment transport through waterfall plunge pools. Here we develop an analytical model to predict steady-state plunge-pool depth and sediment-transport capacity by combining existing jet theory with sediment transport mechanics. Our model predicts plunge-pool sediment-transport capacity increases with increasing river discharge, flow velocity, and waterfall drop height and decreases with increasing plunge-pool depth, radius, and grain size. We tested the model using flume experiments under varying waterfall and plunge-pool geometries, flow hydraulics, and sediment size. The model and experiments show that through morphodynamic feedbacks, plunge pools aggrade to reach shallower equilibrium pool depths in response to increases in imposed sediment supply. Our theory for steady-state pool depth matches the experiments with an R2 value of 0.8, with discrepancies likely due to model simplifications of the hydraulics and sediment transport. Analysis of 75 waterfalls suggests that the water depths in natural plunge pools are strongly influenced by upstream sediment supply, and our model provides a mass-conserving framework to predict sediment and water storage in waterfall plunge pools for sediment routing, habitat assessment, and bedrock erosion modeling.

  19. Impedance-based overcharging and gassing model for VRLA/AGM batteries

    NASA Astrophysics Data System (ADS)

    Thele, M.; Karden, E.; Surewaard, E.; Sauer, D. U.

    This paper presents for the first time an impedance-based non-linear model for lead-acid batteries that is applicable in all operational modes. An overcharging model describes the accumulation and depletion of the dissolved Pb 2+ ions. This physical model has been added to the earlier presented model to expand the model validity. To properly represent the charge acceptance during dynamic operation, a concept of "hardening crystals" has been introduced in the model. Moreover, a detailed gassing and oxygen recombination model has been integrated. A realistic simulation of the overcharging behavior is now possible. The mathematical description is given in the paper. Simplifications are introduced that allow for an efficient implementation and for model parameterization in the time domain. A comparison between experimental data and simulation results demonstrates the achieved accuracy. The model enhancement is of major importance to analyze charging strategies especially in partial-cycling operation with limited charging time, e.g. in electrically assisted or hybrid cars and autonomous power supply systems.

  20. SP-100 - The national space reactor power system program in response to future needs

    NASA Astrophysics Data System (ADS)

    Armijo, J. S.; Josloff, A. T.; Bailey, H. S.; Matteo, D. N.

    The SP-100 system has been designed to meet comprehensive and demanding NASA/DOD/DOE requirements. The key requirements include: nuclear safety for all mission phases, scalability from 10's to 100's of kWe, reliable performance at full power for seven years of partial power for ten years, survivability in civil or military threat environments, capability to operate autonomously for up to six months, capability to protect payloads from excessive radiation, and compatibility with shuttle and expendable launch vehicles. The authors address of major progress in terms of design, flexibility/scalability, survivability, and development. These areas, with the exception of survivability, are discussed in detail. There has been significant improvement in the generic flight system design with substantial mass savings and simplification that enhance performance and reliability. Design activity has confirmed the scalability and flexibility of the system and the ability to efficiently meet NASA, AF, and SDIO needs. SP-100 development continues to make significant progress in all key technology areas.

  1. Vibroacoustic test plan evaluation: Parameter variation study

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloef, H. R.

    1976-01-01

    Statistical decision models are shown to provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology developed provides a major step toward the development of a realistic tool to quantitatively tailor test programs to specific payloads. Testing is considered at the no test, component, subassembly, or system level of assembly. Component redundancy and partial loss of flight data are considered. Most and probabilistic costs are considered, and incipient failures resulting from ground tests are treated. Optimums defining both component and assembly test levels are indicated for the modified test plans considered. modeling simplifications must be considered in interpreting the results relative to a particular payload. New parameters introduced were a no test option, flight by flight failure probabilities, and a cost to design components for higher vibration requirements. Parameters varied were the shuttle payload bay internal acoustic environment, the STS launch cost, the component retest/repair cost, and the amount of redundancy in the housekeeping section of the payload reliability model.

  2. Noninvasive imaging of oral mucosae with optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Lee, Cheng-Yu; Chen, Wei-Chuan; Tsai, Meng-Tsan

    2017-04-01

    In this study, a swept-source optical coherence tomography (OCT) system is developed for in vivo visualization of structural and vascular morphology oral mucosa. For simplification of optical probe fabrication, probe weight, and system setup, the body of the scanning probe is fabricated by a 3D printer to fix the optical components and the mechanical scanning device, and a partially reflective slide is attached at the output end of probe to achieve a common-path configuration. Aside from providing the ability of 3D structural imaging with the developed system, 3D vascular images of oral mucosa can be simultaneously obtained. Then, different locations of oral mucosa are scanned with common-path OCT. The results show that epithelium and lamina propria layers as well as fungiform papilla can be identified and microvascular images can be acquired. With the proposed probe, the system cost and volume can be greatly reduced. Experimental results indicate that such common-path OCT system could be further implemented for oral cancer diagnosis.

  3. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis

    PubMed Central

    Garrison, Kathleen A.; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J.; Aziz-Zadeh, Lisa S.

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant’s structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant’s non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design. PMID:26441816

  4. Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology

    NASA Astrophysics Data System (ADS)

    Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.

    2015-03-01

    In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.

  5. Structural insights into the light-driven auto-assembly process of the water-oxidizing Mn 4CaO 5-cluster in photosystem II

    DOE PAGES

    Zhang, Miao; Bommer, Martin; Chatterjee, Ruchira; ...

    2017-07-18

    In plants, algae and cyanobacteria, Photosystem II (PSII) catalyzes the light-driven splitting of water at a protein-bound Mn 4CaO 5-cluster, the water-oxidizing complex (WOC). In the photosynthetic organisms, the light-driven formation of the WOC from dissolved metal ions is a key process because it is essential in both initial activation and continuous repair of PSII. Structural information is required for understanding of this chaperone-free metal-cluster assembly. For the first time, we obtained a structure of PSII from Thermosynechococcus elongatus without the Mn 4CaO 5-cluster. Surprisingly, cluster-removal leaves the positions of all coordinating amino acid residues and most nearby water moleculesmore » largely unaffected, resulting in a pre-organized ligand shell for kinetically competent and error-free photo-assembly of the Mn 4CaO 5-cluster. First experiments initiating (i) partial disassembly and (ii) partial re-assembly after complete depletion of the Mn4CaO5-cluster agree with a specific bi-manganese cluster, likely a di-µ-oxo bridged pair of Mn(III) ions, as an assembly intermediate.« less

  6. Structural insights into the light-driven auto-assembly process of the water-oxidizing Mn4CaO5-cluster in photosystem II.

    PubMed

    Zhang, Miao; Bommer, Martin; Chatterjee, Ruchira; Hussein, Rana; Yano, Junko; Dau, Holger; Kern, Jan; Dobbek, Holger; Zouni, Athina

    2017-07-18

    In plants, algae and cyanobacteria, Photosystem II (PSII) catalyzes the light-driven splitting of water at a protein-bound Mn 4 CaO 5 -cluster, the water-oxidizing complex (WOC). In the photosynthetic organisms, the light-driven formation of the WOC from dissolved metal ions is a key process because it is essential in both initial activation and continuous repair of PSII. Structural information is required for understanding of this chaperone-free metal-cluster assembly. For the first time, we obtained a structure of PSII from Thermosynechococcus elongatus without the Mn 4 CaO 5 -cluster. Surprisingly, cluster-removal leaves the positions of all coordinating amino acid residues and most nearby water molecules largely unaffected, resulting in a pre-organized ligand shell for kinetically competent and error-free photo-assembly of the Mn 4 CaO 5 -cluster. First experiments initiating (i) partial disassembly and (ii) partial re-assembly after complete depletion of the Mn 4 CaO 5 -cluster agree with a specific bi-manganese cluster, likely a di-µ-oxo bridged pair of Mn(III) ions, as an assembly intermediate.

  7. Capillary-Driven Microfluidic Chips for Miniaturized Immunoassays: Efficient Fabrication and Sealing of Chips Using a "Chip-Olate" Process.

    PubMed

    Temiz, Yuksel; Delamarche, Emmanuel

    2017-01-01

    The fabrication of silicon-based microfluidic chips is invaluable in supporting the development of many microfluidic concepts for research in the life sciences and in vitro diagnostic applications such as the realization of miniaturized immunoassays using capillary-driven chips. While being extremely abundant, the literature covering microfluidic chip fabrication and assay development might not have addressed properly the challenge of fabricating microfluidic chips on a wafer level or the need for dicing wafers to release chips that need then to be further processed, cleaned, rinsed, and dried one by one. Here, we describe the "chip-olate" process wherein microfluidic structures are formed on a silicon wafer, followed by partial dicing, cleaning, and drying steps. Then, integration of reagents (if any) can be done, followed by lamination of a sealing cover. Breaking by hand the partially diced wafer yields individual chips ready for use.

  8. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  9. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jose Reyes

    In recent years it has been recognized that the application of passive safety systems (i.e., those whose operation takes advantage of natural forces such as convection and gravity), can contribute to simplification and potentially to improved economics of new nuclear power plant designs. In 1991 the IAEA Conference on ''The Safety of Nuclear Power: Strategy for the Future'' noted that for new plants the use of passive safety features is a desirable method of achieving simplification and increasing the reliability of the performance of essential safety functions, and should be used wherever appropriate''.

  11. Novel limiter pump topologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, J.H.

    1981-01-01

    The use of limiter pumps as the principle plasma exhaust system of a magnetic confinement fusion device promises significant simplification, when compared to previously investigating divertor based systems. Further simplifications, such as the integration of the exhaust system with a radio frequency heating system and with the main reactor shield and structure are investigated below. The integrity of limiters in a reactor environment is threatened by many mechanisms, the most severe of which may be erosion by sputtering. Two novel topolgies are suggested which allow high erosion without limiter failure.

  12. An Agent Based Collaborative Simplification of 3D Mesh Model

    NASA Astrophysics Data System (ADS)

    Wang, Li-Rong; Yu, Bo; Hagiwara, Ichiro

    Large-volume mesh model faces the challenge in fast rendering and transmission by Internet. The current mesh models obtained by using three-dimensional (3D) scanning technology are usually very large in data volume. This paper develops a mobile agent based collaborative environment on the development platform of mobile-C. Communication among distributed agents includes grasping image of visualized mesh model, annotation to grasped image and instant message. Remote and collaborative simplification can be efficiently conducted by Internet.

  13. An Automated Test of Rat Forelimb Supination Quantifies Motor Function Loss and Recovery After Corticospinal Injury.

    PubMed

    Sindhurakar, Anil; Butensky, Samuel D; Meyers, Eric; Santos, Joshua; Bethea, Thelma; Khalili, Ashley; Sloan, Andrew P; Rennaker, Robert L; Carmel, Jason B

    2017-02-01

    Rodents are the primary animal model of corticospinal injury and repair, yet current behavioral tests do not show the large deficits after injury observed in humans. Forearm supination is critical for hand function and is highly impaired by corticospinal injury in both humans and rats. Current tests of rodent forelimb function do not measure this movement. To determine if quantification of forelimb supination in rats reveals large-scale functional loss and partial recovery after corticospinal injury. We developed a knob supination device that quantifies supination using automated and objective methods. Rats in a reaching box have to grasp and turn a knob in supination in order to receive a food reward. Performance on this task and the single pellet reaching task were measured before and after 2 manipulations of the pyramidal tract: a cut lesion of 1 pyramid and inactivation of motor cortex using 2 different drug doses. A cut lesion of the corticospinal tract produced a large deficit in supination. In contrast, there was no change in pellet retrieval success. Supination function recovered partially over 6 weeks after injury, and a large deficit remained. Motor cortex inactivation produced a dose-dependent loss of knob supination; the effect on pellet reaching was more subtle. The knob supination task reveals in rodents 3 signature hand function changes observed in humans with corticospinal injury: (1) large-scale loss with injury, (2) partial recovery in the weeks after injury, and (3) loss proportional to degree of dysfunction.

  14. Laser-driven acceleration of electrons in a partially ionized plasma channel.

    PubMed

    Rowlands-Rees, T P; Kamperidis, C; Kneip, S; Gonsalves, A J; Mangles, S P D; Gallacher, J G; Brunetti, E; Ibbotson, T; Murphy, C D; Foster, P S; Streeter, M J V; Budde, F; Norreys, P A; Jaroszynski, D A; Krushelnick, K; Najmudin, Z; Hooker, S M

    2008-03-14

    The generation of quasimonoenergetic electron beams, with energies up to 200 MeV, by a laser-plasma accelerator driven in a hydrogen-filled capillary discharge waveguide is investigated. Injection and acceleration of electrons is found to depend sensitively on the delay between the onset of the discharge current and the arrival of the laser pulse. A comparison of spectroscopic and interferometric measurements suggests that injection is assisted by laser ionization of atoms or ions within the channel.

  15. Kinetics and mechanics of clot contraction are governed by the molecular and cellular composition of the blood.

    PubMed

    Tutwiler, Valerie; Litvinov, Rustem I; Lozhkin, Andrey P; Peshkova, Alina D; Lebedeva, Tatiana; Ataullakhanov, Fazoil I; Spiller, Kara L; Cines, Douglas B; Weisel, John W

    2016-01-07

    Platelet-driven blood clot contraction (retraction) is thought to promote wound closure and secure hemostasis while preventing vascular occlusion. Notwithstanding its importance, clot contraction remains a poorly understood process, partially because of the lack of methodology to quantify its dynamics and requirements. We used a novel automated optical analyzer to continuously track in vitro changes in the size of contracting clots in whole blood and in variously reconstituted samples. Kinetics of contraction was complemented with dynamic rheometry to characterize the viscoelasticity of contracting clots. This combined approach enabled investigation of the coordinated mechanistic impact of platelets, including nonmuscle myosin II, red blood cells (RBCs), fibrin(ogen), factor XIIIa (FXIIIa), and thrombin on the kinetics and mechanics of the contraction process. Clot contraction is composed of 3 sequential phases, each characterized by a distinct rate constant. Thrombin, Ca(2+), the integrin αIIbβ3, myosin IIa, FXIIIa cross-linking, and platelet count all promote 1 or more phases of the clot contraction process. In contrast, RBCs impair contraction and reduce elasticity, while increasing the overall contractile stress generated by the platelet-fibrin meshwork. A better understanding of the mechanisms by which blood cells, fibrin(ogen), and platelet-fibrin interactions modulate clot contraction may generate novel approaches to reveal and to manage thrombosis and hemostatic disorders. © 2016 by The American Society of Hematology.

  16. Kinetics and mechanics of clot contraction are governed by the molecular and cellular composition of the blood

    PubMed Central

    Tutwiler, Valerie; Litvinov, Rustem I.; Lozhkin, Andrey P.; Peshkova, Alina D.; Lebedeva, Tatiana; Ataullakhanov, Fazoil I.; Spiller, Kara L.; Cines, Douglas B.

    2016-01-01

    Platelet-driven blood clot contraction (retraction) is thought to promote wound closure and secure hemostasis while preventing vascular occlusion. Notwithstanding its importance, clot contraction remains a poorly understood process, partially because of the lack of methodology to quantify its dynamics and requirements. We used a novel automated optical analyzer to continuously track in vitro changes in the size of contracting clots in whole blood and in variously reconstituted samples. Kinetics of contraction was complemented with dynamic rheometry to characterize the viscoelasticity of contracting clots. This combined approach enabled investigation of the coordinated mechanistic impact of platelets, including nonmuscle myosin II, red blood cells (RBCs), fibrin(ogen), factor XIIIa (FXIIIa), and thrombin on the kinetics and mechanics of the contraction process. Clot contraction is composed of 3 sequential phases, each characterized by a distinct rate constant. Thrombin, Ca2+, the integrin αIIbβ3, myosin IIa, FXIIIa cross-linking, and platelet count all promote 1 or more phases of the clot contraction process. In contrast, RBCs impair contraction and reduce elasticity, while increasing the overall contractile stress generated by the platelet-fibrin meshwork. A better understanding of the mechanisms by which blood cells, fibrin(ogen), and platelet-fibrin interactions modulate clot contraction may generate novel approaches to reveal and to manage thrombosis and hemostatic disorders. PMID:26603837

  17. Design and Fabrication of a Real-Time Measurement System for the Capsaicinoid Content of Korean Red Pepper (Capsicum annuum L.) Powder by Visible and Near-Infrared Spectroscopy.

    PubMed

    Lim, Jongguk; Kim, Giyoung; Mo, Changyeun; Kim, Moon S

    2015-10-29

    This research aims to design and fabricate a system to measure the capsaicinoid content of red pepper powder in a non-destructive and rapid method using visible and near infrared spectroscopy (VNIR). The developed system scans a well-leveled powder surface continuously to minimize the influence of the placenta distribution, thus acquiring stable and representative reflectance spectra. The system incorporates flat belts driven by a sample input hopper and stepping motor, a powder surface leveler, charge-coupled device (CCD) image sensor-embedded VNIR spectrometer, fiber optic probe, and tungsten halogen lamp, and an automated reference measuring unit with a reference panel to measure the standard spectrum. The operation program includes device interface, standard reflectivity measurement, and a graphical user interface to measure the capsaicinoid content. A partial least square regression (PLSR) model was developed to predict the capsaicinoid content; 44 red pepper powder samples whose measured capsaicinoid content ranged 13.45-159.48 mg/100 g by per high-performance liquid chromatography (HPLC) and 1242 VNIR absorbance spectra acquired by the pungency measurement system were used. The determination coefficient of validation (RV2) and standard error of prediction (SEP) for the model with the first-order derivative pretreatment method for Korean red pepper powder were 0.8484 and ±13.6388 mg/100 g, respectively.

  18. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  19. A Digital Microfluidics Platform for Loop-Mediated Isothermal Amplification Detection

    PubMed Central

    Veigas, Bruno; Águas, Hugo; Fortunato, Elvira; Martins, Rodrigo; Baptista, Pedro Viana; Igreja, Rui

    2017-01-01

    Digital microfluidics (DMF) arises as the next step in the fast-evolving field of operation platforms for molecular diagnostics. Moreover, isothermal schemes, such as loop-mediated isothermal amplification (LAMP), allow for further simplification of amplification protocols. Integrating DMF with LAMP will be at the core of a new generation of detection devices for effective molecular diagnostics at point-of-care (POC), providing simple, fast, and automated nucleic acid amplification with exceptional integration capabilities. Here, we demonstrate for the first time the role of coupling DMF and LAMP, in a dedicated device that allows straightforward mixing of LAMP reagents and target DNA, as well as optimum temperature control (reaction droplets undergo a temperature variation of just 0.3 °C, for 65 °C at the bottom plate). This device is produced using low-temperature and low-cost production processes, adaptable to disposable and flexible substrates. DMF-LAMP is performed with enhanced sensitivity without compromising reaction efficacy or losing reliability and efficiency, by LAMP-amplifying 0.5 ng/µL of target DNA in just 45 min. Moreover, on-chip LAMP was performed in 1.5 µL, a considerably lower volume than standard bench-top reactions. PMID:29144379

  20. Automated platelet collection using the latest apheresis devices in an Indian setting.

    PubMed

    Agarwal, Prashant; Verma, Anupam

    2009-10-01

    In a developing nation like India where there is a scarcity of resources and voluntary donors, provision of safe and good quality blood and its components is a huge challenge. The demand for platelets is increasing constantly due to better management of various patient categories, specifically hemato-oncological cases, where there is an increased demand of platelet transfusion. The use of apheresis single donor platelets (SDPs) has been attributed to increased gap between demand and supply of whole blood derived random donor platelets (RDPs). Moreover, the other benefits of SDPs such as decreased donor exposure and simplification of inventory management cannot be overlooked. However, the increased costs and logistic problems, compounded by the lack of awareness, limit the donor recruitment and procedures for SDPs. In Indian scenario, there are no specific guidelines or standards available which can be followed, while simultaneously addressing the associated problems. In this review, we have tried to analyze the various problems of donor selection, donor safety and the quality issues regarding plateletpheresis. Based on this we have tried to give certain recommendations which might help the centers in resolving the problems related to plateletpheresis.

  1. Tsunamigenic Gravity Waves in the Thermosphere-Ionosphere System: Challenges and Opportunities (Invited)

    NASA Astrophysics Data System (ADS)

    Hickey, M. P.

    2010-12-01

    There has been a recent resurgence of interest in the association between tsunamis and traveling ionospheric disturbances (TIDs), fueled in part by the use of GPS satellite technologies to remotely monitor the ionosphere. The TID observations have also triggered a renewed interest in the modeling of such events. Up to this point in time the various model simulations have incorporated various simplifications, some of which are briefly described. A future challenge is to bring together suites of models that each realistically describes one of the subsystems. In this talk I will describe the results of using a linear spectral full-wave model to simulate the propagation of a gravity wave disturbance from the sea surface to the thermosphere. In the model this disturbance is driven by a lower boundary perturbation that mimics a tsunami. A linear model describing the response of the ionosphere to neutral atmosphere perturbations, and airglow perturbations driven by ionosphere and neutral atmosphere fluctuations are also described. Additionally, the gravity wave disturbances carries wave momentum, which will be deposited in the thermosphere accompanying the viscous dissipation of wave energy and lead to accelerations of the mean state. In spite of the simplicity of these models, much can be learned from them. It is suggested that these rare events offer a fairly unique opportunity to test models describing such processes. Model predictions of total electron content (TEC) fluctuations are also briefly compared with TEC measurements obtained following some recent major tsunamis.

  2. A study into the automation of cognitive assessment tasks for delivery via the telephone: lessons for developing remote monitoring applications for the elderly.

    PubMed

    D'Arcy, Shona; Rapcan, Viliam; Gali, Alessandra; Burke, Nicola; O'Connell, Gloria Crispino; Robertson, Ian H; Reilly, Richard B

    2013-01-01

    Cognitive assessments are valuable tools in assessing neurological conditions. They are critical in measuring deficits in cognitive function in an array of neurological disorders and during the ageing process. Automation of cognitive assessments is one way to address the increasing burden on medical resources for an ever increasing ageing population. This study investigated the suitability of using automated Interactive Voice Response (IVR) technology to deliver a suite of cognitive assessments to older adults using speech as the input modality. Several clinically valid and gold-standard cognitive assessments were selected for implementation in the IVR application. The IVR application was designed using human centred design principles to ensure the experience was as user friendly as possible. Sixty one participants completed two IVR assessments and one face to face (FF) assessment with a neuropsychologist. Completion rates for individual tests were inspected to identify those tests that are most suitable for administration via IVR technology. Interclass correlations were calculated to assess the reliability of the automated administration of the cognitive assessments across delivery modes. While all participants successfully completed all automated assessments, variability in the completion rates for different cognitive tests was observed. Statistical analysis found significant interclass correlations for certain cognitive tests between the different modes of administration. Analysis also suggests that an initial FF assessment reduces the variability in cognitive test scores when introducing automation into such an assessment. [corrected] This study has demonstrated the functional and cognitive reliability of administering specific cognitive tests using an automated, speech driven application. This study has defined the characteristics of existing cognitive tests that are suitable for such an automated delivery system and also informs on the limitations of other cognitive tests for this modality. This study presents recommendations for developing future large scale cognitive assessments.

  3. High-content imaging for automated determination of host-cell infection rate by the intracellular parasite Trypanosoma cruzi.

    PubMed

    Nohara, L L; Lema, C; Bader, J O; Aguilera, R J; Almeida, I C

    2010-12-01

    Chagas disease affects 8-11 million people, mostly in Latin America. Sequelae include cardiac, peripheral nervous and/or gastrointestinal disorders, thus placing a large economic and social burden on endemic countries. The pathogenesis and the evolutive pattern of the disease are not fully clarified. Moreover, available drugs are partially effective and toxic, and there is no vaccine. Therefore, there is an urgent need to speed up basic and translational research in the field. Here, we applied automated high-content imaging to generate multiparametric data on a cell-by-cell basis to precisely and quickly determine several parameters associated with in vitro infection of host cell by Trypanosoma cruzi, the causative agent of Chagas disease. Automated and manual quantifications were used to determine the percentage of T. cruzi-infected cells in a 96-well microplate format and the data generated was statistically evaluated. Most importantly, this automated approach can be widely applied for discovery of potential drugs as well as molecular pathway elucidation not only in T. cruzi but also in other human intracellular pathogens. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Psychosocial factors associated with intended use of automated vehicles: A simulated driving study.

    PubMed

    Buckley, Lisa; Kaye, Sherrie-Anne; Pradhan, Anuj K

    2018-06-01

    This study applied the Theory of Planned Behavior (TPB) and the Technology Acceptance Model (TAM) to assess drivers' intended use of automated vehicles (AVs) after undertaking a simulated driving task. In addition, this study explored the potential for trust to account for additional variance to the psychosocial factors in TPB and TAM. Seventy-four participants (51% female) aged between 25 and 64 years (M = 42.8, SD = 12.9) undertook a 20 min simulated experimental drive in which participants experienced periods of automated driving and manual control. A survey task followed. A hierarchical regression analysis revealed that TPB constructs; attitude toward the behavior, subjective norms, and perceived behavioral control, were significant predictors of intentions to use AV. In addition, there was partial support for the test of TAM, with ease of use (but not usefulness) predicting intended use of AV (SAE Level 3). Trust contributed variance to both models beyond TPB or TAM constructs. The findings provide an important insight into factors that might reflect intended use of vehicles that are primarily automated (longitudinal, lateral, and manoeuvre controls) but require and allow drivers to have periods of manual control. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Augmenting SCA project management and automation framework

    NASA Astrophysics Data System (ADS)

    Iyapparaja, M.; Sharma, Bhanupriya

    2017-11-01

    In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.

  6. Electric Power Distribution System Model Simplification Using Segment Substitution

    DOE PAGES

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat; ...

    2017-09-20

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers modelmore » bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). Finally, in contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.« less

  7. Landscape simplification filters species traits and drives biotic homogenization

    PubMed Central

    Gámez-Virués, Sagrario; Perović, David J.; Gossner, Martin M.; Börschig, Carmen; Blüthgen, Nico; de Jong, Heike; Simons, Nadja K.; Klein, Alexandra-Maria; Krauss, Jochen; Maier, Gwen; Scherber, Christoph; Steckel, Juliane; Rothenwöhrer, Christoph; Steffan-Dewenter, Ingolf; Weiner, Christiane N.; Weisser, Wolfgang; Werner, Michael; Tscharntke, Teja; Westphal, Catrin

    2015-01-01

    Biodiversity loss can affect the viability of ecosystems by decreasing the ability of communities to respond to environmental change and disturbances. Agricultural intensification is a major driver of biodiversity loss and has multiple components operating at different spatial scales: from in-field management intensity to landscape-scale simplification. Here we show that landscape-level effects dominate functional community composition and can even buffer the effects of in-field management intensification on functional homogenization, and that animal communities in real-world managed landscapes show a unified response (across orders and guilds) to both landscape-scale simplification and in-field intensification. Adults and larvae with specialized feeding habits, species with shorter activity periods and relatively small body sizes are selected against in simplified landscapes with intense in-field management. Our results demonstrate that the diversity of land cover types at the landscape scale is critical for maintaining communities, which are functionally diverse, even in landscapes where in-field management intensity is high. PMID:26485325

  8. Simplification of the DPPH assay for estimating the antioxidant activity of wine and wine by-products.

    PubMed

    Carmona-Jiménez, Yolanda; García-Moreno, M Valme; Igartuburu, Jose M; Garcia Barroso, Carmelo

    2014-12-15

    The DPPH assay is one of the most commonly employed methods for measuring antioxidant activity. Even though this method is considered very simple and efficient, it does present various limitations which make it complicated to perform. The range of linearity between the DPPH inhibition percentage and sample concentration has been studied with a view to simplifying the method for characterising samples of wine origin. It has been concluded that all the samples are linear in a range of inhibition below 40%, which allows the analysis to be simplified. A new parameter more appropriate for the simplification, the EC20, has been proposed to express the assay results. Additionally, the reaction time was analysed with the object of avoiding the need for kinetic studies in the method. The simplifications considered offer a more functional method, without significant errors, which could be used for routine analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Simplification of Markov chains with infinite state space and the mathematical theory of random gene expression bursts.

    PubMed

    Jia, Chen

    2017-09-01

    Here we develop an effective approach to simplify two-time-scale Markov chains with infinite state spaces by removal of states with fast leaving rates, which improves the simplification method of finite Markov chains. We introduce the concept of fast transition paths and show that the effective transitions of the reduced chain can be represented as the superposition of the direct transitions and the indirect transitions via all the fast transition paths. Furthermore, we apply our simplification approach to the standard Markov model of single-cell stochastic gene expression and provide a mathematical theory of random gene expression bursts. We give the precise mathematical conditions for the bursting kinetics of both mRNAs and proteins. It turns out that random bursts exactly correspond to the fast transition paths of the Markov model. This helps us gain a better understanding of the physics behind the bursting kinetics as an emergent behavior from the fundamental multiscale biochemical reaction kinetics of stochastic gene expression.

  10. Simplification of Markov chains with infinite state space and the mathematical theory of random gene expression bursts

    NASA Astrophysics Data System (ADS)

    Jia, Chen

    2017-09-01

    Here we develop an effective approach to simplify two-time-scale Markov chains with infinite state spaces by removal of states with fast leaving rates, which improves the simplification method of finite Markov chains. We introduce the concept of fast transition paths and show that the effective transitions of the reduced chain can be represented as the superposition of the direct transitions and the indirect transitions via all the fast transition paths. Furthermore, we apply our simplification approach to the standard Markov model of single-cell stochastic gene expression and provide a mathematical theory of random gene expression bursts. We give the precise mathematical conditions for the bursting kinetics of both mRNAs and proteins. It turns out that random bursts exactly correspond to the fast transition paths of the Markov model. This helps us gain a better understanding of the physics behind the bursting kinetics as an emergent behavior from the fundamental multiscale biochemical reaction kinetics of stochastic gene expression.

  11. Progressive simplification and transmission of building polygons based on triangle meshes

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Wang, Yingjie; Guo, Qingsheng; Han, Jiafu

    2010-11-01

    Digital earth is a virtual representation of our planet and a data integration platform which aims at harnessing multisource, multi-resolution, multi-format spatial data. This paper introduces a research framework integrating progressive cartographic generalization and transmission of vector data. The progressive cartographic generalization provides multiple resolution data from coarse to fine as key scales and increments between them which is not available in traditional generalization framework. Based on the progressive simplification algorithm, the building polygons are triangulated into meshes and encoded according to the simplification sequence of two basic operations, edge collapse and vertex split. The map data at key scales and encoded increments between them are stored in a multi-resolution file. As the client submits requests to the server, the coarsest map is transmitted first and then the increments. After data decoding and mesh refinement the building polygons with more details will be visualized. Progressive generalization and transmission of building polygons is demonstrated in the paper.

  12. Electric Power Distribution System Model Simplification Using Segment Substitution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers modelmore » bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). Finally, in contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.« less

  13. Method and apparatus for controlling gas evolution from chemical reactions

    DOEpatents

    Skorpik, James R.; Dodson, Michael G.

    1999-01-01

    The present invention is directed toward monitoring a thermally driven gas evolving chemical reaction with an acoustic apparatus. Signals from the acoustic apparatus are used to control a heater to prevent a run-away condition. A digestion module in combination with a robotic arm further automate physical handling of sample material reaction vessels. The invention is especially useful for carrying out sample procedures defined in EPA Methods SW-846.

  14. Research into language concepts for the mission control center

    NASA Technical Reports Server (NTRS)

    Dellenback, Steven W.; Barton, Timothy J.; Ratner, Jeremiah M.

    1990-01-01

    A final report is given on research into language concepts for the Mission Control Center (MCC). The Specification Driven Language research is described. The state of the image processing field and how image processing techniques could be applied toward automating the generation of the language known as COmputation Development Environment (CODE or Comp Builder) are discussed. Also described is the development of a flight certified compiler for Comps.

  15. Validation of a Sensor-Driven Modeling Paradigm for Multiple Source Reconstruction with FFT-07 Data

    DTIC Science & Technology

    2009-05-01

    operational warning and reporting (information) systems that combine automated data acquisition, analysis , source reconstruction, display and distribution of...report and to incorporate this operational ca- pability into the integrative multiscale urban modeling system implemented in the com- putational...Journal of Fluid Mechanics, 180, 529–556. [27] Flesch, T., Wilson, J. D., and Yee, E. (1995), Backward- time Lagrangian stochastic dispersion models

  16. A peristaltic pump driven 89Zr separation module

    NASA Astrophysics Data System (ADS)

    Siikanen, J.; Peterson, M.; Tran, T. A.; Roos, P.; Ohlsson, T.; Sandell, A.

    2012-12-01

    To facilitate the separation of 89Zr produced in yttrium foils, an automated separation module was designed and assembled. The module separates more than 85% of produced 89Zr - activity in 3 g foils in less than 90 min. About 10 % remains in the dissolving vial. The quality of the separated 89Zr activity was investigated for labeling of the HER2-binding monoclonal antibody fragment, trastuzumab-Fab.

  17. Autonomously Generating Operations Sequences for a Mars Rover Using Artificial Intelligence-Based Planning

    NASA Astrophysics Data System (ADS)

    Sherwood, R.; Mutz, D.; Estlin, T.; Chien, S.; Backes, P.; Norris, J.; Tran, D.; Cooper, B.; Rabideau, G.; Mishkin, A.; Maxwell, S.

    2001-07-01

    This article discusses a proof-of-concept prototype for ground-based automatic generation of validated rover command sequences from high-level science and engineering activities. This prototype is based on ASPEN, the Automated Scheduling and Planning Environment. This artificial intelligence (AI)-based planning and scheduling system will automatically generate a command sequence that will execute within resource constraints and satisfy flight rules. An automated planning and scheduling system encodes rover design knowledge and uses search and reasoning techniques to automatically generate low-level command sequences while respecting rover operability constraints, science and engineering preferences, environmental predictions, and also adhering to hard temporal constraints. This prototype planning system has been field-tested using the Rocky 7 rover at JPL and will be field-tested on more complex rovers to prove its effectiveness before transferring the technology to flight operations for an upcoming NASA mission. Enabling goal-driven commanding of planetary rovers greatly reduces the requirements for highly skilled rover engineering personnel. This in turn greatly reduces mission operations costs. In addition, goal-driven commanding permits a faster response to changes in rover state (e.g., faults) or science discoveries by removing the time-consuming manual sequence validation process, allowing rapid "what-if" analyses, and thus reducing overall cycle times.

  18. Database-driven web interface automating gyrokinetic simulations for validation

    NASA Astrophysics Data System (ADS)

    Ernst, D. R.

    2010-11-01

    We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.

  19. Monitoring Progress and Adherence with Positive Airway Pressure Therapy for Obstructive Sleep Apnea: The Roles of Telemedicine and Mobile Health Applications.

    PubMed

    Hwang, Dennis

    2016-06-01

    Technology is changing the way health care is delivered and how patients are approaching their own health. Given the challenge within sleep medicine of optimizing adherence to continuous positive airway pressure (CPAP) therapy in patients with obstructive sleep apnea (OSA), implementation of telemedicine-based mechanisms is a critical component toward developing a comprehensive and cost-effective solution for OSA management. Key elements include the use of electronic messaging, remote monitoring, automated care mechanisms, and patient self-management platforms. Current practical sleep-related telemedicine platforms include Web-based educational programs, automated CPAP follow-up platforms that promote self-management, and peer-based patient-driven Internet support forums. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Pilot factors guidelines for the operational inspection of navigation systems

    NASA Technical Reports Server (NTRS)

    Sadler, J. F.; Boucek, G. P.

    1988-01-01

    A computerized human engineered inspection technique is developed for use by FAA inspectors in evaluating the pilot factors aspects of aircraft navigation systems. The short title for this project is Nav Handbook. A menu-driven checklist, computer program and data base (Human Factors Design Criteria) were developed and merged to form a self-contained, portable, human factors inspection checklist tool for use in a laboratory or field setting. The automated checklist is tailored for general aviation navigation systems and can be expanded for use with other aircraft systems, transports or military aircraft. The Nav Handbook inspection concept was demonstrated using a lap-top computer and an Omega/VLF CDU. The program generates standardized inspection reports. Automated checklists for LORAN/C and R NAV were also developed. A Nav Handbook User's Guide is included.

  1. Bioculture System Validation

    NASA Technical Reports Server (NTRS)

    Sato, Kevin Y.

    2012-01-01

    The Bioculture System first flight will be to validate the performance of the hardware and its automated and manual operational capabilities in the space flight environment of the International Space Station. Biology, Engineering, and Operations tests will be conducted in the Bioculture System fully characterize its automated and manual functions to support cell culturing for short and long durations. No hypothesis-driven research will be conducted with biological sample, and the science leads have all provided their concurrence that none of the data they collect will be considered as proprietary and can be free distributed to the science community. The outcome of the validation flight will be to commission the hardware for use by the science community. This presentation will provide non-proprietary details about the Bioculture System and information about the activities for the first flight.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    John J. Gangloff Jr; Shatil Sinha; Suresh G. Advani

    The formation and transport of voids in composite materials remains a key research area in composite manufacturing science. Knowledge of how voids, resin, and fiber reinforcement propagate throughout a composite material continuum from green state to cured state during an automated tape layup process is key to minimizing defects induced by void-initiated stress concentrations under applied loads for a wide variety of composite applications. This paper focuses on modeling resin flow in a deforming fiber tow during an automated process of partially impregnated thermoset prepreg composite material tapes. In this work, a tow unit cell based model has been presentedmore » that determines the consolidation and impregnation of a thermoset prepreg tape under an input pressure profile. A parametric study has been performed to characterize the behavior of varying tow speed and compaction forces on the degree of consolidation. Results indicate that increased tow consolidation is achieved with slower tow speeds and higher compaction forces although the relationship is not linear. The overall modeling of this project is motivated to address optimization of the 'green state' composite properties and processing parameters to reduce or eliminate 'cured state' defects, such as porosity and de-lamination. This work is partially funded by the Department of Energy under Award number DE-EE0001367.« less

  3. Ground Processing of Data From the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Sturdevant, Kathryn; Noble, David

    2006-01-01

    A computer program implements the Earth side of the protocol that governs the transfer of data files generated by the Mars Exploration Rovers. It also provides tools for viewing data in these files and integrating data-product files into automated and manual processes. It reconstitutes files from telemetry data packets. Even if only one packet is received, metadata provide enough information to enable this program to identify and use partial data products. This software can generate commands to acknowledge received files and retransmit missed parts of files, or it can feed a manual process to make decisions about retransmission. The software uses an Extensible Markup Language (XML) data dictionary to provide a generic capability for displaying files of basic types, and uses external "plug-in" application programs to provide more sophisticated displays. This program makes data products available with very low latency, and can trigger automated actions when complete or partial products are received. The software is easy to install and use. The only system requirement for installing the software is a Java J2SE 1.4 platform. Several instances of the software can be executed simultaneously on the same machine.

  4. Investigation of a novel approach to scoring Giemsa-stained malaria-infected thin blood films.

    PubMed

    Proudfoot, Owen; Drew, Nathan; Scholzen, Anja; Xiang, Sue; Plebanski, Magdalena

    2008-04-21

    Daily assessment of the percentage of erythrocytes that are infected ('percent-parasitaemia') across a time-course is a necessary step in many experimental studies of malaria, but represents a time-consuming and unpopular task among researchers. The most common method is extensive microscopic examination of Giemsa-stained thin blood-films. This study explored a method for the assessment of percent-parasitaemia that does not require extended periods of microscopy and results in a descriptive and permanent record of parasitaemia data that is highly amenable to subsequent 'data-mining'. Digital photography was utilized in conjunction with a basic purpose-written computer programme to test the viability of the concept. Partial automation of the determination of percent parasitaemia was then explored, resulting in the successful customization of commercially available broad-spectrum image analysis software towards this aim. Lastly, automated discrimination between infected and uninfected RBCs based on analysis of digital parameters of individual cell images was explored in an effort to completely automate the calculation of an accurate percent-parasitaemia.

  5. Automation of data processing and calculation of retention parameters and thermodynamic data for gas chromatography

    NASA Astrophysics Data System (ADS)

    Makarycheva, A. I.; Faerman, V. A.

    2017-02-01

    The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.

  6. Human Papillomavirus (HPV) Genotyping: Automation and Application in Routine Laboratory Testing

    PubMed Central

    Torres, M; Fraile, L; Echevarria, JM; Hernandez Novoa, B; Ortiz, M

    2012-01-01

    A large number of assays designed for genotyping human papillomaviruses (HPV) have been developed in the last years. They perform within a wide range of analytical sensitivity and specificity values for the different viral types, and are used either for diagnosis, epidemiological studies, evaluation of vaccines and implementing and monitoring of vaccination programs. Methods for specific genotyping of HPV-16 and HPV-18 are also useful for the prevention of cervical cancer in screening programs. Some commercial tests are, in addition, fully or partially automated. Automation of HPV genotyping presents advantages such as the simplicity of the testing procedure for the operator, the ability to process a large number of samples in a short time, and the reduction of human errors from manual operations, allowing a better quality assurance and a reduction of cost. The present review collects information about the current HPV genotyping tests, with special attention to practical aspects influencing their use in clinical laboratories. PMID:23248734

  7. Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH.

    PubMed

    Volk, Jochen; Herrmann, Torsten; Wüthrich, Kurt

    2008-07-01

    MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness.

  8. Estimation of urinary stone composition by automated processing of CT images.

    PubMed

    Chevreau, Grégoire; Troccaz, Jocelyne; Conort, Pierre; Renard-Penna, Raphaëlle; Mallet, Alain; Daudon, Michel; Mozer, Pierre

    2009-10-01

    The objective of this article was developing an automated tool for routine clinical practice to estimate urinary stone composition from CT images based on the density of all constituent voxels. A total of 118 stones for which the composition had been determined by infrared spectroscopy were placed in a helical CT scanner. A standard acquisition, low-dose and high-dose acquisitions were performed. All voxels constituting each stone were automatically selected. A dissimilarity index evaluating variations of density around each voxel was created in order to minimize partial volume effects: stone composition was established on the basis of voxel density of homogeneous zones. Stone composition was determined in 52% of cases. Sensitivities for each compound were: uric acid: 65%, struvite: 19%, cystine: 78%, carbapatite: 33.5%, calcium oxalate dihydrate: 57%, calcium oxalate monohydrate: 66.5%, brushite: 75%. Low-dose acquisition did not lower the performances (P < 0.05). This entirely automated approach eliminates manual intervention on the images by the radiologist while providing identical performances including for low-dose protocols.

  9. A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.

    2011-01-01

    A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.

  10. Evaluation of a Broad-Spectrum Partially Automated Adverse Event Surveillance System: A Potential Tool for Patient Safety Improvement in Hospitals With Limited Resources.

    PubMed

    Saikali, Melody; Tanios, Alain; Saab, Antoine

    2017-11-21

    The aim of the study was to evaluate the sensitivity and resource efficiency of a partially automated adverse event (AE) surveillance system for routine patient safety efforts in hospitals with limited resources. Twenty-eight automated triggers from the hospital information system's clinical and administrative databases identified cases that were then filtered by exclusion criteria per trigger and then reviewed by an interdisciplinary team. The system, developed and implemented using in-house resources, was applied for 45 days of surveillance, for all hospital inpatient admissions (N = 1107). Each trigger was evaluated for its positive predictive value (PPV). Furthermore, the sensitivity of the surveillance system (overall and by AE category) was estimated relative to incidence ranges in the literature. The surveillance system identified a total of 123 AEs among 283 reviewed medical records, yielding an overall PPV of 52%. The tool showed variable levels of sensitivity across and within AE categories when compared with the literature, with a relatively low overall sensitivity estimated between 21% and 44%. Adverse events were detected in 23 of the 36 AE categories defined by an established harm classification system. Furthermore, none of the detected AEs were voluntarily reported. The surveillance system showed variable sensitivity levels across a broad range of AE categories with an acceptable PPV, overcoming certain limitations associated with other harm detection methods. The number of cases captured was substantial, and none had been previously detected or voluntarily reported. For hospitals with limited resources, this methodology provides valuable safety information from which interventions for quality improvement can be formulated.

  11. Three-dimensional modeling of the cochlea by use of an arc fitting approach.

    PubMed

    Schurzig, Daniel; Lexow, G Jakob; Majdani, Omid; Lenarz, Thomas; Rau, Thomas S

    2016-12-01

    A cochlea modeling approach is presented allowing for a user defined degree of geometry simplification which automatically adjusts to the patient specific anatomy. Model generation can be performed in a straightforward manner due to error estimation prior to the actual generation, thus minimizing modeling time. Therefore, the presented technique is well suited for a wide range of applications including finite element analyses where geometrical simplifications are often inevitable. The method is presented for n=5 cochleae which were segmented using a custom software for increased accuracy. The linear basilar membrane cross sections are expanded to areas while the scalae contours are reconstructed by a predefined number of arc segments. Prior to model generation, geometrical errors are evaluated locally for each cross section as well as globally for the resulting models and their basal turn profiles. The final combination of all reconditioned features to a 3D volume is performed in Autodesk Inventor using the loft feature. Due to the volume generation based on cubic splines, low errors could be achieved even for low numbers of arc segments and provided cross sections, both of which correspond to a strong degree of model simplification. Model generation could be performed in a time efficient manner. The proposed simplification method was proven to be well suited for the helical cochlea geometry. The generated output data can be imported into commercial software tools for various analyses representing a time efficient way to create cochlea models optimally suited for the desired task.

  12. Wetting Behavior in Colloid-Polymer Mixtures at Different Substrates.

    PubMed

    Wijting, Willem K; Besseling, Nicolaas A M; Cohen Stuart, Martien A

    2003-09-25

    We present experimental observations on wetting phenomena in depletion interaction driven, phase separated colloidal dispersions. The contact angle of the colloidal liquid-gas interface at a solid substrate was determined for a series of compositions. Upon approach to the critical point, a transition occurs from partial to complete wetting. The interaction with the substrate was manipulated by modifying the substrate with a polymer. In that case, a transition from partial to complete drying is observed upon approach to the critical point.

  13. Help or hindrance? The travel, energy and carbon impacts of highly automated vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadud, Zia; MacKenzie, Don; Leiby, Paul

    In 5-10 years, experts predict that new automobiles will be capable of driving themselves under limited conditions and under most conditions within 10–20 years. Automation may affect road vehicle energy consumption and greenhouse gas (GHG) emissions in a host of ways, positive and negative, by causing changes in travel demand, vehicle design, vehicle operating profiles, and choices of fuels. In this paper, we identify specific mechanisms through which automation may affect travel and energy demand and resulting GHG emissions and bring them together using a coherent energy decomposition framework. Here, we review the literature for estimates of the energy impactsmore » of each mechanism and, where the literature is lacking, develop our own estimates using engineering and economic analysis. We consider how widely applicable each mechanism is, and quantify the potential impact of each mechanism on a common basis: the percentage change it is expected to cause in total GHG emissions from light-duty or heavy-duty vehicles in the U.S. Our primary focus is travel related energy consumption and emissions, since potential lifecycle impacts are generally smaller in magnitude. We also explore the net effects of automation on emissions through several illustrative scenarios, finding that automation might plausibly reduce road transport GHG emissions and energy use by nearly half – or nearly double them – depending on which effects come to dominate. We also find that many potential energy-reduction benefits may be realized through partial automation, while the major energy/emission downside risks appear more likely at full automation. Finally, we present some implications for policymakers and identifying priority areas for further research.« less

  14. Help or hindrance? The travel, energy and carbon impacts of highly automated vehicles

    DOE PAGES

    Wadud, Zia; MacKenzie, Don; Leiby, Paul

    2016-02-26

    In 5-10 years, experts predict that new automobiles will be capable of driving themselves under limited conditions and under most conditions within 10–20 years. Automation may affect road vehicle energy consumption and greenhouse gas (GHG) emissions in a host of ways, positive and negative, by causing changes in travel demand, vehicle design, vehicle operating profiles, and choices of fuels. In this paper, we identify specific mechanisms through which automation may affect travel and energy demand and resulting GHG emissions and bring them together using a coherent energy decomposition framework. Here, we review the literature for estimates of the energy impactsmore » of each mechanism and, where the literature is lacking, develop our own estimates using engineering and economic analysis. We consider how widely applicable each mechanism is, and quantify the potential impact of each mechanism on a common basis: the percentage change it is expected to cause in total GHG emissions from light-duty or heavy-duty vehicles in the U.S. Our primary focus is travel related energy consumption and emissions, since potential lifecycle impacts are generally smaller in magnitude. We also explore the net effects of automation on emissions through several illustrative scenarios, finding that automation might plausibly reduce road transport GHG emissions and energy use by nearly half – or nearly double them – depending on which effects come to dominate. We also find that many potential energy-reduction benefits may be realized through partial automation, while the major energy/emission downside risks appear more likely at full automation. Finally, we present some implications for policymakers and identifying priority areas for further research.« less

  15. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    PubMed

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  16. Rebalancing Spatial Attention: Endogenous Orienting May Partially Overcome the Left Visual Field Bias in Rapid Serial Visual Presentation.

    PubMed

    Śmigasiewicz, Kamila; Hasan, Gabriel Sami; Verleger, Rolf

    2017-01-01

    In dynamically changing environments, spatial attention is not equally distributed across the visual field. For instance, when two streams of stimuli are presented left and right, the second target (T2) is better identified in the left visual field (LVF) than in the right visual field (RVF). Recently, it has been shown that this bias is related to weaker stimulus-driven orienting of attention toward the RVF: The RVF disadvantage was reduced with salient task-irrelevant valid cues and increased with invalid cues. Here we studied if also endogenous orienting of attention may compensate for this unequal distribution of stimulus-driven attention. Explicit information was provided about the location of T1 and T2. Effectiveness of the cue manipulation was confirmed by EEG measures: decreasing alpha power before stream onset with informative cues, earlier latencies of potentials evoked by T1-preceding distractors at the right than at the left hemisphere when T1 was cued left, and decreasing T1- and T2-evoked N2pc amplitudes with informative cues. Importantly, informative cues reduced (though did not completely abolish) the LVF advantage, indicated by improved identification of right T2, and reflected by earlier N2pc latency evoked by right T2 and larger decrease in alpha power after cues indicating right T2. Overall, these results suggest that endogenously driven attention facilitates stimulus-driven orienting of attention toward the RVF, thereby partially overcoming the basic LVF bias in spatial attention.

  17. Total ozone observation by sun photometry at Arosa, Switzerland

    NASA Astrophysics Data System (ADS)

    Staehelin, Johannes; Schill, Herbert; Hoegger, Bruno; Viatte, Pierre; Levrat, Gilbert; Gamma, Adrian

    1995-07-01

    The method used for ground-based total ozone observations and the design of two instruments used to monitor atmospheric total ozone at Arosa (Dobson spectrophotometer and Brewer spectrometer) are briefly described. Two different procedures of the calibration of the Dobson spectrometer, both based on the Langley plot method, are presented. Data quality problems that occured in recent years in the measurements of one Dobson instrument at Arosa are discussed, and two different methods to reassess total ozone observations are compared. Two partially automated Dobson spectrophotometers and two completely automated Brewer spectrometers are currently in operation at Arosa. Careful comparison of the results of the measurements of the different instruments yields valuable information of possible small long- term drifts of the instruments involved in the operational measurements.

  18. An evaluation of data-driven motion estimation in comparison to the usage of external-surrogates in cardiac SPECT imaging

    PubMed Central

    Mukherjee, Joyeeta Mitra; Hutton, Brian F; Johnson, Karen L; Pretorius, P Hendrik; King, Michael A

    2014-01-01

    Motion estimation methods in single photon emission computed tomography (SPECT) can be classified into methods which depend on just the emission data (data-driven), or those that use some other source of information such as an external surrogate. The surrogate-based methods estimate the motion exhibited externally which may not correlate exactly with the movement of organs inside the body. The accuracy of data-driven strategies on the other hand is affected by the type and timing of motion occurrence during acquisition, the source distribution, and various degrading factors such as attenuation, scatter, and system spatial resolution. The goal of this paper is to investigate the performance of two data-driven motion estimation schemes based on the rigid-body registration of projections of motion-transformed source distributions to the acquired projection data for cardiac SPECT studies. Comparison is also made of six intensity based registration metrics to an external surrogate-based method. In the data-driven schemes, a partially reconstructed heart is used as the initial source distribution. The partially-reconstructed heart has inaccuracies due to limited angle artifacts resulting from using only a part of the SPECT projections acquired while the patient maintained the same pose. The performance of different cost functions in quantifying consistency with the SPECT projection data in the data-driven schemes was compared for clinically realistic patient motion occurring as discrete pose changes, one or two times during acquisition. The six intensity-based metrics studied were mean-squared difference (MSD), mutual information (MI), normalized mutual information (NMI), pattern intensity (PI), normalized cross-correlation (NCC) and entropy of the difference (EDI). Quantitative and qualitative analysis of the performance is reported using Monte-Carlo simulations of a realistic heart phantom including degradation factors such as attenuation, scatter and system spatial resolution. Further the visual appearance of motion-corrected images using data-driven motion estimates was compared to that obtained using the external motion-tracking system in patient studies. Pattern intensity and normalized mutual information cost functions were observed to have the best performance in terms of lowest average position error and stability with degradation of image quality of the partial reconstruction in simulations. In all patients, the visual quality of PI-based estimation was either significantly better or comparable to NMI-based estimation. Best visual quality was obtained with PI-based estimation in 1 of the 5 patient studies, and with external-surrogate based correction in 3 out of 5 patients. In the remaining patient study there was little motion and all methods yielded similar visual image quality. PMID:24107647

  19. The impact of artificial vehicle sounds for pedestrians on driver stress.

    PubMed

    Cottrell, Nicholas D; Barton, Benjamin K

    2012-01-01

    Electrically based vehicles have produced some concern over their lack of sound, but the impact of artificial sounds now being implemented have not been examined in respect to their effects upon the driver. The impact of two different implementations of vehicle sound on driver stress in electric vehicles was examined. A Nissan HEV running in electric vehicle mode was driven by participants in an area of congestion using three sound implementations: (1) no artificial sounds, (2) manually engaged sounds and (3) automatically engaged sounds. Physiological and self-report questionnaire measures were collected to determine stress and acceptance of the automated sound protocol. Driver stress was significantly higher in the manually activated warning condition, compared to both no artificial sounds and automatically engaged sounds. Implications for automation usage and measurement methods are discussed and future research directions suggested. The advent of hybrid- and all-electric vehicles has created a need for artificial warning signals for pedestrian safety that place task demands on drivers. We investigated drivers' stress differences in response to varying conditions of warning signals for pedestrians. Driver stress was lower when noises were automated.

  20. Intelligent Transportation Systems: Automated Guided Vehicle Systems in Changing Logistics Environments

    NASA Astrophysics Data System (ADS)

    Schulze, L.; Behling, S.; Buhrs, S.

    2008-06-01

    The usage of Automated Guided Vehicle Systems (AGVS) is growing. This has not always been the case in the past. A new record of the sells numbers is the result of inventive developments, new applications and modern thinking. One market that AGVS were not able to thoroughly conquer yet were rapidly changing logistics environments. The advantages in recurrent transportation with AGVS used to be hindered by the needs of flexibility. When nowadays managers talk about Flexible Manufacturing Systems (FMS) there is no reason not to consider AGVS. Fixed guidelines, permanent transfer stations and static routes are no necessity for most AGVS producers. Flexible Manufacturing Systems can raise profitability with AGVS. When robots start saving billions in production costs, the next step at same plants are automated materials handling systems. Today, there are hundreds of instances of computer-controlled systems designed to handle and transport materials, many of which have replaced conventional human-driven platform trucks. Reduced costs due to damages and failures, tracking and tracing as well as improved production scheduling on top of fewer personnel needs are only some of the advantages.

  1. A data driven method for estimation of B(avail) and appK(D) using a single injection protocol with [¹¹C]raclopride in the mouse.

    PubMed

    Wimberley, Catriona J; Fischer, Kristina; Reilhac, Anthonin; Pichler, Bernd J; Gregoire, Marie Claude

    2014-10-01

    The partial saturation approach (PSA) is a simple, single injection experimental protocol that will estimate both B(avail) and appK(D) without the use of blood sampling. This makes it ideal for use in longitudinal studies of neurodegenerative diseases in the rodent. The aim of this study was to increase the range and applicability of the PSA by developing a data driven strategy for determining reliable regional estimates of receptor density (B(avail)) and in vivo affinity (1/appK(D)), and validate the strategy using a simulation model. The data driven method uses a time window guided by the dynamic equilibrium state of the system as opposed to using a static time window. To test the method, simulations of partial saturation experiments were generated and validated against experimental data. The experimental conditions simulated included a range of receptor occupancy levels and three different B(avail) and appK(D) values to mimic diseases states. Also the effect of using a reference region and typical PET noise on the stability and accuracy of the estimates was investigated. The investigations showed that the parameter estimates in a simulated healthy mouse, using the data driven method were within 10±30% of the simulated input for the range of occupancy levels simulated. Throughout all experimental conditions simulated, the accuracy and robustness of the estimates using the data driven method were much improved upon the typical method of using a static time window, especially at low receptor occupancy levels. Introducing a reference region caused a bias of approximately 10% over the range of occupancy levels. Based on extensive simulated experimental conditions, it was shown the data driven method provides accurate and precise estimates of B(avail) and appK(D) for a broader range of conditions compared to the original method. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Simplification of the time-dependent generalized self-interaction correction method using two sets of orbitals: Application of the optimized effective potential formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messud, J.; Dinh, P. M.; Suraud, Eric

    2009-10-15

    We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent 'generalized SIC-OEP'. A straightforward approximation, using the spatial localization of one set of orbitals, leads to the 'generalized SIC-Slater' formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.

  3. Simplification of the time-dependent generalized self-interaction correction method using two sets of orbitals: Application of the optimized effective potential formalism

    NASA Astrophysics Data System (ADS)

    Messud, J.; Dinh, P. M.; Reinhard, P.-G.; Suraud, Eric

    2009-10-01

    We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent “generalized SIC-OEP.” A straightforward approximation, using the spatial localization of one set of orbitals, leads to the “generalized SIC-Slater” formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.

  4. A data-driven method to enhance vibration signal decomposition for rolling bearing fault analysis

    NASA Astrophysics Data System (ADS)

    Grasso, M.; Chatterton, S.; Pennacchi, P.; Colosimo, B. M.

    2016-12-01

    Health condition analysis and diagnostics of rotating machinery requires the capability of properly characterizing the information content of sensor signals in order to detect and identify possible fault features. Time-frequency analysis plays a fundamental role, as it allows determining both the existence and the causes of a fault. The separation of components belonging to different time-frequency scales, either associated to healthy or faulty conditions, represents a challenge that motivates the development of effective methodologies for multi-scale signal decomposition. In this framework, the Empirical Mode Decomposition (EMD) is a flexible tool, thanks to its data-driven and adaptive nature. However, the EMD usually yields an over-decomposition of the original signals into a large number of intrinsic mode functions (IMFs). The selection of most relevant IMFs is a challenging task, and the reference literature lacks automated methods to achieve a synthetic decomposition into few physically meaningful modes by avoiding the generation of spurious or meaningless modes. The paper proposes a novel automated approach aimed at generating a decomposition into a minimal number of relevant modes, called Combined Mode Functions (CMFs), each consisting in a sum of adjacent IMFs that share similar properties. The final number of CMFs is selected in a fully data driven way, leading to an enhanced characterization of the signal content without any information loss. A novel criterion to assess the dissimilarity between adjacent CMFs is proposed, based on probability density functions of frequency spectra. The method is suitable to analyze vibration signals that may be periodically acquired within the operating life of rotating machineries. A rolling element bearing fault analysis based on experimental data is presented to demonstrate the performances of the method and the provided benefits.

  5. The Coronal Analysis of SHocks and Waves (CASHeW) framework

    NASA Astrophysics Data System (ADS)

    Kozarev, Kamen A.; Davey, Alisdair; Kendrick, Alexander; Hammer, Michael; Keith, Celeste

    2017-11-01

    Coronal bright fronts (CBF) are large-scale wavelike disturbances in the solar corona, related to solar eruptions. They are observed (mostly in extreme ultraviolet (EUV) light) as transient bright fronts of finite width, propagating away from the eruption source location. Recent studies of individual solar eruptive events have used EUV observations of CBFs and metric radio type II burst observations to show the intimate connection between waves in the low corona and coronal mass ejection (CME)-driven shocks. EUV imaging with the atmospheric imaging assembly instrument on the solar dynamics observatory has proven particularly useful for detecting large-scale short-lived CBFs, which, combined with radio and in situ observations, holds great promise for early CME-driven shock characterization capability. This characterization can further be automated, and related to models of particle acceleration to produce estimates of particle fluxes in the corona and in the near Earth environment early in events. We present a framework for the coronal analysis of shocks and waves (CASHeW). It combines analysis of NASA Heliophysics System Observatory data products and relevant data-driven models, into an automated system for the characterization of off-limb coronal waves and shocks and the evaluation of their capability to accelerate solar energetic particles (SEPs). The system utilizes EUV observations and models written in the interactive data language. In addition, it leverages analysis tools from the SolarSoft package of libraries, as well as third party libraries. We have tested the CASHeW framework on a representative list of coronal bright front events. Here we present its features, as well as initial results. With this framework, we hope to contribute to the overall understanding of coronal shock waves, their importance for energetic particle acceleration, as well as to the better ability to forecast SEP events fluxes.

  6. Maximum Principle for General Controlled Systems Driven by Fractional Brownian Motions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han Yuecai; Hu Yaozhong; Song Jian, E-mail: jsong2@math.rutgers.edu

    2013-04-15

    We obtain a maximum principle for stochastic control problem of general controlled stochastic differential systems driven by fractional Brownian motions (of Hurst parameter H>1/2). This maximum principle specifies a system of equations that the optimal control must satisfy (necessary condition for the optimal control). This system of equations consists of a backward stochastic differential equation driven by both fractional Brownian motions and the corresponding underlying standard Brownian motions. In addition to this backward equation, the maximum principle also involves the Malliavin derivatives. Our approach is to use conditioning and Malliavin calculus. To arrive at our maximum principle we need tomore » develop some new results of stochastic analysis of the controlled systems driven by fractional Brownian motions via fractional calculus. Our approach of conditioning and Malliavin calculus is also applied to classical system driven by standard Brownian motions while the controller has only partial information. As a straightforward consequence, the classical maximum principle is also deduced in this more natural and simpler way.« less

  7. Active machine learning-driven experimentation to determine compound effects on protein patterns

    PubMed Central

    Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F

    2016-01-01

    High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance. DOI: http://dx.doi.org/10.7554/eLife.10047.001 PMID:26840049

  8. Autovalidation and automation of the postanalytical phase of routine hematology and coagulation analyses in a university hospital laboratory.

    PubMed

    Mlinaric, Ana; Milos, Marija; Coen Herak, Désirée; Fucek, Mirjana; Rimac, Vladimira; Zadro, Renata; Rogic, Dunja

    2018-02-23

    The need to satisfy high-throughput demands for laboratory tests continues to be a challenge. Therefore, we aimed to automate postanalytical phase in hematology and coagulation laboratory by autovalidation of complete blood count (CBC) and routine coagulation test results (prothrombin time [PT], international normalized ratio [PT-INR], activated partial thromboplastin time [APTT], fibrinogen, antithrombin activity [AT] and thrombin time [TT]). Work efficacy and turnaround time (TAT) before and after implementation of automated solutions will be compared. Ordering panels tailored to specific patient populations were implemented. Rerun and reflex testing rules were set in the respective analyzers' software (Coulter DxH Connectivity 1601, Beckman Coulter, FL, USA; AutoAssistant, Siemens Healthcare Diagnostics, Germany), and sample status information was transferred into the laboratory information system. To evaluate if the automation improved TAT and efficacy, data from manually verified results in September and October of 2015 were compared with the corresponding period in 2016 when autovalidation was implemented. Autovalidation rates of 63% for CBC and 65% for routine coagulation test results were achieved. At the TAT of 120 min, the percentage of reported results increased substantially for all analyzed tests, being above 90% for CBC, PT, PT-INR and fibrinogen and 89% for APTT. This output was achieved with three laboratory technicians less compared with the period when the postanalytical phase was not automated. Automation allowed optimized laboratory workflow for specific patient populations, thereby ensuring standardized results reporting. Autovalidation of test results proved to be an efficient tool for improvement of laboratory work efficacy and TAT.

  9. Specific Activation of K-RasG12D Allele in the Bladder Urothelium Results in Lung Alveolar and Vascular Defects

    PubMed Central

    Kanasaki, Megumi; Vong, Sylvia; Rovira, Carlota; Kalluri, Raghu

    2014-01-01

    K-ras is essential for embryogenesis and its mutations are involved in human developmental syndromes and cancer. To determine the consequences of K-ras activation in urothelium, we used uroplakin-II (UPK II) promoter driven Cre recombinase mice and generated mice with mutated KrasG12D allele in the urothelium (UPK II-Cre;LSL-K-rasG12D). The UPK II-Cre;LSL-K-rasG12D mice died neonatally due to lung morphogenesis defects consisting of simplification with enlargement of terminal air spaces and dysmorphic pulmonary vasculature. A significant alteration in epithelial and vascular basement membranes, together with fragmentation of laminin, points to extracellular matrix degradation as the causative mechanism of alveolar and vascular defects. Our data also suggest that altered protease activity in amniotic fluid might be associated with matrix defects in lung of UPK II-Cre;LSL-K-rasG12. These defects resemble those observed in early stage human neonatal bronchopulmonary dysplasia (BPD), although the relevance of this new mouse model for BPD study needs further investigation. PMID:24760005

  10. Transient thermal driven bubble's surface and its potential ultrasound-induced damage

    NASA Astrophysics Data System (ADS)

    Movahed, Pooya; Freund, Jonathan B.

    2017-11-01

    Ultrasound-induced bubble activity in soft tissues is well-known to be a potential injury mechanism in therapeutic ultrasound treatments. We consider damage by transient thermal effects, including a hypothetical mechanism based on transient thermal phenomena, including viscous dissipation. A spherically symmetric compressible Navier-Stokes discretization is developed to solve the full governing equations, both inside and outside of the bubble, without the usual simplifications in the Rayleigh-Plesset bubble dynamics approach. Equations are solved in the Lagrangian framework, which provides a sharp and accurate representation of the interface as well as the viscous dissipation and thermal transport effects, which preclude reduction to the usual Rayleigh-Plesset ordinary differential equation. This method is used to study transient thermal effects at different frequencies and pressure amplitudes relevant to therapeutic ultrasound treatments. High temperatures achieved in the surrounding medium during the violent bubble collapse phase due to the viscous dissipation in the surrounding medium and thermal conduction from the bubble are expected to cause damage. This work was supported by NIH NIDDK Grant P01-DK043881.

  11. Robust impact of coupled stratospheric ozone chemistry on the response of the Austral circulation to increased greenhouse gases

    NASA Astrophysics Data System (ADS)

    Chiodo, G.; Polvani, L. M.

    2016-12-01

    Due to computational constraints, interactive stratospheric chemistry is commonly neglected in most GCMs participating in inter-comparison projects. The impact of this simplification on the modeled response to external forcings remains largely unexplored. In this work, we examine the impact of the stratospheric chemistry coupling on the SH circulation response to an abrupt quadrupling of CO2. We accomplish this with a version of the Whole Atmosphere Community Climate (WACCM) model, which allows coupling and de-coupling stratospheric chemistry, without altering any other physical parameterization. We find that the chemistry coupling in WACCM significantly reduces (by about 20%) the response of both eddy-driven mid-latitude jet and the Hadley Cell strength, without altering the surface temperature response. This behavior is linked to the representation of stratospheric ozone, and its effects on the meridional temperature gradient at the extratropical tropopause. Our results imply that neglecting stratospheric ozone chemistry results in a potential overestimation of the circulation response to GHGs. Hence, stratospheric ozone chemistry produces a substantial negative feedback on the response of the atmospheric circulation to increased greenhouse gases.

  12. Method and apparatus for controlling gas evolution from chemical reactions

    DOEpatents

    Skorpik, J.R.; Dodson, M.G.

    1999-05-25

    The present invention is directed toward monitoring a thermally driven gas evolving chemical reaction with an acoustic apparatus. Signals from the acoustic apparatus are used to control a heater to prevent a run-away condition. A digestion module in combination with a robotic arm further automate physical handling of sample material reaction vessels. The invention is especially useful for carrying out sample procedures defined in EPA Methods SW-846. 8 figs.

  13. A Distributed Value of Information (VoI)-Based Approach for Mission-Adaptive Context-Aware Information Management and Presentation

    DTIC Science & Technology

    2016-05-16

    metrics involve regulating automation of complex systems , such as aircraft .12 Additionally, adaptive management of content in user interfaces has also...both the user and environmental context would aid in deciding how to present the information to the Warfighter. The prototype system currently...positioning system , and rate sensors can provide user - specific context to disambiguate physiologic data. The consumer “quantified self” market has driven

  14. Special wrench for B-nuts reduces torque stress in tubing

    NASA Technical Reports Server (NTRS)

    Stein, J. A.

    1970-01-01

    Gear-driven torque wrench with bearing support is used to tighten B-nut connection of partially supported fluid line with minimum stress to adjacent tubing and fittings. Wrench is useful for working with weak or brittle lines such as glass tubing.

  15. Probabilistic density function method for nonlinear dynamical systems driven by colored noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barajas-Solano, David A.; Tartakovsky, Alexandre M.

    2016-05-01

    We present a probability density function (PDF) method for a system of nonlinear stochastic ordinary differential equations driven by colored noise. The method provides an integro-differential equation for the temporal evolution of the joint PDF of the system's state, which we close by means of a modified Large-Eddy-Diffusivity-type closure. Additionally, we introduce the generalized local linearization (LL) approximation for deriving a computable PDF equation in the form of the second-order partial differential equation (PDE). We demonstrate the proposed closure and localization accurately describe the dynamics of the PDF in phase space for systems driven by noise with arbitrary auto-correlation time.more » We apply the proposed PDF method to the analysis of a set of Kramers equations driven by exponentially auto-correlated Gaussian colored noise to study the dynamics and stability of a power grid.« less

  16. Structural insights into the light-driven auto-assembly process of the water-oxidizing Mn4CaO5-cluster in photosystem II

    PubMed Central

    Zhang, Miao; Bommer, Martin; Chatterjee, Ruchira; Hussein, Rana; Yano, Junko; Dau, Holger; Kern, Jan; Dobbek, Holger; Zouni, Athina

    2017-01-01

    In plants, algae and cyanobacteria, Photosystem II (PSII) catalyzes the light-driven splitting of water at a protein-bound Mn4CaO5-cluster, the water-oxidizing complex (WOC). In the photosynthetic organisms, the light-driven formation of the WOC from dissolved metal ions is a key process because it is essential in both initial activation and continuous repair of PSII. Structural information is required for understanding of this chaperone-free metal-cluster assembly. For the first time, we obtained a structure of PSII from Thermosynechococcus elongatus without the Mn4CaO5-cluster. Surprisingly, cluster-removal leaves the positions of all coordinating amino acid residues and most nearby water molecules largely unaffected, resulting in a pre-organized ligand shell for kinetically competent and error-free photo-assembly of the Mn4CaO5-cluster. First experiments initiating (i) partial disassembly and (ii) partial re-assembly after complete depletion of the Mn4CaO5-cluster agree with a specific bi-manganese cluster, likely a di-µ-oxo bridged pair of Mn(III) ions, as an assembly intermediate. DOI: http://dx.doi.org/10.7554/eLife.26933.001 PMID:28718766

  17. Mammographic density estimation with automated volumetric breast density measurement.

    PubMed

    Ko, Su Yeon; Kim, Eun-Kyung; Kim, Min Jung; Moon, Hee Jung

    2014-01-01

    To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p = 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.

  18. Landscape Simplification Constrains Adult Size in a Native Ground-Nesting Bee

    PubMed Central

    Renauld, Miles; Hutchinson, Alena; Loeb, Gregory; Poveda, Katja; Connelly, Heather

    2016-01-01

    Bees provide critical pollination services to 87% of angiosperm plants; however, the reliability of these services may become threatened as bee populations decline. Agricultural intensification, resulting in the simplification of environments at the landscape scale, greatly changes the quality and quantity of resources available for female bees to provision their offspring. These changes may alter or constrain the tradeoffs in maternal investment allocation between offspring size, number and sex required to maximize fitness. Here we investigate the relationship between landscape scale agricultural intensification and the size and number of individuals within a wild ground nesting bee species, Andrena nasonii. We show that agricultural intensification at the landscape scale was associated with a reduction in the average size of field collected A. nasonii adults in highly agricultural landscapes but not with the number of individuals collected. Small females carried significantly smaller (40%) pollen loads than large females, which is likely to have consequences for subsequent offspring production and fitness. Thus, landscape simplification is likely to constrain allocation of resources to offspring through a reduction in the overall quantity, quality and distribution of resources. PMID:26943127

  19. Landscape Simplification Constrains Adult Size in a Native Ground-Nesting Bee.

    PubMed

    Renauld, Miles; Hutchinson, Alena; Loeb, Gregory; Poveda, Katja; Connelly, Heather

    2016-01-01

    Bees provide critical pollination services to 87% of angiosperm plants; however, the reliability of these services may become threatened as bee populations decline. Agricultural intensification, resulting in the simplification of environments at the landscape scale, greatly changes the quality and quantity of resources available for female bees to provision their offspring. These changes may alter or constrain the tradeoffs in maternal investment allocation between offspring size, number and sex required to maximize fitness. Here we investigate the relationship between landscape scale agricultural intensification and the size and number of individuals within a wild ground nesting bee species, Andrena nasonii. We show that agricultural intensification at the landscape scale was associated with a reduction in the average size of field collected A. nasonii adults in highly agricultural landscapes but not with the number of individuals collected. Small females carried significantly smaller (40%) pollen loads than large females, which is likely to have consequences for subsequent offspring production and fitness. Thus, landscape simplification is likely to constrain allocation of resources to offspring through a reduction in the overall quantity, quality and distribution of resources.

  20. Data-Driven Learning of Total and Local Energies in Elemental Boron

    NASA Astrophysics Data System (ADS)

    Deringer, Volker L.; Pickard, Chris J.; Csányi, Gábor

    2018-04-01

    The allotropes of boron continue to challenge structural elucidation and solid-state theory. Here we use machine learning combined with random structure searching (RSS) algorithms to systematically construct an interatomic potential for boron. Starting from ensembles of randomized atomic configurations, we use alternating single-point quantum-mechanical energy and force computations, Gaussian approximation potential (GAP) fitting, and GAP-driven RSS to iteratively generate a representation of the element's potential-energy surface. Beyond the total energies of the very different boron allotropes, our model readily provides atom-resolved, local energies and thus deepened insight into the frustrated β -rhombohedral boron structure. Our results open the door for the efficient and automated generation of GAPs, and other machine-learning-based interatomic potentials, and suggest their usefulness as a tool for materials discovery.

  1. Data-Driven Learning of Total and Local Energies in Elemental Boron.

    PubMed

    Deringer, Volker L; Pickard, Chris J; Csányi, Gábor

    2018-04-13

    The allotropes of boron continue to challenge structural elucidation and solid-state theory. Here we use machine learning combined with random structure searching (RSS) algorithms to systematically construct an interatomic potential for boron. Starting from ensembles of randomized atomic configurations, we use alternating single-point quantum-mechanical energy and force computations, Gaussian approximation potential (GAP) fitting, and GAP-driven RSS to iteratively generate a representation of the element's potential-energy surface. Beyond the total energies of the very different boron allotropes, our model readily provides atom-resolved, local energies and thus deepened insight into the frustrated β-rhombohedral boron structure. Our results open the door for the efficient and automated generation of GAPs, and other machine-learning-based interatomic potentials, and suggest their usefulness as a tool for materials discovery.

  2. Pivotal role of computers and software in mass spectrometry - SEQUEST and 20 years of tandem MS database searching.

    PubMed

    Yates, John R

    2015-11-01

    Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.

  3. WIND-DRIVEN LATERAL VARIABILITY IN A PARTIALLY MIXED ESTUARY. (R828677C001)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. Bayesian automated cortical segmentation for neonatal MRI

    NASA Astrophysics Data System (ADS)

    Chou, Zane; Paquette, Natacha; Ganesh, Bhavana; Wang, Yalin; Ceschin, Rafael; Nelson, Marvin D.; Macyszyn, Luke; Gaonkar, Bilwaj; Panigrahy, Ashok; Lepore, Natasha

    2017-11-01

    Several attempts have been made in the past few years to develop and implement an automated segmentation of neonatal brain structural MRI. However, accurate automated MRI segmentation remains challenging in this population because of the low signal-to-noise ratio, large partial volume effects and inter-individual anatomical variability of the neonatal brain. In this paper, we propose a learning method for segmenting the whole brain cortical grey matter on neonatal T2-weighted images. We trained our algorithm using a neonatal dataset composed of 3 fullterm and 4 preterm infants scanned at term equivalent age. Our segmentation pipeline combines the FAST algorithm from the FSL library software and a Bayesian segmentation approach to create a threshold matrix that minimizes the error of mislabeling brain tissue types. Our method shows promising results with our pilot training set. In both preterm and full-term neonates, automated Bayesian segmentation generates a smoother and more consistent parcellation compared to FAST, while successfully removing the subcortical structure and cleaning the edges of the cortical grey matter. This method show promising refinement of the FAST segmentation by considerably reducing manual input and editing required from the user, and further improving reliability and processing time of neonatal MR images. Further improvement will include a larger dataset of training images acquired from different manufacturers.

  5. A Data-Driven Response Virtual Sensor Technique with Partial Vibration Measurements Using Convolutional Neural Network.

    PubMed

    Sun, Shan-Bin; He, Yuan-Yuan; Zhou, Si-Da; Yue, Zhen-Jiang

    2017-12-12

    Measurement of dynamic responses plays an important role in structural health monitoring, damage detection and other fields of research. However, in aerospace engineering, the physical sensors are limited in the operational conditions of spacecraft, due to the severe environment in outer space. This paper proposes a virtual sensor model with partial vibration measurements using a convolutional neural network. The transmissibility function is employed as prior knowledge. A four-layer neural network with two convolutional layers, one fully connected layer, and an output layer is proposed as the predicting model. Numerical examples of two different structural dynamic systems demonstrate the performance of the proposed approach. The excellence of the novel technique is further indicated using a simply supported beam experiment comparing to a modal-model-based virtual sensor, which uses modal parameters, such as mode shapes, for estimating the responses of the faulty sensors. The results show that the presented data-driven response virtual sensor technique can predict structural response with high accuracy.

  6. Group invariant solution for a pre-existing fracture driven by a power-law fluid in impermeable rock

    NASA Astrophysics Data System (ADS)

    Fareo, A. G.; Mason, D. P.

    2013-12-01

    The effect of power-law rheology on hydraulic fracturing is investigated. The evolution of a two-dimensional fracture with non-zero initial length and driven by a power-law fluid is analyzed. Only fluid injection into the fracture is considered. The surrounding rock mass is impermeable. With the aid of lubrication theory and the PKN approximation a partial differential equation for the fracture half-width is derived. Using a linear combination of the Lie-point symmetry generators of the partial differential equation, the group invariant solution is obtained and the problem is reduced to a boundary value problem for an ordinary differential equation. Exact analytical solutions are derived for hydraulic fractures with constant volume and with constant propagation speed. The asymptotic solution near the fracture tip is found. The numerical solution for general working conditions is obtained by transforming the boundary value problem to a pair of initial value problems. Throughout the paper, hydraulic fracturing with shear thinning, Newtonian and shear thickening fluids are compared.

  7. A Data-Driven Response Virtual Sensor Technique with Partial Vibration Measurements Using Convolutional Neural Network

    PubMed Central

    Sun, Shan-Bin; He, Yuan-Yuan; Zhou, Si-Da; Yue, Zhen-Jiang

    2017-01-01

    Measurement of dynamic responses plays an important role in structural health monitoring, damage detection and other fields of research. However, in aerospace engineering, the physical sensors are limited in the operational conditions of spacecraft, due to the severe environment in outer space. This paper proposes a virtual sensor model with partial vibration measurements using a convolutional neural network. The transmissibility function is employed as prior knowledge. A four-layer neural network with two convolutional layers, one fully connected layer, and an output layer is proposed as the predicting model. Numerical examples of two different structural dynamic systems demonstrate the performance of the proposed approach. The excellence of the novel technique is further indicated using a simply supported beam experiment comparing to a modal-model-based virtual sensor, which uses modal parameters, such as mode shapes, for estimating the responses of the faulty sensors. The results show that the presented data-driven response virtual sensor technique can predict structural response with high accuracy. PMID:29231868

  8. Volumetric gain of the human pancreas after left partial pancreatic resection: A CT-scan based retrospective study.

    PubMed

    Phillip, Veit; Zahel, Tina; Danninger, Assiye; Erkan, Mert; Dobritz, Martin; Steiner, Jörg M; Kleeff, Jörg; Schmid, Roland M; Algül, Hana

    2015-01-01

    Regeneration of the pancreas has been well characterized in animal models. However, there are conflicting data on the regenerative capacity of the human pancreas. The aim of the present study was to assess the regenerative capacity of the human pancreas. In a retrospective study, data from patients undergoing left partial pancreatic resection at a single center were eligible for inclusion (n = 185). Volumetry was performed based on 5 mm CT-scans acquired through a 256-slice CT-scanner using a semi-automated software. Data from 24 patients (15 males/9 females) were included. Mean ± SD age was 68 ± 11 years (range, 40-85 years). Median time between surgery and the 1st postoperative CT was 9 days (range, 0-27 days; IQR, 7-13), 55 days (range, 21-141 days; IQR, 34-105) until the 2nd CT, and 191 days (range, 62-1902; IQR, 156-347) until the 3rd CT. The pancreatic volumes differed significantly between the first and the second postoperative CT scans (median volume 25.6 mL and 30.6 mL, respectively; p = 0.008) and had significantly increased further by the 3rd CT scan (median volume 37.9 mL; p = 0.001 for comparison with 1st CT scan and p = 0.003 for comparison with 2nd CT scan). The human pancreas shows a measurable and considerable potential of volumetric gain after partial resection. Multidetector-CT based semi-automated volume analysis is a feasible method for follow-up of the volume of the remaining pancreatic parenchyma after partial pancreatectomy. Effects on exocrine and endocrine pancreatic function have to be evaluated in a prospective manner. Copyright © 2015 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  9. Computer program for the automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Poulson, P.; Rasmusson, C.

    1971-01-01

    The automated attendance accounting system (AAAS) was developed under the auspices of the Space Technology Applications Program. The task is basically the adaptation of a small digital computer, coupled with specially developed pushbutton terminals located in school classrooms and offices for the purpose of taking daily attendance, maintaining complete attendance records, and producing partial and summary reports. Especially developed for high schools, the system is intended to relieve both teachers and office personnel from the time-consuming and dreary task of recording and analyzing the myriad classroom attendance data collected throughout the semester. In addition, since many school district budgets are related to student attendance, the increase in accounting accuracy is expected to augment district income. A major component of this system is the real-time AAAS software system, which is described.

  10. EMU processing - A myth dispelled

    NASA Technical Reports Server (NTRS)

    Peacock, Paul R.; Wilde, Richard C.; Lutz, Glenn C.; Melgares, Michael A.

    1991-01-01

    The refurbishment-and-checkout 'processing' activities entailed by the Space Shuttle Extravehicular Mobility Units (EMUs) are currently significantly more modest, at 1050 man-hours, than when Space Shuttle services began (involving about 4000 man-hours). This great improvement in hardware efficiency is due to the design or modification of test rigs for simplification of procedures, as well as those procedures' standardization, in conjunction with an increase in hardware confidence which has allowed the extension of inspection, service, and testing intervals. Recent simplification of the hardware-processing sequence could reduce EMU processing requirements to 600 man-hours in the near future.

  11. An automated method for identifying artifact in independent component analysis of resting-state FMRI.

    PubMed

    Bhaganagarapu, Kaushik; Jackson, Graeme D; Abbott, David F

    2013-01-01

    An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identification of artifact in independent components (ICs) derived from functional MRI (fMRI). The method was designed with the following features: does not require temporal information about an fMRI paradigm; does not require the user to train the algorithm; requires only the fMRI images (additional acquisition of anatomical imaging not required); is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; can be applied to resting-state fMRI; is automated, requiring minimal or no human intervention. We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26 and 72% of the components as artifact (mean 55%). About 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact. We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available.

  12. An Automated Method for Identifying Artifact in Independent Component Analysis of Resting-State fMRI

    PubMed Central

    Bhaganagarapu, Kaushik; Jackson, Graeme D.; Abbott, David F.

    2013-01-01

    An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identification of artifact in independent components (ICs) derived from functional MRI (fMRI). The method was designed with the following features: does not require temporal information about an fMRI paradigm; does not require the user to train the algorithm; requires only the fMRI images (additional acquisition of anatomical imaging not required); is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; can be applied to resting-state fMRI; is automated, requiring minimal or no human intervention. We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26 and 72% of the components as artifact (mean 55%). About 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact. We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available. PMID:23847511

  13. How Do Mobile Phone Diabetes Programs Drive Behavior Change?

    PubMed Central

    Nundy, Shantanu; Mishra, Anjuli; Hogan, Patrick; Lee, Sang Mee; Solomon, Marla C.; Peek, Monica E.

    2015-01-01

    Purpose The purpose of this study was to investigate the behavioral effects of a theory-driven, mobile phone–based intervention that combines automated text messaging and remote nursing, using an automated, interactive text messaging system. Methods This was a mixed methods observational cohort study. Study participants were members of the University of Chicago Health Plan (UCHP) who largely reside in a working-class, urban African American community. Surveys were conducted at baseline, 3 months (mid-intervention), and 6 months (postintervention) to test the hypothesis that the intervention would be associated with improvements in self-efficacy, social support, health beliefs, and self-care. In addition, in-depth individual interviews were conducted with 14 participants and then analyzed using the constant comparative method to identify new behavioral constructs affected by the intervention. Results The intervention was associated with improvements in 5 of 6 domains of self-care (medication taking, glucose monitoring, foot care, exercise, and healthy eating) and improvements in 1 or more measures of self-efficacy, social support, and health beliefs (perceived control). Qualitatively, participants reported that knowledge, attitudes, and ownership were also affected by the program. Together these findings were used to construct a new behavioral model. Conclusions This study’s findings challenge the prevailing assumption that mobile phones largely affect behavior change through reminders and support the idea that behaviorally driven mobile health interventions can address multiple behavioral pathways associated with sustained behavior change. PMID:25278512

  14. Automated segmentation of middle hepatic vein in non-contrast x-ray CT images based on an atlas-driven approach

    NASA Astrophysics Data System (ADS)

    Kitagawa, Teruhiko; Zhou, Xiangrong; Hara, Takeshi; Fujita, Hiroshi; Yokoyama, Ryujiro; Kondo, Hiroshi; Kanematsu, Masayuki; Hoshi, Hiroaki

    2008-03-01

    In order to support the diagnosis of hepatic diseases, understanding the anatomical structures of hepatic lobes and hepatic vessels is necessary. Although viewing and understanding the hepatic vessels in contrast media-enhanced CT images is easy, the observation of the hepatic vessels in non-contrast X-ray CT images that are widely used for the screening purpose is difficult. We are developing a computer-aided diagnosis (CAD) system to support the liver diagnosis based on non-contrast X-ray CT images. This paper proposes a new approach to segment the middle hepatic vein (MHV), a key structure (landmark) for separating the liver region into left and right lobes. Extraction and classification of hepatic vessels are difficult in non-contrast X-ray CT images because the contrast between hepatic vessels and other liver tissues is low. Our approach uses an atlas-driven method by the following three stages. (1) Construction of liver atlases of left and right hepatic lobes using a learning datasets. (2) Fully-automated enhancement and extraction of hepatic vessels in liver regions. (3) Extraction of MHV based on the results of (1) and (2). The proposed approach was applied to 22 normal liver cases of non-contrast X-ray CT images. The preliminary results show that the proposed approach achieves the success in 14 cases for MHV extraction.

  15. European Science Notes. Volume 40, number 1

    NASA Astrophysics Data System (ADS)

    Shaffer, L. E.

    1986-01-01

    European Science Notes is a monthly publication with brief articles on recent developments in European scientific research. Partial contents: Shallow Water Acoustic at the UK's Royal Aircraft Establishment; Immunology Research in Israel; Life Sciences Research at the Weizmann Institute of Science; Symposium on Macromolecules; Mass Spectrometry Meetings; Automation and Robotization in the Welding Industry; 3rd International Conference on Composite Structures; Conference on Quantum Electronics; Very Short Laser Pulse Research.

  16. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    DTIC Science & Technology

    2015-03-27

    Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the Degree of...Master of Science in Engineering and Environmental Management Patrick J. Grandsaert, BS Captain, USAF March 2015 DISTRIBUTION STATEMENT A...reliable data is necessary to make informed decisions on how to best manage aging road assets. This research explores a new method to automate the

  17. Report on the FY 1986 Activities of the Defense Science Study Group. Volume 1.

    DTIC Science & Technology

    1987-05-01

    Reactors Improved Techniques for Wavefront Sensing and Correction in Adaptive Optics Hypervelocity Launchers Underground Facilities 0 Automated...oceanography and sound propagation in partially coherent media such as the turbulent ocean. There are large fixed arrays such as the Sound Surveillance System...Aircraft Continuous Patrol Aircraft Miscellaneous Studies Review of the Plutonium Special Isotope Separation Program of the DOE 4r Fusion Fission Hybrid

  18. Kinematic and Dynamic Analysis of High-Speed Intermittent-Motion Mechanisms.

    DTIC Science & Technology

    1984-01-16

    intermittent-motion mechanisms which -"have potential application to the high-speed automatic weapon system , and an investigation on the workspace of a robotic...manipulator system . The problems of this investigation belong to a selected group of unsolved or partially solved problems which are relevant and...design of high-speed machinery and automated manufacturing systems . Accession For IiTIS GRA&I DTIC TAB Unamounced 0 Justificatio By_, Distribut ion

  19. Inspection of thick welded joints using laser-ultrasonic SAFT.

    PubMed

    Lévesque, D; Asaumi, Y; Lord, M; Bescond, C; Hatanaka, H; Tagami, M; Monchalin, J-P

    2016-07-01

    The detection of defects in thick butt joints in the early phase of multi-pass arc welding would be very valuable to reduce cost and time in the necessity of reworking. As a non-contact method, the laser-ultrasonic technique (LUT) has the potential for the automated inspection of welds, ultimately online during manufacturing. In this study, testing has been carried out using LUT combined with the synthetic aperture focusing technique (SAFT) on 25 and 50mm thick butt welded joints of steel both completed and partially welded. EDM slits of 2 or 3mm height were inserted at different depths in the multi-pass welding process to simulate a lack of fusion. Line scans transverse to the weld are performed with the generation and detection laser spots superimposed directly on the surface of the weld bead. A CCD line camera is used to simultaneously acquire the surface profile for correction in the SAFT processing. All artificial defects but also real defects are visualized in the investigated thick butt weld specimens, either completed or partially welded after a given number of passes. The results obtained clearly show the potential of using the LUT with SAFT for the automated inspection of arc welds or hybrid laser-arc welds during manufacturing. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  20. Time-saving impact of an algorithm to identify potential surgical site infections.

    PubMed

    Knepper, B C; Young, H; Jenkins, T C; Price, C S

    2013-10-01

    To develop and validate a partially automated algorithm to identify surgical site infections (SSIs) using commonly available electronic data to reduce manual chart review. Retrospective cohort study of patients undergoing specific surgical procedures over a 4-year period from 2007 through 2010 (algorithm development cohort) or over a 3-month period from January 2011 through March 2011 (algorithm validation cohort). A single academic safety-net hospital in a major metropolitan area. Patients undergoing at least 1 included surgical procedure during the study period. Procedures were identified in the National Healthcare Safety Network; SSIs were identified by manual chart review. Commonly available electronic data, including microbiologic, laboratory, and administrative data, were identified via a clinical data warehouse. Algorithms using combinations of these electronic variables were constructed and assessed for their ability to identify SSIs and reduce chart review. The most efficient algorithm identified in the development cohort combined microbiologic data with postoperative procedure and diagnosis codes. This algorithm resulted in 100% sensitivity and 85% specificity. Time savings from the algorithm was almost 600 person-hours of chart review. The algorithm demonstrated similar sensitivity on application to the validation cohort. A partially automated algorithm to identify potential SSIs was highly sensitive and dramatically reduced the amount of manual chart review required of infection control personnel during SSI surveillance.

  1. Repeated Induction of Inattentional Blindness in a Simulated Aviation Environment

    NASA Technical Reports Server (NTRS)

    Kennedy, Kellie D.; Stephens, Chad L.; Williams, Ralph A.; Schutte, Paul C.

    2017-01-01

    The study reported herein is a subset of a larger investigation on the role of automation in the context of the flight deck and used a fixed-based, human-in-the-loop simulator. This paper explored the relationship between automation and inattentional blindness (IB) occurrences in a repeated induction paradigm using two types of runway incursions. The critical stimuli for both runway incursions were directly relevant to primary task performance. Sixty non-pilot participants performed the final five minutes of a landing scenario twice in one of three automation conditions: full automation (FA), partial automation (PA), and no automation (NA). The first induction resulted in a 70 percent (42 of 60) detection failure rate with those in the PA condition significantly more likely to detect the incursion compared to the FA condition or the NA condition. The second induction yielded a 50 percent detection failure rate. Although detection improved (detection failure rates declined) in all conditions, those in the FA condition demonstrated the greatest improvement with doubled detection rates. The detection behavior in the first trial did not preclude a failed detection in the second induction. Group membership (IB vs. Detection) in the FA condition showed a greater improvement than those in the NA condition and rated the Mental Demand and Effort subscales of the NASA-TLX (NASA Task Load Index) significantly higher for Time 2 compared Time 1. Participants in the FA condition used the experience of IB exposure to improve task performance whereas those in the NA condition did not, indicating the availability and reallocation of attentional resources in the FA condition. These findings support the role of engagement in operational attention detriment and the consideration of attentional failure causation to determine appropriate mitigation strategies.

  2. Were they in the loop during automated driving? Links between visual attention and crash potential.

    PubMed

    Louw, Tyron; Madigan, Ruth; Carsten, Oliver; Merat, Natasha

    2017-08-01

    A proposed advantage of vehicle automation is that it relieves drivers from the moment-to-moment demands of driving, to engage in other, non-driving related, tasks. However, it is important to gain an understanding of drivers' capacity to resume manual control, should such a need arise. As automation removes vehicle control-based measures as a performance indicator, other metrics must be explored. This driving simulator study, conducted under the European Commission (EC) funded AdaptIVe project, assessed drivers' gaze fixations during partially-automated (SAE Level 2) driving, on approach to critical and non-critical events. Using a between-participant design, 75 drivers experienced automation with one of five out-of-the-loop (OOTL) manipulations, which used different levels of screen visibility and secondary tasks to induce varying levels of engagement with the driving task: 1) no manipulation, 2) manipulation by light fog, 3) manipulation by heavy fog, 4) manipulation by heavy fog plus a visual task, 5) no manipulation plus an n-back task. The OOTL manipulations influenced drivers' first point of gaze fixation after they were asked to attend to an evolving event. Differences resolved within one second and visual attention allocation adapted with repeated events, yet crash outcome was not different between OOTL manipulation groups. Drivers who crashed in the first critical event showed an erratic pattern of eye fixations towards the road centre on approach to the event, while those who did not demonstrated a more stable pattern. Automated driving systems should be able to direct drivers' attention to hazards no less than 6 seconds in advance of an adverse outcome. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. Market Model of Financing Higher Education in Sub-Saharan Africa: Examples from Kenya.

    ERIC Educational Resources Information Center

    Oketch, Moses O.

    2003-01-01

    Examines some of the rationales for financial diversification and partial privatization of state universities in Kenya and the different manifestations of market-driven approaches to university education. Explores whether the market model can address increased demand while maintaining educational quality. (EV)

  4. Kinematics of Tape Recording.

    ERIC Educational Resources Information Center

    Coleman, J. J.

    1982-01-01

    Describes mathematics of the nonliner relationships between a constant-speed, capstan-driven magnetic tape transport mechanism and a constant-angular-velocity take-up reel. The relationship, derived from the sum of a partial, serves in recognition of a finite tape. Thickness can serve as an example of rotational kinematics. (Author/SK)

  5. High-Throughput Screening Enhances Kidney Organoid Differentiation from Human Pluripotent Stem Cells and Enables Automated Multidimensional Phenotyping.

    PubMed

    Czerniecki, Stefan M; Cruz, Nelly M; Harder, Jennifer L; Menon, Rajasree; Annis, James; Otto, Edgar A; Gulieva, Ramila E; Islas, Laura V; Kim, Yong Kyun; Tran, Linh M; Martins, Timothy J; Pippin, Jeffrey W; Fu, Hongxia; Kretzler, Matthias; Shankland, Stuart J; Himmelfarb, Jonathan; Moon, Randall T; Paragas, Neal; Freedman, Benjamin S

    2018-05-15

    Organoids derived from human pluripotent stem cells are a potentially powerful tool for high-throughput screening (HTS), but the complexity of organoid cultures poses a significant challenge for miniaturization and automation. Here, we present a fully automated, HTS-compatible platform for enhanced differentiation and phenotyping of human kidney organoids. The entire 21-day protocol, from plating to differentiation to analysis, can be performed automatically by liquid-handling robots, or alternatively by manual pipetting. High-content imaging analysis reveals both dose-dependent and threshold effects during organoid differentiation. Immunofluorescence and single-cell RNA sequencing identify previously undetected parietal, interstitial, and partially differentiated compartments within organoids and define conditions that greatly expand the vascular endothelium. Chemical modulation of toxicity and disease phenotypes can be quantified for safety and efficacy prediction. Screening in gene-edited organoids in this system reveals an unexpected role for myosin in polycystic kidney disease. Organoids in HTS formats thus establish an attractive platform for multidimensional phenotypic screening. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Utilization of BIM for automation of quantity takeoffs and cost estimation in transport infrastructure construction projects in the Czech Republic

    NASA Astrophysics Data System (ADS)

    Vitásek, Stanislav; Matějka, Petr

    2017-09-01

    The article deals with problematic parts of automated processing of quantity takeoff (QTO) from data generated in BIM model. It focuses on models of road constructions, and uses volumes and dimensions of excavation work to create an estimate of construction costs. The article uses a case study and explorative methods to discuss possibilities and problems of data transfer from a model to a price system of construction production when such transfer is used for price estimates of construction works. Current QTOs and price tenders are made with 2D documents. This process is becoming obsolete because more modern tools can be used. The BIM phenomenon enables partial automation in processing volumes and dimensions of construction units and matching the data to units in a given price scheme. Therefore price of construction can be estimated and structured without lengthy and often imprecise manual calculations. The use of BIM for QTO is highly dependent on local market budgeting systems, therefore proper push/pull strategy is required. It also requires proper requirements specification, compatible pricing database and software.

  7. NASA Tech Briefs, December 2006

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Topic include: Inferring Gear Damage from Oil-Debris and Vibration Data; Forecasting of Storm-Surge Floods Using ADCIRC and Optimized DEMs; User Interactive Software for Analysis of Human Physiological Data; Representation of Serendipitous Scientific Data; Automatic Locking of Laser Frequency to an Absorption Peak; Self-Passivating Lithium/Solid Electrolyte/Iodine Cells; Four-Quadrant Analog Multipliers Using G4-FETs; Noise Source for Calibrating a Microwave Polarimeter; Hybrid Deployable Foam Antennas and Reflectors; Coating MCPs with AlN and GaN; Domed, 40-cm-Diameter Ion Optics for an Ion Thruster; Gesture-Controlled Interfaces for Self-Service Machines; Dynamically Alterable Arrays of Polymorphic Data Types; Identifying Trends in Deep Space Network Monitor Data; Predicting Lifetime of a Thermomechanically Loaded Component; Partial Automation of Requirements Tracing; Automated Synthesis of Architecture of Avionic Systems; SSRL Emergency Response Shore Tool; Wholly Aromatic Ether-Imides as n-Type Semiconductors; Carbon-Nanotube-Carpet Heat-Transfer Pads; Pulse-Flow Microencapsulation System; Automated Low-Gravitation Facility Would Make Optical Fibers; Alignment Cube with One Diffractive Face; Graphite Composite Booms with Integral Hinges; Tool for Sampling Permafrost on a Remote Planet; and Special Semaphore Scheme for UHF Spacecraft Communications.

  8. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...

    2012-01-01

    A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less

  9. The Center for Optimized Structural Studies (COSS) platform for automation in cloning, expression, and purification of single proteins and protein-protein complexes.

    PubMed

    Mlynek, Georg; Lehner, Anita; Neuhold, Jana; Leeb, Sarah; Kostan, Julius; Charnagalov, Alexej; Stolt-Bergner, Peggy; Djinović-Carugo, Kristina; Pinotsis, Nikos

    2014-06-01

    Expression in Escherichia coli represents the simplest and most cost effective means for the production of recombinant proteins. This is a routine task in structural biology and biochemistry where milligrams of the target protein are required in high purity and monodispersity. To achieve these criteria, the user often needs to screen several constructs in different expression and purification conditions in parallel. We describe a pipeline, implemented in the Center for Optimized Structural Studies, that enables the systematic screening of expression and purification conditions for recombinant proteins and relies on a series of logical decisions. We first use bioinformatics tools to design a series of protein fragments, which we clone in parallel, and subsequently screen in small scale for optimal expression and purification conditions. Based on a scoring system that assesses soluble expression, we then select the top ranking targets for large-scale purification. In the establishment of our pipeline, emphasis was put on streamlining the processes such that it can be easily but not necessarily automatized. In a typical run of about 2 weeks, we are able to prepare and perform small-scale expression screens for 20-100 different constructs followed by large-scale purification of at least 4-6 proteins. The major advantage of our approach is its flexibility, which allows for easy adoption, either partially or entirely, by any average hypothesis driven laboratory in a manual or robot-assisted manner.

  10. Conceptual biology, hypothesis discovery, and text mining: Swanson's legacy.

    PubMed

    Bekhuis, Tanja

    2006-04-03

    Innovative biomedical librarians and information specialists who want to expand their roles as expert searchers need to know about profound changes in biology and parallel trends in text mining. In recent years, conceptual biology has emerged as a complement to empirical biology. This is partly in response to the availability of massive digital resources such as the network of databases for molecular biologists at the National Center for Biotechnology Information. Developments in text mining and hypothesis discovery systems based on the early work of Swanson, a mathematician and information scientist, are coincident with the emergence of conceptual biology. Very little has been written to introduce biomedical digital librarians to these new trends. In this paper, background for data and text mining, as well as for knowledge discovery in databases (KDD) and in text (KDT) is presented, then a brief review of Swanson's ideas, followed by a discussion of recent approaches to hypothesis discovery and testing. 'Testing' in the context of text mining involves partially automated methods for finding evidence in the literature to support hypothetical relationships. Concluding remarks follow regarding (a) the limits of current strategies for evaluation of hypothesis discovery systems and (b) the role of literature-based discovery in concert with empirical research. Report of an informatics-driven literature review for biomarkers of systemic lupus erythematosus is mentioned. Swanson's vision of the hidden value in the literature of science and, by extension, in biomedical digital databases, is still remarkably generative for information scientists, biologists, and physicians.

  11. Image segmentation by hierarchial agglomeration of polygons using ecological statistics

    DOEpatents

    Prasad, Lakshman; Swaminarayan, Sriram

    2013-04-23

    A method for rapid hierarchical image segmentation based on perceptually driven contour completion and scene statistics is disclosed. The method begins with an initial fine-scale segmentation of an image, such as obtained by perceptual completion of partial contours into polygonal regions using region-contour correspondences established by Delaunay triangulation of edge pixels as implemented in VISTA. The resulting polygons are analyzed with respect to their size and color/intensity distributions and the structural properties of their boundaries. Statistical estimates of granularity of size, similarity of color, texture, and saliency of intervening boundaries are computed and formulated into logical (Boolean) predicates. The combined satisfiability of these Boolean predicates by a pair of adjacent polygons at a given segmentation level qualifies them for merging into a larger polygon representing a coarser, larger-scale feature of the pixel image and collectively obtains the next level of polygonal segments in a hierarchy of fine-to-coarse segmentations. The iterative application of this process precipitates textured regions as polygons with highly convolved boundaries and helps distinguish them from objects which typically have more regular boundaries. The method yields a multiscale decomposition of an image into constituent features that enjoy a hierarchical relationship with features at finer and coarser scales. This provides a traversable graph structure from which feature content and context in terms of other features can be derived, aiding in automated image understanding tasks. The method disclosed is highly efficient and can be used to decompose and analyze large images.

  12. Development and Testing of an LED-Based Near-Infrared Sensor for Human Kidney Tumor Diagnostics

    PubMed Central

    Zabarylo, Urszula; Kirsanov, Dmitry; Belikova, Valeria; Ageev, Vladimir; Usenov, Iskander; Galyanin, Vladislav; Minet, Olaf; Sakharova, Tatiana; Danielyan, Georgy; Feliksberger, Elena; Artyushenko, Viacheslav

    2017-01-01

    Optical spectroscopy is increasingly used for cancer diagnostics. Tumor detection feasibility in human kidney samples using mid- and near-infrared (NIR) spectroscopy, fluorescence spectroscopy, and Raman spectroscopy has been reported (Artyushenko et al., Spectral fiber sensors for cancer diagnostics in vitro. In Proceedings of the European Conference on Biomedical Optics, Munich, Germany, 21–25 June 2015). In the present work, a simplification of the NIR spectroscopic analysis for cancer diagnostics was studied. The conventional high-resolution NIR spectroscopic method of kidney tumor diagnostics was replaced by a compact optical sensing device constructively represented by a set of four light-emitting diodes (LEDs) at selected wavelengths and one detecting photodiode. Two sensor prototypes were tested using 14 in vitro clinical samples of 7 different patients. Statistical data evaluation using principal component analysis (PCA) and partial least-squares discriminant analysis (PLS-DA) confirmed the general applicability of the LED-based sensing approach to kidney tumor detection. An additional validation of the results was performed by means of sample permutation. PMID:28825612

  13. Fixed-Charge Atomistic Force Fields for Molecular Dynamics Simulations in the Condensed Phase: An Overview.

    PubMed

    Riniker, Sereina

    2018-03-26

    In molecular dynamics or Monte Carlo simulations, the interactions between the particles (atoms) in the system are described by a so-called force field. The empirical functional form of classical fixed-charge force fields dates back to 1969 and remains essentially unchanged. In a fixed-charge force field, the polarization is not modeled explicitly, i.e. the effective partial charges do not change depending on conformation and environment. This simplification allows, however, a dramatic reduction in computational cost compared to polarizable force fields and in particular quantum-chemical modeling. The past decades have shown that simulations employing carefully parametrized fixed-charge force fields can provide useful insights into biological and chemical questions. This overview focuses on the four major force-field families, i.e. AMBER, CHARMM, GROMOS, and OPLS, which are based on the same classical functional form and are continuously improved to the present day. The overview is aimed at readers entering the field of (bio)molecular simulations. More experienced users may find the comparison and historical development of the force-field families interesting.

  14. A novel method for identification of lithium-ion battery equivalent circuit model parameters considering electrochemical properties

    NASA Astrophysics Data System (ADS)

    Zhang, Xi; Lu, Jinling; Yuan, Shifei; Yang, Jun; Zhou, Xuan

    2017-03-01

    This paper proposes a novel parameter identification method for the lithium-ion (Li-ion) battery equivalent circuit model (ECM) considering the electrochemical properties. An improved pseudo two-dimension (P2D) model is established on basis of partial differential equations (PDEs), since the electrolyte potential is simplified from the nonlinear to linear expression while terminal voltage can be divided into the electrolyte potential, open circuit voltage (OCV), overpotential of electrodes, internal resistance drop, and so on. The model order reduction process is implemented by the simplification of the PDEs using the Laplace transform, inverse Laplace transform, Pade approximation, etc. A unified second order transfer function between cell voltage and current is obtained for the comparability with that of ECM. The final objective is to obtain the relationship between the ECM resistances/capacitances and electrochemical parameters such that in various conditions, ECM precision could be improved regarding integration of battery interior properties for further applications, e.g., SOC estimation. Finally simulation and experimental results prove the correctness and validity of the proposed methodology.

  15. On the slow dynamics of near-field acoustically levitated objects under High excitation frequencies

    NASA Astrophysics Data System (ADS)

    Ilssar, Dotan; Bucher, Izhak

    2015-10-01

    This paper introduces a simplified analytical model describing the governing dynamics of near-field acoustically levitated objects. The simplification converts the equation of motion coupled with the partial differential equation of a compressible fluid, into a compact, second order ordinary differential equation, where the local stiffness and damping are transparent. The simplified model allows one to more easily analyse and design near-field acoustic levitation based systems, and it also helps to devise closed-loop controller algorithms for such systems. Near-field acoustic levitation employs fast ultrasonic vibrations of a driving surface and exploits the viscosity and the compressibility of a gaseous medium to achieve average, load carrying pressure. It is demonstrated that the slow dynamics dominates the transient behaviour, while the time-scale associated with the fast, ultrasonic excitation has a small presence in the oscillations of the levitated object. Indeed, the present paper formulates the slow dynamics under an ultrasonic excitation without the need to explicitly consider the latter. The simplified model is compared with a numerical scheme based on Reynolds equation and with experiments, both showing reasonably good results.

  16. Simultaneous Bistability of a Qubit and Resonator in Circuit Quantum Electrodynamics

    NASA Astrophysics Data System (ADS)

    Mavrogordatos, Th. K.; Tancredi, G.; Elliott, M.; Peterer, M. J.; Patterson, A.; Rahamim, J.; Leek, P. J.; Ginossar, E.; Szymańska, M. H.

    2017-01-01

    We explore the joint activated dynamics exhibited by two quantum degrees of freedom: a cavity mode oscillator which is strongly coupled to a superconducting qubit in the strongly coherently driven dispersive regime. Dynamical simulations and complementary measurements show a range of parameters where both the cavity and the qubit exhibit sudden simultaneous switching between two metastable states. This manifests in ensemble averaged amplitudes of both the cavity and qubit exhibiting a partial coherent cancellation. Transmission measurements of driven microwave cavities coupled to transmon qubits show detailed features which agree with the theory in the regime of simultaneous switching.

  17. Theory of advection-driven long range biotic transport

    USDA-ARS?s Scientific Manuscript database

    We propose a simple mechanistic model to examine the effects of advective flow on the spread of fungal diseases spread by wind-blown spores. The model is defined by a set of two coupled non-linear partial differential equations for spore densities. One equation describes the long-distance advectiv...

  18. Cue Effectiveness in Communicatively Efficient Discourse Production

    ERIC Educational Resources Information Center

    Qian, Ting; Jaeger, T. Florian

    2012-01-01

    Recent years have seen a surge in accounts motivated by information theory that consider language production to be partially driven by a preference for communicative efficiency. Evidence from discourse production (i.e., production beyond the sentence level) has been argued to suggest that speakers distribute information across discourse so as to…

  19. Intention-to-Treat Analysis in Partially Nested Randomized Controlled Trials with Real-World Complexity

    ERIC Educational Resources Information Center

    Schweig, Jonathan David; Pane, John F.

    2016-01-01

    Demands for scientific knowledge of what works in educational policy and practice has driven interest in quantitative investigations of educational outcomes, and randomized controlled trials (RCTs) have proliferated under these conditions. In educational settings, even when individuals are randomized, both experimental and control students are…

  20. Hypothesis-Driven Story Building: Counteracting Human Cognitive Biases to Improve Medical Diagnosis Support

    ERIC Educational Resources Information Center

    Zhu, Shizhuo

    2010-01-01

    Clinical decision-making is challenging mainly because of two factors: (1) patient conditions are often complicated with partial and changing information; (2) people have cognitive biases in their decision-making and information-seeking. Consequentially, misdiagnoses and ineffective use of resources may happen. To better support clinical…

Top