Sample records for integration time required

  1. Variable Structure PID Control to Prevent Integrator Windup

    NASA Technical Reports Server (NTRS)

    Hall, C. E.; Hodel, A. S.; Hung, J. Y.

    1999-01-01

    PID controllers are frequently used to control systems requiring zero steady-state error while maintaining requirements for settling time and robustness (gain/phase margins). PID controllers suffer significant loss of performance due to short-term integrator wind-up when used in systems with actuator saturation. We examine several existing and proposed methods for the prevention of integrator wind-up in both continuous and discrete time implementations.

  2. 40 CFR 147.3109 - Timing of mechanical integrity test.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Timing of mechanical integrity test... Certain Oklahoma Indian Tribes § 147.3109 Timing of mechanical integrity test. The demonstrations of mechanical integrity required by § 146.14(b)(2) of this chapter prior to approval for the operation of a...

  3. A new generation of real-time DOS technology for mission-oriented system integration and operation

    NASA Technical Reports Server (NTRS)

    Jensen, E. Douglas

    1988-01-01

    Information is given on system integration and operation (SIO) requirements and a new generation of technical approaches for SIO. Real-time, distribution, survivability, and adaptability requirements and technical approaches are covered. An Alpha operating system program management overview is outlined.

  4. Internal Cargo Integration

    NASA Technical Reports Server (NTRS)

    Hart, Angela

    2006-01-01

    A description of internal cargo integration is presented. The topics include: 1) Typical Cargo for Launch/Disposal; 2) Cargo Delivery Requirements; 3) Cargo Return Requirements; and 4) Vehicle On-Orbit Stay Time.

  5. Real-Time Simulation of Ares I Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick; Matras, Alex; Wilson, Heath; Alday, Nathan; Walker, David; Betts, Kevin; Hughes, Ryan; Turbe, Michael

    2009-01-01

    The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory (SIL) at the Marshall Space Flight Center (MSFC). The primary purpose of the Ares SIL is to test the vehicle avionics hardware and software in a hardware-in-the-loop (HWIL) environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time software backbone to stimulate all required Ares components through high-fidelity simulation. ARTEMIS has been designed to take full advantage of the advances in underlying computational power now available to support HWIL testing. A modular real-time design relying on a fully distributed computing architecture has been achieved. Two fundamental requirements drove ARTEMIS to pursue the use of high-fidelity simulation models in a real-time environment. First, ARTEMIS must be used to test a man-rated integrated avionics hardware and software system, thus requiring a wide variety of nominal and off-nominal simulation capabilities to certify system robustness. The second driving requirement - derived from a nationwide review of current state-of-the-art HWIL facilities - was that preserving digital model fidelity significantly reduced overall vehicle lifecycle cost by reducing testing time for certification runs and increasing flight tempo through an expanded operational envelope. These two driving requirements necessitated the use of high-fidelity models throughout the ARTEMIS simulation. The nature of the Ares mission profile imposed a variety of additional requirements on the ARTEMIS simulation. The Ares I vehicle is composed of multiple elements, including the First Stage Solid Rocket Booster (SRB), the Upper Stage powered by the J- 2X engine, the Orion Crew Exploration Vehicle (CEV) which houses the crew, the Launch Abort System (LAS), and various secondary elements that separate from the vehicle. At launch, the integrated vehicle stack is composed of these stages, and throughout the mission, various elements separate from the integrated stack and tumble back towards the earth. ARTEMIS must be capable of simulating the integrated stack through the flight as well as propagating each individual element after separation. In addition, abort sequences can lead to other unique configurations of the integrated stack as the timing and sequence of the stage separations are altered.

  6. The importance of decision onset

    PubMed Central

    Grinband, Jack; Ferrera, Vincent

    2015-01-01

    The neural mechanisms of decision making are thought to require the integration of evidence over time until a response threshold is reached. Much work suggests that response threshold can be adjusted via top-down control as a function of speed or accuracy requirements. In contrast, the time of integration onset has received less attention and is believed to be determined mostly by afferent or preprocessing delays. However, a number of influential studies over the past decade challenge this assumption and begin to paint a multifaceted view of the phenomenology of decision onset. This review highlights the challenges involved in initiating the integration of evidence at the optimal time and the potential benefits of adjusting integration onset to task demands. The review outlines behavioral and electrophysiolgical studies suggesting that the onset of the integration process may depend on properties of the stimulus, the task, attention, and response strategy. Most importantly, the aggregate findings in the literature suggest that integration onset may be amenable to top-down regulation, and may be adjusted much like response threshold to exert cognitive control and strategically optimize the decision process to fit immediate behavioral requirements. PMID:26609111

  7. Integrated Analysis Tools for Determination of Structural Integrity and Durability of High temperature Polymer Matrix Composites

    DTIC Science & Technology

    2008-08-18

    fidelity will be used to reduce the massive experimental testing and associated time required for qualification of new materials. Tools and...develping a model of the thermo-oxidative process for polymer systems, that incorporates the effects of reaction rates, Fickian diffusion, time varying...degradation processes. Year: 2005 Month: 12 Not required at this time . AIR FORCE OFFICE OF SCIENTIFIC KESEARCH 04 SEP 2008 Page 2 of 2 DTIC Data

  8. Space station needs, attributes, and architectural options study. Volume 1: Missions and requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Science and applications, NOAA environmental observation, commercial resource observations, commercial space processing, commercial communications, national security, technology development, and GEO servicing are addressed. Approach to time phasing of mission requirements, system sizing summary, time-phased user mission payload support, space station facility requirements, and integrated time-phased system requirements are also addressed.

  9. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  10. Two-dimensional Euler and Navier-Stokes Time accurate simulations of fan rotor flows

    NASA Technical Reports Server (NTRS)

    Boretti, A. A.

    1990-01-01

    Two numerical methods are presented which describe the unsteady flow field in the blade-to-blade plane of an axial fan rotor. These methods solve the compressible, time-dependent, Euler and the compressible, turbulent, time-dependent, Navier-Stokes conservation equations for mass, momentum, and energy. The Navier-Stokes equations are written in Favre-averaged form and are closed with an approximate two-equation turbulence model with low Reynolds number and compressibility effects included. The unsteady aerodynamic component is obtained by superposing inflow or outflow unsteadiness to the steady conditions through time-dependent boundary conditions. The integration in space is performed by using a finite volume scheme, and the integration in time is performed by using k-stage Runge-Kutta schemes, k = 2,5. The numerical integration algorithm allows the reduction of the computational cost of an unsteady simulation involving high frequency disturbances in both CPU time and memory requirements. Less than 200 sec of CPU time are required to advance the Euler equations in a computational grid made up of about 2000 grid during 10,000 time steps on a CRAY Y-MP computer, with a required memory of less than 0.3 megawords.

  11. IDENTIFICATION OF TIME-INTEGRATED SAMPLING AND MEASUREMENT TECHNIQUES TO SUPPORT HUMAN EXPOSURE STUDIES

    EPA Science Inventory

    Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Long-term, time-integrated exposure measures would be desirable to address the problem of developing appropriate residential childhood exposure classifications. ...

  12. Cluster of Sound Speed Fields by an Integral Measure

    DTIC Science & Technology

    2010-06-01

    the same cost in time. The increasing the number of sensor depths does not cause execution time to increase. And finally assume that the time required...to be P = Z − ∫ 0 b ∂C(ρ, θ, λ) ∂ρ ∂C(ρ, θ, λ) ∂ρ dρ (2) where (ρ,θ,λ) are the usual geocentric spherical coordinates, and the limits of integration...but using spherical coordinates requires that the horizontal (θ , λ) terms be normalized by the radius. In the case of geocentric coordinates this

  13. On time discretizations for spectral methods. [numerical integration of Fourier and Chebyshev methods for dynamic partial differential equations

    NASA Technical Reports Server (NTRS)

    Gottlieb, D.; Turkel, E.

    1980-01-01

    New methods are introduced for the time integration of the Fourier and Chebyshev methods of solution for dynamic differential equations. These methods are unconditionally stable, even though no matrix inversions are required. Time steps are chosen by accuracy requirements alone. For the Fourier method both leapfrog and Runge-Kutta methods are considered. For the Chebyshev method only Runge-Kutta schemes are tested. Numerical calculations are presented to verify the analytic results. Applications to the shallow water equations are presented.

  14. Integrity Analysis of Real-Time Ppp Technique with Igs-Rts Service for Maritime Navigation

    NASA Astrophysics Data System (ADS)

    El-Diasty, M.

    2017-10-01

    Open sea and inland waterways are the most widely used mode for transporting goods worldwide. It is the International Maritime Organization (IMO) that defines the requirements for position fixing equipment for a worldwide radio-navigation system, in terms of accuracy, integrity, continuity, availability and coverage for the various phases of navigation. Satellite positioning systems can contribute to meet these requirements, as well as optimize marine transportation. Marine navigation usually consists of three major phases identified as Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking with alert limit ranges from 25 m to 0.25 m. GPS positioning is widely used for many applications and is currently recognized by IMO for a future maritime navigation. With the advancement in autonomous GPS positioning techniques such as Precise Point Positioning (PPP) and with the advent of new real-time GNSS correction services such as IGS-Real-Time-Service (RTS), it is necessary to investigate the integrity of the PPP-based positioning technique along with IGS-RTS service in terms of availability and reliability for safe navigation in maritime application. This paper monitors the integrity of an autonomous real-time PPP-based GPS positioning system using the IGS real-time service (RTS) for maritime applications that require minimum availability of integrity of 99.8 % to fulfil the IMO integrity standards. To examine the integrity of the real-time IGS-RTS PPP-based technique for maritime applications, kinematic data from a dual frequency GPS receiver is collected onboard a vessel and investigated with the real-time IGS-RTS PPP-based GPS positioning technique. It is shown that the availability of integrity of the real-time IGS-RTS PPP-based GPS solution is 100 % for all navigation phases and therefore fulfil the IMO integrity standards (99.8 % availability) immediately (after 1 second), after 2 minutes and after 42 minutes of convergence time for Ocean/Coastal/Port approach/Inland waterway, in port navigation and automatic docking, respectively. Moreover, the misleading information is about 2 % for all navigation phases that is considered less safe is not in immediate danger because the horizontal position error is less than the navigation alert limits.

  15. Compensation for Blur Requires Increase in Field of View and Viewing Time

    PubMed Central

    Kwon, MiYoung; Liu, Rong; Chien, Lillian

    2016-01-01

    Spatial resolution is an important factor for human pattern recognition. In particular, low resolution (blur) is a defining characteristic of low vision. Here, we examined spatial (field of view) and temporal (stimulus duration) requirements for blurry object recognition. The spatial resolution of an image such as letter or face, was manipulated with a low-pass filter. In experiment 1, studying spatial requirement, observers viewed a fixed-size object through a window of varying sizes, which was repositioned until object identification (moving window paradigm). Field of view requirement, quantified as the number of “views” (window repositions) for correct recognition, was obtained for three blur levels, including no blur. In experiment 2, studying temporal requirement, we determined threshold viewing time, the stimulus duration yielding criterion recognition accuracy, at six blur levels, including no blur. For letter and face recognition, we found blur significantly increased the number of views, suggesting a larger field of view is required to recognize blurry objects. We also found blur significantly increased threshold viewing time, suggesting longer temporal integration is necessary to recognize blurry objects. The temporal integration reflects the tradeoff between stimulus intensity and time. While humans excel at recognizing blurry objects, our findings suggest compensating for blur requires increased field of view and viewing time. The need for larger spatial and longer temporal integration for recognizing blurry objects may further challenge object recognition in low vision. Thus, interactions between blur and field of view should be considered for developing low vision rehabilitation or assistive aids. PMID:27622710

  16. 7 CFR 372.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and by promoting the effective, efficient integration of all relevant environmental requirements under... parts 1500-1508), especially provisions pertaining to timing (§ 1502.5), integration (§ 1502.25), and...

  17. Development and application of a local linearization algorithm for the integration of quaternion rate equations in real-time flight simulation problems

    NASA Technical Reports Server (NTRS)

    Barker, L. E., Jr.; Bowles, R. L.; Williams, L. H.

    1973-01-01

    High angular rates encountered in real-time flight simulation problems may require a more stable and accurate integration method than the classical methods normally used. A study was made to develop a general local linearization procedure of integrating dynamic system equations when using a digital computer in real-time. The procedure is specifically applied to the integration of the quaternion rate equations. For this application, results are compared to a classical second-order method. The local linearization approach is shown to have desirable stability characteristics and gives significant improvement in accuracy over the classical second-order integration methods.

  18. Integration of Temporal and Ordinal Information During Serial Interception Sequence Learning

    PubMed Central

    Gobel, Eric W.; Sanchez, Daniel J.; Reber, Paul J.

    2011-01-01

    The expression of expert motor skills typically involves learning to perform a precisely timed sequence of movements (e.g., language production, music performance, athletic skills). Research examining incidental sequence learning has previously relied on a perceptually-cued task that gives participants exposure to repeating motor sequences but does not require timing of responses for accuracy. Using a novel perceptual-motor sequence learning task, learning a precisely timed cued sequence of motor actions is shown to occur without explicit instruction. Participants learned a repeating sequence through practice and showed sequence-specific knowledge via a performance decrement when switched to an unfamiliar sequence. In a second experiment, the integration of representation of action order and timing sequence knowledge was examined. When either action order or timing sequence information was selectively disrupted, performance was reduced to levels similar to completely novel sequences. Unlike prior sequence-learning research that has found timing information to be secondary to learning action sequences, when the task demands require accurate action and timing information, an integrated representation of these types of information is acquired. These results provide the first evidence for incidental learning of fully integrated action and timing sequence information in the absence of an independent representation of action order, and suggest that this integrative mechanism may play a material role in the acquisition of complex motor skills. PMID:21417511

  19. Integrable Time-Dependent Quantum Hamiltonians

    NASA Astrophysics Data System (ADS)

    Sinitsyn, Nikolai A.; Yuzbashyan, Emil A.; Chernyak, Vladimir Y.; Patra, Aniket; Sun, Chen

    2018-05-01

    We formulate a set of conditions under which the nonstationary Schrödinger equation with a time-dependent Hamiltonian is exactly solvable analytically. The main requirement is the existence of a non-Abelian gauge field with zero curvature in the space of system parameters. Known solvable multistate Landau-Zener models satisfy these conditions. Our method provides a strategy to incorporate time dependence into various quantum integrable models while maintaining their integrability. We also validate some prior conjectures, including the solution of the driven generalized Tavis-Cummings model.

  20. ITOHealth: a multimodal middleware-oriented integrated architecture for discovering medical entities.

    PubMed

    Alor-Hernández, Giner; Sánchez-Cervantes, José Luis; Juárez-Martínez, Ulises; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Aguilar-Laserre, Alberto

    2012-03-01

    Emergency healthcare is one of the emerging application domains for information services, which requires highly multimodal information services. The time of consuming pre-hospital emergency process is critical. Therefore, the minimization of required time for providing primary care and consultation to patients is one of the crucial factors when trying to improve the healthcare delivery in emergency situations. In this sense, dynamic location of medical entities is a complex process that needs time and it can be critical when a person requires medical attention. This work presents a multimodal location-based system for locating and assigning medical entities called ITOHealth. ITOHealth provides a multimodal middleware-oriented integrated architecture using a service-oriented architecture in order to provide information of medical entities in mobile devices and web browsers with enriched interfaces providing multimodality support. ITOHealth's multimodality is based on the use of Microsoft Agent Characters, the integration of natural language voice to the characters, and multi-language and multi-characters support providing an advantage for users with visual impairments.

  1. Integrating Science in Agricultural Education: Attitudes of Indiana Agricultural Science and Business Teachers.

    ERIC Educational Resources Information Center

    Balschweid, Mark A.; Thompson, Gregory W.

    2002-01-01

    In a survey of Indiana agriscience and business teachers (n=170), one-half reported their students receive science credit for agriscience and business courses; they felt prepared to teach integrated biological sciences; and integration required more preparation time. They needed appropriate equipment and adequate funding to support integration.…

  2. Design of time-pulse coded optoelectronic neuronal elements for nonlinear transformation and integration

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2008-03-01

    In the paper the actuality of neurophysiologically motivated neuron arrays with flexibly programmable functions and operations with possibility to select required accuracy and type of nonlinear transformation and learning are shown. We consider neurons design and simulation results of multichannel spatio-time algebraic accumulation - integration of optical signals. Advantages for nonlinear transformation and summation - integration are shown. The offered circuits are simple and can have intellectual properties such as learning and adaptation. The integrator-neuron is based on CMOS current mirrors and comparators. The performance: consumable power - 100...500 μW, signal period- 0.1...1ms, input optical signals power - 0.2...20 μW time delays - less 1μs, the number of optical signals - 2...10, integration time - 10...100 of signal periods, accuracy or integration error - about 1%. Various modifications of the neuron-integrators with improved performance and for different applications are considered in the paper.

  3. Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.

    2014-12-01

    Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.

  4. Multi-objective group scheduling optimization integrated with preventive maintenance

    NASA Astrophysics Data System (ADS)

    Liao, Wenzhu; Zhang, Xiufang; Jiang, Min

    2017-11-01

    This article proposes a single-machine-based integration model to meet the requirements of production scheduling and preventive maintenance in group production. To describe the production for identical/similar and different jobs, this integrated model considers the learning and forgetting effects. Based on machine degradation, the deterioration effect is also considered. Moreover, perfect maintenance and minimal repair are adopted in this integrated model. The multi-objective of minimizing total completion time and maintenance cost is taken to meet the dual requirements of delivery date and cost. Finally, a genetic algorithm is developed to solve this optimization model, and the computation results demonstrate that this integrated model is effective and reliable.

  5. Semi-implicit time integration of atmospheric flows with characteristic-based flux partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Debojyoti; Constantinescu, Emil M.

    2016-06-23

    Here, this paper presents a characteristic-based flux partitioning for the semi-implicit time integration of atmospheric flows. Nonhydrostatic models require the solution of the compressible Euler equations. The acoustic time scale is significantly faster than the advective scale, yet it is typically not relevant to atmospheric and weather phenomena. The acoustic and advective components of the hyperbolic flux are separated in the characteristic space. High-order, conservative additive Runge-Kutta methods are applied to the partitioned equations so that the acoustic component is integrated in time implicitly with an unconditionally stable method, while the advective component is integrated explicitly. The time step ofmore » the overall algorithm is thus determined by the advective scale. Benchmark flow problems are used to demonstrate the accuracy, stability, and convergence of the proposed algorithm. The computational cost of the partitioned semi-implicit approach is compared with that of explicit time integration.« less

  6. Integrals for IBS and beam cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.; /Fermilab

    Simulation of beam cooling usually requires performing certain integral transformations every time step or so, which is a significant burden on the CPU. Examples are the dispersion integrals (Hilbert transforms) in the stochastic cooling, wake fields and IBS integrals. An original method is suggested for fast and sufficiently accurate computation of the integrals. This method is applied for the dispersion integral. Some methodical aspects of the IBS analysis are discussed.

  7. Integrals for IBS and Beam Cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.

    Simulation of beam cooling usually requires performing certain integral transformations every time step or so, which is a significant burden on the CPU. Examples are the dispersion integrals (Hilbert transforms) in the stochastic cooling, wake fields and IBS integrals. An original method is suggested for fast and sufficiently accurate computation of the integrals. This method is applied for the dispersion integral. Some methodical aspects of the IBS analysis are discussed.

  8. Time Integrating Optical Signal Processing

    DTIC Science & Technology

    1981-07-01

    advantage of greatly reducing the bandwidth requirement for the memory feeding the second cell. For a system composed of a PbMoO 4 and a ( TeO2 )s Bragg cell...bounds. ( TeO2 )L and ( TeO2 )s represent, respectively, the long- / , / itudinal and slow shear / modes of TeO2 . ’a , / / /a ’o [ / / / / was assumed here...could be implemented with a 25mm TeO2 device operated in the longitudinal mode in a hybrid system. A purely time-integrating system would require about

  9. Advancements in Wind Integration Study Data Modeling: The Wind Integration National Dataset (WIND) Toolkit; Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draxl, C.; Hodge, B. M.; Orwig, K.

    2013-10-01

    Regional wind integration studies in the United States require detailed wind power output data at many locations to perform simulations of how the power system will operate under high-penetration scenarios. The wind data sets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as be time synchronized with available load profiles. The Wind Integration National Dataset (WIND) Toolkit described in this paper fulfills these requirements. A wind resource dataset, wind power production time series, and simulated forecasts from a numerical weather predictionmore » model run on a nationwide 2-km grid at 5-min resolution will be made publicly available for more than 110,000 onshore and offshore wind power production sites.« less

  10. 49 CFR Appendix E to Part 192 - Guidance on Determining High Consequence Areas and on Carrying out Requirements in the Integrity...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... addressing time dependent and independent threats for a transmission pipeline operating below 30% SMYS not in... pipeline system are covered for purposes of the integrity management program requirements, an operator must... system, or an operator may apply one method to individual portions of the pipeline system. (Refer to...

  11. The Impact of Training on the Time Required to Implement Technology in the Classroom

    ERIC Educational Resources Information Center

    Stevens, Troy

    2014-01-01

    Many teachers are using technology to improve student achievement, but only a few are attaining an improvement in student achievement. The purpose of this quantitative study was to identify: (1) how much time teachers spend integrating technology into their classroom, (2) how much time teachers believe is required to maximize the effectiveness of…

  12. Image/Time Series Mining Algorithms: Applications to Developmental Biology, Document Processing and Data Streams

    ERIC Educational Resources Information Center

    Tataw, Oben Moses

    2013-01-01

    Interdisciplinary research in computer science requires the development of computational techniques for practical application in different domains. This usually requires careful integration of different areas of technical expertise. This dissertation presents image and time series analysis algorithms, with practical interdisciplinary applications…

  13. Formulation of an explicit-multiple-time-step time integration method for use in a global primitive equation grid model

    NASA Technical Reports Server (NTRS)

    Chao, W. C.

    1982-01-01

    With appropriate modifications, a recently proposed explicit-multiple-time-step scheme (EMTSS) is incorporated into the UCLA model. In this scheme, the linearized terms in the governing equations that generate the gravity waves are split into different vertical modes. Each mode is integrated with an optimal time step, and at periodic intervals these modes are recombined. The other terms are integrated with a time step dictated by the CFL condition for low-frequency waves. This large time step requires a special modification of the advective terms in the polar region to maintain stability. Test runs for 72 h show that EMTSS is a stable, efficient and accurate scheme.

  14. hp-Adaptive time integration based on the BDF for viscous flows

    NASA Astrophysics Data System (ADS)

    Hay, A.; Etienne, S.; Pelletier, D.; Garon, A.

    2015-06-01

    This paper presents a procedure based on the Backward Differentiation Formulas of order 1 to 5 to obtain efficient time integration of the incompressible Navier-Stokes equations. The adaptive algorithm performs both stepsize and order selections to control respectively the solution accuracy and the computational efficiency of the time integration process. The stepsize selection (h-adaptivity) is based on a local error estimate and an error controller to guarantee that the numerical solution accuracy is within a user prescribed tolerance. The order selection (p-adaptivity) relies on the idea that low-accuracy solutions can be computed efficiently by low order time integrators while accurate solutions require high order time integrators to keep computational time low. The selection is based on a stability test that detects growing numerical noise and deems a method of order p stable if there is no method of lower order that delivers the same solution accuracy for a larger stepsize. Hence, it guarantees both that (1) the used method of integration operates inside of its stability region and (2) the time integration procedure is computationally efficient. The proposed time integration procedure also features a time-step rejection and quarantine mechanisms, a modified Newton method with a predictor and dense output techniques to compute solution at off-step points.

  15. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    PubMed Central

    Ehsan, Shoaib; Clark, Adrian F.; ur Rehman, Naveed; McDonald-Maier, Klaus D.

    2015-01-01

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems. PMID:26184211

  16. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    PubMed

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  17. Comparisons of discrete and integrative sampling accuracy in estimating pulsed aquatic exposures.

    PubMed

    Morrison, Shane A; Luttbeg, Barney; Belden, Jason B

    2016-11-01

    Most current-use pesticides have short half-lives in the water column and thus the most relevant exposure scenarios for many aquatic organisms are pulsed exposures. Quantifying exposure using discrete water samples may not be accurate as few studies are able to sample frequently enough to accurately determine time-weighted average (TWA) concentrations of short aquatic exposures. Integrative sampling methods that continuously sample freely dissolved contaminants over time intervals (such as integrative passive samplers) have been demonstrated to be a promising measurement technique. We conducted several modeling scenarios to test the assumption that integrative methods may require many less samples for accurate estimation of peak 96-h TWA concentrations. We compared the accuracies of discrete point samples and integrative samples while varying sampling frequencies and a range of contaminant water half-lives (t 50  = 0.5, 2, and 8 d). Differences the predictive accuracy of discrete point samples and integrative samples were greatest at low sampling frequencies. For example, when the half-life was 0.5 d, discrete point samples required 7 sampling events to ensure median values > 50% and no sampling events reporting highly inaccurate results (defined as < 10% of the true 96-h TWA). Across all water half-lives investigated, integrative sampling only required two samples to prevent highly inaccurate results and measurements resulting in median values > 50% of the true concentration. Regardless, the need for integrative sampling diminished as water half-life increased. For an 8-d water half-life, two discrete samples produced accurate estimates and median values greater than those obtained for two integrative samples. Overall, integrative methods are the more accurate method for monitoring contaminants with short water half-lives due to reduced frequency of extreme values, especially with uncertainties around the timing of pulsed events. However, the acceptability of discrete sampling methods for providing accurate concentration measurements increases with increasing aquatic half-lives. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. A 3D image sensor with adaptable charge subtraction scheme for background light suppression

    NASA Astrophysics Data System (ADS)

    Shin, Jungsoon; Kang, Byongmin; Lee, Keechang; Kim, James D. K.

    2013-02-01

    We present a 3D ToF (Time-of-Flight) image sensor with adaptive charge subtraction scheme for background light suppression. The proposed sensor can alternately capture high resolution color image and high quality depth map in each frame. In depth-mode, the sensor requires enough integration time for accurate depth acquisition, but saturation will occur in high background light illumination. We propose to divide the integration time into N sub-integration times adaptively. In each sub-integration time, our sensor captures an image without saturation and subtracts the charge to prevent the pixel from the saturation. In addition, the subtraction results are cumulated N times obtaining a final result image without background illumination at full integration time. Experimental results with our own ToF sensor show high background suppression performance. We also propose in-pixel storage and column-level subtraction circuit for chiplevel implementation of the proposed method. We believe the proposed scheme will enable 3D sensors to be used in out-door environment.

  19. Flow Control and Routing in an Integrated Voice and Data Communication Network

    DTIC Science & Technology

    1981-08-01

    require continuous and almost real - time delivery; they are very sensitive to delay. Data conversations, on the other hand, are generally intolerant of...packets arrive in time to be delivered to the sink. However, this is not the solution we seek. We have noted that voice conversations require almost real ...by long messages that require continuous real - time delivery; e.g. voice facsimile, video. Class II: characterized by short discrete messages that

  20. Supporting BPMN choreography with system integration artefacts for enterprise process collaboration

    NASA Astrophysics Data System (ADS)

    Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2014-07-01

    Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between independently developed systems. Enterprise integration patterns (EIPs) are widely accepted artefacts to design integration solutions. If the choreography model represents coordination requirements between processes with behaviour mismatches, the integration designer needs to analyse the routing requirements and address these requirements by manually designing EIP message routers. As collaboration scales and complexity increases, manual design becomes inefficient. Thus, the research problem of this paper is to explore a method to automatically identify routing requirements from BPMN choreography model and to accordingly design routing in the integration solution. To achieve this goal, recurring behaviour mismatch scenarios are analysed as patterns, and corresponding solutions are proposed as EIP routers. Using this method, a choreography model can be analysed by computer to identify occurrences of mismatch patterns, leading to corresponding router selection. A case study demonstrates that the proposed method enables computer-assisted integration design to implement choreography. A further experiment reveals that the method is effective to improve the design quality and reduce time cost.

  1. Validation of an Integrated Airframe and Turbofan Engine Simulation for Evaluation of Propulsion Control Modes

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Sowers, T Shane; Liu, Yuan; Owen, A. Karl; Guo, Ten-Huei

    2015-01-01

    The National Aeronautics and Space Administration (NASA) has developed independent airframe and engine models that have been integrated into a single real-time aircraft simulation for piloted evaluation of propulsion control algorithms. In order to have confidence in the results of these evaluations, the integrated simulation must be validated to demonstrate that its behavior is realistic and that it meets the appropriate Federal Aviation Administration (FAA) certification requirements for aircraft. The paper describes the test procedures and results, demonstrating that the integrated simulation generally meets the FAA requirements and is thus a valid testbed for evaluation of propulsion control modes.

  2. GEODYN programmer's guide, volume 2, part 2. [computer program for estimation of orbit and geodetic parameters

    NASA Technical Reports Server (NTRS)

    Mullins, N. E.; Dao, N. C.; Martin, T. V.; Goad, C. C.; Boulware, N. L.; Chin, M. M.

    1972-01-01

    A computer program for executive control routine for orbit integration of artificial satellites is presented. At the beginning of each arc, the program initiates required constants as well as the variational partials at epoch. If epoch needs to be reset to a previous time, the program negates the stepsize, and calls for integration backward to the desired time. After backward integration is completed, the program resets the stepsize to the proper positive quantity.

  3. Computation of type curves for flow to partially penetrating wells in water-table aquifers

    USGS Publications Warehouse

    Moench, Allen F.

    1993-01-01

    Evaluation of Neuman's analytical solution for flow to a well in a homogeneous, anisotropic, water-table aquifer commonly requires large amounts of computation time and can produce inaccurate results for selected combinations of parameters. Large computation times occur because the integrand of a semi-infinite integral involves the summation of an infinite series. Each term of the series requires evaluation of the roots of equations, and the series itself is sometimes slowly convergent. Inaccuracies can result from lack of computer precision or from the use of improper methods of numerical integration. In this paper it is proposed to use a method of numerical inversion of the Laplace transform solution, provided by Neuman, to overcome these difficulties. The solution in Laplace space is simpler in form than the real-time solution; that is, the integrand of the semi-infinite integral does not involve an infinite series or the need to evaluate roots of equations. Because the integrand is evaluated rapidly, advanced methods of numerical integration can be used to improve accuracy with an overall reduction in computation time. The proposed method of computing type curves, for which a partially documented computer program (WTAQ1) was written, was found to reduce computation time by factors of 2 to 20 over the time needed to evaluate the closed-form, real-time solution.

  4. Validation of Passive Sampling Devices for Monitoring of Munitions Constituents in Underwater Environments

    DTIC Science & Technology

    2017-09-01

    this project, we launched at Esperanza pier (Figure 5-4), which required a minimum of 2 hours of travel time , including transit from Camp Garcia to the...concentrations of emerging contaminants by providing a time -integrated sample with low detection limits and in situ extraction. PSDs are fairly well...A continuous sampling approach allows detection and quantification of chemicals in an integrated manner, providing time - weighted average (TWA

  5. Adaptive Implicit Non-Equilibrium Radiation Diffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Philip, Bobby; Wang, Zhen; Berrill, Mark A

    2013-01-01

    We describe methods for accurate and efficient long term time integra- tion of non-equilibrium radiation diffusion systems: implicit time integration for effi- cient long term time integration of stiff multiphysics systems, local control theory based step size control to minimize the required global number of time steps while control- ling accuracy, dynamic 3D adaptive mesh refinement (AMR) to minimize memory and computational costs, Jacobian Free Newton-Krylov methods on AMR grids for efficient nonlinear solution, and optimal multilevel preconditioner components that provide level independent solver convergence.

  6. Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System

    NASA Astrophysics Data System (ADS)

    Lee, Chang Jae; Yun, Jae Hee

    2017-06-01

    Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.

  7. Implicit integration methods for dislocation dynamics

    DOE PAGES

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; ...

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a waymore » of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.« less

  8. Removal of Gross Air Embolization from Cardiopulmonary Bypass Circuits with Integrated Arterial Line Filters: A Comparison of Circuit Designs.

    PubMed

    Reagor, James A; Holt, David W

    2016-03-01

    Advances in technology, the desire to minimize blood product transfusions, and concerns relating to inflammatory mediators have lead many practitioners and manufacturers to minimize cardiopulmonary bypass (CBP) circuit designs. The oxygenator and arterial line filter (ALF) have been integrated into one device as a method of attaining a reduction in prime volume and surface area. The instructions for use of a currently available oxygenator with integrated ALF recommends incorporating a recirculation line distal to the oxygenator. However, according to an unscientific survey, 70% of respondents utilize CPB circuits incorporating integrated ALFs without a path of recirculation distal to the oxygenator outlet. Considering this circuit design, the ability to quickly remove a gross air bolus in the blood path distal to the oxygenator may be compromised. This in vitro study was designed to determine if the time required to remove a gross air bolus from a CPB circuit without a path of recirculation distal to the oxygenator will be significantly longer than that of a circuit with a path of recirculation distal to the oxygenator. A significant difference was found in the mean time required to remove a gross air bolus between the circuit designs (p = .0003). Additionally, There was found to be a statistically significant difference in the mean time required to remove a gross air bolus between Trial 1 and Trials 4 (p = .015) and 5 (p =.014) irrespective of the circuit design. Under the parameters of this study, a recirculation line distal to an oxygenator with an integrated ALF significantly decreases the time it takes to remove an air bolus from the CPB circuit and may be safer for clinical use than the same circuit without a recirculation line.

  9. TIME-INTEGRATED EXPOSURE MEASURES TO IMPROVE THE PREDICTIVE POWER OF EXPOSURE CLASSIFICATION FOR EPIDEMIOLOGIC STUDIES

    EPA Science Inventory

    Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Although long-term integrated exposure measurements are a critical component of exposure assessment, the ability to include these measurements into epidemiologic...

  10. Implicit time accurate simulation of unsteady flow

    NASA Astrophysics Data System (ADS)

    van Buuren, René; Kuerten, Hans; Geurts, Bernard J.

    2001-03-01

    Implicit time integration was studied in the context of unsteady shock-boundary layer interaction flow. With an explicit second-order Runge-Kutta scheme, a reference solution to compare with the implicit second-order Crank-Nicolson scheme was determined. The time step in the explicit scheme is restricted by both temporal accuracy as well as stability requirements, whereas in the A-stable implicit scheme, the time step has to obey temporal resolution requirements and numerical convergence conditions. The non-linear discrete equations for each time step are solved iteratively by adding a pseudo-time derivative. The quasi-Newton approach is adopted and the linear systems that arise are approximately solved with a symmetric block Gauss-Seidel solver. As a guiding principle for properly setting numerical time integration parameters that yield an efficient time accurate capturing of the solution, the global error caused by the temporal integration is compared with the error resulting from the spatial discretization. Focus is on the sensitivity of properties of the solution in relation to the time step. Numerical simulations show that the time step needed for acceptable accuracy can be considerably larger than the explicit stability time step; typical ratios range from 20 to 80. At large time steps, convergence problems that are closely related to a highly complex structure of the basins of attraction of the iterative method may occur. Copyright

  11. Execution environment for intelligent real-time control systems

    NASA Technical Reports Server (NTRS)

    Sztipanovits, Janos

    1987-01-01

    Modern telerobot control technology requires the integration of symbolic and non-symbolic programming techniques, different models of parallel computations, and various programming paradigms. The Multigraph Architecture, which has been developed for the implementation of intelligent real-time control systems is described. The layered architecture includes specific computational models, integrated execution environment and various high-level tools. A special feature of the architecture is the tight coupling between the symbolic and non-symbolic computations. It supports not only a data interface, but also the integration of the control structures in a parallel computing environment.

  12. DITTY - a computer program for calculating population dose integrated over ten thousand years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.

    The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.

  13. Connectivity of wetlands to downstream waters: Conceptual framework and review

    EPA Science Inventory

    A river represents the time-integrated combination of all waters contributing to it. Understanding the factors that influence a river’s health and sustainability, as well as its degradation, requires an integrated systems perspective. This considers all the components of the ri...

  14. An integrated gateway for various PHDs in U-healthcare environments.

    PubMed

    Park, KeeHyun; Pak, JuGeon

    2012-01-01

    We propose an integrated gateway for various personal health devices (PHDs). This gateway receives measurements from various PHDs and conveys them to a remote monitoring server (MS). It provides two kinds of transmission modes: immediate transmission and integrated transmission. The former mode operates if a measurement exceeds a predetermined threshold or in the case of an emergency. In the latter mode, the gateway retains the measurements instead of forwarding them. When the reporting time comes, the gateway extracts all the stored measurements, integrates them into one message, and transmits the integrated message to the MS. Through this mechanism, the transmission overhead can be reduced. On the basis of the proposed gateway, we construct a u-healthcare system comprising an activity monitor, a medication dispenser, and a pulse oximeter. The evaluation results show that the size of separate messages from various PHDs is reduced through the integration process, and the process does not require much time; the integration time is negligible.

  15. An Integrated Gateway for Various PHDs in U-Healthcare Environments

    PubMed Central

    Park, KeeHyun; Pak, JuGeon

    2012-01-01

    We propose an integrated gateway for various personal health devices (PHDs). This gateway receives measurements from various PHDs and conveys them to a remote monitoring server (MS). It provides two kinds of transmission modes: immediate transmission and integrated transmission. The former mode operates if a measurement exceeds a predetermined threshold or in the case of an emergency. In the latter mode, the gateway retains the measurements instead of forwarding them. When the reporting time comes, the gateway extracts all the stored measurements, integrates them into one message, and transmits the integrated message to the MS. Through this mechanism, the transmission overhead can be reduced. On the basis of the proposed gateway, we construct a u-healthcare system comprising an activity monitor, a medication dispenser, and a pulse oximeter. The evaluation results show that the size of separate messages from various PHDs is reduced through the integration process, and the process does not require much time; the integration time is negligible. PMID:22899891

  16. An integral nuclear power and propulsion system concept

    NASA Astrophysics Data System (ADS)

    Choong, Phillip T.; Teofilo, Vincent L.; Begg, Lester L.; Dunn, Charles; Otting, William

    An integral space power concept provides both the electrical power and propulsion from a common heat source and offers superior performance capabilities over conventional orbital insertion using chemical propulsion systems. This paper describes a hybrid (bimodal) system concept based on a proven, inherently safe solid fuel form for the high temperature reactor core operation and rugged planar thermionic energy converter for long-life steady state electric power production combined with NERVA-based rocket technology for propulsion. The integral system is capable of long-life power operation and multiple propulsion operations. At an optimal thrust level, the integral system can maintain the minimal delta-V requirement while minimizing the orbital transfer time. A trade study comparing the overall benefits in placing large payloads to GEO with the nuclear electric propulsion option shows superiority of nuclear thermal propulsion. The resulting savings in orbital transfer time and the substantial reduction of overall lift requirement enables the use of low-cost launchers for several near-term military satellite missions.

  17. Time-integrated directional detection of dark matter

    NASA Astrophysics Data System (ADS)

    O'Hare, Ciaran A. J.; Kavanagh, Bradley J.; Green, Anne M.

    2017-10-01

    The analysis of signals in directional dark matter (DM) detectors typically assumes that the directions of nuclear recoils can be measured in the Galactic rest frame. However, this is not possible with all directional detection technologies. In nuclear emulsions, for example, the recoil events must be detected and measured after the exposure time of the experiment. Unless the entire detector is mounted and rotated with the sidereal day, the recoils cannot be reoriented in the Galactic rest frame. We examine the effect of this "time integration" on the primary goals of directional detection, namely: (1) confirming that the recoils are anisotropic; (2) measuring the median recoil direction to confirm their Galactic origin; and (3) probing below the neutrino floor. We show that after time integration the DM recoil distribution retains a preferred direction and is distinct from that of Solar neutrino-induced recoils. Many of the advantages of directional detection are therefore preserved and it is not crucial to mount and rotate the detector. Rejecting isotropic backgrounds requires a factor of 2 more signal events compared with an experiment with event time information, whereas a factor of 1.5-3 more events are needed to measure a median direction in agreement with the expectation for DM. We also find that there is still effectively no neutrino floor in a time-integrated directional experiment. However to reach a cross section an order of magnitude below the floor, a factor of ˜8 larger exposure is required than with a conventional directional experiment. We also examine how the sensitivity is affected for detectors with only 2D recoil track readout, and/or no head-tail measurement. As for non-time-integrated experiments, 2D readout is not a major disadvantage, though a lack of head-tail sensitivity is.

  18. Integrating a geographic information system, a scientific visualization system and an orographic precipitation model

    USGS Publications Warehouse

    Hay, L.; Knapp, L.

    1996-01-01

    Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.

  19. Dynamic combination of sensory and reward information under time pressure

    PubMed Central

    Farashahi, Shiva; Kao, Chang-Hao

    2018-01-01

    When making choices, collecting more information is beneficial but comes at the cost of sacrificing time that could be allocated to making other potentially rewarding decisions. To investigate how the brain balances these costs and benefits, we conducted a series of novel experiments in humans and simulated various computational models. Under six levels of time pressure, subjects made decisions either by integrating sensory information over time or by dynamically combining sensory and reward information over time. We found that during sensory integration, time pressure reduced performance as the deadline approached, and choice was more strongly influenced by the most recent sensory evidence. By fitting performance and reaction time with various models we found that our experimental results are more compatible with leaky integration of sensory information with an urgency signal or a decision process based on stochastic transitions between discrete states modulated by an urgency signal. When combining sensory and reward information, subjects spent less time on integration than optimally prescribed when reward decreased slowly over time, and the most recent evidence did not have the maximal influence on choice. The suboptimal pattern of reaction time was partially mitigated in an equivalent control experiment in which sensory integration over time was not required, indicating that the suboptimal response time was influenced by the perception of imperfect sensory integration. Meanwhile, during combination of sensory and reward information, performance did not drop as the deadline approached, and response time was not different between correct and incorrect trials. These results indicate a decision process different from what is involved in the integration of sensory information over time. Together, our results not only reveal limitations in sensory integration over time but also illustrate how these limitations influence dynamic combination of sensory and reward information. PMID:29584717

  20. Higher Order Time Integration Schemes for the Unsteady Navier-Stokes Equations on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Jothiprasad, Giridhar; Mavriplis, Dimitri J.; Caughey, David A.

    2002-01-01

    The rapid increase in available computational power over the last decade has enabled higher resolution flow simulations and more widespread use of unstructured grid methods for complex geometries. While much of this effort has been focused on steady-state calculations in the aerodynamics community, the need to accurately predict off-design conditions, which may involve substantial amounts of flow separation, points to the need to efficiently simulate unsteady flow fields. Accurate unsteady flow simulations can easily require several orders of magnitude more computational effort than a corresponding steady-state simulation. For this reason, techniques for improving the efficiency of unsteady flow simulations are required in order to make such calculations feasible in the foreseeable future. The purpose of this work is to investigate possible reductions in computer time due to the choice of an efficient time-integration scheme from a series of schemes differing in the order of time-accuracy, and by the use of more efficient techniques to solve the nonlinear equations which arise while using implicit time-integration schemes. This investigation is carried out in the context of a two-dimensional unstructured mesh laminar Navier-Stokes solver.

  1. Time and space integrating acousto-optic folded spectrum processing for SETI

    NASA Technical Reports Server (NTRS)

    Wagner, K.; Psaltis, D.

    1986-01-01

    Time and space integrating folded spectrum techniques utilizing acousto-optic devices (AOD) as 1-D input transducers are investigated for a potential application as wideband, high resolution, large processing gain spectrum analyzers in the search for extra-terrestrial intelligence (SETI) program. The space integrating Fourier transform performed by a lens channels the coarse spectral components diffracted from an AOD onto an array of time integrating narrowband fine resolution spectrum analyzers. The pulsing action of a laser diode samples the interferometrically detected output, aliasing the fine resolution components to baseband, as required for the subsequent charge coupled devices (CCD) processing. The raster scan mechanism incorporated into the readout of the CCD detector array is used to unfold the 2-D transform, reproducing the desired high resolution Fourier transform of the input signal.

  2. The impact of integrated water management on the Space Station propulsion system

    NASA Technical Reports Server (NTRS)

    Schmidt, George R.

    1987-01-01

    The water usage of elements in the Space Station integrated water system (IWS) is discussed, and the parameters affecting the overall water balance and the water-electrolysis propulsion-system requirements are considered. With nominal IWS operating characteristics, extra logistic water resupply (LWR) is found to be unnecessary in the satisfaction of the nominal propulsion requirements. With the consideration of all possible operating characteristics, LWR will not be required in 65.5 percent of the cases, and for 17.9 percent of the cases LWR can be eliminated by controlling the stay time of theShuttle Orbiter orbiter.

  3. Considerations for developing technologies for an integrated person-borne IED countermeasure architecture

    NASA Astrophysics Data System (ADS)

    Lombardo, Nicholas J.; Knudson, Christa K.; Rutz, Frederick C.; Pattison, Kerrie J.; Stratton, Rex C.; Wiborg, James C.

    2010-04-01

    Developing an integrated person-borne improvised explosive device (IED) countermeasure to protect unstructured crowds at large public venues is the goal of the Standoff Technology Integration and Demonstration Program (STIDP), sponsored in part by the U.S. Department of Homeland Security (DHS). The architecture being developed includes countermeasure technologies deployed as a layered defense and enabling technologies for operating the countermeasures as an integrated system. In the architecture, early recognition of potentially higher-risk individuals is crucial. Sensors must be able to detect, with high accuracy, explosives' threat signatures in varying environmental conditions, from a variety of approaches and with dense crowds and limited dwell time. Command-and-control technologies are needed to automate sensor operation, reduce staffing requirements, improve situational awareness, and automate/facilitate operator decisions. STIDP is developing technical and operational requirements for standoff and remotely operated sensors and is working with federal agencies and foreign governments to implement these requirements into their research and development programs. STIDP also is developing requirements for a software platform to rapidly integrate and control various sensors; acquire, analyze, and record their data; and present the data in an operationally relevant manner. Requirements also are being developed for spatial analysis, tracking and assessing threats with available screening resources, and data fusion for operator decision-making.

  4. Real-Time Hardware-in-the-Loop Simulation of Ares I Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick; Matras, Alex; Walker, David; Wilson, Heath; Fulton, Chris; Alday, Nathan; Betts, Kevin; Hughes, Ryan; Turbe, Michael

    2009-01-01

    The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory at the Marshall Space Flight Center. The primary purpose of the Ares System Integration Laboratory is to test the vehicle avionics hardware and software in a hardware - in-the-loop environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time simulation backbone to stimulate all required Ares components for verification testing. ARTE_VIIS provides high -fidelity dynamics, actuator, and sensor models to simulate an accurate flight trajectory in order to ensure realistic test conditions. ARTEMIS has been designed to take advantage of the advances in underlying computational power now available to support hardware-in-the-loop testing to achieve real-time simulation with unprecedented model fidelity. A modular realtime design relying on a fully distributed computing architecture has been implemented.

  5. Development of the Semi-implicit Time Integration in KIM-SH

    NASA Astrophysics Data System (ADS)

    NAM, H.

    2015-12-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded in 2011 by the Korea Meteorological Administration (KMA) to develop Korea's own global Numerical Weather Prediction (NWP) system as nine year (2011-2019) project. The KIM-SH is a KIAPS integrated model-spectral element based in the HOMME. In KIM-SH, the explicit schemes are employed. We introduce the three- and two-time-level semi-implicit scheme in KIM-SH as the time integration. Explicit schemes however have a tendancy to be unstable and require very small timesteps while semi-implicit schemes are very stable and can have much larger timesteps.We define the linear and reference values, then by definition of semi-implicit scheme, we apply the linear solver as GMRES. The numerical results from experiments will be introduced with the current development status of the time integration in KIM-SH. Several numerical examples are shown to confirm the efficiency and reliability of the proposed schemes.

  6. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  7. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  8. Demonstrating artificial intelligence for space systems - Integration and project management issues

    NASA Technical Reports Server (NTRS)

    Hack, Edmund C.; Difilippo, Denise M.

    1990-01-01

    As part of its Systems Autonomy Demonstration Project (SADP), NASA has recently demonstrated the Thermal Expert System (TEXSYS). Advanced real-time expert system and human interface technology was successfully developed and integrated with conventional controllers of prototype space hardware to provide intelligent fault detection, isolation, and recovery capability. Many specialized skills were required, and responsibility for the various phases of the project therefore spanned multiple NASA centers, internal departments and contractor organizations. The test environment required communication among many types of hardware and software as well as between many people. The integration, testing, and configuration management tools and methodologies which were applied to the TEXSYS project to assure its safe and successful completion are detailed. The project demonstrated that artificial intelligence technology, including model-based reasoning, is capable of the monitoring and control of a large, complex system in real time.

  9. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  10. 49 CFR 570.56 - Vacuum brake assist unit and vacuum brake system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... The following requirements apply to vehicles with vacuum brake assist units and vacuum brake systems. (a) Vacuum brake assist unit integrity. The vacuum brake assist unit shall demonstrate integrity as... maintained on the pedal. (1) Inspection procedure. Stop the engine and apply service brake several times to...

  11. 14 CFR 255.4 - Display of information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... with the requirements of this section. (1) Each system must offer an integrated display that uses the.... (2) Each integrated display offered by a system must either use elapsed time as a significant factor in selecting service options from the database or give single-plane flights a preference over...

  12. Non-functional Avionics Requirements

    NASA Astrophysics Data System (ADS)

    Paulitsch, Michael; Ruess, Harald; Sorea, Maria

    Embedded systems in aerospace become more and more integrated in order to reduce weight, volume/size, and power of hardware for more fuel-effi ciency. Such integration tendencies change architectural approaches of system ar chi tec tures, which subsequently change non-functional requirements for plat forms. This paper provides some insight into state-of-the-practice of non-func tional requirements for developing ultra-critical embedded systems in the aero space industry, including recent changes and trends. In particular, formal requi re ment capture and formal analysis of non-functional requirements of avionic systems - including hard-real time, fault-tolerance, reliability, and per for mance - are exemplified by means of recent developments in SAL and HiLiTE.

  13. Parallel Multi-Step/Multi-Rate Integration of Two-Time Scale Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Chang, Johnny T.; Ploen, Scott R.; Sohl, Garett. A,; Martin, Bryan J.

    2004-01-01

    Increasing demands on the fidelity of simulations for real-time and high-fidelity simulations are stressing the capacity of modern processors. New integration techniques are required that provide maximum efficiency for systems that are parallelizable. However many current techniques make assumptions that are at odds with non-cascadable systems. A new serial multi-step/multi-rate integration algorithm for dual-timescale continuous state systems is presented which applies to these systems, and is extended to a parallel multi-step/multi-rate algorithm. The superior performance of both algorithms is demonstrated through a representative example.

  14. Developing a Crew Time Model for Human Exploration Missions to Mars

    NASA Technical Reports Server (NTRS)

    Battfeld, Bryan; Stromgren, Chel; Shyface, Hilary; Cirillo, William; Goodliff, Kandyce

    2015-01-01

    Candidate human missions to Mars require mission lengths that could extend beyond those that have previously been demonstrated during crewed Lunar (Apollo) and International Space Station (ISS) missions. The nature of the architectures required for deep space human exploration will likely necessitate major changes in how crews operate and maintain the spacecraft. The uncertainties associated with these shifts in mission constructs - including changes to habitation systems, transit durations, and system operations - raise concerns as to the ability of the crew to complete required overhead activities while still having time to conduct a set of robust exploration activities. This paper will present an initial assessment of crew operational requirements for human missions to the Mars surface. The presented results integrate assessments of crew habitation, system maintenance, and utilization to present a comprehensive analysis of potential crew time usage. Destination operations were assessed for a short (approx. 50 day) and long duration (approx. 500 day) surface habitation case. Crew time allocations are broken out by mission segment, and the availability of utilization opportunities was evaluated throughout the entire mission progression. To support this assessment, the integrated crew operations model (ICOM) was developed. ICOM was used to parse overhead, maintenance and system repair, and destination operations requirements within each mission segment - outbound transit, Mars surface duration, and return transit - to develop a comprehensive estimation of exploration crew time allocations. Overhead operational requirements included daily crew operations, health maintenance activities, and down time. Maintenance and repair operational allocations are derived using the Exploration Maintainability and Analysis Tool (EMAT) to develop a probabilistic estimation of crew repair time necessary to maintain systems functionality throughout the mission.

  15. Alternative Procedure of Heat Integration Tehnique Election between Two Unit Processes to Improve Energy Saving

    NASA Astrophysics Data System (ADS)

    Santi, S. S.; Renanto; Altway, A.

    2018-01-01

    The energy use system in a production process, in this case heat exchangers networks (HENs), is one element that plays a role in the smoothness and sustainability of the industry itself. Optimizing Heat Exchanger Networks (HENs) from process streams can have a major effect on the economic value of an industry as a whole. So the solving of design problems with heat integration becomes an important requirement. In a plant, heat integration can be carried out internally or in combination between process units. However, steps in the determination of suitable heat integration techniques require long calculations and require a long time. In this paper, we propose an alternative step in determining heat integration technique by investigating 6 hypothetical units using Pinch Analysis approach with objective function energy target and total annual cost target. The six hypothetical units consist of units A, B, C, D, E, and F, where each unit has the location of different process streams to the temperature pinch. The result is a potential heat integration (ΔH’) formula that can trim conventional steps from 7 steps to just 3 steps. While the determination of the preferred heat integration technique is to calculate the potential of heat integration (ΔH’) between the hypothetical process units. Completion of calculation using matlab language programming.

  16. Spatial Data Integration Using Ontology-Based Approach

    NASA Astrophysics Data System (ADS)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  17. Integrating an object system into CLIPS: Language design and implementation issues

    NASA Technical Reports Server (NTRS)

    Auburn, Mark

    1990-01-01

    This paper describes the reasons why an object system with integrated pattern-matching and object-oriented programming facilities is desirable for CLIPS and how it is possible to integrate such a system into CLIPS while maintaining the run-time performance and the low memory usage for which CLIPS is known. The requirements for an object system in CLIPS that includes object-oriented programming and integrated pattern-matching are discussed and various techniques for optimizing the object system and its integration with the pattern-matcher are presented.

  18. Part-Time Occupational Faculty: A Contribution to Excellence. Information Series No. 300.

    ERIC Educational Resources Information Center

    Parsons, Michael H.

    Part-time faculty are essential to the accomplishment of the mission of postsecondary occupational education institutions. A commitment to excellence requires a comprehensive, systematic design for part-time faculty recruitment, development, assessment, and integration into the institution's delivery system. Careful attention to recruitment…

  19. Lessons on corporate "sustainability" disclosure from Deepwater Horizon.

    PubMed

    Lewis, Sanford

    2011-01-01

    The BP oil spill highlighted shortcomings of current financial and sustainability reporting standards and practice. "Integrated reporting" aims to combine financial and social/environmental information into a single annual corporate report. But without more stringent standards, integrated reports would neglect substantial risks and, as BP's sustainability reports demonstrate, create false impressions of good practice.To be of value, integration must: 1. Require timely disclosure of enforcement notices, orders and allegations issued by regulators. 2. Require disclosure of credible scientific reports and concerns indicative of potentially catastrophic risks of a company's products and activities, regardless of scientific uncertainty. 3. Require review and disclosures of a firm's safety culture. 4. Require disclosure of any facts or circumstances needed to ensure that the management's self-portrait of its sustainability strategies, goals and progress is not materially misleading.In conducting its misleading reporting, BP largely followed Global Reporting Initiative (GRI) guidelines. GRI is soliciting input, beginning in summer 2011, on how to revise those guidelines. Since GRI may prove a leading source for sustainability disclosure rules in integrating reporting, lessons learned from the BP experience must be applied to the next GRI revisions.

  20. Accentra Pharmaceuticals: Thrashing through ERP Systems

    ERIC Educational Resources Information Center

    Bradds, Nathan; Hills, Emily; Masters, Kelly; Weiss, Kevin; Havelka, Douglas

    2017-01-01

    Implementing and integrating an Enterprise Resource Planning (ERP) system into an organization is an enormous undertaking that requires substantial cash outlays, time commitments, and skilled IT and business personnel. It requires careful and detailed planning, thorough testing and training, and a change management process that creates a…

  1. Minimizing the area required for time constants in integrated circuits

    NASA Technical Reports Server (NTRS)

    Lyons, J. C.

    1972-01-01

    When a medium- or large-scale integrated circuit is designed, efforts are usually made to avoid the use of resistor-capacitor time constant generators. The capacitor needed for this circuit usually takes up more surface area on the chip than several resistors and transistors. When the use of this network is unavoidable, the designer usually makes an effort to see that the choice of resistor and capacitor combinations is such that a minimum amount of surface area is consumed. The optimum ratio of resistance to capacitance that will result in this minimum area is equal to the ratio of resistance to capacitance which may be obtained from a unit of surface area for the particular process being used. The minimum area required is a function of the square root of the reciprocal of the products of the resistance and capacitance per unit area. This minimum occurs when the area required by the resistor is equal to the area required by the capacitor.

  2. Symmetric and arbitrarily high-order Birkhoff-Hermite time integrators and their long-time behaviour for solving nonlinear Klein-Gordon equations

    NASA Astrophysics Data System (ADS)

    Liu, Changying; Iserles, Arieh; Wu, Xinyuan

    2018-03-01

    The Klein-Gordon equation with nonlinear potential occurs in a wide range of application areas in science and engineering. Its computation represents a major challenge. The main theme of this paper is the construction of symmetric and arbitrarily high-order time integrators for the nonlinear Klein-Gordon equation by integrating Birkhoff-Hermite interpolation polynomials. To this end, under the assumption of periodic boundary conditions, we begin with the formulation of the nonlinear Klein-Gordon equation as an abstract second-order ordinary differential equation (ODE) and its operator-variation-of-constants formula. We then derive a symmetric and arbitrarily high-order Birkhoff-Hermite time integration formula for the nonlinear abstract ODE. Accordingly, the stability, convergence and long-time behaviour are rigorously analysed once the spatial differential operator is approximated by an appropriate positive semi-definite matrix, subject to suitable temporal and spatial smoothness. A remarkable characteristic of this new approach is that the requirement of temporal smoothness is reduced compared with the traditional numerical methods for PDEs in the literature. Numerical results demonstrate the advantage and efficiency of our time integrators in comparison with the existing numerical approaches.

  3. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  4. Self-aligned blocking integration demonstration for critical sub-40nm pitch Mx level patterning

    NASA Astrophysics Data System (ADS)

    Raley, Angélique; Mohanty, Nihar; Sun, Xinghua; Farrell, Richard A.; Smith, Jeffrey T.; Ko, Akiteru; Metz, Andrew W.; Biolsi, Peter; Devilliers, Anton

    2017-04-01

    Multipatterning has enabled continued scaling of chip technology at the 28nm node and beyond. Selfaligned double patterning (SADP) and self-aligned quadruple patterning (SAQP) as well as Litho- Etch/Litho-Etch (LELE) iterations are widely used in the semiconductor industry to enable patterning at sub 193 immersion lithography resolutions for layers such as FIN, Gate and critical Metal lines. Multipatterning requires the use of multiple masks which is costly and increases process complexity as well as edge placement error variation driven mostly by overlay. To mitigate the strict overlay requirements for advanced technology nodes (7nm and below), a self-aligned blocking integration is desirable. This integration trades off the overlay requirement for an etch selectivity requirement and enables the cut mask overlay tolerance to be relaxed from half pitch to three times half pitch. Selfalignement has become the latest trend to enable scaling and self-aligned integrations are being pursued and investigated for various critical layers such as contact, via, metal patterning. In this paper we propose and demonstrate a low cost flexible self-aligned blocking strategy for critical metal layer patterning for 7nm and beyond from mask assembly to low -K dielectric etch. The integration is based on a 40nm pitch SADP flow with 2 cut masks compatible with either cut or block integration and employs dielectric films widely used in the back end of the line. As a consequence this approach is compatible with traditional etch, deposition and cleans tools that are optimized for dielectric etches. We will review the critical steps and selectivities required to enable this integration along with bench-marking of each integration option (cut vs. block).

  5. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  6. Integrated watershed analysis: adapting to changing times

    Treesearch

    Gordon H. Reeves

    2013-01-01

    Resource managers are increasingly required to conduct integrated analyses of aquatic and terrestrial ecosystems before undertaking any activities. Th ere are a number of research studies on the impacts of management actions on these ecosystems, as well as a growing body of knowledge about ecological processes that aff ect them, particularly aquatic ecosystems, which...

  7. A Model for an Integrated Learning Community.

    ERIC Educational Resources Information Center

    Van Sickle, Shaila; Mehs, Doreen

    Fort Lewis College (Colorado) developed a 17 credit, multidisciplinary learning program for first-time freshmen. The Integrated Learning Program (ILP) meets several of the college's general education requirements, is issue-oriented, and is taught by a team of five faculty members. The goals of the program include getting students to learn how to…

  8. Point Cloud-Based Automatic Assessment of 3D Computer Animation Courseworks

    ERIC Educational Resources Information Center

    Paravati, Gianluca; Lamberti, Fabrizio; Gatteschi, Valentina; Demartini, Claudio; Montuschi, Paolo

    2017-01-01

    Computer-supported assessment tools can bring significant benefits to both students and teachers. When integrated in traditional education workflows, they may help to reduce the time required to perform the evaluation and consolidate the perception of fairness of the overall process. When integrated within on-line intelligent tutoring systems,…

  9. Emphasizing the "Literacy" in "Scientific Literacy": A Concise Blueprint for Integrating Writing into Biology Classes

    ERIC Educational Resources Information Center

    Yule, Jeffrey V.; Wolf, William C.; Young, Nolan L.

    2010-01-01

    Effectively integrating writing into biology classes gives students the opportunity to develop a better understanding of and engagement with course content. Yet many instructors remain reluctant to emphasize writing. Some are concerned about the time commitment writing assessment requires. Others shy away from emphasizing writing in their classes…

  10. Faculty Perspectives on Effective Integration of Simulation into a Baccalaureate Nursing Curriculum

    ERIC Educational Resources Information Center

    Howell, Linda Jane

    2017-01-01

    Research shows that use of high fidelity simulation (HFS) as a teaching strategy requires extensive amounts of faculty time and financial resources for faculty development and equipment. This project study addressed the challenges encountered in the integration of HFS into a Midwestern metropolitan baccalaureate nursing program. The purpose of…

  11. Ramp Technology and Intelligent Processing in Small Manufacturing

    NASA Technical Reports Server (NTRS)

    Rentz, Richard E.

    1992-01-01

    To address the issues of excessive inventories and increasing procurement lead times, the Navy is actively pursuing flexible computer integrated manufacturing (FCIM) technologies, integrated by communication networks to respond rapidly to its requirements for parts. The Rapid Acquisition of Manufactured Parts (RAMP) program, initiated in 1986, is an integral part of this effort. The RAMP program's goal is to reduce the current average production lead times experienced by the Navy's inventory control points by a factor of 90 percent. The manufacturing engineering component of the RAMP architecture utilizes an intelligent processing technology built around a knowledge-based shell provided by ICAD, Inc. Rules and data bases in the software simulate an expert manufacturing planner's knowledge of shop processes and equipment. This expert system can use Product Data Exchange using STEP (PDES) data to determine what features the required part has, what material is required to manufacture it, what machines and tools are needed, and how the part should be held (fixtured) for machining, among other factors. The program's rule base then indicates, for example, how to make each feature, in what order to make it, and to which machines on the shop floor the part should be routed for processing. This information becomes part of the shop work order. The process planning function under RAMP greatly reduces the time and effort required to complete a process plan. Since the PDES file that drives the intelligent processing is 100 percent complete and accurate to start with, the potential for costly errors is greatly diminished.

  12. Ramp technology and intelligent processing in small manufacturing

    NASA Astrophysics Data System (ADS)

    Rentz, Richard E.

    1992-04-01

    To address the issues of excessive inventories and increasing procurement lead times, the Navy is actively pursuing flexible computer integrated manufacturing (FCIM) technologies, integrated by communication networks to respond rapidly to its requirements for parts. The Rapid Acquisition of Manufactured Parts (RAMP) program, initiated in 1986, is an integral part of this effort. The RAMP program's goal is to reduce the current average production lead times experienced by the Navy's inventory control points by a factor of 90 percent. The manufacturing engineering component of the RAMP architecture utilizes an intelligent processing technology built around a knowledge-based shell provided by ICAD, Inc. Rules and data bases in the software simulate an expert manufacturing planner's knowledge of shop processes and equipment. This expert system can use Product Data Exchange using STEP (PDES) data to determine what features the required part has, what material is required to manufacture it, what machines and tools are needed, and how the part should be held (fixtured) for machining, among other factors. The program's rule base then indicates, for example, how to make each feature, in what order to make it, and to which machines on the shop floor the part should be routed for processing. This information becomes part of the shop work order. The process planning function under RAMP greatly reduces the time and effort required to complete a process plan. Since the PDES file that drives the intelligent processing is 100 percent complete and accurate to start with, the potential for costly errors is greatly diminished.

  13. Integrating amplifiers using cooled JFETs

    NASA Technical Reports Server (NTRS)

    Low, F. J.

    1984-01-01

    It is shown how a simple integrating amplifier based on commercially available JFET and MOSFET switches can be used to measure photocurrents from detectors with noise levels as low as 1.6 x 10 to the -18th A/root Hz (10 electrons/sec). A figure shows the basic circuit, along with the waveform at the output. The readout is completely nondestructive; the reset noise does not contribute since sampling of the accumulated charge occurs between resets which are required only when the stored charge has reached a very high level. The storage capacity ranges from 10 to the 6th to 10 to the 9th electrons, depending on detector parameters and linearity requirements. Data taken with an Si:Sb detector operated at 24 microns are presented. The responsivity agrees well with the value obtained by Young et al. (1981) in the transimpedance amplifier circuit. The data are seen as indicating that extremely low values of NEP can be obtained for integration times of 1 sec and that longer integrations continue to improve the SNR at a rate faster than the square root of time when background noise is not present.

  14. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    PubMed Central

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987

  15. An approach for integrating the prioritization of functional and nonfunctional requirements.

    PubMed

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  16. Computer-aided resource planning and scheduling for radiological services

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  17. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions.

    PubMed

    Cendagorta, Joseph R; Bačić, Zlatko; Tuckerman, Mark E

    2018-03-14

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  18. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions

    NASA Astrophysics Data System (ADS)

    Cendagorta, Joseph R.; Bačić, Zlatko; Tuckerman, Mark E.

    2018-03-01

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  19. Software as a service approach to sensor simulation software deployment

    NASA Astrophysics Data System (ADS)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  20. Free nitrous acid serving as a pretreatment method for alkaline fermentation to enhance short-chain fatty acid production from waste activated sludge.

    PubMed

    Zhao, Jianwei; Wang, Dongbo; Li, Xiaoming; Yang, Qi; Chen, Hongbo; Zhong, Yu; Zeng, Guangming

    2015-07-01

    Alkaline condition (especially pH 10) has been demonstrated to be a promising method for short-chain fatty acid (SCFA) production from waste activated sludge anaerobic fermentation, because it can effectively inhibit the activities of methanogens. However, due to the limit of sludge solubilization rate, long fermentation time is required but SCFA yield is still limited. This paper reports a new pretreatment method for alkaline fermentation, i.e., using free nitrous acid (FNA) to pretreat sludge for 2 d, by which the fermentation time is remarkably shortened and meanwhile the SCFA production is significantly enhanced. Experimental results showed the highest SCFA production of 370.1 mg COD/g VSS (volatile suspended solids) was achieved at 1.54 mg FNA/L pretreatment integration with 2 d of pH 10 fermentation, which was 4.7- and 1.5-fold of that in the blank (uncontrolled) and sole pH 10 systems, respectively. The total time of this integration system was only 4 d, whereas the corresponding time was 15 d in the blank and 8 d in the sole pH 10 systems. The mechanism study showed that compared with pH 10, FNA pretreatment accelerated disruption of both extracellular polymeric substances and cell envelope. After FNA pretreatment, pH 10 treatment (1 d) caused 38.0% higher substrate solubilization than the sole FNA, which indicated that FNA integration with pH 10 could cause positive synergy on sludge solubilization. It was also observed that this integration method benefited hydrolysis and acidification processes. Therefore, more SCFA was produced, but less fermentation time was required in the integrated system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Simplified filtered Smith predictor for MIMO processes with multiple time delays.

    PubMed

    Santos, Tito L M; Torrico, Bismark C; Normey-Rico, Julio E

    2016-11-01

    This paper proposes a simplified tuning strategy for the multivariable filtered Smith predictor. It is shown that offset-free control can be achieved with step references and disturbances regardless of the poles of the primary controller, i.e., integral action is not explicitly required. This strategy reduces the number of design parameters and simplifies tuning procedure because the implicit integrative poles are not considered for design purposes. The simplified approach can be used to design continuous-time or discrete-time controllers. Three case studies are used to illustrate the advantages of the proposed strategy if compared with the standard approach, which is based on the explicit integrative action. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  2. High-resolution time series of Pseudomonas aeruginosa gene expression and rhamnolipid secretion through growth curve synchronization.

    PubMed

    van Ditmarsch, Dave; Xavier, João B

    2011-06-17

    Online spectrophotometric measurements allow monitoring dynamic biological processes with high-time resolution. Contrastingly, numerous other methods require laborious treatment of samples and can only be carried out offline. Integrating both types of measurement would allow analyzing biological processes more comprehensively. A typical example of this problem is acquiring quantitative data on rhamnolipid secretion by the opportunistic pathogen Pseudomonas aeruginosa. P. aeruginosa cell growth can be measured by optical density (OD600) and gene expression can be measured using reporter fusions with a fluorescent protein, allowing high time resolution monitoring. However, measuring the secreted rhamnolipid biosurfactants requires laborious sample processing, which makes this an offline measurement. Here, we propose a method to integrate growth curve data with endpoint measurements of secreted metabolites that is inspired by a model of exponential cell growth. If serial diluting an inoculum gives reproducible time series shifted in time, then time series of endpoint measurements can be reconstructed using calculated time shifts between dilutions. We illustrate the method using measured rhamnolipid secretion by P. aeruginosa as endpoint measurements and we integrate these measurements with high-resolution growth curves measured by OD600 and expression of rhamnolipid synthesis genes monitored using a reporter fusion. Two-fold serial dilution allowed integrating rhamnolipid measurements at a ~0.4 h-1 frequency with high-time resolved data measured at a 6 h-1 frequency. We show how this simple method can be used in combination with mutants lacking specific genes in the rhamnolipid synthesis or quorum sensing regulation to acquire rich dynamic data on P. aeruginosa virulence regulation. Additionally, the linear relation between the ratio of inocula and the time-shift between curves produces high-precision measurements of maximum specific growth rates, which were determined with a precision of ~5.4%. Growth curve synchronization allows integration of rich time-resolved data with endpoint measurements to produce time-resolved quantitative measurements. Such data can be valuable to unveil the dynamic regulation of virulence in P. aeruginosa. More generally, growth curve synchronization can be applied to many biological systems thus helping to overcome a key obstacle in dynamic regulation: the scarceness of quantitative time-resolved data.

  3. Sequential ranging integration times in the presence of CW interference in the ranging channel

    NASA Technical Reports Server (NTRS)

    Mathur, Ashok; Nguyen, Tien

    1986-01-01

    The Deep Space Network (DSN), managed by the Jet Propulsion Laboratory for NASA, is used primarily for communication with interplanetary spacecraft. The high sensitivity required to achieve planetary communications makes the DSN very susceptible to radio-frequency interference (RFI). In this paper, an analytical model is presented of the performance degradation of the DSN sequential ranging subsystem in the presence of downlink CW interference in the ranging channel. A trade-off between the ranging component integration times and the ranging signal-to-noise ratio to achieve a desired level of range measurement accuracy and the probability of error in the code components is also presented. Numerical results presented illustrate the required trade-offs under various interference conditions.

  4. Intelligent systems technology infrastructure for integrated systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry, Jr.

    1991-01-01

    Significant advances have occurred during the last decade in intelligent systems technologies (a.k.a. knowledge-based systems, KBS) including research, feasibility demonstrations, and technology implementations in operational environments. Evaluation and simulation data obtained to date in real-time operational environments suggest that cost-effective utilization of intelligent systems technologies can be realized for Automated Rendezvous and Capture applications. The successful implementation of these technologies involve a complex system infrastructure integrating the requirements of transportation, vehicle checkout and health management, and communication systems without compromise to systems reliability and performance. The resources that must be invoked to accomplish these tasks include remote ground operations and control, built-in system fault management and control, and intelligent robotics. To ensure long-term evolution and integration of new validated technologies over the lifetime of the vehicle, system interfaces must also be addressed and integrated into the overall system interface requirements. An approach for defining and evaluating the system infrastructures including the testbed currently being used to support the on-going evaluations for the evolutionary Space Station Freedom Data Management System is presented and discussed. Intelligent system technologies discussed include artificial intelligence (real-time replanning and scheduling), high performance computational elements (parallel processors, photonic processors, and neural networks), real-time fault management and control, and system software development tools for rapid prototyping capabilities.

  5. Integrated Human-Robotic Missions to the Moon and Mars: Mission Operations Design Implications

    NASA Technical Reports Server (NTRS)

    Mishkin, Andrew; Lee, Young; Korth, David; LeBlanc, Troy

    2007-01-01

    For most of the history of space exploration, human and robotic programs have been independent, and have responded to distinct requirements. The NASA Vision for Space Exploration calls for the return of humans to the Moon, and the eventual human exploration of Mars; the complexity of this range of missions will require an unprecedented use of automation and robotics in support of human crews. The challenges of human Mars missions, including roundtrip communications time delays of 6 to 40 minutes, interplanetary transit times of many months, and the need to manage lifecycle costs, will require the evolution of a new mission operations paradigm far less dependent on real-time monitoring and response by an Earthbound operations team. Robotic systems and automation will augment human capability, increase human safety by providing means to perform many tasks without requiring immediate human presence, and enable the transfer of traditional mission control tasks from the ground to crews. Developing and validating the new paradigm and its associated infrastructure may place requirements on operations design for nearer-term lunar missions. The authors, representing both the human and robotic mission operations communities, assess human lunar and Mars mission challenges, and consider how human-robot operations may be integrated to enable efficient joint operations, with the eventual emergence of a unified exploration operations culture.

  6. Integrated Human-Robotic Missions to the Moon and Mars: Mission Operations Design Implications

    NASA Technical Reports Server (NTRS)

    Korth, David; LeBlanc, Troy; Mishkin, Andrew; Lee, Young

    2006-01-01

    For most of the history of space exploration, human and robotic programs have been independent, and have responded to distinct requirements. The NASA Vision for Space Exploration calls for the return of humans to the Moon, and the eventual human exploration of Mars; the complexity of this range of missions will require an unprecedented use of automation and robotics in support of human crews. The challenges of human Mars missions, including roundtrip communications time delays of 6 to 40 minutes, interplanetary transit times of many months, and the need to manage lifecycle costs, will require the evolution of a new mission operations paradigm far less dependent on real-time monitoring and response by an Earthbound operations team. Robotic systems and automation will augment human capability, increase human safety by providing means to perform many tasks without requiring immediate human presence, and enable the transfer of traditional mission control tasks from the ground to crews. Developing and validating the new paradigm and its associated infrastructure may place requirements on operations design for nearer-term lunar missions. The authors, representing both the human and robotic mission operations communities, assess human lunar and Mars mission challenges, and consider how human-robot operations may be integrated to enable efficient joint operations, with the eventual emergence of a unified exploration operations culture.

  7. Solar System Chaos and Orbital Solutions for Paleoclimate Studies: Limits and New Results

    NASA Astrophysics Data System (ADS)

    Zeebe, R. E.

    2017-12-01

    I report results from accurate numerical integrations of Solar System orbits over the past 100 Myr. The simulations used different integrator algorithms, step sizes, and initial conditions (NASA, INPOP), and included effects from general relativity, different models of the Moon, the Sun's quadrupole moment, and up to ten asteroids. In one simulation, I probed the potential effect of a hypothetical Planet 9 on the dynamics of the system. The most expensive integration required 4 months wall-clock time (Bulirsch-Stoer algorithm) and showed a maximum relative energy error < 2.5e{-13} over the past 100 Myr. The difference in Earth's eccentricity (DeE) was used to track the difference between two solutions, which were considered to diverge at time tau when DeE irreversibly crossed 10% of Earth's mean eccentricity ( 0.028 x 0.1). My results indicate that finding a unique orbital solution is limited by initial conditions from current ephemerides to 54 Myr. Bizarrely, the 4-month Bulirsch-Stoer integration and a different integration scheme that required only 5 hours wall-clock time (symplectic, 12-day time step, Moon as a simple quadrupole perturbation), agree to 63 Myr. Solutions including 3 and 10 asteroids diverge at tau 48 Myr. The effect of a hypothetical Planet 9 on DeE becomes discernible at 66 Myr. Using tau as a criterion, the current state-of-the-art solutions all differ from previously published results beyond 50 Myr. The current study provides new orbital solutions for application in geological studies. I will also comment on the prospect of constraining astronomical solutions by geologic data.

  8. Mission Engineering of a Rapid Cycle Spacecraft Logistics Fleet

    NASA Technical Reports Server (NTRS)

    Holladay, Jon; McClendon, Randy (Technical Monitor)

    2002-01-01

    The requirement for logistics re-supply of the International Space Station has provided a unique opportunity for engineering the implementation of NASA's first dedicated pressurized logistics carrier fleet. The NASA fleet is comprised of three Multi-Purpose Logistics Modules (MPLM) provided to NASA by the Italian Space Agency in return for operations time aboard the International Space Station. Marshall Space Flight Center was responsible for oversight of the hardware development from preliminary design through acceptance of the third flight unit, and currently manages the flight hardware sustaining engineering and mission engineering activities. The actual MPLM Mission began prior to NASA acceptance of the first flight unit in 1999 and will continue until the de-commission of the International Space Station that is planned for 20xx. Mission engineering of the MPLM program requires a broad focus on three distinct yet inter-related operations processes: pre-flight, flight operations, and post-flight turn-around. Within each primary area exist several complex subsets of distinct and inter-related activities. Pre-flight processing includes the evaluation of carrier hardware readiness for space flight. This includes integration of payload into the carrier, integration of the carrier into the launch vehicle, and integration of the carrier onto the orbital platform. Flight operations include the actual carrier operations during flight and any required real-time ground support. Post-flight processing includes de-integration of the carrier hardware from the launch vehicle, de-integration of the payload, and preparation for returning the carrier to pre-flight staging. Typical space operations are engineered around the requirements and objectives of a dedicated mission on a dedicated operational platform (i.e. Launch or Orbiting Vehicle). The MPLM, however, has expanded this envelope by requiring operations with both vehicles during flight as well as pre-launch and post-landing operations. These unique requirements combined with a success-oriented schedule of four flights within a ten-month period have provided numerous opportunities for understanding and improving operations processes. Furthermore, it has increased the knowledge base of future Payload Carrier and Launch Vehicle hardware and requirement developments. Discussion of the process flows and target areas for process improvement are provided in the subject paper. Special emphasis is also placed on supplying guidelines for hardware development. The combination of process knowledge and hardware development knowledge will provide a comprehensive overview for future vehicle developments as related to integration and transportation of payloads.

  9. Evaluation of hazard and integrity monitor functions for integrated alerting and notification using a sensor simulation framework

    NASA Astrophysics Data System (ADS)

    Bezawada, Rajesh; Uijt de Haag, Maarten

    2010-04-01

    This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.

  10. Assessing Backwards Integration as a Method of KBO Family Finding

    NASA Astrophysics Data System (ADS)

    Benfell, Nathan; Ragozzine, Darin

    2018-04-01

    The age of young asteroid collisional families can sometimes be determined by using backwards n-body integrations of the solar system. This method is not used for discovering young asteroid families and is limited by the unpredictable influence of the Yarkovsky effect on individual specific asteroids over time. Since these limitations are not as important for objects in the Kuiper belt, Marcus et al. 2011 suggested that backwards integration could be used to discover and characterize collisional families in the outer solar system. But various challenges present themselves when running precise and accurate 4+ Gyr integrations of Kuiper Belt objects. We have created simulated families of Kuiper Belt Objects with identical starting locations and velocity distributions, based on the Haumea Family. We then ran several long-term test integrations to observe the effect of various simulation parameters on integration results. These integrations were then used to investigate which parameters are of enough significance to require inclusion in the integration. Thereby we determined how to construct long-term integrations that both yield significant results and require manageable processing power. Additionally, we have tested the use of backwards integration as a method of discovery of potential young families in the Kuiper Belt.

  11. The Wind Integration National Dataset (WIND) toolkit (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caroline Draxl: NREL

    2014-01-01

    Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.

  12. Technical integration of hippocampus, Basal Ganglia and physical models for spatial navigation.

    PubMed

    Fox, Charles; Humphries, Mark; Mitchinson, Ben; Kiss, Tamas; Somogyvari, Zoltan; Prescott, Tony

    2009-01-01

    Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large-scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings.

  13. Impact of integration of sexual and reproductive health services on consultation duration times: results from the Integra Initiative.

    PubMed

    Siapka, Mariana; Obure, Carol Dayo; Mayhew, Susannah H; Sweeney, Sedona; Fenty, Justin; Vassall, Anna

    2017-11-01

    The lack of human resources is a key challenge in scaling up of HIV services in Africa's health care system. Integrating HIV services could potentially increase their effectiveness and optimize the use of limited resources and clinical staff time. We examined the impact of integration of provider initiated HIV counselling and testing (PITC) and family planning (FP counselling and FP provision) services on duration of consultation to assess the impact of PITC and FP integration on staff workload. This study was conducted in 24 health facilities in Kenya under the Integra Initiative, a non-randomized, pre/post intervention trial to evaluate the impact of integrated HIV and sexual and reproductive health services on health and service outcomes. We compared the time spent providing PITC-only services, FP-only services and integrated PITC/FP services. We used log-linear regression to assess the impact of plausible determinants on the duration of clients' consultation times. Median consultation duration times were highest for PITC-only services (30 min), followed by integrated services (10 min) and FP-only services (8 min). Times for PITC-only and FP-only services were 69.7% higher (95% Confidence Intervals (CIs) 35.8-112.0) and 43.9% lower (95% CIs -55.4 to - 29.6) than times spent on these services when delivered as an integrated service, respectively. The reduction in consultation times with integration suggests a potential reduction in workload. The higher consultation time for PITC-only could be because more pre- and post-counselling is provided at these stand-alone services. In integrated PITC/FP services, the duration of the visit fell below that required by HIV testing guidelines, and service mix between counselling and testing substantially changed. Integration of HIV with FP services may compromise the quality of services delivered and care must be taken to clearly specify and monitor appropriate consultation duration times and procedures during the process of integrating HIV and FP services. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  14. Impact of integration of sexual and reproductive health services on consultation duration times: results from the Integra Initiative

    PubMed Central

    Siapka, Mariana; Obure, Carol Dayo; Mayhew, Susannah H; Fenty, Justin; Initiative, Integra; Vassall, Anna

    2017-01-01

    Abstract The lack of human resources is a key challenge in scaling up of HIV services in Africa’s health care system. Integrating HIV services could potentially increase their effectiveness and optimize the use of limited resources and clinical staff time. We examined the impact of integration of provider initiated HIV counselling and testing (PITC) and family planning (FP counselling and FP provision) services on duration of consultation to assess the impact of PITC and FP integration on staff workload. This study was conducted in 24 health facilities in Kenya under the Integra Initiative, a non-randomized, pre/post intervention trial to evaluate the impact of integrated HIV and sexual and reproductive health services on health and service outcomes. We compared the time spent providing PITC-only services, FP-only services and integrated PITC/FP services. We used log-linear regression to assess the impact of plausible determinants on the duration of clients’ consultation times. Median consultation duration times were highest for PITC-only services (30 min), followed by integrated services (10 min) and FP-only services (8 min). Times for PITC-only and FP-only services were 69.7% higher (95% Confidence Intervals (CIs) 35.8–112.0) and 43.9% lower (95% CIs −55.4 to − 29.6) than times spent on these services when delivered as an integrated service, respectively. The reduction in consultation times with integration suggests a potential reduction in workload. The higher consultation time for PITC-only could be because more pre- and post-counselling is provided at these stand-alone services. In integrated PITC/FP services, the duration of the visit fell below that required by HIV testing guidelines, and service mix between counselling and testing substantially changed. Integration of HIV with FP services may compromise the quality of services delivered and care must be taken to clearly specify and monitor appropriate consultation duration times and procedures during the process of integrating HIV and FP services. PMID:29194545

  15. Real-time simulations for automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Cuseo, John A.

    1991-01-01

    Although the individual technologies for automated rendezvous and capture (AR&C) exist, they have not yet been integrated to produce a working system in the United States. Thus, real-time integrated systems simulations are critical to the development and pre-flight demonstration of an AR&C capability. Real-time simulations require a level of development more typical of a flight system compared to purely analytical methods, thus providing confidence in derived design concepts. This presentation will describe Martin Marietta's Space Operations Simulation (SOS) Laboratory, a state-of-the-art real-time simulation facility for AR&C, along with an implementation for the Satellite Servicer System (SSS) Program.

  16. Realization of Real-Time Clinical Data Integration Using Advanced Database Technology

    PubMed Central

    Yoo, Sooyoung; Kim, Boyoung; Park, Heekyong; Choi, Jinwook; Chun, Jonghoon

    2003-01-01

    As information & communication technologies have advanced, interest in mobile health care systems has grown. In order to obtain information seamlessly from distributed and fragmented clinical data from heterogeneous institutions, we need solutions that integrate data. In this article, we introduce a method for information integration based on real-time message communication using trigger and advanced database technologies. Messages were devised to conform to HL7, a standard for electronic data exchange in healthcare environments. The HL7 based system provides us with an integrated environment in which we are able to manage the complexities of medical data. We developed this message communication interface to generate and parse HL7 messages automatically from the database point of view. We discuss how easily real time data exchange is performed in the clinical information system, given the requirement for minimum loading of the database system. PMID:14728271

  17. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    PubMed

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard error estimated by using SAAM numerical and NUKFIT showed differences of <1%. The differences for the time-integrated activity coefficients were also <1% (standard error between 0.4% and 3%). In general, the application of the software is user-friendly and the results are mathematically correct and reproducible. An application of NUKFIT is presented for three different clinical examples. The software tool with its underlying methodology can be employed to objectively and reproducibly estimate the time integrated activity coefficient and its standard error for most time activity data in molecular radiotherapy.

  18. Module generation for self-testing integrated systems

    NASA Astrophysics Data System (ADS)

    Vanriessen, Ronald Pieter

    Hardware used for self test in VLSI (Very Large Scale Integrated) systems is reviewed, and an architecture to control the test hardware in an integrated system is presented. Because of the increase of test times, the use of self test techniques has become practically and economically viable for VLSI systems. Beside the reduction in test times and costs, self test also provides testing at operational speeds. Therefore, a suitable combination of scan path and macrospecific (self) tests is required to reduce test times and costs. An expert system that can be used in a silicon compilation environment is presented. The approach requires a minimum of testability knowledge from a system designer. A user friendly interface was described for specifying and modifying testability requirements by a testability expert. A reason directed backtracking mechanism is used to solve selection failures. Both the hierarchical testable architecture and the design for testability expert system are used in a self test compiler. The definition of a self test compiler was given. A self test compiler is a software tool that selects an appropriate test method for every macro in a design. The hardware to control a macro test will be included in the design automatically. As an example, the integration of the self-test compiler in a silicon compilation system PIRAMID was described. The design of a demonstrator circuit by self test compiler is described. This circuit consists of two self testable macros. Control of the self test hardware is carried out via the test access port of the boundary scan standard.

  19. Non-electrical-power temperature-time integrating sensor for RFID based on microfluidics

    NASA Astrophysics Data System (ADS)

    Schneider, Mike; Hoffmann, Martin

    2011-06-01

    The integration of RFID tags into packages offers the opportunity to combine logistic advantages of the technology with monitoring different parameters from inside the package at the same time. An essential demand for enhanced product safety especially in pharmacy or food industry is the monitoring of the time-temperature-integral. Thus, completely passive time-temperature-integrators (TTI) requiring no battery, microprocessor nor data logging devices are developed. TTI representing the sterilization process inside an autoclave system is a demanding challenge: a temperature of at least 120 °C have to be maintained over 45 minutes to assure that no unwanted organism remains. Due to increased temperature, the viscosity of a fluid changes and thus the speed of the fluid inside the channel increases. The filled length of the channel represents the time temperature integral affecting the system. Measurements as well as simulations allow drawing conclusions about the influence of the geometrical parameters of the system and provide the possibility of adaptation. Thus a completely passive sensor element for monitoring an integral parameter with waiving of external electrical power supply and data processing technology is demonstrated. Furthermore, it is shown how to adjust the specific TTI parameters of the sensor to different applications and needs by modifying the geometrical parameters of the system.

  20. Canonical fluid thermodynamics

    NASA Technical Reports Server (NTRS)

    Schmid, L. A.

    1972-01-01

    The space-time integral of the thermodynamic pressure plays the role of the thermodynamic potential for compressible, adiabatic flow in the sense that the pressure integral for stable flow is less than for all slightly different flows. This stability criterion can be converted into a variational minimum principle by requiring the molar free-enthalpy and the temperature, which are the arguments of the pressure function, to be generalized velocities, that is, the proper-time derivatives of scalar spare-time functions which are generalized coordinates in the canonical formalism. In a fluid context, proper-time differentiation must be expressed in terms of three independent quantities that specify the fluid velocity. This can be done in several ways, all of which lead to different variants (canonical transformations) of the same constraint-free action integral whose Euler-Lagrange equations are just the well-known equations of motion for adiabatic compressible flow.

  1. 76 FR 44961 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Alternative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... of the IRAs such as rates of return and any restrictions on withdrawals. Moreover, general... requirements, and any withdrawal restrictions; and the tax treatment of the SEP-related IRA. Furthermore... requirements to participants and information regarding social security integration (if applicable); and timely...

  2. The Power Dynamics and Politics of Survey Design: Measuring Workload Associated with Teaching, Administering and Supporting Work-Integrated Learning Courses

    ERIC Educational Resources Information Center

    Clark, Lindie; Rowe, Anna; Cantori, Alex; Bilgin, Ayse; Mukuria, Valentine

    2016-01-01

    Work-integrated learning (WIL) courses can be more time consuming and resource intensive to design, teach, administer and support than classroom-based courses, as they generally require different curricula and pedagogical approaches as well as additional administrative and pastoral responsibilities. Workload and resourcing issues are reported as…

  3. Using assistive robots to promote inclusive education.

    PubMed

    Encarnação, P; Leite, T; Nunes, C; Nunes da Ponte, M; Adams, K; Cook, A; Caiado, A; Pereira, J; Piedade, G; Ribeiro, M

    2017-05-01

    This paper describes the development and test of physical and virtual integrated augmentative manipulation and communication assistive technologies (IAMCATs) that enable children with motor and speech impairments to manipulate educational items by controlling a robot with a gripper, while communicating through a speech generating device. Nine children with disabilities, nine regular and nine special education teachers participated in the study. Teachers adapted academic activities so they could also be performed by the children with disabilities using the IAMCAT. An inductive content analysis of the teachers' interviews before and after the intervention was performed. Teachers considered the IAMCAT to be a useful resource that can be integrated into the regular class dynamics respecting their curricular planning. It had a positive impact on children with disabilities and on the educational community. However, teachers pointed out the difficulties in managing the class, even with another adult present, due to the extra time required by children with disabilities to complete the activities. The developed assistive technologies enable children with disabilities to participate in academic activities but full inclusion would require another adult in class and strategies to deal with the additional time required by children to complete the activities. Implications for Rehabilitation Integrated augmentative manipulation and communication assistive technologies are useful resources to promote the participation of children with motor and speech impairments in classroom activities. Virtual tools, running on a computer screen, may be easier to use but further research is needed in order to evaluate its effectiveness when compared to physical tools. Full participation of children with motor and speech impairments in academic activities using these technologies requires another adult in class and adequate strategies to manage the extra time the child with disabilities may require to complete the activities.

  4. Performance Enhancements Under Dual-task Conditions

    NASA Technical Reports Server (NTRS)

    Kramer, A. F.; Wickens, C. D.; Donchin, E.

    1984-01-01

    Research on dual-task performance has been concerned with delineating the antecedent conditions which lead to dual-task decrements. Capacity models of attention, which propose that a hypothetical resource structure underlies performance, have been employed as predictive devices. These models predict that tasks which require different processing resources can be more successfully time shared than tasks which require common resources. The conditions under which such dual-task integrality can be fostered were assessed in a study in which three factors likely to influence the integrality between tasks were manipulated: inter-task redundancy, the physical proximity of tasks and the task relevant objects. Twelve subjects participated in three experimental sessions in which they performed both single and dual-tasks. The primary task was a pursuit step tracking task. The secondary tasks required the discrimination between different intensities or different spatial positions of a stimulus. The results are discussed in terms of a model of dual-task integrality.

  5. Program Manager: Journal of the Defense Systems Management College. Volume 19, Number 1, January-February 1990

    DTIC Science & Technology

    1990-02-01

    proposal to the HQ USAF than optimum communication be- While the report indicated that this recommended a long-range, integrated tween the user and the...SRA), conduct of The three functional Working Integrated logistics program plan- tradeoff studies, and development of Groups are separated into the...as well * INTEGRATION as a rapid retargeting capability. V. SYSTEM REQUIREMENTS Working Conditions At the same time, SAC realized that VI. SYSTEM

  6. Object Representations in Human Visual Cortex Formed Through Temporal Integration of Dynamic Partial Shape Views.

    PubMed

    Orlov, Tanya; Zohary, Ehud

    2018-01-17

    We typically recognize visual objects using the spatial layout of their parts, which are present simultaneously on the retina. Therefore, shape extraction is based on integration of the relevant retinal information over space. The lateral occipital complex (LOC) can represent shape faithfully in such conditions. However, integration over time is sometimes required to determine object shape. To study shape extraction through temporal integration of successive partial shape views, we presented human participants (both men and women) with artificial shapes that moved behind a narrow vertical or horizontal slit. Only a tiny fraction of the shape was visible at any instant at the same retinal location. However, observers perceived a coherent whole shape instead of a jumbled pattern. Using fMRI and multivoxel pattern analysis, we searched for brain regions that encode temporally integrated shape identity. We further required that the representation of shape should be invariant to changes in the slit orientation. We show that slit-invariant shape information is most accurate in the LOC. Importantly, the slit-invariant shape representations matched the conventional whole-shape representations assessed during full-image runs. Moreover, when the same slit-dependent shape slivers were shuffled, thereby preventing their spatiotemporal integration, slit-invariant shape information was reduced dramatically. The slit-invariant representation of the various shapes also mirrored the structure of shape perceptual space as assessed by perceptual similarity judgment tests. Therefore, the LOC is likely to mediate temporal integration of slit-dependent shape views, generating a slit-invariant whole-shape percept. These findings provide strong evidence for a global encoding of shape in the LOC regardless of integration processes required to generate the shape percept. SIGNIFICANCE STATEMENT Visual objects are recognized through spatial integration of features available simultaneously on the retina. The lateral occipital complex (LOC) represents shape faithfully in such conditions even if the object is partially occluded. However, shape must sometimes be reconstructed over both space and time. Such is the case in anorthoscopic perception, when an object is moving behind a narrow slit. In this scenario, spatial information is limited at any moment so the whole-shape percept can only be inferred by integration of successive shape views over time. We find that LOC carries shape-specific information recovered using such temporal integration processes. The shape representation is invariant to slit orientation and is similar to that evoked by a fully viewed image. Existing models of object recognition lack such capabilities. Copyright © 2018 the authors 0270-6474/18/380659-20$15.00/0.

  7. Teacher perceptions of usefulness of mobile learning devices in rural secondary science classrooms

    NASA Astrophysics Data System (ADS)

    Tighe, Lisa

    The internet and easy accessibility to a wide range of digital content has created the necessity for teachers to embrace and integrate digitial media in their curriculums. Although there is a call for digital media integration in curriculum by current learning standards, rural schools continue to have access to fewer resources due to limited budgets, potentially preventing teachers from having access to the most current technology and science instructional materials. This dissertation identifies the perceptions rural secondary science teachers have on the usefulness of mobile learning devices in the science classroom. The successes and challenges in using mobile learning devices in the secondary classroom were also explored. Throughout this research, teachers generally supported the integration of mobile devices in the classroom, while harboring some concerns relating to student distractability and the time required for integrating mobile devices in exisiting curriculum. Quantitative and qualitative data collected through surveys, interviews, and classroom observations revealed that teachers perceive that mobile devices bring benefits such as ease of communication and easy access to digitial information. However, there are perceived challenges with the ability to effectively communicate complex scientific information via mobile devices, distractibility of students, and the time required to develop effective curriculum to integrate digital media into the secondary science classroom.

  8. High-efficiency non-uniformity correction for wide dynamic linear infrared radiometry system

    NASA Astrophysics Data System (ADS)

    Li, Zhou; Yu, Yi; Tian, Qi-Jie; Chang, Song-Tao; He, Feng-Yun; Yin, Yan-He; Qiao, Yan-Feng

    2017-09-01

    Several different integration times are always set for a wide dynamic linear and continuous variable integration time infrared radiometry system, therefore, traditional calibration-based non-uniformity correction (NUC) are usually conducted one by one, and furthermore, several calibration sources required, consequently makes calibration and process of NUC time-consuming. In this paper, the difference of NUC coefficients between different integration times have been discussed, and then a novel NUC method called high-efficiency NUC, which combines the traditional calibration-based non-uniformity correction, has been proposed. It obtains the correction coefficients of all integration times in whole linear dynamic rangesonly by recording three different images of a standard blackbody. Firstly, mathematical procedure of the proposed non-uniformity correction method is validated and then its performance is demonstrated by a 400 mm diameter ground-based infrared radiometry system. Experimental results show that the mean value of Normalized Root Mean Square (NRMS) is reduced from 3.78% to 0.24% by the proposed method. In addition, the results at 4 ms and 70 °C prove that this method has a higher accuracy compared with traditional calibration-based NUC. In the meantime, at other integration time and temperature there is still a good correction effect. Moreover, it greatly reduces the number of correction time and temperature sampling point, and is characterized by good real-time performance and suitable for field measurement.

  9. A point implicit time integration technique for slow transient flow problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kadioglu, Samet Y.; Berry, Ray A.; Martineau, Richard C.

    2015-05-01

    We introduce a point implicit time integration technique for slow transient flow problems. The method treats the solution variables of interest (that can be located at cell centers, cell edges, or cell nodes) implicitly and the rest of the information related to same or other variables are handled explicitly. The method does not require implicit iteration; instead it time advances the solutions in a similar spirit to explicit methods, except it involves a few additional function(s) evaluation steps. Moreover, the method is unconditionally stable, as a fully implicit method would be. This new approach exhibits the simplicity of implementation ofmore » explicit methods and the stability of implicit methods. It is specifically designed for slow transient flow problems of long duration wherein one would like to perform time integrations with very large time steps. Because the method can be time inaccurate for fast transient problems, particularly with larger time steps, an appropriate solution strategy for a problem that evolves from a fast to a slow transient would be to integrate the fast transient with an explicit or semi-implicit technique and then switch to this point implicit method as soon as the time variation slows sufficiently. We have solved several test problems that result from scalar or systems of flow equations. Our findings indicate the new method can integrate slow transient problems very efficiently; and its implementation is very robust.« less

  10. ELT-scale Adaptive Optics real-time control with thes Intel Xeon Phi Many Integrated Core Architecture

    NASA Astrophysics Data System (ADS)

    Jenkins, David R.; Basden, Alastair; Myers, Richard M.

    2018-05-01

    We propose a solution to the increased computational demands of Extremely Large Telescope (ELT) scale adaptive optics (AO) real-time control with the Intel Xeon Phi Knights Landing (KNL) Many Integrated Core (MIC) Architecture. The computational demands of an AO real-time controller (RTC) scale with the fourth power of telescope diameter and so the next generation ELTs require orders of magnitude more processing power for the RTC pipeline than existing systems. The Xeon Phi contains a large number (≥64) of low power x86 CPU cores and high bandwidth memory integrated into a single socketed server CPU package. The increased parallelism and memory bandwidth are crucial to providing the performance for reconstructing wavefronts with the required precision for ELT scale AO. Here, we demonstrate that the Xeon Phi KNL is capable of performing ELT scale single conjugate AO real-time control computation at over 1.0kHz with less than 20μs RMS jitter. We have also shown that with a wavefront sensor camera attached the KNL can process the real-time control loop at up to 966Hz, the maximum frame-rate of the camera, with jitter remaining below 20μs RMS. Future studies will involve exploring the use of a cluster of Xeon Phis for the real-time control of the MCAO and MOAO regimes of AO. We find that the Xeon Phi is highly suitable for ELT AO real time control.

  11. Development of Sensors for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro

    2005-01-01

    Advances in technology have led to the availability of smaller and more accurate sensors. Computer power to process large amounts of data is no longer the prevailing issue; thus multiple and redundant sensors can be used to obtain more accurate and comprehensive measurements in a space vehicle. The successful integration and commercialization of micro- and nanotechnology for aerospace applications require that a close and interactive relationship be developed between the technology provider and the end user early in the project. Close coordination between the developers and the end users is critical since qualification for flight is time-consuming and expensive. The successful integration of micro- and nanotechnology into space vehicles requires a coordinated effort throughout the design, development, installation, and integration processes

  12. Rapid Calculation of Spacecraft Trajectories Using Efficient Taylor Series Integration

    NASA Technical Reports Server (NTRS)

    Scott, James R.; Martini, Michael C.

    2011-01-01

    A variable-order, variable-step Taylor series integration algorithm was implemented in NASA Glenn's SNAP (Spacecraft N-body Analysis Program) code. SNAP is a high-fidelity trajectory propagation program that can propagate the trajectory of a spacecraft about virtually any body in the solar system. The Taylor series algorithm's very high order accuracy and excellent stability properties lead to large reductions in computer time relative to the code's existing 8th order Runge-Kutta scheme. Head-to-head comparison on near-Earth, lunar, Mars, and Europa missions showed that Taylor series integration is 15.8 times faster than Runge- Kutta on average, and is more accurate. These speedups were obtained for calculations involving central body, other body, thrust, and drag forces. Similar speedups have been obtained for calculations that include J2 spherical harmonic for central body gravitation. The algorithm includes a step size selection method that directly calculates the step size and never requires a repeat step. High-order Taylor series integration algorithms have been shown to provide major reductions in computer time over conventional integration methods in numerous scientific applications. The objective here was to directly implement Taylor series integration in an existing trajectory analysis code and demonstrate that large reductions in computer time (order of magnitude) could be achieved while simultaneously maintaining high accuracy. This software greatly accelerates the calculation of spacecraft trajectories. At each time level, the spacecraft position, velocity, and mass are expanded in a high-order Taylor series whose coefficients are obtained through efficient differentiation arithmetic. This makes it possible to take very large time steps at minimal cost, resulting in large savings in computer time. The Taylor series algorithm is implemented primarily through three subroutines: (1) a driver routine that automatically introduces auxiliary variables and sets up initial conditions and integrates; (2) a routine that calculates system reduced derivatives using recurrence relations for quotients and products; and (3) a routine that determines the step size and sums the series. The order of accuracy used in a trajectory calculation is arbitrary and can be set by the user. The algorithm directly calculates the motion of other planetary bodies and does not require ephemeris files (except to start the calculation). The code also runs with Taylor series and Runge-Kutta used interchangeably for different phases of a mission.

  13. Risk/Requirements Trade-off Guidelines for Low Cost Satellite Systems

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Man, Kin F.

    1996-01-01

    The accelerating trend toward faster, better, cheaper missions places increasing emphasis on the trade-offs between requirements and risk to reduce cost and development times, while still improving quality and reliability. The Risk/Requirement Trade-off Guidelines discussed in this paper are part of an integrated approach to address the main issues by focusing on the sum of prevention, analysis, control, or test (PACT) processes.

  14. A portable integrated system to control an active needle

    NASA Astrophysics Data System (ADS)

    Konh, Bardia; Motalleb, Mahdi; Ashrafiuon, Hashem

    2017-04-01

    The primary objective of this work is to introduce an integrated portable system to operate a flexible active surgical needle with actuation capabilities. The smart needle uses the robust actuation capabilities of the shape memory alloy wires to drastically improve the accuracy of in medical procedures such as brachytherapy. This, however, requires an integrated system aimed to control the insertion of the needle via a linear motor and its deflection by the SMA wire in real-time. The integrated system includes a flexible needle prototype, a Raspberry Pi computer, a linear stage motor, an SMA wire actuator, a power supply, electromagnetic tracking system, and various communication supplies. The linear stage motor guides the needle into tissue. The power supply provides appropriate current to the SMA actuator. The tracking system measures tip movement for feedback, The Raspberry Pi is the central tool that receives the tip movement feedback and controls the linear stage motor and the SMA actuator via the power supply. The implemented algorithms required for communication and feedback control are also described. This paper demonstrates that the portable integrated system may be a viable solution for more effective procedures requiring surgical needles.

  15. 45 CFR 225.2 - State plan requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: (1) Such methods of recruitment and selection as will offer opportunity for full-time or part-time... personnel of which subprofessional staff are an integral part; (3) A career service plan permitting persons... provide for: (1) A position in which rests responsibility for the development, organization, and...

  16. Dynamic implicit 3D adaptive mesh refinement for non-equilibrium radiation diffusion

    NASA Astrophysics Data System (ADS)

    Philip, B.; Wang, Z.; Berrill, M. A.; Birke, M.; Pernice, M.

    2014-04-01

    The time dependent non-equilibrium radiation diffusion equations are important for solving the transport of energy through radiation in optically thick regimes and find applications in several fields including astrophysics and inertial confinement fusion. The associated initial boundary value problems that are encountered often exhibit a wide range of scales in space and time and are extremely challenging to solve. To efficiently and accurately simulate these systems we describe our research on combining techniques that will also find use more broadly for long term time integration of nonlinear multi-physics systems: implicit time integration for efficient long term time integration of stiff multi-physics systems, local control theory based step size control to minimize the required global number of time steps while controlling accuracy, dynamic 3D adaptive mesh refinement (AMR) to minimize memory and computational costs, Jacobian Free Newton-Krylov methods on AMR grids for efficient nonlinear solution, and optimal multilevel preconditioner components that provide level independent solver convergence.

  17. A parallel time integrator for noisy nonlinear oscillatory systems

    NASA Astrophysics Data System (ADS)

    Subber, Waad; Sarkar, Abhijit

    2018-06-01

    In this paper, we adapt a parallel time integration scheme to track the trajectories of noisy non-linear dynamical systems. Specifically, we formulate a parallel algorithm to generate the sample path of nonlinear oscillator defined by stochastic differential equations (SDEs) using the so-called parareal method for ordinary differential equations (ODEs). The presence of Wiener process in SDEs causes difficulties in the direct application of any numerical integration techniques of ODEs including the parareal algorithm. The parallel implementation of the algorithm involves two SDEs solvers, namely a fine-level scheme to integrate the system in parallel and a coarse-level scheme to generate and correct the required initial conditions to start the fine-level integrators. For the numerical illustration, a randomly excited Duffing oscillator is investigated in order to study the performance of the stochastic parallel algorithm with respect to a range of system parameters. The distributed implementation of the algorithm exploits Massage Passing Interface (MPI).

  18. Learning from Transitioning to New Technology That Supports Online and Blended Learning: A Case Study

    ERIC Educational Resources Information Center

    Lock, Jennifer; Johnson, Carol

    2017-01-01

    Transitioning from one technology to another within educational institutions is complex and multi-faceted, and requires time. Such a transition involves more than making the new technology available for use. It requires knowing the people involved, designing differentiated support structures, and integrating various resources to meet their…

  19. Integrating Aggregate Exposure Pathway (AEP) and Adverse Outcome Pathway (AOP) Frameworks to Estimate Exposure-relevant Responses

    EPA Science Inventory

    High throughput toxicity testing (HTT) holds the promise of providing data for tens of thousands of chemicals that currently have no data due to the cost and time required for animal testing. Interpretation of these results require information linking the perturbations seen in vi...

  20. Avionics System Architecture for the NASA Orion Vehicle

    NASA Technical Reports Server (NTRS)

    Baggerman, Clint; McCabe, Mary; Verma, Dinesh

    2009-01-01

    It has been 30 years since the National Aeronautics and Space Administration (NASA) last developed a crewed spacecraft capable of launch, on-orbit operations, and landing. During that time, aerospace avionics technologies have greatly advanced in capability, and these technologies have enabled integrated avionics architectures for aerospace applications. The inception of NASA s Orion Crew Exploration Vehicle (CEV) spacecraft offers the opportunity to leverage the latest integrated avionics technologies into crewed space vehicle architecture. The outstanding question is to what extent to implement these advances in avionics while still meeting the unique crewed spaceflight requirements for safety, reliability and maintainability. Historically, aircraft and spacecraft have very similar avionics requirements. Both aircraft and spacecraft must have high reliability. They also must have as much computing power as possible and provide low latency between user control and effecter response while minimizing weight, volume, and power. However, there are several key differences between aircraft and spacecraft avionics. Typically, the overall spacecraft operational time is much shorter than aircraft operation time, but the typical mission time (and hence, the time between preventive maintenance) is longer for a spacecraft than an aircraft. Also, the radiation environment is typically more severe for spacecraft than aircraft. A "loss of mission" scenario (i.e. - the mission is not a success, but there are no casualties) arguably has a greater impact on a multi-million dollar spaceflight mission than a typical commercial flight. Such differences need to be weighted when determining if an aircraft-like integrated modular avionics (IMA) system is suitable for a crewed spacecraft. This paper will explore the preliminary design process of the Orion vehicle avionics system by first identifying the Orion driving requirements and the difference between Orion requirements and those of other previous crewed spacecraft avionics systems. Common systems engineering methods will be used to evaluate the value propositions, or the factors that weight most heavily in design consideration, of Orion and other aerospace systems. Then, the current Orion avionics architecture will be presented and evaluated.

  1. High-quality weather data for grid integration studies

    NASA Astrophysics Data System (ADS)

    Draxl, C.

    2016-12-01

    As variable renewable power penetration levels increase in power systems worldwide, renewable integration studies are crucial to ensure continued economic and reliable operation of the power grid. In this talk we will shed light on requirements for grid integration studies as far as wind and solar energy are concerned. Because wind and solar plants are strongly impacted by weather, high-resolution and high-quality weather data are required to drive power system simulations. Future data sets will have to push limits of numerical weather prediction to yield these high-resolution data sets, and wind data will have to be time-synchronized with solar data. Current wind and solar integration data sets will be presented. The Wind Integration National Dataset (WIND) Toolkit is the largest and most complete grid integration data set publicly available to date. A meteorological data set, wind power production time series, and simulated forecasts created using the Weather Research and Forecasting Model run on a 2-km grid over the continental United States at a 5-min resolution is now publicly available for more than 126,000 land-based and offshore wind power production sites. The Solar Integration National Dataset (SIND) is available as time synchronized with the WIND Toolkit, and will allow for combined wind-solar grid integration studies. The National Solar Radiation Database (NSRDB) is a similar high temporal- and spatial resolution database of 18 years of solar resource data for North America and India. Grid integration studies are also carried out in various countries, which aim at increasing their wind and solar penetration through combined wind and solar integration data sets. We will present a multi-year effort to directly support India's 24x7 energy access goal through a suite of activities aimed at enabling large-scale deployment of clean energy and energy efficiency. Another current effort is the North-American-Renewable-Integration-Study, with the aim of providing a seamless data set across borders for a whole continent, to simulate and analyze the impacts of potential future large wind and solar power penetrations on bulk power system operations.

  2. Sink or Swim: Learning by Doing in a Supply Chain Integration Activity*

    ERIC Educational Resources Information Center

    Harnowo, Akhadian S.; Calhoun, Mikelle A.; Monteiro, Heather

    2016-01-01

    Studies show that supply chain integration (SCI) is important to organizations. This article describes an activity that places students in the middle of an SCI scenario. The highly interactive hands-on simulation requires only 50 to 60 minutes of classroom time, may be used with 18 to about 36 students, and involves minimal instructor preparation.…

  3. OSTA data systems planning workshop recommendations

    NASA Technical Reports Server (NTRS)

    Desjardins, R.

    1981-01-01

    The Integrated Discipline Requirements are presented, including the following needs: (1) quality data sets, (2) a systematic treatment of problems with present data, (3) a single integrated catalog or master directory, (4) continuity of data formats, (5) a standard geographic and time basis, (6) data delivery in terms of easy rather than immediate accessibility, (7) data archives, and (8) cooperation with user agencies.

  4. Self-consistent predictor/corrector algorithms for stable and efficient integration of the time-dependent Kohn-Sham equation

    NASA Astrophysics Data System (ADS)

    Zhu, Ying; Herbert, John M.

    2018-01-01

    The "real time" formulation of time-dependent density functional theory (TDDFT) involves integration of the time-dependent Kohn-Sham (TDKS) equation in order to describe the time evolution of the electron density following a perturbation. This approach, which is complementary to the more traditional linear-response formulation of TDDFT, is more efficient for computation of broad-band spectra (including core-excited states) and for systems where the density of states is large. Integration of the TDKS equation is complicated by the time-dependent nature of the effective Hamiltonian, and we introduce several predictor/corrector algorithms to propagate the density matrix, one of which can be viewed as a self-consistent extension of the widely used modified-midpoint algorithm. The predictor/corrector algorithms facilitate larger time steps and are shown to be more efficient despite requiring more than one Fock build per time step, and furthermore can be used to detect a divergent simulation on-the-fly, which can then be halted or else the time step modified.

  5. A comparative study of Rosenbrock-type and implicit Runge-Kutta time integration for discontinuous Galerkin method for unsteady 3D compressible Navier-Stokes equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xiaodong; Xia, Yidong; Luo, Hong

    A comparative study of two classes of third-order implicit time integration schemes is presented for a third-order hierarchical WENO reconstructed discontinuous Galerkin (rDG) method to solve the 3D unsteady compressible Navier-Stokes equations: — 1) the explicit first stage, single diagonally implicit Runge-Kutta (ESDIRK3) scheme, and 2) the Rosenbrock-Wanner (ROW) schemes based on the differential algebraic equations (DAEs) of Index-2. Compared with the ESDIRK3 scheme, a remarkable feature of the ROW schemes is that, they only require one approximate Jacobian matrix calculation every time step, thus considerably reducing the overall computational cost. A variety of test cases, ranging from inviscid flowsmore » to DNS of turbulent flows, are presented to assess the performance of these schemes. Here, numerical experiments demonstrate that the third-order ROW scheme for the DAEs of index-2 can not only achieve the designed formal order of temporal convergence accuracy in a benchmark test, but also require significantly less computing time than its ESDIRK3 counterpart to converge to the same level of discretization errors in all of the flow simulations in this study, indicating that the ROW methods provide an attractive alternative for the higher-order time-accurate integration of the unsteady compressible Navier-Stokes equations.« less

  6. A comparative study of Rosenbrock-type and implicit Runge-Kutta time integration for discontinuous Galerkin method for unsteady 3D compressible Navier-Stokes equations

    DOE PAGES

    Liu, Xiaodong; Xia, Yidong; Luo, Hong; ...

    2016-10-05

    A comparative study of two classes of third-order implicit time integration schemes is presented for a third-order hierarchical WENO reconstructed discontinuous Galerkin (rDG) method to solve the 3D unsteady compressible Navier-Stokes equations: — 1) the explicit first stage, single diagonally implicit Runge-Kutta (ESDIRK3) scheme, and 2) the Rosenbrock-Wanner (ROW) schemes based on the differential algebraic equations (DAEs) of Index-2. Compared with the ESDIRK3 scheme, a remarkable feature of the ROW schemes is that, they only require one approximate Jacobian matrix calculation every time step, thus considerably reducing the overall computational cost. A variety of test cases, ranging from inviscid flowsmore » to DNS of turbulent flows, are presented to assess the performance of these schemes. Here, numerical experiments demonstrate that the third-order ROW scheme for the DAEs of index-2 can not only achieve the designed formal order of temporal convergence accuracy in a benchmark test, but also require significantly less computing time than its ESDIRK3 counterpart to converge to the same level of discretization errors in all of the flow simulations in this study, indicating that the ROW methods provide an attractive alternative for the higher-order time-accurate integration of the unsteady compressible Navier-Stokes equations.« less

  7. Design and implementation of the ATLAS TRT front end electronics

    NASA Astrophysics Data System (ADS)

    Newcomer, Mitch; Atlas TRT Collaboration

    2006-07-01

    The ATLAS TRT subsystem is comprised of 380,000 4 mm straw tube sensors ranging in length from 30 to 80 cm. Polypropelene plastic layers between straws and a xenon-based gas mixture in the straws allow the straws to be used for both tracking and transition radiation detection. Detector-mounted electronics with data sparsification was chosen to minimize the cable plant inside the super-conducting solenoid of the ATLAS inner tracker. The "on detector" environment required a small footprint, low noise, low power and radiation-tolerant readout capable of triggering at rates up to 20 MHz with an analog signal dynamic range of >300 times the discriminator setting. For tracking, a position resolution better than 150 μm requires leading edge trigger timing with ˜1 ns precision and for transition radiation detection, a charge collection time long enough to integrate the direct and reflected signal from the unterminated straw tube is needed for position-independent energy measurement. These goals have been achieved employing two custom Application-specific integrated circuits (ASICS) and board design techniques that successfully separate analog and digital functionality while providing an integral part of the straw tube shielding.

  8. An integrated decision support system for diagnosing and managing patients with community-acquired pneumonia.

    PubMed Central

    Aronsky, D.; Haug, P. J.

    1999-01-01

    Decision support systems that integrate guidelines have become popular applications to reduce variation and deliver cost-effective care. However, adverse characteristics of decision support systems, such as additional and time-consuming data entry or manually identifying eligible patients, result in a "behavioral bottleneck" that prevents decision support systems to become part of the clinical routine. This paper describes the design and the implementation of an integrated decision support system that explores a novel approach for bypassing the behavioral bottleneck. The real-time decision support system does not require health care providers to enter additional data and consists of a diagnostic and a management component. Images Fig. 1 Fig. 2 Fig. 3 PMID:10566348

  9. Elevation-relief ratio, hypsometric integral, and geomorphic area-altitude analysis.

    NASA Technical Reports Server (NTRS)

    Pike, R. J.; Wilson, S. E.

    1971-01-01

    Mathematical proof establishes identity of hypsometric integral and elevation-relief ratio, two quantitative topographic descriptors developed independently of one another for entirely different purposes. Operationally, values of both measures are in excellent agreement for arbitrarily bounded topographic samples, as well as for low-order fluvial watersheds. By using a point-sampling technique rather than planimetry, elevation-relief ratio (defined as mean elevation minus minimum elevation divided by relief) is calculated manually in about a third of the time required for the hypsometric integral.

  10. Space Station - An integrated approach to operational logistics support

    NASA Technical Reports Server (NTRS)

    Hosmer, G. J.

    1986-01-01

    Development of an efficient and cost effective operational logistics system for the Space Station will require logistics planning early in the program's design and development phase. This paper will focus on Integrated Logistics Support (ILS) Program techniques and their application to the Space Station program design, production and deployment phases to assure the development of an effective and cost efficient operational logistics system. The paper will provide the methodology and time-phased programmatic steps required to establish a Space Station ILS Program that will provide an operational logistics system based on planned Space Station program logistics support.

  11. Integrating perioperative information from divergent sources.

    PubMed

    Frost, Elizabeth A M

    2012-01-01

    The enormous diversity of physician practices, including specialists, and patient requirements and comorbidities make integration of appropriate perioperative information difficult. Lack of communicating computer systems adds to the difficulty of assembling data. Meta analysis and evidence-based studies indicate that far too many tests are performed perioperatively. Guidelines for appropriate perioperative management have been formulated by several specialties. Education as to current findings and requirements should be better communicated to surgeons, consultants, and patients to improve healthcare needs and at the same time decrease costs. Means to better communication by interpersonal collaboration are outlined. © 2012 Mount Sinai School of Medicine.

  12. Data base architecture for instrument characteristics critical to spacecraft conceptual design

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Allen, Cheryl L.

    1990-01-01

    Spacecraft designs are driven by the payloads and mission requirements that they support. Many of the payload characteristics, such as mass, power requirements, communication requirements, moving parts, and so forth directly affect the choices for the spacecraft structural configuration and its subsystem design and component selection. The conceptual design process, which translates mission requirements into early spacecraft concepts, must be tolerant of frequent changes in the payload complement and resource requirements. A computer data base was designed and implemented for the purposes of containing the payload characteristics pertinent for spacecraft conceptual design, tracking the evolution of these payloads over time, and enabling the integration of the payload data with engineering analysis programs for improving the efficiency in producing spacecraft designs. In-house tools were used for constructing the data base and for performing the actual integration with an existing program for optimizing payload mass locations on the spacecraft.

  13. Earth orbital experiment program and requirements study, volume 1, sections 1 - 6

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A reference manual for planners of manned earth-orbital research activity is presented. The manual serves as a systems approach to experiment and mission planning based on an integrated consideration of candidate research programs and the appropriate vehicle, mission, and technology development requirements. Long range goals and objectives for NASA activities during the 1970 to 1980 time period are analyzed. The useful and proper roles of manned and automated spacecraft for implementing NASA experiments are described. An integrated consideration of NASA long range goals and objectives, the system and mission requirements, and the alternative implementation plans are developed. Specific areas of investigation are: (1) manned space flight requirements, (2) space biology, (3) spaceborne astronomy, (4) space communications and navigation, (5) earth observation, (6) supporting technology development requirements, (7) data management system matrices, (8) instrumentation matrices, and (9) biotechnology laboratory experiments.

  14. Variational symplectic algorithm for guiding center dynamics in the inner magnetosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Jinxing; Pu Zuyin; Xie Lun

    Charged particle dynamics in magnetosphere has temporal and spatial multiscale; therefore, numerical accuracy over a long integration time is required. A variational symplectic integrator (VSI) [H. Qin and X. Guan, Phys. Rev. Lett. 100, 035006 (2008) and H. Qin, X. Guan, and W. M. Tang, Phys. Plasmas 16, 042510 (2009)] for the guiding-center motion of charged particles in general magnetic field is applied to study the dynamics of charged particles in magnetosphere. Instead of discretizing the differential equations of the guiding-center motion, the action of the guiding-center motion is discretized and minimized to obtain the iteration rules for advancing themore » dynamics. The VSI conserves exactly a discrete Lagrangian symplectic structure and has better numerical properties over a long integration time, compared with standard integrators, such as the standard and adaptive fourth order Runge-Kutta (RK4) methods. Applying the VSI method to guiding-center dynamics in the inner magnetosphere, we can accurately calculate the particles'orbits for an arbitrary long simulating time with good conservation property. When a time-independent convection and corotation electric field is considered, the VSI method can give the accurate single particle orbit, while the RK4 method gives an incorrect orbit due to its intrinsic error accumulation over a long integrating time.« less

  15. Reliable Viscosity Calculation from Equilibrium Molecular Dynamics Simulations: A Time Decomposition Method.

    PubMed

    Zhang, Yong; Otani, Akihito; Maginn, Edward J

    2015-08-11

    Equilibrium molecular dynamics is often used in conjunction with a Green-Kubo integral of the pressure tensor autocorrelation function to compute the shear viscosity of fluids. This approach is computationally expensive and is subject to a large amount of variability because the plateau region of the Green-Kubo integral is difficult to identify unambiguously. Here, we propose a time decomposition approach for computing the shear viscosity using the Green-Kubo formalism. Instead of one long trajectory, multiple independent trajectories are run and the Green-Kubo relation is applied to each trajectory. The averaged running integral as a function of time is fit to a double-exponential function with a weighting function derived from the standard deviation of the running integrals. Such a weighting function minimizes the uncertainty of the estimated shear viscosity and provides an objective means of estimating the viscosity. While the formal Green-Kubo integral requires an integration to infinite time, we suggest an integration cutoff time tcut, which can be determined by the relative values of the running integral and the corresponding standard deviation. This approach for computing the shear viscosity can be easily automated and used in computational screening studies where human judgment and intervention in the data analysis are impractical. The method has been applied to the calculation of the shear viscosity of a relatively low-viscosity liquid, ethanol, and relatively high-viscosity ionic liquid, 1-n-butyl-3-methylimidazolium bis(trifluoromethane-sulfonyl)imide ([BMIM][Tf2N]), over a range of temperatures. These test cases show that the method is robust and yields reproducible and reliable shear viscosity values.

  16. Temporal resolution for the perception of features and conjunctions.

    PubMed

    Bodelón, Clara; Fallah, Mazyar; Reynolds, John H

    2007-01-24

    The visual system decomposes stimuli into their constituent features, represented by neurons with different feature selectivities. How the signals carried by these feature-selective neurons are integrated into coherent object representations is unknown. To constrain the set of possible integrative mechanisms, we quantified the temporal resolution of perception for color, orientation, and conjunctions of these two features. We find that temporal resolution is measurably higher for each feature than for their conjunction, indicating that time is required to integrate features into a perceptual whole. This finding places temporal limits on the mechanisms that could mediate this form of perceptual integration.

  17. Optical integrator for optical dark-soliton detection and pulse shaping.

    PubMed

    Ngo, Nam Quoc

    2006-09-10

    The design and analysis of an Nth-order optical integrator using the digital filter technique is presented. The optical integrator is synthesized using planar-waveguide technology. It is shown that a first-order optical integrator can be used as an optical dark-soliton detector by converting an optical dark-soliton pulse into an optical bell-shaped pulse for ease of detection. The optical integrators can generate an optical step function, staircase function, and paraboliclike functions from input optical Gaussian pulses. The optical integrators may be potentially used as basic building blocks of all-optical signal processing systems because the time integrals of signals may sometimes be required for further use or analysis. Furthermore, an optical integrator may be used for the shaping of optical pulses or in an optical feedback control system.

  18. The general 2-D moments via integral transform method for acoustic radiation and scattering

    NASA Astrophysics Data System (ADS)

    Smith, Jerry R.; Mirotznik, Mark S.

    2004-05-01

    The moments via integral transform method (MITM) is a technique to analytically reduce the 2-D method of moments (MoM) impedance double integrals into single integrals. By using a special integral representation of the Green's function, the impedance integral can be analytically simplified to a single integral in terms of transformed shape and weight functions. The reduced expression requires fewer computations and reduces the fill times of the MoM impedance matrix. Furthermore, the resulting integral is analytic for nearly arbitrary shape and weight function sets. The MITM technique is developed for mixed boundary conditions and predictions with basic shape and weight function sets are presented. Comparisons of accuracy and speed between MITM and brute force are presented. [Work sponsored by ONR and NSWCCD ILIR Board.

  19. Integrating an MR head into a peak detection channel

    NASA Astrophysics Data System (ADS)

    Curland, Nathan; Machelski, Russell J.

    1994-03-01

    Integrating a magnetoresistive (MR) head into a peak detection channel requires the engineer to deal with basic differences between MR and thin film heads. These differences result from nonlinear sensor response, separate write and read elements, and having an active element at the air bearing surface (ABS). A simple model for flux superposition can adequately address nonlinear effects and be used for equalization design. Timing budgets can be developed which demonstrate the dominance of media noise for present day systems. Single threshold qualification can handle most current system requirements. Separate read/write elements mean that more attention needs to be paid to offtrack equalization design and head dimensional tolerancing. An active element at the ABS requires better control of the head-disc potential and leakage currents.

  20. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  1. Simplifying operations with an uplink/downlink integration toolkit

    NASA Astrophysics Data System (ADS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-11-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  2. AUTOPLAN: A PC-based automated mission planning tool

    NASA Technical Reports Server (NTRS)

    Paterra, Frank C.; Allen, Marc S.; Lawrence, George F.

    1987-01-01

    A PC-based automated mission and resource planning tool, AUTOPLAN, is described, with application to small-scale planning and scheduling systems in the Space Station program. The input is a proposed mission profile, including mission duration, number of allowable slip periods, and requirement profiles for one or more resources as a function of time. A corresponding availability profile is also entered for each resource over the whole time interval under study. AUTOPLAN determines all integrated schedules which do not require more than the available resources.

  3. Conversion-Integration of MSFC Nonlinear Signal Diagnostic Analysis Algorithms for Realtime Execution of MSFC's MPP Prototype System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1996-01-01

    NASA's advanced propulsion system Small Scale Magnetic Disturbances/Advanced Technology Development (SSME/ATD) has been undergoing extensive flight certification and developmental testing, which involves large numbers of health monitoring measurements. To enhance engine safety and reliability, detailed analysis and evaluation of the measurement signals are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce the risk of catastrophic system failures and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. During the development of SSME, ASRI participated in the research and development of several advanced non- linear signal diagnostic methods for health monitoring and failure prediction in turbomachinery components. However, due to the intensive computational requirement associated with such advanced analysis tasks, current SSME dynamic data analysis and diagnostic evaluation is performed off-line following flight or ground test with a typical diagnostic turnaround time of one to two days. The objective of MSFC's MPP Prototype System is to eliminate such 'diagnostic lag time' by achieving signal processing and analysis in real-time. Such an on-line diagnostic system can provide sufficient lead time to initiate corrective action and also to enable efficient scheduling of inspection, maintenance and repair activities. The major objective of this project was to convert and implement a number of advanced nonlinear diagnostic DSP algorithms in a format consistent with that required for integration into the Vanderbilt Multigraph Architecture (MGA) Model Based Programming environment. This effort will allow the real-time execution of these algorithms using the MSFC MPP Prototype System. ASRI has completed the software conversion and integration of a sequence of nonlinear signal analysis techniques specified in the SOW for real-time execution on MSFC's MPP Prototype. This report documents and summarizes the results of the contract tasks; provides the complete computer source code; including all FORTRAN/C Utilities; and all other utilities/supporting software libraries that are required for operation.

  4. Real-time realizations of the Bayesian Infrasonic Source Localization Method

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Arrowsmith, S.; Hofstetter, A.; Nippress, A.

    2015-12-01

    The Bayesian Infrasonic Source Localization method (BISL), introduced by Mordak et al. (2010) and upgraded by Marcillo et al. (2014) is destined for the accurate estimation of the atmospheric event origin at local, regional and global scales by the seismic and infrasonic networks and arrays. The BISL is based on probabilistic models of the source-station infrasonic signal propagation time, picking time and azimuth estimate merged with a prior knowledge about celerity distribution. It requires at each hypothetical source location, integration of the product of the corresponding source-station likelihood functions multiplied by a prior probability density function of celerity over the multivariate parameter space. The present BISL realization is generally time-consuming procedure based on numerical integration. The computational scheme proposed simplifies the target function so that integrals are taken exactly and are represented via standard functions. This makes the procedure much faster and realizable in real-time without practical loss of accuracy. The procedure executed as PYTHON-FORTRAN code demonstrates high performance on a set of the model and real data.

  5. Spacelab Mission Implementation Cost Assessment (SMICA)

    NASA Technical Reports Server (NTRS)

    Guynes, B. V.

    1984-01-01

    A total savings of approximately 20 percent is attainable if: (1) mission management and ground processing schedules are compressed; (2) the equipping, staffing, and operating of the Payload Operations Control Center is revised, and (3) methods of working with experiment developers are changed. The development of a new mission implementation technique, which includes mission definition, experiment development, and mission integration/operations, is examined. The Payload Operations Control Center is to relocate and utilize new computer equipment to produce cost savings. Methods of reducing costs by minimizing the Spacelab and payload processing time during pre- and post-mission operation at KSC are analyzed. The changes required to reduce costs in the analytical integration process are studied. The influence of time, requirements accountability, and risk on costs is discussed. Recommendation for cost reductions developed by the Spacelab Mission Implementation Cost Assessment study are listed.

  6. The calculation of viscosity of liquid n-decane and n-hexadecane by the Green-Kubo method

    NASA Astrophysics Data System (ADS)

    Cui, S. T.; Cummings, P. T.; Cochran, H. D.

    This short commentary presents the result of long molecular dynamics simulation calculations of the shear viscosity of liquid n-decane and n-hexadecane using the Green-Kubo integration method. The relaxation time of the stress-stress correlation function is compared with those of rotation and diffusion. The rotational and diffusional relaxation times, which are easy to calculate, provide useful guides for the required simulation time in viscosity calculations. Also, the computational time required for viscosity calculations of these systems by the Green-Kubo method is compared with the time required for previous non-equilibrium molecular dynamics calculations of the same systems. The method of choice for a particular calculation is determined largely by the properties of interest, since the efficiencies of the two methods are comparable for calculation of the zero strain rate viscosity.

  7. Long-Range Educational Policy Planning and the Demand for Educated Manpower in Times of Uncertainty.

    ERIC Educational Resources Information Center

    Bakke, E. K.

    1984-01-01

    There is no good method of regulating the educational system based on specific, numerical measurements of labor requirements, and it will be important to integrate uncertainty into future forecasts. Adjustments in demand and supply of educated labor in Norway require a decentralized authority structure providing incentives for institutions and the…

  8. Advanced FIREFLY Assessment Generalized Mechanization Requirements Report

    DTIC Science & Technology

    1979-06-01

    Systems; Fire Control Computers ; Weapon Control 20. ABSTRACT (Continue on reverse side If necessary end tdentify by blockc number) -The requirements for...airborne digital computer which can be specialized to per- form successfully in a variety of tactical aircraft with differing avionics sensors, fire...AGG ........................................... 27 13 Time of Flight Computation Using a Modified (China Lake) Numerical Integration Algorithm

  9. The implementation of multiple interprofessional integrated modules by health sciences faculty in Chile.

    PubMed

    Castillo-Parra, Silvana; Oyarzo Torres, Sandra; Espinoza Barrios, Mónica; Rojas-Serey, Ana María; Maya, Juan Diego; Sabaj Diez, Valeria; Aliaga Castillo, Verónica; Castillo Niño, Manuel; Romero Romero, Luis; Foster, Jennifer; Hawes Barrios, Gustavo

    2017-11-01

    Multiple interprofessional integrated modules (MIIM) 1 and 2 are two required, cross-curricular courses developed by a team of health professions faculty, as well as experts in education, within the Faculty of Medicine of the University of Chile. MIIM 1 focused on virtual cases requiring team decision-making in real time. MIIM 2 focused on a team-based community project. The evaluation of MIIM included student, teacher, and coordinator perspectives. To explore the perceptions of this interprofessional experience quantitative data in the form of standardised course evaluations regarding teaching methodology, interpersonal relations and the course organisation and logistics were gathered. In addition, qualitative perceptions were collected from student focus groups and meetings with tutors and coordinators. Between 2010 and 2014, 881 students enrolled in MIIM. Their evaluation scores rated interpersonal relations most highly, followed by organisation and logistics, and then teaching methodology. A key result was the learning related to interprofessional team work by the teaching coordinators, as well as the participating faculty. The strengths of this experience included student integration and construction of new knowledge, skill development in making decisions, and collective self-learning. Challenges included additional time management and tutors' role. This work requires valuation of an alternative way of learning, which is critical for the performance of future health professionals.

  10. Integrated environmental planning in the Philippines: A case study of the Palawan Integrated Environmental Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganapin, D.J. Jr.

    1987-01-01

    Integrated environmental planning was analyzed using the case of the Palawan Integrated Environmental Program in the Philippines. The study explores the relationship between development and environmental planning and evaluates the importance of organizational coordination and timing in the integration of these two concerns. Factors affecting the accomplishment of the objectives of integrated environmental planning were also studied. Two planning phases of the Palawan Integrated Environmental Program were observed using the case study approach. Observations of various planning participants-consultants, middle level agency personnel, heads of local agencies-were also considered. The integration of environmental considerations in development planning was found to bemore » beneficial to both environmental and development concerns. The experience showed that such integration requiring tight organizational coordination and the proper timing of activities and outputs. The success of the Palawan Integrated Environmental Program was also found to depend on effective communication, the political functionality of the government, the leadership of its executives, the presence of appropriate structures of authority, sufficiency of funds and manpower and the availability of appropriate environmental planning techniques. Recommendations are provided to further strengthen the integration of environmental considerations in development planning and increase the effectiveness of integrated environmental programs.« less

  11. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  12. Efficient and accurate time-stepping schemes for integrate-and-fire neuronal networks.

    PubMed

    Shelley, M J; Tao, L

    2001-01-01

    To avoid the numerical errors associated with resetting the potential following a spike in simulations of integrate-and-fire neuronal networks, Hansel et al. and Shelley independently developed a modified time-stepping method. Their particular scheme consists of second-order Runge-Kutta time-stepping, a linear interpolant to find spike times, and a recalibration of postspike potential using the spike times. Here we show analytically that such a scheme is second order, discuss the conditions under which efficient, higher-order algorithms can be constructed to treat resets, and develop a modified fourth-order scheme. To support our analysis, we simulate a system of integrate-and-fire conductance-based point neurons with all-to-all coupling. For six-digit accuracy, our modified Runge-Kutta fourth-order scheme needs a time-step of Delta(t) = 0.5 x 10(-3) seconds, whereas to achieve comparable accuracy using a recalibrated second-order or a first-order algorithm requires time-steps of 10(-5) seconds or 10(-9) seconds, respectively. Furthermore, since the cortico-cortical conductances in standard integrate-and-fire neuronal networks do not depend on the value of the membrane potential, we can attain fourth-order accuracy with computational costs normally associated with second-order schemes.

  13. Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor

    NASA Technical Reports Server (NTRS)

    deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert

    2001-01-01

    This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.

  14. Deconstruction of spatial integrity in visual stimulus detected by modulation of synchronized activity in cat visual cortex.

    PubMed

    Zhou, Zhiyi; Bernard, Melanie R; Bonds, A B

    2008-04-02

    Spatiotemporal relationships among contour segments can influence synchronization of neural responses in the primary visual cortex. We performed a systematic study to dissociate the impact of spatial and temporal factors in the signaling of contour integration via synchrony. In addition, we characterized the temporal evolution of this process to clarify potential underlying mechanisms. With a 10 x 10 microelectrode array, we recorded the simultaneous activity of multiple cells in the cat primary visual cortex while stimulating with drifting sine-wave gratings. We preserved temporal integrity and systematically degraded spatial integrity of the sine-wave gratings by adding spatial noise. Neural synchronization was analyzed in the time and frequency domains by conducting cross-correlation and coherence analyses. The general association between neural spike trains depends strongly on spatial integrity, with coherence in the gamma band (35-70 Hz) showing greater sensitivity to the change of spatial structure than other frequency bands. Analysis of the temporal dynamics of synchronization in both time and frequency domains suggests that spike timing synchronization is triggered nearly instantaneously by coherent structure in the stimuli, whereas frequency-specific oscillatory components develop more slowly, presumably through network interactions. Our results suggest that, whereas temporal integrity is required for the generation of synchrony, spatial integrity is critical in triggering subsequent gamma band synchronization.

  15. Feasibility study, software design, layout and simulation of a two-dimensional Fast Fourier Transform machine for use in optical array interferometry

    NASA Technical Reports Server (NTRS)

    Boriakoff, Valentin

    1994-01-01

    The goal of this project was the feasibility study of a particular architecture of a digital signal processing machine operating in real time which could do in a pipeline fashion the computation of the fast Fourier transform (FFT) of a time-domain sampled complex digital data stream. The particular architecture makes use of simple identical processors (called inner product processors) in a linear organization called a systolic array. Through computer simulation the new architecture to compute the FFT with systolic arrays was proved to be viable, and computed the FFT correctly and with the predicted particulars of operation. Integrated circuits to compute the operations expected of the vital node of the systolic architecture were proven feasible, and even with a 2 micron VLSI technology can execute the required operations in the required time. Actual construction of the integrated circuits was successful in one variant (fixed point) and unsuccessful in the other (floating point).

  16. The Mars Technology Program

    NASA Technical Reports Server (NTRS)

    Hayati, Samad A.

    2002-01-01

    Future Mars missions require new capabilities that currently are not available. The Mars Technology Program (MTP) is an integral part of the Mars Exploration Program (MEP). Its sole purpose is to assure that required technologies are developed in time to enable the baselined and future missions. The MTP is a NASA-wide technology development program managed by JPL. It is divided into a Focused Program and a Base Program. The Focused Program is tightly tied to the proposed Mars Program mission milestones. It involves time-critical deliverables that must be developed in time for infusion into the proposed Mars 2005, and, 2009 missions. In addition a technology demonstration mission by AFRL will test a LIDAR as part of a joint NASNAFRL experiment. This program bridges the gap between technology and projects by vertically integrating the technology work with pre-project development in a project-like environment with critical dates for technology infusion. A Base Technology Program attacks higher riskhigher payoff technologies not in the critical path of missions.

  17. Integrating Analysis Goals for EOP, CRF and TRF

    NASA Technical Reports Server (NTRS)

    Ma, Chopo; MacMillan, Daniel; Petrov, Leonid

    2002-01-01

    In a simplified, idealized way the TRF (Terrestrial Reference Frame) can be considered a set of positions at epoch and corresponding linear rates of change while the CRF (Celestial Reference Frame) is a set of fixed directions in space. VLBI analysis can be optimized for CRF and TRF separately while handling some of the complexity of geodetic and astrometric reality. For EOP (Earth Orientation Parameter) time series both CRF and TRF should be accurate at the epoch of interest and well defined over time. The optimal integration of EOP, TRF and CRF in a single VLBI solution configuration requires a detailed consideration of the data set and the possibly conflicting nature of the reference frames. A possible approach for an integrated analysis is described.

  18. Packetized Video On MAGNET

    NASA Astrophysics Data System (ADS)

    Lazar, Aurel A.; White, John S.

    1987-07-01

    Theoretical analysis of integrated local area network model of MAGNET, an integrated network testbed developed at Columbia University, shows that the bandwidth freed up during video and voice calls during periods of little movement in the images and periods of silence in the speech signals could be utilized efficiently for graphics and data transmission. Based on these investigations, an architecture supporting adaptive protocols that are dynamicaly controlled by the requirements of a fluctuating load and changing user environment has been advanced. To further analyze the behavior of the network, a real-time packetized video system has been implemented. This system is embedded in the real-time multimedia workstation EDDY, which integrates video, voice, and data traffic flows. Protocols supporting variable-bandwidth, fixed-quality packetized video transport are described in detail.

  19. The Information Technology Infrastructure for the Translational Genomics Core and the Partners Biobank at Partners Personalized Medicine

    PubMed Central

    Boutin, Natalie; Holzbach, Ana; Mahanta, Lisa; Aldama, Jackie; Cerretani, Xander; Embree, Kevin; Leon, Irene; Rathi, Neeta; Vickers, Matilde

    2016-01-01

    The Biobank and Translational Genomics core at Partners Personalized Medicine requires robust software and hardware. This Information Technology (IT) infrastructure enables the storage and transfer of large amounts of data, drives efficiencies in the laboratory, maintains data integrity from the time of consent to the time that genomic data is distributed for research, and enables the management of complex genetic data. Here, we describe the functional components of the research IT infrastructure at Partners Personalized Medicine and how they integrate with existing clinical and research systems, review some of the ways in which this IT infrastructure maintains data integrity and security, and discuss some of the challenges inherent to building and maintaining such infrastructure. PMID:26805892

  20. Lab-on-chip systems for integrated bioanalyses

    PubMed Central

    Madaboosi, Narayanan; Soares, Ruben R.G.; Fernandes, João Tiago S.; Novo, Pedro; Moulas, Geraud; Chu, Virginia

    2016-01-01

    Biomolecular detection systems based on microfluidics are often called lab-on-chip systems. To fully benefit from the miniaturization resulting from microfluidics, one aims to develop ‘from sample-to-answer’ analytical systems, in which the input is a raw or minimally processed biological, food/feed or environmental sample and the output is a quantitative or qualitative assessment of one or more analytes of interest. In general, such systems will require the integration of several steps or operations to perform their function. This review will discuss these stages of operation, including fluidic handling, which assures that the desired fluid arrives at a specific location at the right time and under the appropriate flow conditions; molecular recognition, which allows the capture of specific analytes at precise locations on the chip; transduction of the molecular recognition event into a measurable signal; sample preparation upstream from analyte capture; and signal amplification procedures to increase sensitivity. Seamless integration of the different stages is required to achieve a point-of-care/point-of-use lab-on-chip device that allows analyte detection at the relevant sensitivity ranges, with a competitive analysis time and cost. PMID:27365042

  1. What Do Students Pay for College? Web Tables. NCES 2012-263

    ERIC Educational Resources Information Center

    Ginder, Scott; Mason, Marcinda

    2012-01-01

    The Integrated Postsecondary Education Data System (IPEDS) has long collected and reported data on tuition, required fees, and room and board charges for first-time, full-time undergraduate (FTFTUG) students. These charges, along with other miscellaneous expenses, constitute what is known as the "published price of attendance" or the…

  2. Integrating MRP (materiel requirements planning) into modern business.

    PubMed

    Lunn, T

    1994-05-01

    Time is the commodity of the '90s. Therefore, we all must learn how to use our manufacturing systems to shorten lead time and increase customer satisfaction. The objective of this article is to discuss practical ways people integrate the techniques of materiel requirements planning (MRP) systems with just-in-time (JIT) execution systems to increase customer satisfaction. Included are examples of new ways people use MRP systems to exemplify the process of continuous improvement--multiple items on work orders, consolidated routings, flexing capacity, and other new developments. Ways that successful companies use MRP II for planning and JIT for execution are discussed. There are many examples of how to apply theory to real life situations and a discussion of techniques that work to keep companies in the mode of continuous improvement. Also included is a look at hands-on, practical methods people use to achieve lead time reduction and simplify bills of material. Total quality management concepts can be applied to the MRP process itself. This in turn helps people improve schedule adherence, which leads to customer satisfaction.

  3. Discontinuous Spectral Difference Method for Conservation Laws on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Liu, Yen; Vinokur, Marcel

    2004-01-01

    A new, high-order, conservative, and efficient discontinuous spectral finite difference (SD) method for conservation laws on unstructured grids is developed. The concept of discontinuous and high-order local representations to achieve conservation and high accuracy is utilized in a manner similar to the Discontinuous Galerkin (DG) and the Spectral Volume (SV) methods, but while these methods are based on the integrated forms of the equations, the new method is based on the differential form to attain a simpler formulation and higher efficiency. Conventional unstructured finite-difference and finite-volume methods require data reconstruction based on the least-squares formulation using neighboring point or cell data. Since each unknown employs a different stencil, one must repeat the least-squares inversion for every point or cell at each time step, or to store the inversion coefficients. In a high-order, three-dimensional computation, the former would involve impractically large CPU time, while for the latter the memory requirement becomes prohibitive. In addition, the finite-difference method does not satisfy the integral conservation in general. By contrast, the DG and SV methods employ a local, universal reconstruction of a given order of accuracy in each cell in terms of internally defined conservative unknowns. Since the solution is discontinuous across cell boundaries, a Riemann solver is necessary to evaluate boundary flux terms and maintain conservation. In the DG method, a Galerkin finite-element method is employed to update the nodal unknowns within each cell. This requires the inversion of a mass matrix, and the use of quadratures of twice the order of accuracy of the reconstruction to evaluate the surface integrals and additional volume integrals for nonlinear flux functions. In the SV method, the integral conservation law is used to update volume averages over subcells defined by a geometrically similar partition of each grid cell. As the order of accuracy increases, the partitioning for 3D requires the introduction of a large number of parameters, whose optimization to achieve convergence becomes increasingly more difficult. Also, the number of interior facets required to subdivide non-planar faces, and the additional increase in the number of quadrature points for each facet, increases the computational cost greatly.

  4. Sixth- and eighth-order Hermite integrator for N-body simulations

    NASA Astrophysics Data System (ADS)

    Nitadori, Keigo; Makino, Junichiro

    2008-10-01

    We present sixth- and eighth-order Hermite integrators for astrophysical N-body simulations, which use the derivatives of accelerations up to second-order ( snap) and third-order ( crackle). These schemes do not require previous values for the corrector, and require only one previous value to construct the predictor. Thus, they are fairly easy to implement. The additional cost of the calculation of the higher-order derivatives is not very high. Even for the eighth-order scheme, the number of floating-point operations for force calculation is only about two times larger than that for traditional fourth-order Hermite scheme. The sixth-order scheme is better than the traditional fourth-order scheme for most cases. When the required accuracy is very high, the eighth-order one is the best. These high-order schemes have several practical advantages. For example, they allow a larger number of particles to be integrated in parallel than the fourth-order scheme does, resulting in higher execution efficiency in both general-purpose parallel computers and GRAPE systems.

  5. Speed of perceptual grouping in acquired brain injury.

    PubMed

    Kurylo, Daniel D; Larkin, Gabriella Brick; Waxman, Richard; Bukhari, Farhan

    2014-09-01

    Evidence exists that damage to white matter connections may contribute to reduced speed of information processing in traumatic brain injury and stroke. Damage to such axonal projections suggests a particular vulnerability to functions requiring integration across cortical sites. To test this prediction, measurements were made of perceptual grouping, which requires integration of stimulus components. A group of traumatic brain injury and cerebral vascular accident patients and a group of age-matched healthy control subjects viewed arrays of dots and indicated the pattern into which stimuli were perceptually grouped. Psychophysical measurements were made of perceptual grouping as well as processing speed. The patient group showed elevated grouping thresholds as well as extended processing time. In addition, most patients showed progressive slowing of processing speed across levels of difficulty, suggesting reduced resources to accommodate increased demands on grouping. These results support the prediction that brain injury results in a particular vulnerability to functions requiring integration of information across the cortex, which may result from dysfunction of long-range axonal connection.

  6. The evolution of meaning: spatio-temporal dynamics of visual object recognition.

    PubMed

    Clarke, Alex; Taylor, Kirsten I; Tyler, Lorraine K

    2011-08-01

    Research on the spatio-temporal dynamics of visual object recognition suggests a recurrent, interactive model whereby an initial feedforward sweep through the ventral stream to prefrontal cortex is followed by recurrent interactions. However, critical questions remain regarding the factors that mediate the degree of recurrent interactions necessary for meaningful object recognition. The novel prediction we test here is that recurrent interactivity is driven by increasing semantic integration demands as defined by the complexity of semantic information required by the task and driven by the stimuli. To test this prediction, we recorded magnetoencephalography data while participants named living and nonliving objects during two naming tasks. We found that the spatio-temporal dynamics of neural activity were modulated by the level of semantic integration required. Specifically, source reconstructed time courses and phase synchronization measures showed increased recurrent interactions as a function of semantic integration demands. These findings demonstrate that the cortical dynamics of object processing are modulated by the complexity of semantic information required from the visual input.

  7. Modeling biological pathway dynamics with timed automata.

    PubMed

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  8. A Wide Dynamic Range Tapped Linear Array Image Sensor

    NASA Astrophysics Data System (ADS)

    Washkurak, William D.; Chamberlain, Savvas G.; Prince, N. Daryl

    1988-08-01

    Detectors for acousto-optic signal processing applications require fast transient response as well as wide dynamic range. There are two major choices of detectors: conductive or integration mode. Conductive mode detectors have an initial transient period before they reach then' i equilibrium state. The duration of 1 his period is dependent on light level as well as detector capacitance. At low light levels a conductive mode detector is very slow; response time is typically on the order of milliseconds. Generally. to obtain fast transient response an integrating mode detector is preferred. With integrating mode detectors. the dynamic range is determined by the charge storage capability of the tran-sport shift registers and the noise level of the image sensor. The conventional net hod used to improve dynamic range is to increase the shift register charge storage capability. To achieve a dynamic range of fifty thousand assuming two hundred noise equivalent electrons, a charge storage capability of ten million electrons would be required. In order to accommodate this amount of charge. unrealistic shift registers widths would be required. Therefore, with an integrating mode detector it is difficult to achieve a dynamic range of over four orders of magnitude of input light intensity. Another alternative is to solve the problem at the photodetector aml not the shift, register. DALSA's wide dynamic range detector utilizes an optimized, ion implant doped, profiled MOSFET photodetector specifically designed for wide dynamic range. When this new detector operates at high speed and at low light levels the photons are collected and stored in an integrating fashion. However. at bright light levels where transient periods are short, the detector switches into a conductive mode. The light intensity is logarithmically compressed into small charge packets, easily carried by the CCD shift register. As a result of the logarithmic conversion, dynamic ranges of over six orders of magnitide are obtained. To achieve the short integration times necessary in acousto-optic applications. t he wide dynamic range detector has been implemented into a tapped array architecture with eight outputs and 256 photoelements. Operation of each 01)1,1)111 at 16 MHz yields detector integration times of 2 micro-seconds. Buried channel two phase CCD shift register technology is utilized to minimize image sensor noise improve video output rates and increase ease of operation.

  9. Numerical analysis of the asymptotic two-point boundary value solution for N-body trajectories.

    NASA Technical Reports Server (NTRS)

    Lancaster, J. E.; Allemann, R. A.

    1972-01-01

    Previously published asymptotic solutions for lunar and interplanetary trajectories have been modified and combined to formulate a general analytical boundary value solution applicable to a broad class of trajectory problems. In addition, the earlier first-order solutions have been extended to second-order to determine if improved accuracy is possible. Comparisons between the asymptotic solution and numerical integration for several lunar and interplanetary trajectories show that the asymptotic solution is generally quite accurate. Also, since no iterations are required, a solution to the boundary value problem is obtained in a fraction of the time required for numerically integrated solutions.

  10. Monte Carlo Simulations and Generation of the SPI Response

    NASA Technical Reports Server (NTRS)

    Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.

    2003-01-01

    In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss our extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and inflight calibration data with MGEANT simulation.

  11. Monte Carlo Simulations and Generation of the SPI Response

    NASA Technical Reports Server (NTRS)

    Sturner, S. J.; Shrader, C. R.; Weidenspointner, G.; Teegarden, B. J.; Attie, D.; Cordier, B.; Diehl, R.; Ferguson, C.; Jean, P.; vonKienlin, A.

    2003-01-01

    In this paper we discuss the methods developed for the production of the INTEGRAL/SPI instrument response. The response files were produced using a suite of Monte Carlo simulation software developed at NASA/GSFC based on the GEANT-3 package available from CERN. The production of the INTEGRAL/SPI instrument response also required the development of a detailed computer mass model for SPI. We discuss ow extensive investigations into methods to reduce both the computation time and storage requirements for the SPI response. We also discuss corrections to the simulated response based on our comparison of ground and infiight Calibration data with MGEANT simulations.

  12. GPS Integrity Channel RTCA Working Group recommendations

    NASA Astrophysics Data System (ADS)

    Kalafus, Rudolph M.

    Recommendations made by a working group established by the Radio Technical Commission for Aeronautics are presented for the design of a wide-area broadcast service to provide indications on the status of GPS satellites. The integrity channel requirements and operational goals are outlined. Six integrity channel system concepts are considered and system design and time-to-alarm considerations are examined. The recommended system includes the broadcast of a coarse range measurement for each satellite which will enable the on-board GPS receiver to determine whether or not the navigation accuracy is within prescribed limits.

  13. Integrated SeismoGeodetic Systsem with High-Resolution, Real-Time GNSS and Accelerometer Observation For Earthquake Early Warning Application.

    NASA Astrophysics Data System (ADS)

    Passmore, P. R.; Jackson, M.; Zimakov, L. G.; Raczka, J.; Davidson, P.

    2014-12-01

    The key requirements for Earthquake Early Warning and other Rapid Event Notification Systems are: Quick delivery of digital data from a field station to the acquisition and processing center; Data integrity for real-time earthquake notification in order to provide warning prior to significant ground shaking in the given target area. These two requirements are met in the recently developed Trimble SG160-09 SeismoGeodetic System, which integrates both GNSS and acceleration measurements using the Kalman filter algorithm to create a new high-rate (200 sps), real-time displacement with sufficient accuracy and very low latency for rapid delivery of the acquired data to a processing center. The data acquisition algorithm in the SG160-09 System provides output of both acceleration and displacement digital data with 0.2 sec delay. This is a significant reduction in the time interval required for real-time transmission compared to data delivery algorithms available in digitizers currently used in other Earthquake Early Warning networks. Both acceleration and displacement data are recorded and transmitted to the processing site in a specially developed Multiplexed Recording Format (MRF) that minimizes the bandwidth required for real-time data transmission. In addition, a built in algorithm calculates the τc and Pd once the event is declared. The SG160-09 System keeps track of what data has not been acknowledged and re-transmits the data giving priority to current data. Modified REF TEK Protocol Daemon (RTPD) receives the digital data and acknowledges data received without error. It forwards this "good" data to processing clients of various real-time data processing software including Earthworm and SeisComP3. The processing clients cache packets when a data gap occurs due to a dropped packet or network outage. The cache packet time is settable, but should not exceed 0.5 sec in the Earthquake Early Warning network configuration. The rapid data transmission algorithm was tested with different communication media, including Internet, DSL, Wi-Fi, GPRS, etc. The test results show that the data latency via most communication media do not exceed 0.5 sec nominal from a first sample in the data packet. Detailed acquisition algorithm and results of data transmission via different communication media are presented.

  14. Training & Personnel Systems Technology. R&D Program Description FY 84-85.

    DTIC Science & Technology

    1984-04-01

    performance requirements in terms of rapid response times, high rates of information processing, and complex decision making that tax the capabilities...makers to make linguistic and format changes to texts to enhance general literacy rates , (d) begin integrating human and animal data on stress ;ffects...systems are being Integrated Into the force at unprecedented rates , arrival of this sophisticated, high-technology equipment will coincide with increased

  15. Implications of Responsive Space on the Flight Software Architecture

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.

  16. Bayes factors for the linear ballistic accumulator model of decision-making.

    PubMed

    Evans, Nathan J; Brown, Scott D

    2018-04-01

    Evidence accumulation models of decision-making have led to advances in several different areas of psychology. These models provide a way to integrate response time and accuracy data, and to describe performance in terms of latent cognitive processes. Testing important psychological hypotheses using cognitive models requires a method to make inferences about different versions of the models which assume different parameters to cause observed effects. The task of model-based inference using noisy data is difficult, and has proven especially problematic with current model selection methods based on parameter estimation. We provide a method for computing Bayes factors through Monte-Carlo integration for the linear ballistic accumulator (LBA; Brown and Heathcote, 2008), a widely used evidence accumulation model. Bayes factors are used frequently for inference with simpler statistical models, and they do not require parameter estimation. In order to overcome the computational burden of estimating Bayes factors via brute force integration, we exploit general purpose graphical processing units; we provide free code for this. This approach allows estimation of Bayes factors via Monte-Carlo integration within a practical time frame. We demonstrate the method using both simulated and real data. We investigate the stability of the Monte-Carlo approximation, and the LBA's inferential properties, in simulation studies.

  17. Integrated calibration sphere and calibration step fixture for improved coordinate measurement machine calibration

    DOEpatents

    Clifford, Harry J [Los Alamos, NM

    2011-03-22

    A method and apparatus for mounting a calibration sphere to a calibration fixture for Coordinate Measurement Machine (CMM) calibration and qualification is described, decreasing the time required for such qualification, thus allowing the CMM to be used more productively. A number of embodiments are disclosed that allow for new and retrofit manufacture to perform as integrated calibration sphere and calibration fixture devices. This invention renders unnecessary the removal of a calibration sphere prior to CMM measurement of calibration features on calibration fixtures, thereby greatly reducing the time spent qualifying a CMM.

  18. General Multimechanism Reversible-Irreversible Time-Dependent Constitutive Deformation Model Being Developed

    NASA Technical Reports Server (NTRS)

    Saleeb, A. F.; Arnold, Steven M.

    2001-01-01

    Since most advanced material systems (for example metallic-, polymer-, and ceramic-based systems) being currently researched and evaluated are for high-temperature airframe and propulsion system applications, the required constitutive models must account for both reversible and irreversible time-dependent deformations. Furthermore, since an integral part of continuum-based computational methodologies (be they microscale- or macroscale-based) is an accurate and computationally efficient constitutive model to describe the deformation behavior of the materials of interest, extensive research efforts have been made over the years on the phenomenological representations of constitutive material behavior in the inelastic analysis of structures. From a more recent and comprehensive perspective, the NASA Glenn Research Center in conjunction with the University of Akron has emphasized concurrently addressing three important and related areas: that is, 1) Mathematical formulation; 2) Algorithmic developments for updating (integrating) the external (e.g., stress) and internal state variables; 3) Parameter estimation for characterizing the model. This concurrent perspective to constitutive modeling has enabled the overcoming of the two major obstacles to fully utilizing these sophisticated time-dependent (hereditary) constitutive models in practical engineering analysis. These obstacles are: 1) Lack of efficient and robust integration algorithms; 2) Difficulties associated with characterizing the large number of required material parameters, particularly when many of these parameters lack obvious or direct physical interpretations.

  19. Start-up capabilities of photovoltaic module for the International Space Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hajela, G.; Hague, L.

    1997-12-31

    The International Space Station (ISS) uses four photovoltaic modules (PVMs) to supply electric power for the US On-Orbit Segment (USOS). The ISS is assembled on orbit over a period of about 5 years and over 40 stages. PVMs are launched and integrated with the ISS at different times during the ISS assembly. During early stages, the electric power is provided by the integrated truss segment (ITS) P6; subsequently, ITS P4, S4, and S6 are launched. PVMs are launched into space in the National Space Transportation System (NSTS) cargo bay. Each PVM consists of two independent power channels. The NSTS docksmore » with the ISS, the PVM is removed from the cargo bay and installed on the ISS. At this stage the PVM is in stowed configuration and its batteries are in fully discharged state. The start-up consists of initialization and checkout of all hardware, deployment of SAW and photovoltaic radiator (PVR), thermal conditioning batteries, and charging batteries; not necessarily in the same order for all PVMs. PVMs are designed to be capable of on-orbit start-up, within a specified time period, when external power is applied to a specified electrical interface. This paper describes the essential steps required for PVM start-up and how these operations are performed for various PVMs. The integrated operations scenarios (IOS) prepared by the NASA, Johnson Space Center, details specific procedures and timelines for start-up of each PVM. The paper describes how dormant batteries are brought to their normal operating temperature range and then charged to 100% state of charge (SOC). Total time required to complete start-up is computed and compared to the IOS timelines. External power required during start-up is computed and compared to the requirements and/or available power on ISS. Also described is how these start-up procedures can be adopted for restart of PVMs when required.« less

  20. Putting the Pieces Together for Queer Youth: A Model of Integrated Assessment of Need and Program Planning

    ERIC Educational Resources Information Center

    Berberet, Heather M.

    2006-01-01

    Needs assessments require staff with the necessary expertise to design the study, collect the data, analyze the data, and present results. They require money, time, and persistence, because the people one wishes to assess often are difficult to access. This article argues for the centrality of a well-done needs assessment when developing services…

  1. Listening as a Method of Learning a Foreign Language at the Non-Language Faculty of the University

    ERIC Educational Resources Information Center

    Kondrateva, Irina G.; Safina, Minnisa S.; Valeev, Agzam A.

    2016-01-01

    Learning a foreign language is becoming an increasingly important with Russia's integration into the world community. In this regard, increased requirements for the educational process and the development of new innovative teaching methods meet the requirements of the time. One of the important aspects of learning a foreign language is listening…

  2. Accuracy and optimal timing of activity measurements in estimating the absorbed dose of radioiodine in the treatment of Graves' disease

    NASA Astrophysics Data System (ADS)

    Merrill, S.; Horowitz, J.; Traino, A. C.; Chipkin, S. R.; Hollot, C. V.; Chait, Y.

    2011-02-01

    Calculation of the therapeutic activity of radioiodine 131I for individualized dosimetry in the treatment of Graves' disease requires an accurate estimate of the thyroid absorbed radiation dose based on a tracer activity administration of 131I. Common approaches (Marinelli-Quimby formula, MIRD algorithm) use, respectively, the effective half-life of radioiodine in the thyroid and the time-integrated activity. Many physicians perform one, two, or at most three tracer dose activity measurements at various times and calculate the required therapeutic activity by ad hoc methods. In this paper, we study the accuracy of estimates of four 'target variables': time-integrated activity coefficient, time of maximum activity, maximum activity, and effective half-life in the gland. Clinical data from 41 patients who underwent 131I therapy for Graves' disease at the University Hospital in Pisa, Italy, are used for analysis. The radioiodine kinetics are described using a nonlinear mixed-effects model. The distributions of the target variables in the patient population are characterized. Using minimum root mean squared error as the criterion, optimal 1-, 2-, and 3-point sampling schedules are determined for estimation of the target variables, and probabilistic bounds are given for the errors under the optimal times. An algorithm is developed for computing the optimal 1-, 2-, and 3-point sampling schedules for the target variables. This algorithm is implemented in a freely available software tool. Taking into consideration 131I effective half-life in the thyroid and measurement noise, the optimal 1-point time for time-integrated activity coefficient is a measurement 1 week following the tracer dose. Additional measurements give only a slight improvement in accuracy.

  3. Flight elements: Fault detection and fault management

    NASA Technical Reports Server (NTRS)

    Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.

    1990-01-01

    Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.

  4. Combined Monte Carlo and path-integral method for simulated library of time-resolved reflectance curves from layered tissue models

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann

    2009-02-01

    Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.

  5. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  6. Coupled Cryogenic Thermal and Electrical Models for Transient Analysis of Superconducting Power Devices with Integrated Cryogenic Systems

    NASA Astrophysics Data System (ADS)

    Satyanarayana, S.; Indrakanti, S.; Kim, J.; Kim, C.; Pamidi, S.

    2017-12-01

    Benefits of an integrated high temperature superconducting (HTS) power system and the associated cryogenic systems on board an electric ship or aircraft are discussed. A versatile modelling methodology developed to assess the cryogenic thermal behavior of the integrated system with multiple HTS devices and the various potential configurations are introduced. The utility and effectiveness of the developed modelling methodology is demonstrated using a case study involving a hypothetical system including an HTS propulsion motor, an HTS generator and an HTS power cable cooled by an integrated cryogenic helium circulation system. Using the methodology, multiple configurations are studied. The required total cooling power and the ability to maintain each HTS device at the required operating temperatures are considered for each configuration and the trade-offs are discussed for each configuration. Transient analysis of temperature evolution in the cryogenic helium circulation loop in case of a system failure is carried out to arrive at the required critical response time. The analysis was also performed for a similar liquid nitrogen circulation for an isobaric condition and the cooling capacity ratio is used to compare the relative merits of the two cryogens.

  7. Programmable logic construction kits for hyper-real-time neuronal modeling.

    PubMed

    Guerrero-Rivera, Ruben; Morrison, Abigail; Diesmann, Markus; Pearce, Tim C

    2006-11-01

    Programmable logic designs are presented that achieve exact integration of leaky integrate-and-fire soma and dynamical synapse neuronal models and incorporate spike-time dependent plasticity and axonal delays. Highly accurate numerical performance has been achieved by modifying simpler forward-Euler-based circuitry requiring minimal circuit allocation, which, as we show, behaves equivalently to exact integration. These designs have been implemented and simulated at the behavioral and physical device levels, demonstrating close agreement with both numerical and analytical results. By exploiting finely grained parallelism and single clock cycle numerical iteration, these designs achieve simulation speeds at least five orders of magnitude faster than the nervous system, termed here hyper-real-time operation, when deployed on commercially available field-programmable gate array (FPGA) devices. Taken together, our designs form a programmable logic construction kit of commonly used neuronal model elements that supports the building of large and complex architectures of spiking neuron networks for real-time neuromorphic implementation, neurophysiological interfacing, or efficient parameter space investigations.

  8. Numerical Evaluation of the "Dual-Kernel Counter-flow" Matric Convolution Integral that Arises in Discrete/Continuous (D/C) Control Theory

    NASA Technical Reports Server (NTRS)

    Nixon, Douglas D.

    2009-01-01

    Discrete/Continuous (D/C) control theory is a new generalized theory of discrete-time control that expands the concept of conventional (exact) discrete-time control to create a framework for design and implementation of discretetime control systems that include a continuous-time command function generator so that actuator commands need not be constant between control decisions, but can be more generally defined and implemented as functions that vary with time across sample period. Because the plant/control system construct contains two linear subsystems arranged in tandem, a novel dual-kernel counter-flow convolution integral appears in the formulation. As part of the D/C system design and implementation process, numerical evaluation of that integral over the sample period is required. Three fundamentally different evaluation methods and associated algorithms are derived for the constant-coefficient case. Numerical results are matched against three available examples that have closed-form solutions.

  9. SIMULATING ATMOSPHERIC EXPOSURE USING AN INNOVATIVE METEOROLOGICAL SAMPLING SCHEME

    EPA Science Inventory

    Multimedia Risk assessments require the temporal integration of atmospheric concentration and deposition estimates with other media modules. However, providing an extended time series of estimates is computationally expensive. An alternative approach is to substitute long-ter...

  10. Flame hardened snow plow blades.

    DOT National Transportation Integrated Search

    2013-04-15

    Underbody plows and High Speed Ice Blades are an integral part of clearing Iowa roads of snow and ice during winter : operations. Changing these blades requires crews to suspend plowing operations and return to the garage decreasing time : spent clea...

  11. A high-order boundary integral method for surface diffusions on elastically stressed axisymmetric rods.

    PubMed

    Li, Xiaofan; Nie, Qing

    2009-07-01

    Many applications in materials involve surface diffusion of elastically stressed solids. Study of singularity formation and long-time behavior of such solid surfaces requires accurate simulations in both space and time. Here we present a high-order boundary integral method for an elastically stressed solid with axi-symmetry due to surface diffusions. In this method, the boundary integrals for isotropic elasticity in axi-symmetric geometry are approximated through modified alternating quadratures along with an extrapolation technique, leading to an arbitrarily high-order quadrature; in addition, a high-order (temporal) integration factor method, based on explicit representation of the mean curvature, is used to reduce the stability constraint on time-step. To apply this method to a periodic (in axial direction) and axi-symmetric elastically stressed cylinder, we also present a fast and accurate summation method for the periodic Green's functions of isotropic elasticity. Using the high-order boundary integral method, we demonstrate that in absence of elasticity the cylinder surface pinches in finite time at the axis of the symmetry and the universal cone angle of the pinching is found to be consistent with the previous studies based on a self-similar assumption. In the presence of elastic stress, we show that a finite time, geometrical singularity occurs well before the cylindrical solid collapses onto the axis of symmetry, and the angle of the corner singularity on the cylinder surface is also estimated.

  12. Power Hardware-in-the-Loop Evaluation of PV Inverter Grid Support on Hawaiian Electric Feeders: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Austin; Prabakar, Kumaraguru; Nagarajan, Adarsh

    As more grid-connected photovoltaic (PV) inverters become compliant with evolving interconnections requirements, there is increased interest from utilities in understanding how to best deploy advanced grid-support functions (GSF) in the field. One efficient and cost-effective method to examine such deployment options is to leverage power hardware-in-the-loop (PHIL) testing methods. Two Hawaiian Electric feeder models were converted to real-time models in the OPAL-RT real-time digital testing platform, and integrated with models of GSF capable PV inverters that were modeled from characterization test data. The integrated model was subsequently used in PHIL testing to evaluate the effects of different fixed power factormore » and volt-watt control settings on voltage regulation of the selected feeders. The results of this study were provided as inputs for field deployment and technical interconnection requirements for grid-connected PV inverters on the Hawaiian Islands.« less

  13. Integrating population health into a family medicine clerkship: 7 years of evolution.

    PubMed

    Unverzagt, Mark; Wallerstein, Nina; Benson, Jeffrey A; Tomedi, Angelo; Palley, Toby B

    2003-01-01

    A population health curriculum using methodologies from community-oriented primary care (COPC) was developed in 1994 as part of a required third-year family medicine clerkship at the University of New Mexico. The curriculum integrates population health/community medicine projects and problem-based tutorials into a community-based, ambulatory clinical experience. By combining a required population health experience with relevant clinical training, student careers have the opportunity to be influenced during the critical third year. Results over a 7-year period describe a three-phase evolution of the curriculum, within the context of changes in medical education and in health care delivery systems in that same period of time. Early evaluation revealed that students viewed the curricular experience as time consuming and peripheral to their training. Later comments on the revised curriculum showed a higher regard for the experience that was described as important for student learning.

  14. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  15. Advanced software integration: The case for ITV facilities

    NASA Technical Reports Server (NTRS)

    Garman, John R.

    1990-01-01

    The array of technologies and methodologies involved in the development and integration of avionics software has moved almost as rapidly as computer technology itself. Future avionics systems involve major advances and risks in the following areas: (1) Complexity; (2) Connectivity; (3) Security; (4) Duration; and (5) Software engineering. From an architectural standpoint, the systems will be much more distributed, involve session-based user interfaces, and have the layered architectures typified in the layers of abstraction concepts popular in networking. Typified in the NASA Space Station Freedom will be the highly distributed nature of software development itself. Systems composed of independent components developed in parallel must be bound by rigid standards and interfaces, the clean requirements and specifications. Avionics software provides a challenge in that it can not be flight tested until the first time it literally flies. It is the binding of requirements for such an integration environment into the advances and risks of future avionics systems that form the basis of the presented concept and the basic Integration, Test, and Verification concept within the development and integration life cycle of Space Station Mission and Avionics systems.

  16. Optimal Black Start Resource Allocation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu, Feng; Wang, Jianhui; Chen, Chen

    The restoration of the bulk power system after a partial or complete blackout relies on black-start (BS) resources. To prepare for system restoration, it is important to procure the right amount of BS resources at the right locations in the grid so that the total restoration time can be minimized. Achieving this goal requires that resource procurement planning takes the restoration process into account. In this study, we integrate the BS resource procurement decision with a restoration planning model and develop an optimization model that produces a minimal cost procurement plan that satisfies the restoration time requirement.

  17. Integrating Real-Time Antecedent Rubrics via Blackboard™ into a Community College General Psychology Class

    ERIC Educational Resources Information Center

    Goomas, David

    2015-01-01

    Numerous studies have reported on the innovative and effective delivery of online course content by community colleges, but not much has been done on how learning management systems (LMS) can deliver real-time (immediate data delivery) antecedents that inform students of performance requirements. This pilot study used Blackboard's™ interactive…

  18. Picosecond Resolution Time-to-Digital Converter Using Gm-C Integrator and SAR-ADC

    NASA Astrophysics Data System (ADS)

    Xu, Zule; Miyahara, Masaya; Matsuzawa, Akira

    2014-04-01

    A picosecond resolution time-to-digital converter (TDC) is presented. The resolution of a conventional delay chain TDC is limited by the delay of a logic buffer. Various types of recent TDCs are successful in breaking this limitation, but they require a significant calibration effort to achieve picosecond resolution with a sufficient linear range. To address these issues, we propose a simple method to break the resolution limitation without any calibration: a Gm-C integrator followed by a successive approximation register analog-to-digital converter (SAR-ADC). This translates the time interval into charge, and then the charge is quantized. A prototype chip was fabricated in 90 nm CMOS. The measurement results reveal a 1 ps resolution, a -0.6/0.7 LSB differential nonlinearity (DNL), a -1.1/2.3 LSB integral nonlinearity (INL), and a 9-bit range. The measured 11.74 ps single-shot precision is caused by the noise of the integrator. We analyze the noise of the integrator and propose an improved front-end circuit to reduce this noise. The proposal is verified by simulations showing the maximum single-shot precision is less than 1 ps. The proposed front-end circuit can also diminish the mismatch effects.

  19. On the enhanced detectability of GPS anomalous behavior with relative entropy

    NASA Astrophysics Data System (ADS)

    Cho, Jeongho

    2016-10-01

    A standard receiver autonomous integrity monitoring (RAIM) technique for the global positioning system (GPS) has been dedicated to provide an integrity monitoring capability for safety-critical GPS applications, such as in civil aviation for the en-route (ER) through non-precision approach (NPA) or lateral navigation (LNAV). The performance of the existing RAIM method, however, may not meet more stringent aviation requirements for availability and integrity during the precision approach and landing phases of flight due to insufficient observables and/or untimely warning to the user beyond a specified time-to-alert in the event of a significant GPS failure. This has led to an enhanced RAIM architecture ensuring stricter integrity requirement by greatly decreasing the detection time when a satellite failure or a measurement error has occurred. We thus attempted to devise a user integrity monitor which is capable of identifying the GPS failure more rapidly than a standard RAIM scheme by incorporating the RAIM with the relative entropy, which is a likelihood ratio approach to assess the inconsistence between two data streams, quite different from a Euclidean distance. In addition, the delay-coordinate embedding technique needs to be considered and preprocessed to associate the discriminant measure obtained from the RAIM with the relative entropy in the new RAIM design. In simulation results, we demonstrate that the proposed user integrity monitor outperforms the standard RAIM with a higher level of detection rate of anomalies which could be hazardous to the users in the approach or landing phase and is a very promising alternative for the detection of deviations in GPS signal. The comparison also shows that it enables to catch even small anomalous gradients more rapidly than a typical user integrity monitor.

  20. Curricular Integration in Pharmacy Education

    PubMed Central

    Pearson, Marion L.; Hubball, Harry T.

    2012-01-01

    This article reviews the concepts of curricular integration and integrative learning. These concepts have reemerged in contemporary higher education reforms and are crucial in pharmacy programs where students are expected to acquire the knowledge, skills, and abilities needed for competent practice in a complex environment. Enhancing integration requires negotiating obstacles, including institutional traditions of disciplinary structures and disciplinary differences in understandings of knowledge and approaches to teaching and learning; investing the time and effort to design and implement integrated curricula; and using learning-centered pedagogical strategies. Evidence supporting the value of such efforts is not compelling, as much because of insufficient research as lackluster findings. Future avenues of scholarly inquiry are suggested to evaluate curricular integration, distinguishing between the curriculum espoused by planners, the curriculum enacted by instructors, and the curriculum experienced by students. PMID:23275669

  1. The Integrated Safety-Critical Advanced Avionics Communication and Control (ISAACC) System Concept: Infrastructure for ISHM

    NASA Technical Reports Server (NTRS)

    Gwaltney, David A.; Briscoe, Jeri M.

    2005-01-01

    Integrated System Health Management (ISHM) architectures for spacecraft will include hard real-time, critical subsystems and soft real-time monitoring subsystems. Interaction between these subsystems will be necessary and an architecture supporting multiple criticality levels will be required. Demonstration hardware for the Integrated Safety-Critical Advanced Avionics Communication & Control (ISAACC) system has been developed at NASA Marshall Space Flight Center. It is a modular system using a commercially available time-triggered protocol, ?Tp/C, that supports hard real-time distributed control systems independent of the data transmission medium. The protocol is implemented in hardware and provides guaranteed low-latency messaging with inherent fault-tolerance and fault-containment. Interoperability between modules and systems of modules using the TTP/C is guaranteed through definition of messages and the precise message schedule implemented by the master-less Time Division Multiple Access (TDMA) communications protocol. "Plug-and-play" capability for sensors and actuators provides automatically configurable modules supporting sensor recalibration and control algorithm re-tuning without software modification. Modular components of controlled physical system(s) critical to control algorithm tuning, such as pumps or valve components in an engine, can be replaced or upgraded as "plug and play" components without modification to the ISAACC module hardware or software. ISAACC modules can communicate with other vehicle subsystems through time-triggered protocols or other communications protocols implemented over Ethernet, MIL-STD- 1553 and RS-485/422. Other communication bus physical layers and protocols can be included as required. In this way, the ISAACC modules can be part of a system-of-systems in a vehicle with multi-tier subsystems of varying criticality. The goal of the ISAACC architecture development is control and monitoring of safety critical systems of a manned spacecraft. These systems include spacecraft navigation and attitude control, propulsion, automated docking, vehicle health management and life support. ISAACC can integrate local critical subsystem health management with subsystems performing long term health monitoring. The ISAACC system and its relationship to ISHM will be presented.

  2. Comparison of computer-integrated patient-controlled epidural analgesia with no initial basal infusion versus moderate basal infusion for labor and delivery: A randomized controlled trial

    PubMed Central

    Sng, Ban Leong; Woo, David; Leong, Wan Ling; Wang, Hao; Assam, Pryseley Nkouibert; Sia, Alex TH

    2014-01-01

    Background and Aims: Computer-integrated patient-controlled epidural analgesia (CIPCEA) is a novel epidural drug delivery system. It automatically adjusts the basal infusion based on the individual's need for analgesia as labor progresses. Materials and Methods: This study compared the time-weighted local anesthetic (LA) consumption by comparing parturients using CIPCEA with no initial basal infusion (CIPCEA0) with CIPCEA with initial moderate basal infusion of 5 ml/H (CIPCEA5). We recruited 76 subjects after ethics approval. The computer integration of CIPCEA titrate the basal infusion to 5, 10, 15, or 20 ml/H if the parturient required respectively, one, two, three, or four patient demands in the previous hour. The basal infusion reduced by 5 ml/H if there was no demand in the previous hour. The sample size was calculated to show equivalence in LA consumption. Results: The time-weighted LA consumption between both groups were similar with CIPCEA0 group (mean [standard deviation (SD)] 8.9 [3.5] mg/H) compared to the CIPCEA5 group (mean [SD] 9.9 [3.5] mg/H), P = 0.080. Both groups had a similar incidence of breakthrough pain, duration of the second stage, mode of delivery, and patient satisfaction. However, more subjects in the CIPCEA0 group required patient self-bolus. There were no differences in fetal outcomes. Discussion: Both CIPCEA regimens had similar time-weighted LA consumption and initial moderate basal infusion with CIPCEA may not be required. PMID:25425774

  3. NY TBO Research: Integrated Demand Management (IDM): IDM Concept, Tools, and Training Package

    NASA Technical Reports Server (NTRS)

    Smith, Nancy

    2016-01-01

    A series of human-in-the-loop simulation sessions were conducted in the Airspace Operations Laboratory (AOL) to evaluate a new traffic management concept called Integrated Demand Management (IDM). The simulation explored how to address chronic equity, throughput and delay issues associated with New Yorks high-volume airports by operationally integrating three current and NextGen capabilities the Collaborative Trajectory Options Program (CTOP), Time-Based Flow Management (TBFM) and Required Time of Arrival (RTA) in order to better manage traffic demand within the National Air Traffic System. A package of presentation slides was developed to describe the concept, tools, and training materials used in the simulation sessions. The package will be used to outbrief our stakeholders by both presenting orally and disseminating of the materials via email.

  4. Things That Scientists Don't Understand About NASA Spaceflight Research

    NASA Technical Reports Server (NTRS)

    Platts, S. H.; Bauer, Terri; Rogers, Shanna

    2017-01-01

    So you want to conduct human spaceflight research aboard the International Space Station (ISS)? Once your spaceflight research aboard the ISS is proposal is funded.... the real work begins. Because resources are so limited for ISS research, it is necessary to maximize the work being done, while at the same time, minimizing the resources spent. Astronauts may be presented with over 30 human research experiments and select, on average approximately 15 in which to participate. In order to conduct this many studies, ISSMP uses the study requirements provided by the principle investigator to integrate all of this work into the astronauts' complement. The most important thing for investigators to convey to the ISSMP team is their RESEARCH REQUIREMENTS. Requirements are captured in the Experiment document. This document is the official record of how, what, where and when data will be collected. One common mistake that investigators make is not taking this document seriously, but when push comes to shove, if a research requirement is not in this document....it will not get done. The research requirements are then integrated to form a complement of research for each astronaut. What do we mean by integration? Many experiments have overlapping requirements; blood draws, behavioral surveys, heart rate measurement. Where possible, these measures are combined to reduce redundancy and save crew time. Investigators can access these data via data sharing agreements. More examples of how ISS research is integrated will be presented. There are additional limitations commonly associated with human spaceflight research that will also be discussed. Large/heavy hardware, invasive procedures, and toxic reagents are extremely difficult to implement on the ISS. There are strict limits placed on the amount of blood that can be drawn from crew members during (and immediately after) spaceflight. These limits are based on 30-day rolling accumulations. We have recently had to start restricting studies due to this limit. The NASA Human Research Program (HRP) provides extensive support, via ISSMP, to help investigators cope with all of the intricacies of conducting human spaceflight research. This presentation will help you take the best advantage of that support.

  5. VME rollback hardware for time warp multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Robb, Michael J.; Buzzell, Calvin A.

    1992-01-01

    The purpose of the research effort is to develop and demonstrate innovative hardware to implement specific rollback and timing functions required for efficient queue management and precision timekeeping in multiprocessor discrete event simulations. The previously completed phase 1 effort demonstrated the technical feasibility of building hardware modules which eliminate the state saving overhead of the Time Warp paradigm used in distributed simulations on multiprocessor systems. The current phase 2 effort will build multiple pre-production rollback hardware modules integrated with a network of Sun workstations, and the integrated system will be tested by executing a Time Warp simulation. The rollback hardware will be designed to interface with the greatest number of multiprocessor systems possible. The authors believe that the rollback hardware will provide for significant speedup of large scale discrete event simulation problems and allow multiprocessors using Time Warp to dramatically increase performance.

  6. Bessel function expansion to reduce the calculation time and memory usage for cylindrical computer-generated holograms.

    PubMed

    Sando, Yusuke; Barada, Daisuke; Jackin, Boaz Jessie; Yatagai, Toyohiko

    2017-07-10

    This study proposes a method to reduce the calculation time and memory usage required for calculating cylindrical computer-generated holograms. The wavefront on the cylindrical observation surface is represented as a convolution integral in the 3D Fourier domain. The Fourier transformation of the kernel function involving this convolution integral is analytically performed using a Bessel function expansion. The analytical solution can drastically reduce the calculation time and the memory usage without any cost, compared with the numerical method using fast Fourier transform to Fourier transform the kernel function. In this study, we present the analytical derivation, the efficient calculation of Bessel function series, and a numerical simulation. Furthermore, we demonstrate the effectiveness of the analytical solution through comparisons of calculation time and memory usage.

  7. Effects of Repair on Structural Integrity.

    DOT National Transportation Integrated Search

    1993-12-01

    Commercial aircraft operators are required by FAA regulations to repair damaged aircraft structures. These repairs must be performed in a timely manner to reduce aircraft downtime and loss of revenue. A guiding principle that has been used for many a...

  8. The NCBI BioSystems database.

    PubMed

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  9. Blip decomposition of the path integral: exponential acceleration of real-time calculations on quantum dissipative systems.

    PubMed

    Makri, Nancy

    2014-10-07

    The real-time path integral representation of the reduced density matrix for a discrete system in contact with a dissipative medium is rewritten in terms of the number of blips, i.e., elementary time intervals over which the forward and backward paths are not identical. For a given set of blips, it is shown that the path sum with respect to the coordinates of all remaining time points is isomorphic to that for the wavefunction of a system subject to an external driving term and thus can be summed by an inexpensive iterative procedure. This exact decomposition reduces the number of terms by a factor that increases exponentially with propagation time. Further, under conditions (moderately high temperature and/or dissipation strength) that lead primarily to incoherent dynamics, the "fully incoherent limit" zero-blip term of the series provides a reasonable approximation to the dynamics, and the blip series converges rapidly to the exact result. Retention of only the blips required for satisfactory convergence leads to speedup of full-memory path integral calculations by many orders of magnitude.

  10. When unconscious rewards boost cognitive task performance inefficiently: the role of consciousness in integrating value and attainability information

    PubMed Central

    Zedelius, Claire M.; Veling, Harm; Aarts, Henk

    2012-01-01

    Research has shown that high vs. low value rewards improve cognitive task performance independent of whether they are perceived consciously or unconsciously. However, efficient performance in response to high value rewards also depends on whether or not rewards are attainable. This raises the question of whether unconscious reward processing enables people to take into account such attainability information. Building on a theoretical framework according to which conscious reward processing is required to enable higher level cognitive processing, the present research tested the hypothesis that conscious but not unconscious reward processing enables integration of reward value with attainability information. In two behavioral experiments, participants were exposed to mask high and low value coins serving as rewards on a working memory (WM) task. The likelihood for conscious processing was manipulated by presenting the coins relatively briefly (17 ms) or long and clearly visible (300 ms). Crucially, rewards were expected to be attainable or unattainable. Requirements to integrate reward value with attainability information varied across experiments. Results showed that when integration of value and attainability was required (Experiment 1), long reward presentation led to efficient performance, i.e., selectively improved performance for high value attainable rewards. In contrast, in the short presentation condition, performance was increased for high value rewards even when these were unattainable. This difference between the effects of long and short presentation time disappeared when integration of value and attainability information was not required (Experiment 2). Together these findings suggest that unconsciously processed reward information is not integrated with attainability expectancies, causing inefficient effort investment. These findings are discussed in terms of a unique role of consciousness in efficient allocation of effort to cognitive control processes. PMID:22848198

  11. Simulation of a Real-Time Local Data Integration System over East-Central Florida

    NASA Technical Reports Server (NTRS)

    Case, Jonathan

    1999-01-01

    The Applied Meteorology Unit (AMU) simulated a real-time configuration of a Local Data Integration System (LDIS) using data from 15-28 February 1999. The objectives were to assess the utility of a simulated real-time LDIS, evaluate and extrapolate system performance to identify the hardware necessary to run a real-time LDIS, and determine the sensitivities of LDIS. The ultimate goal for running LDIS is to generate analysis products that enhance short-range (less than 6 h) weather forecasts issued in support of the 45th Weather Squadron, Spaceflight Meteorology Group, and Melbourne National Weather Service operational requirements. The simulation used the Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS) software on an IBM RS/6000 workstation with a 67-MHz processor. This configuration ran in real-time, but not sufficiently fast for operational requirements. Thus, the AMU recommends a workstation with a 200-MHz processor and 512 megabytes of memory to run the AMU's configuration of LDIS in real-time. This report presents results from two case studies and several data sensitivity experiments. ADAS demonstrates utility through its ability to depict high-resolution cloud and wind features in a variety of weather situations. The sensitivity experiments illustrate the influence of disparate data on the resulting ADAS analyses.

  12. Integrated propulsion for near-Earth space missions. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Dailey, C. L.; Meissinger, H. F.; Lovberg, R. H.; Zafran, S.

    1981-01-01

    Tradeoffs between electric propulsion system mass ratio and transfer time from LEO to GEO were conducted parametrically for various thruster efficiency, specific impulse, and other propulsion parameters. A computer model was developed for performing orbit transfer calculations which included the effects of aerodynamic drag, radiation degradation, and occultation. The tradeoff results showed that thruster technology areas for integrated propulsion should be directed towards improving primary thruster efficiency in the range from 1500 to 2500 seconds, and be continued towards reducing specific mass. Comparison of auxiliary propulsion systems showed large total propellant mass savings with integrated electric auxiliary propulsion. Stationkeeping is the most demanding on orbit propulsion requirement. At area densities above 0.5 sq m/kg, East-West stationkeeping requirements from solar pressure exceed North-South stationkeeping requirements from gravitational forces. A solar array pointing strategy was developed to minimize the effects of atmospheric drag at low altitude, enabling electric propulsion to initiate orbit transfer at Shuttle's maximum cargo carrying altitude. Gravity gradient torques are used during ascent to sustain the spacecraft roll motion required for optimum solar array illumination. A near optimum cover glass thickness of 6 mils was established for LEO to GEO transfer.

  13. Defense Logistics Standard Systems Functional Requirements.

    DTIC Science & Technology

    1987-03-01

    Artificial Intelligence - the development of a machine capability to perform functions normally concerned with human intelligence, such as learning , adapting...Basic Data Base Machine Configurations .... ......... D- 18 xx ~ ?f~~~vX PART I: MODELS - DEFENSE LOGISTICS STANDARD SYSTEMS FUNCTIONAL REQUIREMENTS...On-line, Interactive Access. Integrating user input and machine output in a dynamic, real-time, give-and- take process is considered the optimum mode

  14. Event-driven processing for hardware-efficient neural spike sorting

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Pereira, João L.; Constandinou, Timothy G.

    2018-02-01

    Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.

  15. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  16. How many research nurses for how many clinical trials in an oncology setting? Definition of the Nursing Time Required by Clinical Trial-Assessment Tool (NTRCT-AT).

    PubMed

    Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa

    2017-02-01

    Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.

  17. The prediction of the noise of supersonic propellers in time domain - New theoretical results

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1983-01-01

    In this paper, a new formula for the prediction of the noise of supersonic propellers is derived in the time domain which is superior to the previous formulations in several respects. The governing equation is based on the Ffowcs Williams-Hawkings (FW-H) equation with the thickness source term replaced by an equivalent loading source term derived by Isom (1975). Using some results of generalized function theory and simple four-dimensional space-time geometry, the formal solution of the governing equation is manipulated to a form requiring only the knowledge of blade surface pressure data and geometry. The final form of the main result of this paper consists of some surface and line integrals. The surface integrals depend on the surface pressure, time rate of change of surface pressure, and surface pressure gradient. These integrals also involve blade surface curvatures. The line integrals which depend on local surface pressure are along the trailing edge, the shock traces on the blade, and the perimeter of the airfoil section at the inner radius of the blade. The new formulation is for the full blade surface and does not involve any numerical observer time differentiation. The method of implementation on a computer for numerical work is also discussed.

  18. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  19. The Research on Lucalibration of GF-4 Satellite

    NASA Astrophysics Data System (ADS)

    Qi, W.; Tan, W.

    2018-04-01

    Starting from the lunar observation requirements of the GF-4 satellite, the main index such as the resolution, the imaging field, the reflect radiance and the imaging integration time are analyzed combined with the imaging features and parameters of this camera. The analysis results show that the lunar observation of GF-4 satellite has high resolution, wide field which can image the whole moon, the radiance of the pupil which is reflected by the moon is within the dynamic range of the camera, and the lunar image quality can be guaranteed better by setting up a reasonable integration time. At the same time, the radiation transmission model of the lunar radiation calibration is trace and the radiation degree is evaluated.

  20. Benchmark measurements and calculations of a 3-dimensional neutron streaming experiment

    NASA Astrophysics Data System (ADS)

    Barnett, D. A., Jr.

    1991-02-01

    An experimental assembly known as the Dog-Legged Void assembly was constructed to measure the effect of neutron streaming in iron and void regions. The primary purpose of the measurements was to provide benchmark data against which various neutron transport calculation tools could be compared. The measurements included neutron flux spectra at four places and integral measurements at two places in the iron streaming path as well as integral measurements along several axial traverses. These data have been used in the verification of Oak Ridge National Laboratory's three-dimensional discrete ordinates code, TORT. For a base case calculation using one-half inch mesh spacing, finite difference spatial differencing, an S(sub 16) quadrature and P(sub 1) cross sections in the MUFT multigroup structure, the calculated solution agreed to within 18 percent with the spectral measurements and to within 24 percent of the integral measurements. Variations on the base case using a fewgroup energy structure and P(sub 1) and P(sub 3) cross sections showed similar agreement. Calculations using a linear nodal spatial differencing scheme and fewgroup cross sections also showed similar agreement. For the same mesh size, the nodal method was seen to require 2.2 times as much CPU time as the finite difference method. A nodal calculation using a typical mesh spacing of 2 inches, which had approximately 32 times fewer mesh cells than the base case, agreed with the measurements to within 34 percent and yet required on 8 percent of the CPU time.

  1. Research on key technologies of data processing in internet of things

    NASA Astrophysics Data System (ADS)

    Zhu, Yangqing; Liang, Peiying

    2017-08-01

    The data of Internet of things (IOT) has the characteristics of polymorphism, heterogeneous, large amount and processing real-time. The traditional structured and static batch processing method has not met the requirements of data processing of IOT. This paper studied a middleware that can integrate heterogeneous data of IOT, and integrated different data formats into a unified format. Designed a data processing model of IOT based on the Storm flow calculation architecture, integrated the existing Internet security technology to build the Internet security system of IOT data processing, which provided reference for the efficient transmission and processing of IOT data.

  2. Integrated Environment for Ubiquitous Healthcare and Mobile IPv6 Networks

    NASA Astrophysics Data System (ADS)

    Cagalaban, Giovanni; Kim, Seoksoo

    The development of Internet technologies based on the IPv6 protocol will allow real-time monitoring of people with health deficiencies and improve the independence of elderly people. This paper proposed a ubiquitous healthcare system for the personalized healthcare services with the support of mobile IPv6 networks. Specifically, this paper discusses the integration of ubiquitous healthcare and wireless networks and its functional requirements. This allow an integrated environment where heterogeneous devices such a mobile devices and body sensors can continuously monitor patient status and communicate remotely with healthcare servers, physicians, and family members to effectively deliver healthcare services.

  3. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  4. Integration and Testing of LCS Software

    NASA Technical Reports Server (NTRS)

    Wang, John

    2014-01-01

    Kennedy Space Center is in the midst of developing a command and control system for the launch of the next generation manned space vehicle. The Space Launch System (SLS) will launch using the new Spaceport Command and Control System (SCCS). As a member of the Software Integration and Test (SWIT) Team, command scripts, and bash scripts were written to assist in integration and testing of the Launch Control System (LCS), which is a component of SCCS. The short term and midterm tasks are for the most part completed. The long term tasks if time permits will require a presentation and demonstration.

  5. Integrated geometry and grid generation system for complex configurations

    NASA Technical Reports Server (NTRS)

    Akdag, Vedat; Wulf, Armin

    1992-01-01

    A grid generation system was developed that enables grid generation for complex configurations. The system called ICEM/CFD is described and its role in computational fluid dynamics (CFD) applications is presented. The capabilities of the system include full computer aided design (CAD), grid generation on the actual CAD geometry definition using robust surface projection algorithms, interfacing easily with known CAD packages through common file formats for geometry transfer, grid quality evaluation of the volume grid, coupling boundary condition set-up for block faces with grid topology generation, multi-block grid generation with or without point continuity and block to block interface requirement, and generating grid files directly compatible with known flow solvers. The interactive and integrated approach to the problem of computational grid generation not only substantially reduces manpower time but also increases the flexibility of later grid modifications and enhancements which is required in an environment where CFD is integrated into a product design cycle.

  6. Analysis of peptides using an integrated microchip HPLC-MS/MS system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.

    Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less

  7. From Chains for Mean Value Inequalities to Mitrinovic's Problem II

    ERIC Educational Resources Information Center

    Zhu, Ling

    2005-01-01

    In this note, an integrated form of some significant means with two variables is provided, and some chains for mean value inequalities are obtained. At the same time, a concise family of algebraic functions appears, which satisfy Mitrinovic's requirements.

  8. A transition from using multi-step procedures to a fully integrated system for performing extracorporeal photopheresis: A comparison of costs and efficiencies.

    PubMed

    Azar, Nabih; Leblond, Veronique; Ouzegdouh, Maya; Button, Paul

    2017-12-01

    The Pitié Salpêtrière Hospital Hemobiotherapy Department, Paris, France, has been providing extracorporeal photopheresis (ECP) since November 2011, and started using the Therakos ® CELLEX ® fully integrated system in 2012. This report summarizes our single-center experience of transitioning from the use of multi-step ECP procedures to the fully integrated ECP system, considering the capacity and cost implications. The total number of ECP procedures performed 2011-2015 was derived from department records. The time taken to complete a single ECP treatment using a multi-step technique and the fully integrated system at our department was assessed. Resource costs (2014€) were obtained for materials and calculated for personnel time required. Time-driven activity-based costing methods were applied to provide a cost comparison. The number of ECP treatments per year increased from 225 (2012) to 727 (2015). The single multi-step procedure took 270 min compared to 120 min for the fully integrated system. The total calculated per-session cost of performing ECP using the multi-step procedure was greater than with the CELLEX ® system (€1,429.37 and €1,264.70 per treatment, respectively). For hospitals considering a transition from multi-step procedures to fully integrated methods for ECP where cost may be a barrier, time-driven activity-based costing should be utilized to gain a more comprehensive understanding the full benefit that such a transition offers. The example from our department confirmed that there were not just cost and time savings, but that the time efficiencies gained with CELLEX ® allow for more patient treatments per year. © 2017 The Authors Journal of Clinical Apheresis Published by Wiley Periodicals, Inc.

  9. Integrating Analysis Goals for EOP, CRF and TRF

    NASA Technical Reports Server (NTRS)

    Ma, Chopo; MacMillan, D.; Petrov, L.; Smith, David E. (Technical Monitor)

    2001-01-01

    In a simplified, idealized way the TRF can be considered a set of positions at epoch and corresponding linear rates of change while the CRF is a set of fixed directions in space. VLBI analysis can be optimized for CRF and TRF separately while handling some of the complexity of geodetic and astrometric reality. For EOP time series both CRF and TRF should be accurate at the epoch of interest and well defined over time. The optimal integral EOP, TRF and CRF in a single VLBI solution configuration requires a detailed consideration of the data set and the possibly conflicting nature of reference frames.

  10. Recalibration of the Multisensory Temporal Window of Integration Results from Changing Task Demands

    PubMed Central

    Mégevand, Pierre; Molholm, Sophie; Nayak, Ashabari; Foxe, John J.

    2013-01-01

    The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands. PMID:23951203

  11. Integral Sensor Fault Detection and Isolation for Railway Traction Drive.

    PubMed

    Garramiola, Fernando; Del Olmo, Jon; Poza, Javier; Madina, Patxi; Almandoz, Gaizka

    2018-05-13

    Due to the increasing importance of reliability and availability of electric traction drives in Railway applications, early detection of faults has become an important key for Railway traction drive manufacturers. Sensor faults are important sources of failures. Among the different fault diagnosis approaches, in this article an integral diagnosis strategy for sensors in traction drives is presented. Such strategy is composed of an observer-based approach for direct current (DC)-link voltage and catenary current sensors, a frequency analysis approach for motor current phase sensors and a hardware redundancy solution for speed sensors. None of them requires any hardware change requirement in the actual traction drive. All the fault detection and isolation approaches have been validated in a Hardware-in-the-loop platform comprising a Real Time Simulator and a commercial Traction Control Unit for a tram. In comparison to safety-critical systems in Aerospace applications, Railway applications do not need instantaneous detection, and the diagnosis is validated in a short time period for reliable decision. Combining the different approaches and existing hardware redundancy, an integral fault diagnosis solution is provided, to detect and isolate faults in all the sensors installed in the traction drive.

  12. Integral Sensor Fault Detection and Isolation for Railway Traction Drive

    PubMed Central

    del Olmo, Jon; Poza, Javier; Madina, Patxi; Almandoz, Gaizka

    2018-01-01

    Due to the increasing importance of reliability and availability of electric traction drives in Railway applications, early detection of faults has become an important key for Railway traction drive manufacturers. Sensor faults are important sources of failures. Among the different fault diagnosis approaches, in this article an integral diagnosis strategy for sensors in traction drives is presented. Such strategy is composed of an observer-based approach for direct current (DC)-link voltage and catenary current sensors, a frequency analysis approach for motor current phase sensors and a hardware redundancy solution for speed sensors. None of them requires any hardware change requirement in the actual traction drive. All the fault detection and isolation approaches have been validated in a Hardware-in-the-loop platform comprising a Real Time Simulator and a commercial Traction Control Unit for a tram. In comparison to safety-critical systems in Aerospace applications, Railway applications do not need instantaneous detection, and the diagnosis is validated in a short time period for reliable decision. Combining the different approaches and existing hardware redundancy, an integral fault diagnosis solution is provided, to detect and isolate faults in all the sensors installed in the traction drive. PMID:29757251

  13. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  14. Using a water-food-energy nexus approach for optimal irrigation management during drought events in Nebraska

    NASA Astrophysics Data System (ADS)

    Campana, P. E.; Zhang, J.; Yao, T.; Melton, F. S.; Yan, J.

    2017-12-01

    Climate change and drought have severe impacts on the agricultural sector affecting crop yields, water availability, and energy consumption for irrigation. Monitoring, assessing and mitigating the effects of climate change and drought on the agricultural and energy sectors are fundamental challenges that require investigation for water, food, and energy security issues. Using an integrated water-food-energy nexus approach, this study is developing a comprehensive drought management system through integration of real-time drought monitoring with real-time irrigation management. The spatially explicit model developed, GIS-OptiCE, can be used for simulation, multi-criteria optimization and generation of forecasts to support irrigation management. To demonstrate the value of the approach, the model has been applied to one major corn region in Nebraska to study the effects of the 2012 drought on crop yield and irrigation water/energy requirements as compared to a wet year such as 2009. The water-food-energy interrelationships evaluated show that significant water volumes and energy are required to halt the negative effects of drought on the crop yield. The multi-criteria optimization problem applied in this study indicates that the optimal solutions of irrigation do not necessarily correspond to those that would produce the maximum crop yields, depending on both water and economic constraints. In particular, crop pricing forecasts are extremely important to define the optimal irrigation management strategy. The model developed shows great potential in precision agriculture by providing near real-time data products including information on evapotranspiration, irrigation volumes, energy requirements, predicted crop growth, and nutrient requirements.

  15. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed

    Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.

  16. Space Gator: a giant leap for fiber optic sensing

    NASA Astrophysics Data System (ADS)

    Evenblij, R. S.; Leijtens, J. A. P.

    2017-11-01

    Fibre Optic Sensing is a rapidly growing application field for Photonics Integrated Circuits (PIC) technology. PIC technology is regarded enabling for required performances and miniaturization of next generation fibre optic sensing instrumentation. So far a number of Application Specific Photonics Integrated Circuits (ASPIC) based interrogator systems have been realized as operational system-on-chip devices. These circuits have shown that all basic building blocks are working and complete interrogator on chip solutions can be produced. Within the Saristu (FP7) project several high reliability solutions for fibre optic sensing in Aeronautics are being developed, combining the specifically required performance aspects for the different sensing applications: damage detection, impact detection, load monitoring and shape sensing (including redundancy aspects and time division features). Further developments based on devices and taking into account specific space requirements (like radiation aspects) will lead to the Space Gator, which is a radiation tolerant highly integrated Fibre Bragg Grating (FBG) interrogator on chip. Once developed and qualified the Space Gator will be a giant leap for fibre optic sensing in future space applications.

  17. Monolithic optoelectronic integrated broadband optical receiver with graphene photodetectors

    NASA Astrophysics Data System (ADS)

    Cheng, Chuantong; Huang, Beiju; Mao, Xurui; Zhang, Zanyun; Zhang, Zan; Geng, Zhaoxin; Xue, Ping; Chen, Hongda

    2017-07-01

    Optical receivers with potentially high operation bandwidth and low cost have received considerable interest due to rapidly growing data traffic and potential Tb/s optical interconnect requirements. Experimental realization of 65 GHz optical signal detection and 262 GHz intrinsic operation speed reveals the significance role of graphene photodetectors (PDs) in optical interconnect domains. In this work, a novel complementary metal oxide semiconductor post-backend process has been developed for integrating graphene PDs onto silicon integrated circuit chips. A prototype monolithic optoelectronic integrated optical receiver has been successfully demonstrated for the first time. Moreover, this is a firstly reported broadband optical receiver benefiting from natural broadband light absorption features of graphene material. This work is a perfect exhibition of the concept of monolithic optoelectronic integration and will pave way to monolithically integrated graphene optoelectronic devices with silicon ICs for three-dimensional optoelectronic integrated circuit chips.

  18. The NCBI BioSystems database

    PubMed Central

    Geer, Lewis Y.; Marchler-Bauer, Aron; Geer, Renata C.; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H.

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI’s Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets. PMID:19854944

  19. Enabling heterogenous multi-scale database for emergency service functions through geoinformation technologies

    NASA Astrophysics Data System (ADS)

    Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.

  20. An ultra-compact and low-power oven-controlled crystal oscillator design for precision timing applications.

    PubMed

    Lim, Jaehyun; Kim, Hyunsoo; Jackson, Thomas; Choi, Kyusun; Kenny, David

    2010-09-01

    A novel design for a chip-scale miniature oven-controlled crystal oscillator (OCXO) is presented. In this design, all the main components of an OCXO--consisting of an oscillator, a temperature sensor, a heater, and temperature-control circuitry--are integrated on a single CMOS chip. The OCXO package size can be reduced significantly with this design, because the resonator does not require a separate package and most of the circuitry is integrated on a single CMOS chip. Other characteristics such as power consumption and warm-up time are also improved. Two different types of quartz resonators, an AT-cut tab mesa-type quartz crystal and a frame enclosed resonator, allow miniaturization of the OCXO structure. Neither of these quartz resonator types requires a separate package inside the oven structure; therefore, they can each be directly integrated with the custom-designed CMOS chip. The miniature OCXO achieves a frequency stability of +/- 0.35 ppm with an AT-cut tab mesa-type quartz crystal in the temperature range of 0 °C to 60 °C. The maximum power consumption of this miniature OCXO is 1.2 W at start-up and 303 mW at steady state. The warm-up time to reach the steady state is 190 s. These results using the proposed design are better than or the same as high-frequency commercial OCXOs.

  1. SU-E-P-20: Personnel Lead Apparel Integrity Inspection: Where We Are and What We Need?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, S; Zhang, J; Anaskevich, L

    Purpose: In recent years, tremendous efforts have been devoted to radiation dose reduction, especially for patients who are directly exposed to primary radiation or receive radiopharmaceuticals. Limited efforts have been focused on those personnel who are exposed to secondary radiation while fulfilling their work responsibilities associated with diagnostic imaging and image-guided interventions. Occupational exposure is compounded in daily practice and can lead to a significant radiation dose over time. Personnel lead apparel is a well-accepted engineering control to protect healthcare workers when radiation is inevitable. The question is, do we have a nationally established program to protect personnel? This studymore » is to investigate the lead apparel inspection programs among the USA. Methods: A series of surveys of state regulations, the University Health System Consortium, and federal regulations and regulations determined by accrediting bodies were conducted. The surveys were used to determine the current status of lead apparel programs regarding integrity inspections. Based on the survey results, a thorough program was proposed accordingly. Results: Of 50 states, seventeen states and Washington D.C. require lead apparel integrity inspections within their state regulations. Eleven of these states specify that the inspection is required on an annual basis. Two of these states require lead apron integrity checks to be performed semi-annually. Eleven out of the two hundred academic medical centers surveyed responded. The results show that the method (visually vs. fluoroscopy) used to conduct lead apparel integrity checks differ greatly amongst healthcare organizations. The FDA, EPA, CRCPD and NCRP require lead apparel integrity checks. However, the level of policies is different. A standard program is not well established and clearly there is a lack of standardization. Conclusion: A program led by legislative (state or federal government) and with specific frequency, methods, tracking and criteria is needed to ensure the integrity of personnel lead apparel.« less

  2. Producibility and Production Aspects of the Market Analysis Process

    DTIC Science & Technology

    1989-06-01

    for most TROSCOM general purpose systems and equipment are the U.S. Army Quartermaster Center and School, Fort Lee, VA ( fuels handling and storage...established a Mission Area Proponency Branch staffed with military R&D Coordinator Officers (formerly TRISOs - Technical Requirements Integration Staff...time is spent reacting, rather than acting, i.e., the amount of work required to supply numerous reports on delinquent contractors and on Technical

  3. Stakeholder Definition for Indonesian Integrated Agriculture Information System (IAIS)

    NASA Astrophysics Data System (ADS)

    Budi Santoso, Halim; Delima, Rosa

    2017-03-01

    Stakeholders plays an important roles to determine the system requirements. Stakeholders are people or organizations that has an interest to the enterprise. Timely and effective consultation of relevant stakeholders is a paramount importance in the requirements engineering process. From the research and analysis of system stakeholder finds that there are four stakeholder groups in IAIS. Stakeholder analysis is being implemented by identifying stakeholder, stakeholder category, and analysis interaction between stakeholders.

  4. Biological production models as elements of coupled, atmosphere-ocean models for climate research

    NASA Technical Reports Server (NTRS)

    Platt, Trevor; Sathyendranath, Shubha

    1991-01-01

    Process models of phytoplankton production are discussed with respect to their suitability for incorporation into global-scale numerical ocean circulation models. Exact solutions are given for integrals over the mixed layer and the day of analytic, wavelength-independent models of primary production. Within this class of model, the bias incurred by using a triangular approximation (rather than a sinusoidal one) to the variation of surface irradiance through the day is computed. Efficient computation algorithms are given for the nonspectral models. More exact calculations require a spectrally sensitive treatment. Such models exist but must be integrated numerically over depth and time. For these integrations, resolution in wavelength, depth, and time are considered and recommendations made for efficient computation. The extrapolation of the one-(spatial)-dimension treatment to large horizontal scale is discussed.

  5. A Fourier spectral-discontinuous Galerkin method for time-dependent 3-D Schrödinger-Poisson equations with discontinuous potentials

    NASA Astrophysics Data System (ADS)

    Lu, Tiao; Cai, Wei

    2008-10-01

    In this paper, we propose a high order Fourier spectral-discontinuous Galerkin method for time-dependent Schrödinger-Poisson equations in 3-D spaces. The Fourier spectral Galerkin method is used for the two periodic transverse directions and a high order discontinuous Galerkin method for the longitudinal propagation direction. Such a combination results in a diagonal form for the differential operators along the transverse directions and a flexible method to handle the discontinuous potentials present in quantum heterojunction and supperlattice structures. As the derivative matrices are required for various time integration schemes such as the exponential time differencing and Crank Nicholson methods, explicit derivative matrices of the discontinuous Galerkin method of various orders are derived. Numerical results, using the proposed method with various time integration schemes, are provided to validate the method.

  6. Optical integration of SPO mirror modules in the ATHENA telescope

    NASA Astrophysics Data System (ADS)

    Valsecchi, G.; Marioni, F.; Bianucci, G.; Zocchi, F. E.; Gallieni, D.; Parodi, G.; Ottolini, M.; Collon, M.; Civitani, M.; Pareschi, G.; Spiga, D.; Bavdaz, M.; Wille, E.

    2017-08-01

    ATHENA (Advanced Telescope for High-ENergy Astrophysics) is the next high-energy astrophysical mission selected by the European Space Agency for launch in 2028. The X-ray telescope consists of 1062 silicon pore optics mirror modules with a target angular resolution of 5 arcsec. Each module must be integrated on a 3 m structure with an accuracy of 1.5 arcsec for alignment and assembly. This industrial and scientific team is developing the alignment and integration process of the SPO mirror modules based on ultra-violet imaging at the 12 m focal plane. This technique promises to meet the accuracy requirement while, at the same time, allowing arbitrary integration sequence and mirror module exchangeability. Moreover, it enables monitoring the telescope point spread function during the planned 3-year integration phase.

  7. Feasibility of using a reliable automated Doppler flow velocity measurements for research and clinical practices

    NASA Astrophysics Data System (ADS)

    Zolgharni, Massoud; Dhutia, Niti M.; Cole, Graham D.; Willson, Keith; Francis, Darrel P.

    2014-03-01

    Echocardiographers are often unkeen to make the considerable time investment to make additional multiple measurements of Doppler velocity. Main hurdle to obtaining multiple measurements is the time required to manually trace a series of Doppler traces. To make it easier to analyse more beats, we present an automated system for Doppler envelope quantification. It analyses long Doppler strips, spanning many heartbeats, and does not require the electrocardiogram to isolate individual beats. We tested its measurement of velocity-time-integral and peak-velocity against the reference standard defined as the average of three experts who each made three separate measurements. The automated measurements of velocity-time-integral showed strong correspondence (R2 = 0.94) and good Bland-Altman agreement (SD = 6.92%) with the reference consensus expert values, and indeed performed as well as the individual experts (R2 = 0.90 to 0.96, SD = 5.66% to 7.64%). The same performance was observed for peak-velocities; (R2 = 0.98, SD = 2.95%) and (R2 = 0.93 to 0.98, SD = 2.94% to 5.12%). This automated technology allows <10 times as many beats to be acquired and analysed compared to the conventional manual approach, with each beat maintaining its accuracy.

  8. GEOTAIL Spacecraft historical data report

    NASA Technical Reports Server (NTRS)

    Boersig, George R.; Kruse, Lawrence F.

    1993-01-01

    The purpose of this GEOTAIL Historical Report is to document ground processing operations information gathered on the GEOTAIL mission during processing activities at the Cape Canaveral Air Force Station (CCAFS). It is hoped that this report may aid management analysis, improve integration processing and forecasting of processing trends, and reduce real-time schedule changes. The GEOTAIL payload is the third Delta 2 Expendable Launch Vehicle (ELV) mission to document historical data. Comparisons of planned versus as-run schedule information are displayed. Information will generally fall into the following categories: (1) payload stay times (payload processing facility/hazardous processing facility/launch complex-17A); (2) payload processing times (planned, actual); (3) schedule delays; (4) integrated test times (experiments/launch vehicle); (5) unique customer support requirements; (6) modifications performed at facilities; (7) other appropriate information (Appendices A & B); and (8) lessons learned (reference Appendix C).

  9. SIMULATING ATMOSPHERIC EXPOSURE IN A NATIONAL RISK ASSESSMENT USING AN INNOVATIVE METEOROLOGICAL SAMPLING SCHEME

    EPA Science Inventory

    Multimedia risk assessments require the temporal integration of atmospheric concentration and deposition with other media modules. However, providing an extended time series of estimates is computationally expensive. An alternative approach is to substitute long-term average a...

  10. Temporal and Dose-response Pathway Analysis for Predicting Chronic Chemical Toxicity

    EPA Science Inventory

    Current challenges facing chemical risk assessment are the time and resources required to meet the data standards necessary for a published assessment and the incorporation of modern biological information. The integration of toxicogenomics into the risk assessment paradigm may ...

  11. Icons improve older and younger adults' comprehension of medication information.

    PubMed

    Morrow, D G; Hier, C M; Menard, W E; Leirer, V O

    1998-07-01

    We examined whether timeline icons improved older and younger adults' comprehension of medication information. In Experiment 1, comprehension of instructions with the icon (icon/text format) and without the icon (text-only format) was assessed by questions about information that was (a) implicit in the text but depicted explicitly by the icon (total dose in a 24 hour period), (b) stated and depicted in the icon/text condition (medication dose and times), and (c) stated but not depicted by the icon (e.g., side effects). In a separate task, participants also recalled medication instructions (with or without the icon) after a study period. We found that questions about dose and time information were answered more quickly and accurately when the icon was present in the instructions. Notably, icon benefits were greater for information that was implicit rather than stated in the text. This finding suggests that icons can improve older and younger adults' comprehension by reducing the need to draw some inferences. The icon also reduced effective study time (study time per item recalled). In Experiment 2, icon benefits did not occur for a less integrated version of the timeline icon that, like the text, required participants to integrate dose and time information in order to identify the total daily dose. The integrated version of the icon again improved comprehension, as in Experiment 1, as well as drawing inferences from memory. These findings show that integrated timeline icons improved comprehension primarily by aiding the integration of dose and time information. These findings are discussed in terms of a situation model approach to comprehension.

  12. Empirical Analysis of the Variability of Wind Generation in India: Implications for Grid Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phadke, Amol; Abhyankar, NIkit; Rao, Poorvi

    We analyze variability in load and wind generation in India to assess its implications for grid integration of large scale wind projects using actual wind generation and load data from two states in India, Karnataka and Tamil Nadu. We compare the largest variations in load and net load (load ?wind, i.e., load after integrating wind) that the generation fleet has to meet. In Tamil Nadu, where wind capacity is about 53percent of the peak demand, we find that the additional variation added due to wind over the current variation in load is modest; if wind penetration reaches 15percent and 30percentmore » by energy, the additional hourly variation is less than 0.5percent and 4.5percent of the peak demand respectively for 99percent of the time. For wind penetration of 15percent by energy, Tamil Nadu system is found to be capable of meeting the additional ramping requirement for 98.8percent of the time. Potential higher uncertainty in net load compared to load is found to have limited impact on ramping capability requirements of the system if coal plants can me ramped down to 50percent of their capacity. Load and wind aggregation in Tamil Nadu and Karnataka is found to lower the variation by at least 20percent indicating the benefits geographic diversification. These findings suggest modest additional flexible capacity requirements and costs for absorbing variation in wind power and indicate that the potential capacity support (if wind does not generate enough during peak periods) may be the issue that has more bearing on the economics of integrating wind« less

  13. High Quality Data for Grid Integration Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clifton, Andrew; Draxl, Caroline; Sengupta, Manajit

    As variable renewable power penetration levels increase in power systems worldwide, renewable integration studies are crucial to ensure continued economic and reliable operation of the power grid. The existing electric grid infrastructure in the US in particular poses significant limitations on wind power expansion. In this presentation we will shed light on requirements for grid integration studies as far as wind and solar energy are concerned. Because wind and solar plants are strongly impacted by weather, high-resolution and high-quality weather data are required to drive power system simulations. Future data sets will have to push limits of numerical weather predictionmore » to yield these high-resolution data sets, and wind data will have to be time-synchronized with solar data. Current wind and solar integration data sets are presented. The Wind Integration National Dataset (WIND) Toolkit is the largest and most complete grid integration data set publicly available to date. A meteorological data set, wind power production time series, and simulated forecasts created using the Weather Research and Forecasting Model run on a 2-km grid over the continental United States at a 5-min resolution is now publicly available for more than 126,000 land-based and offshore wind power production sites. The National Solar Radiation Database (NSRDB) is a similar high temporal- and spatial resolution database of 18 years of solar resource data for North America and India. The need for high-resolution weather data pushes modeling towards finer scales and closer synchronization. We also present how we anticipate such datasets developing in the future, their benefits, and the challenges with using and disseminating such large amounts of data.« less

  14. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  15. Automated Test Environment for a Real-Time Control System

    NASA Technical Reports Server (NTRS)

    Hall, Ronald O.

    1994-01-01

    An automated environment with hardware-in-the-loop has been developed by Rocketdyne Huntsville for test of a real-time control system. The target system of application is the man-rated real-time system which controls the Space Shuttle Main Engines (SSME). The primary use of the environment is software verification and validation, but it is also useful for evaluation and analysis of SSME avionics hardware and mathematical engine models. It provides a test bed for the integration of software and hardware. The principles and skills upon which it operates may be applied to other target systems, such as those requiring hardware-in-the-loop simulation and control system development. Potential applications are in problem domains demanding highly reliable software systems requiring testing to formal requirements and verifying successful transition to/from off-nominal system states.

  16. Rain rate and modeled fade distributions at 20 GHz and 30 GHz derived from five years of network rain gauge measurements

    NASA Technical Reports Server (NTRS)

    Goldhirsh, Julius; Krichevsky, Vladimir; Gebo, Norman

    1992-01-01

    Five years of rain rate and modeled slant path attenuation distributions at 20 GHz and 30 GHz derived from a network of 10 tipping bucket rain gages was examined. The rain gage network is located within a grid 70 km north-south and 47 km east-west in the Mid-Atlantic coast of the United States in the vicinity of Wallops Island, Virginia. Distributions were derived from the variable integration time data and from one minute averages. It was demonstrated that for realistic fade margins, the variable integration time results are adequate to estimate slant path attenuations at frequencies above 20 GHz using models which require one minute averages. An accurate empirical formula was developed to convert the variable integration time rain rates to one minute averages. Fade distributions at 20 GHz and 30 GHz were derived employing Crane's Global model because it was demonstrated to exhibit excellent accuracy with measured COMSTAR fades at 28.56 GHz.

  17. ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications

    NASA Astrophysics Data System (ADS)

    Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.

    2003-12-01

    The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.

  18. Integration of the response to a dietary potassium load: a paleolithic perspective.

    PubMed

    Kamel, Kamel S; Schreiber, Martin; Halperin, Mitchell L

    2014-05-01

    Our purpose is to integrate new insights in potassium (K(+)) physiology to understand K(+) homeostasis and illustrate some of their clinical implications. Since control mechanisms that are essential for survival were likely developed in Paleolithic times, we think the physiology of K(+) homeostasis can be better revealed when viewed from what was required to avoid threats and achieve balance in Paleolithic times. Three issues will be highlighted. First, we shall consider the integrative physiology of the gastrointestinal tract and the role of lactic acid released from enterocytes following absorption of sugars (fruit and berries) to cause a shift of this K(+) load into the liver. Second, we shall discuss the integrative physiology of WNK kinases and modulation of delivery of bicarbonate to the distal nephron to switch the aldosterone response from sodium chloride retention to K(+) secretion when faced with a K(+) load. Third, we shall emphasize the role of intra-renal recycling of urea in achieving K(+) homeostasis when the diet contains protein and K(+).

  19. GNSS/Electronic Compass/Road Segment Information Fusion for Vehicle-to-Vehicle Collision Avoidance Application

    PubMed Central

    Cheng, Qi; Xue, Dabin; Wang, Guanyu; Ochieng, Washington Yotto

    2017-01-01

    The increasing number of vehicles in modern cities brings the problem of increasing crashes. One of the applications or services of Intelligent Transportation Systems (ITS) conceived to improve safety and reduce congestion is collision avoidance. This safety critical application requires sub-meter level vehicle state estimation accuracy with very high integrity, continuity and availability, to detect an impending collision and issue a warning or intervene in the case that the warning is not heeded. Because of the challenging city environment, to date there is no approved method capable of delivering this high level of performance in vehicle state estimation. In particular, the current Global Navigation Satellite System (GNSS) based collision avoidance systems have the major limitation that the real-time accuracy of dynamic state estimation deteriorates during abrupt acceleration and deceleration situations, compromising the integrity of collision avoidance. Therefore, to provide the Required Navigation Performance (RNP) for collision avoidance, this paper proposes a novel Particle Filter (PF) based model for the integration or fusion of real-time kinematic (RTK) GNSS position solutions with electronic compass and road segment data used in conjunction with an Autoregressive (AR) motion model. The real-time vehicle state estimates are used together with distance based collision avoidance algorithms to predict potential collisions. The algorithms are tested by simulation and in the field representing a low density urban environment. The results show that the proposed algorithm meets the horizontal positioning accuracy requirement for collision avoidance and is superior to positioning accuracy of GNSS only, traditional Constant Velocity (CV) and Constant Acceleration (CA) based motion models, with a significant improvement in the prediction accuracy of potential collision. PMID:29186851

  20. GNSS/Electronic Compass/Road Segment Information Fusion for Vehicle-to-Vehicle Collision Avoidance Application.

    PubMed

    Sun, Rui; Cheng, Qi; Xue, Dabin; Wang, Guanyu; Ochieng, Washington Yotto

    2017-11-25

    The increasing number of vehicles in modern cities brings the problem of increasing crashes. One of the applications or services of Intelligent Transportation Systems (ITS) conceived to improve safety and reduce congestion is collision avoidance. This safety critical application requires sub-meter level vehicle state estimation accuracy with very high integrity, continuity and availability, to detect an impending collision and issue a warning or intervene in the case that the warning is not heeded. Because of the challenging city environment, to date there is no approved method capable of delivering this high level of performance in vehicle state estimation. In particular, the current Global Navigation Satellite System (GNSS) based collision avoidance systems have the major limitation that the real-time accuracy of dynamic state estimation deteriorates during abrupt acceleration and deceleration situations, compromising the integrity of collision avoidance. Therefore, to provide the Required Navigation Performance (RNP) for collision avoidance, this paper proposes a novel Particle Filter (PF) based model for the integration or fusion of real-time kinematic (RTK) GNSS position solutions with electronic compass and road segment data used in conjunction with an Autoregressive (AR) motion model. The real-time vehicle state estimates are used together with distance based collision avoidance algorithms to predict potential collisions. The algorithms are tested by simulation and in the field representing a low density urban environment. The results show that the proposed algorithm meets the horizontal positioning accuracy requirement for collision avoidance and is superior to positioning accuracy of GNSS only, traditional Constant Velocity (CV) and Constant Acceleration (CA) based motion models, with a significant improvement in the prediction accuracy of potential collision.

  1. A reproducible approach to high-throughput biological data acquisition and integration

    PubMed Central

    Rahnavard, Gholamali; Waldron, Levi; McIver, Lauren; Shafquat, Afrah; Franzosa, Eric A.; Miropolsky, Larissa; Sweeney, Christopher

    2015-01-01

    Modern biological research requires rapid, complex, and reproducible integration of multiple experimental results generated both internally and externally (e.g., from public repositories). Although large systematic meta-analyses are among the most effective approaches both for clinical biomarker discovery and for computational inference of biomolecular mechanisms, identifying, acquiring, and integrating relevant experimental results from multiple sources for a given study can be time-consuming and error-prone. To enable efficient and reproducible integration of diverse experimental results, we developed a novel approach for standardized acquisition and analysis of high-throughput and heterogeneous biological data. This allowed, first, novel biomolecular network reconstruction in human prostate cancer, which correctly recovered and extended the NFκB signaling pathway. Next, we investigated host-microbiome interactions. In less than an hour of analysis time, the system retrieved data and integrated six germ-free murine intestinal gene expression datasets to identify the genes most influenced by the gut microbiota, which comprised a set of immune-response and carbohydrate metabolism processes. Finally, we constructed integrated functional interaction networks to compare connectivity of peptide secretion pathways in the model organisms Escherichia coli, Bacillus subtilis, and Pseudomonas aeruginosa. PMID:26157642

  2. Solution of the advection-dispersion equation by a finite-volume eulerian-lagrangian local adjoint method

    USGS Publications Warehouse

    Healy, R.W.; Russell, T.F.

    1992-01-01

    A finite-volume Eulerian-Lagrangian local adjoint method for solution of the advection-dispersion equation is developed and discussed. The method is mass conservative and can solve advection-dominated ground-water solute-transport problems accurately and efficiently. An integrated finite-difference approach is used in the method. A key component of the method is that the integral representing the mass-storage term is evaluated numerically at the current time level. Integration points, and the mass associated with these points, are then forward tracked up to the next time level. The number of integration points required to reach a specified level of accuracy is problem dependent and increases as the sharpness of the simulated solute front increases. Integration points are generally equally spaced within each grid cell. For problems involving variable coefficients it has been found to be advantageous to include additional integration points at strategic locations in each well. These locations are determined by backtracking. Forward tracking of boundary fluxes by the method alleviates problems that are encountered in the backtracking approaches of most characteristic methods. A test problem is used to illustrate that the new method offers substantial advantages over other numerical methods for a wide range of problems.

  3. Iterative integral parameter identification of a respiratory mechanics model.

    PubMed

    Schranz, Christoph; Docherty, Paul D; Chiew, Yeong Shiong; Möller, Knut; Chase, J Geoffrey

    2012-07-18

    Patient-specific respiratory mechanics models can support the evaluation of optimal lung protective ventilator settings during ventilation therapy. Clinical application requires that the individual's model parameter values must be identified with information available at the bedside. Multiple linear regression or gradient-based parameter identification methods are highly sensitive to noise and initial parameter estimates. Thus, they are difficult to apply at the bedside to support therapeutic decisions. An iterative integral parameter identification method is applied to a second order respiratory mechanics model. The method is compared to the commonly used regression methods and error-mapping approaches using simulated and clinical data. The clinical potential of the method was evaluated on data from 13 Acute Respiratory Distress Syndrome (ARDS) patients. The iterative integral method converged to error minima 350 times faster than the Simplex Search Method using simulation data sets and 50 times faster using clinical data sets. Established regression methods reported erroneous results due to sensitivity to noise. In contrast, the iterative integral method was effective independent of initial parameter estimations, and converged successfully in each case tested. These investigations reveal that the iterative integral method is beneficial with respect to computing time, operator independence and robustness, and thus applicable at the bedside for this clinical application.

  4. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  5. Sensory processes modulate differences in multi-component behavior and cognitive control between childhood and adulthood.

    PubMed

    Gohil, Krutika; Bluschke, Annet; Roessner, Veit; Stock, Ann-Kathrin; Beste, Christian

    2017-10-01

    Many everyday tasks require executive functions to achieve a certain goal. Quite often, this requires the integration of information derived from different sensory modalities. Children are less likely to integrate information from different modalities and, at the same time, also do not command fully developed executive functions, as compared to adults. Yet still, the role of developmental age-related effects on multisensory integration processes has not been examined within the context of multicomponent behavior until now (i.e., the concatenation of different executive subprocesses). This is problematic because differences in multisensory integration might actually explain a significant amount of the developmental effects that have traditionally been attributed to changes in executive functioning. In a system, neurophysiological approach combining electroencephaloram (EEG) recordings and source localization analyses, we therefore examined this question. The results show that differences in how children and adults accomplish multicomponent behavior do not solely depend on developmental differences in executive functioning. Instead, the observed developmental differences in response selection processes (reflected by the P3 ERP) were largely dependent on the complexity of integrating temporally separated stimuli from different modalities. This effect was related to activation differences in medial frontal and inferior parietal cortices. Primary perceptual gating or attentional selection processes (P1 and N1 ERPs) were not affected. The results show that differences in multisensory integration explain parts of transformations in cognitive processes between childhood and adulthood that have traditionally been attributed to changes in executive functioning, especially when these require the integration of multiple modalities during response selection. Hum Brain Mapp 38:4933-4945, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Analysis of the glow curve of SrB 4O 7:Dy compounds employing the GOT model

    NASA Astrophysics Data System (ADS)

    Ortega, F.; Molina, P.; Santiago, M.; Spano, F.; Lester, M.; Caselli, E.

    2006-02-01

    The glow curve of SrB 4O 7:Dy phosphors has been analysed with the general one trap model (GOT). To solve the differential equation describing the GOT model a novel algorithm has been employed, which reduces significantly the deconvolution time with respect to the time required by usual integration algorithms, such as the Runge-Kutta method.

  7. Ab initio molecular dynamics with nuclear quantum effects at classical cost: Ring polymer contraction for density functional theory.

    PubMed

    Marsalek, Ondrej; Markland, Thomas E

    2016-02-07

    Path integral molecular dynamics simulations, combined with an ab initio evaluation of interactions using electronic structure theory, incorporate the quantum mechanical nature of both the electrons and nuclei, which are essential to accurately describe systems containing light nuclei. However, path integral simulations have traditionally required a computational cost around two orders of magnitude greater than treating the nuclei classically, making them prohibitively costly for most applications. Here we show that the cost of path integral simulations can be dramatically reduced by extending our ring polymer contraction approach to ab initio molecular dynamics simulations. By using density functional tight binding as a reference system, we show that our ring polymer contraction scheme gives rapid and systematic convergence to the full path integral density functional theory result. We demonstrate the efficiency of this approach in ab initio simulations of liquid water and the reactive protonated and deprotonated water dimer systems. We find that the vast majority of the nuclear quantum effects are accurately captured using contraction to just the ring polymer centroid, which requires the same number of density functional theory calculations as a classical simulation. Combined with a multiple time step scheme using the same reference system, which allows the time step to be increased, this approach is as fast as a typical classical ab initio molecular dynamics simulation and 35× faster than a full path integral calculation, while still exactly including the quantum sampling of nuclei. This development thus offers a route to routinely include nuclear quantum effects in ab initio molecular dynamics simulations at negligible computational cost.

  8. Integration of Slack, a cloud-based team collaboration application, into research coordination.

    PubMed

    Gofine, Miriam; Clark, Sunday

    2017-06-30

    Practitioners of epidemiology require efficient real-time communication and shared access to numerous documents in order to effectively manage a study. Much of this communication involves study logistics and does not require use of Protected Health Information. Slack is a team collaboration app; it archives all direct messages and group conversations, hosts documents internally, and integrates with the Google Docs application. Slack has both desktop and mobile applications, allowing users to communicate in real-time without the need to find email addresses or phone numbers or create contact lists. METHOD: We piloted the integration of Slack into our research team of one faculty member, one research coordinator, and approximately 20 research assistants. Statistics describing the app's usage were calculated twelve months after its implementation. RESULTS: Results indicating heavy usage by both research professionals and assistants are presented. Our Slack group included a cumulative 51 users. Between October 2015 and November 2016, approximately 10,600 messages were sent through Slack; 53% were sent by RA's and 47% were sent by us. Of the 106 files stored on Slack, 82% were uploaded by research staff. In a January 2016 survey, 100% of RA's agreed or strongly agreed that Slack improved communication within the team. CONCLUSION: We demonstrate a model for integration of communication technology into academic activities by research teams. Slack is easily integrated into the workflow at an urban, academic medical center and is adopted by users as a highly effective tool for meeting research teams' communication and document management needs.

  9. Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.

  10. SIM Lite: Ground Alignment of the Instrument

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Goullioud, Renaud; Nicaise, Fabien; Kuan, Gary; Morales, Mauricio

    2010-01-01

    We present the start of the ground alignment plan for the SIM Lite Instrument. We outline the integration and alignment of the individual benches on which all the optics are mounted, and then the alignment of the benches to form the Science and Guide interferometers. The Instrument has a guide interferometer with only a 40 arc-seconds field of regard, and 200 arc-seconds of alignment adjustability. This requires each sides of the interferometer to be aligned to a fraction of that, while at the same time be orthogonal to the baseline defined by the External Metrology Truss. The baselines of the Science and Guide interferometers must also be aligned to be parallel. The start of these alignment plans is captured in a SysML Instrument System model, in the form of activity diagrams. These activity diagrams are then related to the hardware design and requirements. We finish with future plans for the alignment and integration activities and requirements.

  11. SIM Lite: ground alignment of the instrument

    NASA Astrophysics Data System (ADS)

    Dekens, Frank G.; Goullioud, Renaud; Nicaise, Fabien; Kuan, Gary; Morales, Mauricio

    2010-07-01

    We present the start of the ground alignment plan for the SIM Lite Instrument. We outline the integration and alignment of the individual benches on which all the optics are mounted, and then the alignment of the benches to form the Science and Guide interferometers. The Instrument has a guide interferometer with only a 40 arc-seconds field of regard, and 200 arc-seconds of alignment adjustability. This requires each sides of the interferometer to be aligned to a fraction of that, while at the same time be orthogonal to the baseline defined by the External Metrology Truss. The baselines of the Science and Guide interferometers must also be aligned to be parallel. The start of these alignment plans is captured in a SysML Instrument System model, in the form of activity diagrams. These activity diagrams are then related to the hardware design and requirements. We finish with future plans for the alignment and integration activities and requirements.

  12. On coarse projective integration for atomic deposition in amorphous systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chuang, Claire Y., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov; Sinno, Talid, E-mail: talid@seas.upenn.edu; Han, Sang M., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov

    2015-10-07

    Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of time scales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity, and computational efficiency. Coarse projective integration, an example application of the “equation-free” framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute time derivatives of slowly evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to themore » application of this technique in realistic settings is the “lifting” operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO{sub 2} substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO{sub 2} using only a few measures of the island size distribution. The approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less

  13. TIME-TAG mode of STIS observations using the MAMA detectors

    NASA Astrophysics Data System (ADS)

    Sahu, Kailash; Danks, Anthony; Baum, Stefi; Balzano, Vicki; Kraemer, Steve; Kutina, Ray; Sears, William

    1995-04-01

    We summarize the time-tag mode of STIS observations using the MAMA detectors, both in imaging and spectroscopic modes. After a brief outline on the MAMA detector characteristics and the astronomical applications of the time-tag mode, the general philosophy and the details of the data management strategy are described in detail. The GO specifications, and the consequent different modes of data transfer strategy are outlined. Restrictions on maximum data rates, integration times, and BUFFER-TIME requirements are explained. A few cases where the subarray option would be useful are outlined.

  14. Closed-Form Evaluation of Mutual Coupling in a Planar Array of Circular Apertures

    NASA Technical Reports Server (NTRS)

    Bailey, M. C.

    1996-01-01

    The integral expression for the mutual admittance between circular apertures in a planar array is evaluated in closed form. Very good accuracy is realized when compared with values that were obtained by numerical integration. Utilization of this closed-form expression, for all element pairs that are separated by more than one element spacing, yields extremely accurate results and significantly reduces the computation time that is required to analyze the performance of a large electronically scanning antenna array.

  15. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  16. Debris Examination Using Ballistic and Radar Integrated Software

    NASA Technical Reports Server (NTRS)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  17. Development and Application of an Integrated Health Impacts Assessment Tool for the Sacramento Region.

    DOT National Transportation Integrated Search

    2017-10-01

    Plans crafted by metropolitan planning organizations (MPOs) lay out how billions of dollars in transportation investments will be made over a 20 to 30-year time horizon. Federal transportation authorizations require MPOs to identify and track key ind...

  18. On Dark Times, Parallel Universes, and Deja Vu.

    ERIC Educational Resources Information Center

    Starnes, Bobby Ann

    2000-01-01

    Effectiveness cannot be found in the mediocrity arising from programs that require lessons, teaching strategies, and precisely executed materials to ensure integrity. Expensive, scripted programs like Success for All are designed not to improve teaching, but to render the art of teaching unnecessary. (MLH)

  19. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  20. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  1. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  2. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  3. 40 CFR 63.7800 - What are my operation and maintenance requirements?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...

  4. Acousto-optic time- and space-integrating spotlight-mode SAR processor

    NASA Astrophysics Data System (ADS)

    Haney, Michael W.; Levy, James J.; Michael, Robert R., Jr.

    1993-09-01

    The technical approach and recent experimental results for the acousto-optic time- and space- integrating real-time SAR image formation processor program are reported. The concept overcomes the size and power consumption limitations of electronic approaches by using compact, rugged, and low-power analog optical signal processing techniques for the most computationally taxing portions of the SAR imaging problem. Flexibility and performance are maintained by the use of digital electronics for the critical low-complexity filter generation and output image processing functions. The results include a demonstration of the processor's ability to perform high-resolution spotlight-mode SAR imaging by simultaneously compensating for range migration and range/azimuth coupling in the analog optical domain, thereby avoiding a highly power-consuming digital interpolation or reformatting operation usually required in all-electronic approaches.

  5. A numerical scheme to solve unstable boundary value problems

    NASA Technical Reports Server (NTRS)

    Kalnay-Rivas, E.

    1977-01-01

    The considered scheme makes it possible to determine an unstable steady state solution in cases in which, because of lack of symmetry, such a solution cannot be obtained analytically, and other time integration or relaxation schemes, because of instability, fail to converge. The iterative solution of a single complex equation is discussed and a nonlinear system of equations is considered. Described applications of the scheme are related to a steady state solution with shear instability, an unstable nonlinear Ekman boundary layer, and the steady state solution of a baroclinic atmosphere with asymmetric forcing. The scheme makes use of forward and backward time integrations of the original spatial differential operators and of an approximation of the adjoint operators. Only two computations of the time derivative per iteration are required.

  6. Functional and real-time requirements of a multisensor data fusion (MSDF) situation and threat assessment (STA) resource management (RM) system

    NASA Astrophysics Data System (ADS)

    Duquet, Jean Remi; Bergeron, Pierre; Blodgett, Dale E.; Couture, Jean; Macieszczak, Maciej; Mayrand, Michel; Chalmers, Bruce A.; Paradis, Stephane

    1998-03-01

    The Research and Development group at Lockheed Martin Canada, in collaboration with the Defence Research Establishment Valcartier, has undertaken a research project in order to capture and analyze the real-time and functional requirements of a next generation Command and Control System (CCS) for the Canadian Patrol Frigates, integrating Multi- Sensor Data Fusion (MSDF), Situation and Threat Assessment (STA) and Resource Management (RM). One important aspect of the project is to define how the use of Artificial Intelligence may optimize the performance of an integrated, real-time MSDF/STA/RM system. A closed-loop simulation environment is being developed to facilitate the evaluation of MSDF/STA/RM concepts, algorithms and architectures. This environment comprises (1) a scenario generator, (2) complex sensor, hardkill and softkill weapon models, (3) a real-time monitoring tool, (4) a distributed Knowledge-Base System (KBS) shell. The latter is being completely redesigned and implemented in-house since no commercial KBS shell could adequately satisfy all the project requirements. The closed- loop capability of the simulation environment, together with its `simulated real-time' capability, allows the interaction between the MSDF/STA/RM system and the environment targets during the execution of a scenario. This capability is essential to measure the performance of many STA and RM functionalities. Some benchmark scenarios have been selected to demonstrate quantitatively the capabilities of the selected MSDF/STA/RM algorithms. The paper describes the simulation environment and discusses the MSDF/STA/RM functionalities currently implemented and their performance as an automatic CCS.

  7. Thin-film luminescent concentrators for integrated devices: a cookbook.

    PubMed

    Evenson, S A; Rawicz, A H

    1995-11-01

    A luminescent concentrator (LC) is a nonimaging optical device used for collecting light energy. As a result of its unique properties, a LC also offers the possibility of separating different portions of the spectrum and concentrating them at the same time. Hence, LC's can be applied to a whole range of problems requiring the collection, manipulation, and distribution or measurement of light. Further-more, as described in our previous research, thin-film LC elements can be deposited directly over sensor and processing electronics in the form of integrated LC devices. As an aid to further research, the materials and technology required to fabricate these thin-film LC elements through the use of an ultraviolet-curable photopolymer are documented in detail.

  8. Advanced solar irradiances applied to satellite and ionospheric operational systems

    NASA Astrophysics Data System (ADS)

    Tobiska, W. Kent; Schunk, Robert; Eccles, Vince; Bouwer, Dave

    Satellite and ionospheric operational systems require solar irradiances in a variety of time scales and spectral formats. We describe the development of a system using operational grade solar irradiances that are applied to empirical thermospheric density models and physics-based ionospheric models used by operational systems that require a space weather characterization. The SOLAR2000 (S2K) and SOLARFLARE (SFLR) models developed by Space Environment Technologies (SET) provide solar irradiances from the soft X-rays (XUV) through the Far Ultraviolet (FUV) spectrum. The irradiances are provided as integrated indices for the JB2006 empirical atmosphere density models and as line/band spectral irradiances for the physics-based Ionosphere Forecast Model (IFM) developed by the Space Environment Corporation (SEC). We describe the integration of these irradiances in historical, current epoch, and forecast modes through the Communication Alert and Prediction System (CAPS). CAPS provides real-time and forecast HF radio availability for global and regional users and global total electron content (TEC) conditions.

  9. A Framework for Robust Multivariable Optimization of Integrated Circuits in Space Applications

    NASA Technical Reports Server (NTRS)

    DuMonthier, Jeffrey; Suarez, George

    2013-01-01

    Application Specific Integrated Circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way which facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as framework of software modules, templates and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation. Templates provide a starting point for both while toolbox functions minimize the code required. Once a test bench has been coded to optimize a particular circuit, it is also used to verify the final design. The combination of test bench and cost function can then serve as a template for similar circuits or be re-used to migrate the design to different processes by re-running it with the new process specific device models. The system has been used in the design of time to digital converters for laser ranging and time-of-flight mass spectrometry to optimize analog, mixed signal and digital circuits such as charge sensitive amplifiers, comparators, delay elements, radiation tolerant dual interlocked (DICE) flip-flops and two of three voter gates.

  10. Research Plan of the Department of Systems Engineering and the Operations Research Center for Academic Year 2007

    DTIC Science & Technology

    2006-10-01

    high probability for success. Estimated Time to Complete: 31 May 2007. 4. Support and Upgrade of Armed Forces-CARES to integrate Chaplin ...Excellence (ORCEN) is to provide a small, full- time analytical capability to both the Academy and the United States Army and the Department of...complete significant research projects in this time as they usually require little train-up as they are exposed to many military and academic

  11. Wavefront correction with Kalman filtering for the WFIRST-AFTA coronagraph instrument

    NASA Astrophysics Data System (ADS)

    Riggs, A. J. Eldorado; Kasdin, N. Jeremy; Groff, Tyler D.

    2015-09-01

    The only way to characterize most exoplanets spectrally is via direct imaging. For example, the Coronagraph Instrument (CGI) on the proposed Wide-Field Infrared Survey Telescope-Astrophysics Focused Telescope Assets (WFIRST-AFTA) mission plans to image and characterize several cool gas giants around nearby stars. The integration time on these faint exoplanets will be many hours to days. A crucial assumption for mission planning is that the time required to dig a dark hole (a region of high star-to-planet contrast) with deformable mirrors is small compared to science integration time. The science camera must be used as the wavefront sensor to avoid non-common path aberrations, but this approach can be quite time intensive. Several estimation images are required to build an estimate of the starlight electric field before it can be partially corrected, and this process is repeated iteratively until high contrast is reached. Here we present simulated results of batch process and recursive wavefront estimation schemes. In particular, we test a Kalman filter and an iterative extended Kalman filter (IEKF) to reduce the total exposure time and improve the robustness of wavefront correction for the WFIRST-AFTA CGI. An IEKF or other nonlinear filter also allows recursive, real-time estimation of sources incoherent with the star, such as exoplanets and disks, and may therefore reduce detection uncertainty.

  12. Integration of MATLAB Simulink(Registered Trademark) Models with the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Lewis, Emily K.; Vuong, Nghia D.

    2012-01-01

    This paper describes the integration of MATLAB Simulink(Registered TradeMark) models into the Vertical Motion Simulator (VMS) at NASA Ames Research Center. The VMS is a high-fidelity, large motion flight simulator that is capable of simulating a variety of aerospace vehicles. Integrating MATLAB Simulink models into the VMS needed to retain the development flexibility of the MATLAB environment and allow rapid deployment of model changes. The process developed at the VMS was used successfully in a number of recent simulation experiments. This accomplishment demonstrated that the model integrity was preserved, while working within the hard real-time run environment of the VMS architecture, and maintaining the unique flexibility of the VMS to meet diverse research requirements.

  13. On the Assessment of Acoustic Scattering and Shielding by Time Domain Boundary Integral Equation Solutions

    NASA Technical Reports Server (NTRS)

    Hu, Fang Q.; Pizzo, Michelle E.; Nark, Douglas M.

    2016-01-01

    Based on the time domain boundary integral equation formulation of the linear convective wave equation, a computational tool dubbed Time Domain Fast Acoustic Scattering Toolkit (TD-FAST) has recently been under development. The time domain approach has a distinct advantage that the solutions at all frequencies are obtained in a single computation. In this paper, the formulation of the integral equation, as well as its stabilization by the Burton-Miller type reformulation, is extended to cases of a constant mean flow in an arbitrary direction. In addition, a "Source Surface" is also introduced in the formulation that can be employed to encapsulate regions of noise sources and to facilitate coupling with CFD simulations. This is particularly useful for applications where the noise sources are not easily described by analytical source terms. Numerical examples are presented to assess the accuracy of the formulation, including a computation of noise shielding by a thin barrier motivated by recent Historical Baseline F31A31 open rotor noise shielding experiments. Furthermore, spatial resolution requirements of the time domain boundary element method are also assessed using point per wavelength metrics. It is found that, using only constant basis functions and high-order quadrature for surface integration, relative errors of less than 2% may be obtained when the surface spatial resolution is 5 points-per-wavelength (PPW) or 25 points-per-wavelength squared (PPW2).

  14. Addressing BI Transactional Flows in the Real-Time Enterprise Using GoldenGate TDM

    NASA Astrophysics Data System (ADS)

    Pareek, Alok

    It's time to visit low latency and reliable real-time (RT) infrastructures to support next generation BI applications instead of continually debating the need and notion of real-time. The last few years have illuminated some key paradigms affecting data management. The arguments put forth to move away from traditional DBMS architectures have proven persuasive - and specialized architectural data stores are being adopted in the industry [1]. The change from traditional database pull methods towards intelligent routing/push models is underway, causing applications to be redesigned, redeployed, and re-architected. One direct result of this is that despite original warnings about replication [2] - enterprises continue to deploy multiple replicas to support both performance, and high availability of RT applications, with an added complexity around manageability of heterogeneous computing systems. The enterprise is overflowing with data streams that require instantaneous processing and integration, to deliver faster visibility and invoke conjoined actions for RT decision making, resulting in deployment of advanced BI applications as can be seen by stream processing over RT feeds from operational systems for CEP [3]. Given these various paradigms, a multitude of new challenges and requirements have emerged, thereby necessitating different approaches to management of RT applications for BI. The purpose of this paper is to offer a viewpoint on how RT affects critical operational applications, evolves the weight of non-critical applications, and pressurizes availability/data-movement requirements in the underlying infrastructure. I will discuss how the GoldenGate TDM platform is being deployed within the RTE to manage some of these challenges particularly around RT dissemination of transactional data to reduce latency in data integration flows, to enable real-time reporting/DW, and to increase availability of underlying operational systems. Real world case studies will be used to support the various discussion points. The paper is an argument to augment traditional DI flows with a real-time technology (referred to as transactional data management) to support operational BI requirements.

  15. An X-Band SOS Resistive Gate-Insulator-Semiconductor /RIS/ switch

    NASA Astrophysics Data System (ADS)

    Kwok, S. P.

    1980-02-01

    The new X-Band Resistive Gate-Insulator-Semiconductor (RIS) switch has been fabricated on silicon-on-sapphire, and its equivalent circuit model characterized. An RIS SPST switch with 20-dB on/off isolation, 1.2-dB insertion loss, and power handling capacity in excess of 20-W peak has been achieved at X band. The device switching time is on the order of 600 ns, and it requires negligible control holding current in both on and off states. The device is compatible with monolithic integrated-circuit technology and thus is suitable for integration into low-cost monolithic phase shifters or other microwave integrated circuits.

  16. Integrating functional and anatomical information to facilitate cardiac resynchronization therapy.

    PubMed

    Tournoux, Francois B; Manzke, Robert; Chan, Raymond C; Solis, Jorge; Chen-Tournoux, Annabel A; Gérard, Olivier; Nandigam, Veena; Allain, Pascal; Reddy, Vivek; Ruskin, Jeremy N; Weyman, Arthur E; Picard, Michael H; Singh, Jagmeet P

    2007-08-01

    Multiple imaging modalities are required in patients receiving cardiac resynchronization therapy. We have developed a strategy to integrate echocardiographic and angiographic information to facilitate left ventricle (LV) lead position. Full three-dimensional LV-volumes (3DLVV) and dyssynchrony maps were acquired before and after resynchronization. At the time of device implantation, 3D-rotational coronary venous angiography was performed. 3D-models of the veins were then integrated with the pre- and post-3DLVV. In the case displayed, prior to implantation, the lateral wall was delayed compared to the septum. The LV lead was positioned into the vein over the most delayed region, resulting in improved LV synchrony.

  17. Advanced order management in ERM systems: the tic-tac-toe algorithm

    NASA Astrophysics Data System (ADS)

    Badell, Mariana; Fernandez, Elena; Puigjaner, Luis

    2000-10-01

    The concept behind improved enterprise resource planning systems (ERP) systems is the overall integration of the whole enterprise functionality into the management systems through financial links. Converting current software into real management decision tools requires crucial changes in the current approach to ERP systems. This evolution must be able to incorporate the technological achievements both properly and in time. The exploitation phase of plants needs an open web-based environment for collaborative business-engineering with on-line schedulers. Today's short lifecycles of products and processes require sharp and finely tuned management actions that must be guided by scheduling tools. Additionally, such actions must be able to keep track of money movements related to supply chain events. Thus, the necessary outputs require financial-production integration at the scheduling level as proposed in the new approach of enterprise management systems (ERM). Within this framework, the economical analysis of the due date policy and its optimization become essential to manage dynamically realistic and optimal delivery dates with price-time trade-off during the marketing activities. In this work we propose a scheduling tool with web-based interface conducted by autonomous agents when precise economic information relative to plant and business actions and their effects are provided. It aims to attain a better arrangement of the marketing and production events in order to face the bid/bargain process during e-commerce. Additionally, management systems require real time execution and an efficient transaction-oriented approach capable to dynamically adopt realistic and optimal actions to support marketing management. To this end the TicTacToe algorithm provides sequence optimization with acceptable tolerances in realistic time.

  18. Design and implementation of an identification system in construction site safety for proactive accident prevention.

    PubMed

    Yang, Huanjia; Chew, David A S; Wu, Weiwei; Zhou, Zhipeng; Li, Qiming

    2012-09-01

    Identifying accident precursors using real-time identity information has great potential to improve safety performance in construction industry, which is still suffering from day to day records of accident fatality and injury. Based on the requirements analysis for identifying precursor and the discussion of enabling technology solutions for acquiring and sharing real-time automatic identification information on construction site, this paper proposes an identification system design for proactive accident prevention to improve construction site safety. Firstly, a case study is conducted to analyze the automatic identification requirements for identifying accident precursors in construction site. Results show that it mainly consists of three aspects, namely access control, training and inspection information and operation authority. The system is then designed to fulfill these requirements based on ZigBee enabled wireless sensor network (WSN), radio frequency identification (RFID) technology and an integrated ZigBee RFID sensor network structure. At the same time, an information database is also designed and implemented, which includes 15 tables, 54 queries and several reports and forms. In the end, a demonstration system based on the proposed system design is developed as a proof of concept prototype. The contributions of this study include the requirement analysis and technical design of a real-time identity information tracking solution for proactive accident prevention on construction sites. The technical solution proposed in this paper has a significant importance in improving safety performance on construction sites. Moreover, this study can serve as a reference design for future system integrations where more functions, such as environment monitoring and location tracking, can be added. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Time to audit.

    PubMed

    Smyth, L G; Martin, Z; Hall, B; Collins, D; Mealy, K

    2012-09-01

    Public and political pressures are increasing on doctors and in particular surgeons to demonstrate competence assurance. While surgical audit is an integral part of surgical practice, its implementation and delivery at a national level in Ireland is poorly developed. Limits to successful audit systems relate to lack of funding and administrative support. In Wexford General Hospital, we have a comprehensive audit system which is based on the Lothian Surgical Audit system. We wished to analyse the amount of time required by the Consultant, NCHDs and clerical staff on one surgical team to run a successful audit system. Data were collected over a calendar month. This included time spent coding and typing endoscopy procedures, coding and typing operative procedures, and typing and signing discharge letters. The total amount of time spent to run the audit system for one Consultant surgeon for one calendar month was 5,168 min or 86.1 h. Greater than 50% of this time related to work performed by administrative staff. Only the intern and administrative staff spent more than 5% of their working week attending to work related to the audit. An integrated comprehensive audit system requires a very little time input by Consultant surgeons. Greater than 90% of the workload in running the audit was performed by the junior house doctors and administrative staff. The main financial implications for national audit implementation would relate to software and administrative staff recruitment. Implementation of the European Working Time Directive in Ireland may limit the time available for NCHD's to participate in clinical audit.

  20. Integrated Life-Cycle Framework for Maintenance, Monitoring and Reliability of Naval Ship Structures

    DTIC Science & Technology

    2012-08-15

    number of times, a fast and accurate method for analyzing the ship hull is required. In order to obtain this required computational speed and accuracy...Naval Engineers Fleet Maintenance & Modernization Symposium (FMMS 2011) [8] and the Eleventh International Conference on Fast Sea Transportation ( FAST ...probabilistic strength of the ship hull. First, a novel deterministic method for the fast and accurate calculation of the strength of the ship hull is

  1. Preserving the Dinosaurs or At Least Their Knowledge!

    DTIC Science & Technology

    2015-09-01

    used. And, it was taking away from my practice time on the soccer field. Penn’s freshman soccer team was one of the best in the country, requiring...character-building experiences). Another difference is that much of the expertise we gained in the 1980s, such as mine, left government service and...Department. …The 21st century requires us to integrate leadership development practices with emerg - ing opportunities to rethink how we develop

  2. Defective sensorimotor integration in preparation for reaction time tasks in patients with multiple sclerosis.

    PubMed

    Cabib, Christopher; Llufriu, Sara; Casanova-Molla, Jordi; Saiz, Albert; Valls-Solé, Josep

    2015-03-01

    Slowness of voluntary movements in patients with multiple sclerosis (MS) may be due to various factors, including attentional and cognitive deficits, delays in motor conduction time, and impairment of specific central nervous system circuits. In 13 healthy volunteers and 20 mildly disabled, relapsing-remitting MS patients, we examined simple reaction time (SRT) tasks requiring sensorimotor integration in circuits involving the corpus callosum and the brain stem. A somatosensory stimulus was used as the imperative signal (IS), and subjects were requested to react with either the ipsilateral or the contralateral hand (uncrossed vs. crossed SRT). In 33% of trials, a startling auditory stimulus was presented together with the IS, and the percentage reaction time change with respect to baseline SRT trials was measured (StartReact effect). The difference between crossed and uncrossed SRT, which requires interhemispheric conduction, was significantly larger in patients than in healthy subjects (P = 0.021). The StartReact effect, which involves activation of brain stem motor pathways, was reduced significantly in patients with respect to healthy subjects (uncrossed trials: P = 0.015; crossed trials: P = 0.005). In patients, a barely significant correlation was found between SRT delay and conduction abnormalities in motor and sensory pathways (P = 0.02 and P = 0.04, respectively). The abnormalities found specifically in trials reflecting interhemispheric transfer of information, as well as the evidence for reduced subcortical motor preparation, indicate that a delay in reaction time execution in MS patients cannot be explained solely by conduction slowing in motor and sensory pathways but suggest, instead, defective sensorimotor integration mechanisms in at least the two circuits examined. Copyright © 2015 The American Physiological Society.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  4. A Simple Tool for the Design and Analysis of Multiple-Reflector Antennas in a Multi-Disciplinary Environment

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea

    2000-01-01

    The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.

  5. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    PubMed Central

    Honeine, Jean-Louis; Schieppati, Marco

    2014-01-01

    Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices. PMID:25339872

  6. Time-varying spatial data integration and visualization: 4 Dimensions Environmental Observations Platform (4-DEOS)

    NASA Astrophysics Data System (ADS)

    Paciello, Rossana; Coviello, Irina; Filizzola, Carolina; Genzano, Nicola; Lisi, Mariano; Mazzeo, Giuseppe; Pergola, Nicola; Sileo, Giancanio; Tramutoli, Valerio

    2014-05-01

    In environmental studies the integration of heterogeneous and time-varying data, is a very common requirement for investigating and possibly visualize correlations among physical parameters underlying the dynamics of complex phenomena. Datasets used in such kind of applications has often different spatial and temporal resolutions. In some case superimposition of asynchronous layers is required. Traditionally the platforms used to perform spatio-temporal visual data analyses allow to overlay spatial data, managing the time using 'snapshot' data model, each stack of layers being labeled with different time. But this kind of architecture does not incorporate the temporal indexing neither the third spatial dimension which is usually given as an independent additional layer. Conversely, the full representation of a generic environmental parameter P(x,y,z,t) in the 4D space-time domain could allow to handle asynchronous datasets as well as less traditional data-products (e.g. vertical sections, punctual time-series, etc.) . In this paper we present the 4 Dimensions Environmental Observation Platform (4-DEOS), a system based on a web services architecture Client-Broker-Server. This platform is a new open source solution for both a timely access and an easy integration and visualization of heterogeneous (maps, vertical profiles or sections, punctual time series, etc.) asynchronous, geospatial products. The innovative aspect of the 4-DEOS system is that users can analyze data/products individually moving through time, having also the possibility to stop the display of some data/products and focus on other parameters for better studying their temporal evolution. This platform gives the opportunity to choose between two distinct display modes for time interval or for single instant. Users can choose to visualize data/products in two ways: i) showing each parameter in a dedicated window or ii) visualize all parameters overlapped in a single window. A sliding time bar, allows to follow the temporal evolution of the selected data/product. With this software, users have the possibility to identify events partially correlated each other not only in the spatial dimension but also in the time domain even at different time lags.

  7. A precision analogue integrator system for heavy current measurement in MFDC resistance spot welding

    NASA Astrophysics Data System (ADS)

    Xia, Yu-Jun; Zhang, Zhong-Dian; Xia, Zhen-Xin; Zhu, Shi-Liang; Zhang, Rui

    2016-02-01

    In order to control and monitor the quality of middle frequency direct current (MFDC) resistance spot welding (RSW), precision measurement of the welding current up to 100 kA is required, for which Rogowski coils are the only viable current transducers at present. Thus, a highly accurate analogue integrator is the key to restoring the converted signals collected from the Rogowski coils. Previous studies emphasised that the integration drift is a major factor that influences the performance of analogue integrators, but capacitive leakage error also has a significant impact on the result, especially in long-time pulse integration. In this article, new methods of measuring and compensating capacitive leakage error are proposed to fabricate a precision analogue integrator system for MFDC RSW. A voltage holding test is carried out to measure the integration error caused by capacitive leakage, and an original integrator with a feedback adder is designed to compensate capacitive leakage error in real time. The experimental results and statistical analysis show that the new analogue integrator system could constrain both drift and capacitive leakage error, of which the effect is robust to different voltage levels of output signals. The total integration error is limited within  ±0.09 mV s-1 0.005% s-1 or full scale at a 95% confidence level, which makes it possible to achieve the precision measurement of the welding current of MFDC RSW with Rogowski coils of 0.1% accuracy class.

  8. Software requirements flow-down and preliminary software design for the G-CLEF spectrograph

    NASA Astrophysics Data System (ADS)

    Evans, Ian N.; Budynkiewicz, Jamie A.; DePonte Evans, Janet; Miller, Joseph B.; Onyuksel, Cem; Paxson, Charles; Plummer, David A.

    2016-08-01

    The Giant Magellan Telescope (GMT)-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, precision radial velocity (PRV) optical echelle spectrograph that will be the first light instrument on the GMT. The G-CLEF instrument device control subsystem (IDCS) provides software control of the instrument hardware, including the active feedback loops that are required to meet the G-CLEF PRV stability requirements. The IDCS is also tasked with providing operational support packages that include data reduction pipelines and proposal preparation tools. A formal, but ultimately pragmatic approach is being used to establish a complete and correct set of requirements for both the G-CLEF device control and operational support packages. The device control packages must integrate tightly with the state-machine driven software and controls reference architecture designed by the GMT Organization. A model-based systems engineering methodology is being used to develop a preliminary design that meets these requirements. Through this process we have identified some lessons that have general applicability to the development of software for ground-based instrumentation. For example, tasking an individual with overall responsibility for science/software/hardware integration is a key step to ensuring effective integration between these elements. An operational concept document that includes detailed routine and non- routine operational sequences should be prepared in parallel with the hardware design process to tie together these elements and identify any gaps. Appropriate time-phasing of the hardware and software design phases is important, but revisions to driving requirements that impact software requirements and preliminary design are inevitable. Such revisions must be carefully managed to ensure efficient use of resources.

  9. Improved algorithms and methods for room sound-field prediction by acoustical radiosity in arbitrary polyhedral rooms.

    PubMed

    Nosal, Eva-Marie; Hodgson, Murray; Ashdown, Ian

    2004-08-01

    This paper explores acoustical (or time-dependent) radiosity--a geometrical-acoustics sound-field prediction method that assumes diffuse surface reflection. The literature of acoustical radiosity is briefly reviewed and the advantages and disadvantages of the method are discussed. A discrete form of the integral equation that results from meshing the enclosure boundaries into patches is presented and used in a discrete-time algorithm. Furthermore, an averaging technique is used to reduce computational requirements. To generalize to nonrectangular rooms, a spherical-triangle method is proposed as a means of evaluating the integrals over solid angles that appear in the discrete form of the integral equation. The evaluation of form factors, which also appear in the numerical solution, is discussed for rectangular and nonrectangular rooms. This algorithm and associated methods are validated by comparison of the steady-state predictions for a spherical enclosure to analytical solutions.

  10. Improved algorithms and methods for room sound-field prediction by acoustical radiosity in arbitrary polyhedral rooms

    NASA Astrophysics Data System (ADS)

    Nosal, Eva-Marie; Hodgson, Murray; Ashdown, Ian

    2004-08-01

    This paper explores acoustical (or time-dependent) radiosity-a geometrical-acoustics sound-field prediction method that assumes diffuse surface reflection. The literature of acoustical radiosity is briefly reviewed and the advantages and disadvantages of the method are discussed. A discrete form of the integral equation that results from meshing the enclosure boundaries into patches is presented and used in a discrete-time algorithm. Furthermore, an averaging technique is used to reduce computational requirements. To generalize to nonrectangular rooms, a spherical-triangle method is proposed as a means of evaluating the integrals over solid angles that appear in the discrete form of the integral equation. The evaluation of form factors, which also appear in the numerical solution, is discussed for rectangular and nonrectangular rooms. This algorithm and associated methods are validated by comparison of the steady-state predictions for a spherical enclosure to analytical solutions.

  11. Waveguide integrated superconducting single-photon detectors with high internal quantum efficiency at telecom wavelengths

    PubMed Central

    Kahl, Oliver; Ferrari, Simone; Kovalyuk, Vadim; Goltsman, Gregory N.; Korneev, Alexander; Pernice, Wolfram H. P.

    2015-01-01

    Superconducting nanowire single-photon detectors (SNSPDs) provide high efficiency for detecting individual photons while keeping dark counts and timing jitter minimal. Besides superior detection performance over a broad optical bandwidth, compatibility with an integrated optical platform is a crucial requirement for applications in emerging quantum photonic technologies. Here we present SNSPDs embedded in nanophotonic integrated circuits which achieve internal quantum efficiencies close to unity at 1550 nm wavelength. This allows for the SNSPDs to be operated at bias currents far below the critical current where unwanted dark count events reach milli-Hz levels while on-chip detection efficiencies above 70% are maintained. The measured dark count rates correspond to noise-equivalent powers in the 10−19 W/Hz−1/2 range and the timing jitter is as low as 35 ps. Our detectors are fully scalable and interface directly with waveguide-based optical platforms. PMID:26061283

  12. Multirate Particle-in-Cell Time Integration Techniques of Vlasov-Maxwell Equations for Collisionless Kinetic Plasma Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Guangye; Chacon, Luis; Knoll, Dana Alan

    2015-07-31

    A multi-rate PIC formulation was developed that employs large timesteps for slow field evolution, and small (adaptive) timesteps for particle orbit integrations. Implementation is based on a JFNK solver with nonlinear elimination and moment preconditioning. The approach is free of numerical instabilities (ω peΔt >>1, and Δx >> λ D), and requires many fewer dofs (vs. explicit PIC) for comparable accuracy in challenging problems. Significant gains (vs. conventional explicit PIC) may be possible for large scale simulations. The paper is organized as follows: Vlasov-Maxwell Particle-in-cell (PIC) methods for plasmas; Explicit, semi-implicit, and implicit time integrations; Implicit PIC formulation (Jacobian-Free Newton-Krylovmore » (JFNK) with nonlinear elimination allows different treatments of disparate scales, discrete conservation properties (energy, charge, canonical momentum, etc.)); Some numerical examples; and Summary.« less

  13. An integral equation formulation for rigid bodies in Stokes flow in three dimensions

    NASA Astrophysics Data System (ADS)

    Corona, Eduardo; Greengard, Leslie; Rachh, Manas; Veerapaneni, Shravan

    2017-03-01

    We present a new derivation of a boundary integral equation (BIE) for simulating the three-dimensional dynamics of arbitrarily-shaped rigid particles of genus zero immersed in a Stokes fluid, on which are prescribed forces and torques. Our method is based on a single-layer representation and leads to a simple second-kind integral equation. It avoids the use of auxiliary sources within each particle that play a role in some classical formulations. We use a spectrally accurate quadrature scheme to evaluate the corresponding layer potentials, so that only a small number of spatial discretization points per particle are required. The resulting discrete sums are computed in O (n) time, where n denotes the number of particles, using the fast multipole method (FMM). The particle positions and orientations are updated by a high-order time-stepping scheme. We illustrate the accuracy, conditioning and scaling of our solvers with several numerical examples.

  14. The recall of information from working memory. Insights from behavioural and chronometric perspectives.

    PubMed

    Towse, John N; Cowan, Nelson; Hitch, Graham J; Horton, Neil J

    2008-01-01

    We describe and evaluate a recall reconstruction hypothesis for working memory (WM), according to which items can be recovered from multiple memory representations. Across four experiments, participants recalled memoranda that were either integrated with or independent of the sentence content. We found consistently longer pauses accompanying the correct recall of integrated compared with independent words, supporting the argument that sentence memory could scaffold the access of target items. Integrated words were also more likely to be recalled correctly, dependent on the details of the task. Experiment 1 investigated the chronometry of spoken recall for word span and reading span, with participants completing an unfinished sentence in the latter case. Experiments 2 and 3 confirm recall time differences without using word generation requirements, while Experiment 4 used an item and order response choice paradigm with nonspoken responses. Data emphasise the value of recall timing in constraining theories of WM functioning.

  15. Waveguide integrated superconducting single-photon detectors with high internal quantum efficiency at telecom wavelengths.

    PubMed

    Kahl, Oliver; Ferrari, Simone; Kovalyuk, Vadim; Goltsman, Gregory N; Korneev, Alexander; Pernice, Wolfram H P

    2015-06-10

    Superconducting nanowire single-photon detectors (SNSPDs) provide high efficiency for detecting individual photons while keeping dark counts and timing jitter minimal. Besides superior detection performance over a broad optical bandwidth, compatibility with an integrated optical platform is a crucial requirement for applications in emerging quantum photonic technologies. Here we present SNSPDs embedded in nanophotonic integrated circuits which achieve internal quantum efficiencies close to unity at 1550 nm wavelength. This allows for the SNSPDs to be operated at bias currents far below the critical current where unwanted dark count events reach milli-Hz levels while on-chip detection efficiencies above 70% are maintained. The measured dark count rates correspond to noise-equivalent powers in the 10(-19) W/Hz(-1/2) range and the timing jitter is as low as 35 ps. Our detectors are fully scalable and interface directly with waveguide-based optical platforms.

  16. Fast Time-Dependent Density Functional Theory Calculations of the X-ray Absorption Spectroscopy of Large Systems.

    PubMed

    Besley, Nicholas A

    2016-10-11

    The computational cost of calculations of K-edge X-ray absorption spectra using time-dependent density functional (TDDFT) within the Tamm-Dancoff approximation is significantly reduced through the introduction of a severe integral screening procedure that includes only integrals that involve the core s basis function of the absorbing atom(s) coupled with a reduced quality numerical quadrature for integrals associated with the exchange and correlation functionals. The memory required for the calculations is reduced through construction of the TDDFT matrix within the absorbing core orbitals excitation space and exploiting further truncation of the virtual orbital space. The resulting method, denoted fTDDFTs, leads to much faster calculations and makes the study of large systems tractable. The capability of the method is demonstrated through calculations of the X-ray absorption spectra at the carbon K-edge of chlorophyll a, C 60 and C 70 .

  17. A new interpolation method for gridded extensive variables with application in Lagrangian transport and dispersion models

    NASA Astrophysics Data System (ADS)

    Hittmeir, Sabine; Philipp, Anne; Seibert, Petra

    2017-04-01

    In discretised form, an extensive variable usually represents an integral over a 3-dimensional (x,y,z) grid cell. In the case of vertical fluxes, gridded values represent integrals over a horizontal (x,y) grid face. In meteorological models, fluxes (precipitation, turbulent fluxes, etc.) are usually written out as temporally integrated values, thus effectively forming 3D (x,y,t) integrals. Lagrangian transport models require interpolation of all relevant variables towards the location in 4D space of each of the computational particles. Trivial interpolation algorithms usually implicitly assume the integral value to be a point value valid at the grid centre. If the integral value would be reconstructed from the interpolated point values, it would in general not be correct. If nonlinear interpolation methods are used, non-negativity cannot easily be ensured. This problem became obvious with respect to the interpolation of precipitation for the calculation of wet deposition FLEXPART (http://flexpart.eu) which uses ECMWF model output or other gridded input data. The presently implemented method consists of a special preprocessing in the input preparation software and subsequent linear interpolation in the model. The interpolated values are positive but the criterion of cell-wise conservation of the integral property is violated; it is also not very accurate as it smoothes the field. A new interpolation algorithm was developed which introduces additional supporting grid points in each time interval with linear interpolation to be applied in FLEXPART later between them. It preserves the integral precipitation in each time interval, guarantees the continuity of the time series, and maintains non-negativity. The function values of the remapping algorithm at these subgrid points constitute the degrees of freedom which can be prescribed in various ways. Combining the advantages of different approaches leads to a final algorithm respecting all the required conditions. To improve the monotonicity behaviour we additionally derived a filter to restrict over- or undershooting. At the current stage, the algorithm is meant primarily for the temporal dimension. It can also be applied with operator-splitting to include the two horizontal dimensions. An extension to 2D appears feasible, while a fully 3D version would most likely not justify the effort compared to the operator-splitting approach.

  18. Consolidation of Vocabulary Is Associated with Sleep in Children

    ERIC Educational Resources Information Center

    Henderson, Lisa M.; Weighall, Anna R.; Brown, Helen; Gaskell, M. Gareth

    2012-01-01

    Although the acquisition of a novel word is apparently rapid, adult research suggests that integration of novel and existing knowledge (measured by engagement in lexical competition) requires sleep-associated consolidation. We present the first investigation of whether a similar time-course dissociation characterizes word learning across…

  19. Parental Effects on Children's Emotional Development over Time and across Generations

    ERIC Educational Resources Information Center

    Stack, Dale M.; Serbin, Lisa A.; Enns, Leah N.; Ruttle, Paula L.; Barrieau, Lindsey

    2010-01-01

    Principal tasks of the early childhood years, including attaining self-efficacy, self-control, social integration, and preparedness for education, require the development of adaptive and competent emotional development. Results from longitudinal and intergenerational studies examining the effect of parenting behaviors on children's emotional…

  20. Symplectic molecular dynamics simulations on specially designed parallel computers.

    PubMed

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  1. Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats

    USGS Publications Warehouse

    Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.

    2012-01-01

    This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.

  2. In-situ Testing of the EHT High Gain and Frequency Ultra-Stable Integrators

    NASA Astrophysics Data System (ADS)

    Miller, Kenneth; Ziemba, Timothy; Prager, James; Slobodov, Ilia; Lotz, Dan

    2014-10-01

    Eagle Harbor Technologies (EHT) has developed a long-pulse integrator that exceeds the ITER specification for integration error and pulse duration. During the Phase I program, EHT improved the RPPL short-pulse integrators, added a fast digital reset, and demonstrated that the new integrators exceed the ITER integration error and pulse duration requirements. In Phase II, EHT developed Field Programmable Gate Array (FPGA) software that allows for integrator control and real-time signal digitization and processing. In the second year of Phase II, the EHT integrator will be tested at a validation platform experiment (HIT-SI) and tokamak (DIII-D). In the Phase IIB program, EHT will continue development of the EHT integrator to reduce overall cost per channel. EHT will test lower cost components, move to surface mount components, and add an onboard Field Programmable Gate Array and data acquisition to produce a stand-alone system with lower cost per channel and increased the channel density. EHT will test the Phase IIB integrator at a validation platform experiment (HIT-SI) and tokamak (DIII-D). Work supported by the DOE under Contract Number (DE-SC0006281).

  3. Temporal windows in visual processing: "prestimulus brain state" and "poststimulus phase reset" segregate visual transients on different temporal scales.

    PubMed

    Wutz, Andreas; Weisz, Nathan; Braun, Christoph; Melcher, David

    2014-01-22

    Dynamic vision requires both stability of the current perceptual representation and sensitivity to the accumulation of sensory evidence over time. Here we study the electrophysiological signatures of this intricate balance between temporal segregation and integration in vision. Within a forward masking paradigm with short and long stimulus onset asynchronies (SOA), we manipulated the temporal overlap of the visual persistence of two successive transients. Human observers enumerated the items presented in the second target display as a measure of the informational capacity read-out from this partly temporally integrated visual percept. We observed higher β-power immediately before mask display onset in incorrect trials, in which enumeration failed due to stronger integration of mask and target visual information. This effect was timescale specific, distinguishing between segregation and integration of visual transients that were distant in time (long SOA). Conversely, for short SOA trials, mask onset evoked a stronger visual response when mask and targets were correctly segregated in time. Examination of the target-related response profile revealed the importance of an evoked α-phase reset for the segregation of those rapid visual transients. Investigating this precise mapping of the temporal relationships of visual signals onto electrophysiological responses highlights how the stream of visual information is carved up into discrete temporal windows that mediate between segregated and integrated percepts. Fragmenting the stream of visual information provides a means to stabilize perceptual events within one instant in time.

  4. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  5. The organizational and clinical impact of integrating bedside equipment to an information system: a systematic literature review of patient data management systems (PDMS).

    PubMed

    Cheung, Amy; van Velden, Floris H P; Lagerburg, Vera; Minderman, Niels

    2015-03-01

    The introduction of an information system integrated to bedside equipment requires significant financial and resource investment; therefore understanding the potential impact is beneficial for decision-makers. However, no systematic literature reviews (SLRs) focus on this topic. This SLR aims to gather evidence on the impact of the aforementioned system, also known as a patient data management system (PDMS) on both organizational and clinical outcomes. A literature search was performed using the databases Medline/PubMed and CINHAL for English articles published between January 2000 and December 2012. A quality assessment was performed on articles deemed relevant for the SLR. Eighteen articles were included in the SLR. Sixteen articles investigated the impact of a PDMS on the organizational outcomes, comprising descriptive, quantitative and qualitative studies. A PDMS was found to reduce the charting time, increase the time spent on direct patient care and reduce the occurrence of errors. Only two articles investigated the clinical impact of a PDMS. Both reported an improvement in clinical outcomes when a PDMS was integrated with a clinical decision support system (CDSS). A PDMS has shown to offer many advantages in both the efficiency and the quality of care delivered to the patient. In addition, a PDMS integrated to a CDSS may improve clinical outcomes, although further studies are required for validation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Medical Systems Engineering to Support Mars Mission Crew Autonomy

    NASA Technical Reports Server (NTRS)

    Antonsen, Erik; Mindock, Jennifer

    2017-01-01

    Human spaceflight missions to Mars face exceptionally challenging resource limitations that far exceed those faced before. Increasing transit times, decreasing opportunity for resupply, communications challenges, and extended time to evacuate a crew to definitive medical care dictate a level of crew autonomy in medical care that is beyond the current medical model. To approach this challenge, a medical systems engineering approach is proposed that relies on a clearly articulated Concept of Operations and risk analysis tools that are in development at NASA. This paper proposes an operational clinical model with key terminology and concepts translated to a controls theory paradigm to frame a common language between clinical and engineering teams. This common language will be used for design and validation of an exploration medical system that is fully integrated into a Mars transit vehicle. This approach merges medical simulation, human factors evaluation techniques, and human-in-the-loop testing in ground based analogs to tie medical hardware and software subsystem performance and overall medical system functionality to metrics of operational medical autonomy. Merging increases in operational clinical autonomy with a more restricted vehicle system resource scenario in interplanetary spaceflight will require an unprecedented level of medical and engineering integration. Full integration of medical capabilities into a Mars vehicle system may require a new approach to integrating medical system design and operations into the vehicle Program structure. Prior to the standing-up of a Mars Mission Program, proof of concept is proposed through the Human Research Program.

  7. 125 GHz sine wave gating InGaAs/InP single-photon detector with a monolithically integrated readout circuit

    NASA Astrophysics Data System (ADS)

    Jiang, Wen-Hao; Liu, Jian-Hong; Liu, Yin; Jin, Ge; Zhang, Jun; Pan, Jian-Wei

    2017-12-01

    InGaAs/InP single-photon detectors (SPDs) are the key devices for applications requiring near-infrared single-photon detection. Gating mode is an effective approach to synchronous single-photon detection. Increasing gating frequency and reducing module size are important challenges for the design of such detector system. Here we present for the first time an InGaAs/InP SPD with 1.25 GHz sine wave gating using a monolithically integrated readout circuit (MIRC). The MIRC has a size of 15 mm * 15 mm and implements the miniaturization of avalanche extraction for high-frequency sine wave gating. In the MIRC, low-pass filters and a low-noise radio frequency amplifier are integrated based on the technique of low temperature co-fired ceramic, which can effectively reduce the parasitic capacitance and extract weak avalanche signals. We then characterize the InGaAs/InP SPD to verify the functionality and reliability of MIRC, and the SPD exhibits excellent performance with 27.5 % photon detection efficiency, 1.2 kcps dark count rate, and 9.1 % afterpulse probability at 223 K and 100 ns hold-off time. With this MIRC, one can further design miniaturized high-frequency SPD modules that are highly required for practical applications.

  8. Intelligent switches of integrated lightwave circuits with core telecommunication functions

    NASA Astrophysics Data System (ADS)

    Izhaky, Nahum; Duer, Reuven; Berns, Neil; Tal, Eran; Vinikman, Shirly; Schoenwald, Jeffrey S.; Shani, Yosi

    2001-05-01

    We present a brief overview of a promising switching technology based on Silica on Silicon thermo-optic integrated circuits. This is basically a 2D solid-state optical device capable of non-blocking switching operation. Except of its excellent performance (insertion loss<5dB, switching time<2ms...), the switch enables additional important build-in functionalities. It enables single-to- single channel switching and single-to-multiple channel multicasting/broadcasting. In addition, it has the capability of channel weighting and variable output power control (attenuation), for instance, to equalize signal levels and compensate for unbalanced different optical input powers, or to equalize unbalanced EDFA gain curve. We examine the market segments appropriate for the switch size and technology, followed by a discussion of the basic features of the technology. The discussion is focused on important requirements from the switch and the technology (e.g., insertion loss, power consumption, channel isolation, extinction ratio, switching time, and heat dissipation). The mechanical design is also considered. It must take into account integration of optical fiber, optical planar wafer, analog electronics and digital microprocessor controls, embedded software, and heating power dissipation. The Lynx Photon.8x8 switch is compared to competing technologies, in terms of typical market performance requirements.

  9. Vertical integration in medical school: effect on the transition to postgraduate training.

    PubMed

    Wijnen-Meijer, Marjo; ten Cate, Olle Th J; van der Schaaf, Marieke; Borleffs, Jan C C

    2010-03-01

    Recently, many medical schools' curricula have been revised so that they represent vertically integrated (VI) curricula. Important changes include: the provision of earlier clinical experience; longer clerkships, and the fostering of increasing levels of responsibility. One of the aims of vertical integration is to facilitate the transition to postgraduate training. The purpose of the present study is to determine whether a VI curriculum at medical school affects the transition to postgraduate training in a positive way. We carried out a questionnaire study among graduates of six medical schools in the Netherlands, who had followed either a VI or a non-VI curriculum. Items in the questionnaire focused on preparedness for work and postgraduate training, the time and number of applications required to be admitted to residency, and the process of making career choices. In comparison with those who have followed non-VI programmes, graduates of VI curricula appear to make definitive career choices earlier, need less time and fewer applications to obtain residency positions and feel more prepared for work and postgraduate training. The curriculum at medical school affects the transition to postgraduate training. Additional research is required to determine which components of the curriculum cause this effect and to specify under which conditions this effect occurs.

  10. Pneumatic oscillator circuits for timing and control of integrated microfluidics.

    PubMed

    Duncan, Philip N; Nguyen, Transon V; Hui, Elliot E

    2013-11-05

    Frequency references are fundamental to most digital systems, providing the basis for process synchronization, timing of outputs, and waveform synthesis. Recently, there has been growing interest in digital logic systems that are constructed out of microfluidics rather than electronics, as a possible means toward fully integrated laboratory-on-a-chip systems that do not require any external control apparatus. However, the full realization of this goal has not been possible due to the lack of on-chip frequency references, thus requiring timing signals to be provided from off-chip. Although microfluidic oscillators have been demonstrated, there have been no reported efforts to characterize, model, or optimize timing accuracy, which is the fundamental metric of a clock. Here, we report pneumatic ring oscillator circuits built from microfluidic valves and channels. Further, we present a compressible-flow analysis that differs fundamentally from conventional circuit theory, and we show the utility of this physically based model for the optimization of oscillator stability. Finally, we leverage microfluidic clocks to demonstrate circuits for the generation of phase-shifted waveforms, self-driving peristaltic pumps, and frequency division. Thus, pneumatic oscillators can serve as on-chip frequency references for microfluidic digital logic circuits. On-chip clocks and pumps both constitute critical building blocks on the path toward achieving autonomous laboratory-on-a-chip devices.

  11. Scheduling revisited workstations in integrated-circuit fabrication

    NASA Technical Reports Server (NTRS)

    Kline, Paul J.

    1992-01-01

    The cost of building new semiconductor wafer fabrication factories has grown rapidly, and a state-of-the-art fab may cost 250 million dollars or more. Obtaining an acceptable return on this investment requires high productivity from the fabrication facilities. This paper describes the Photo Dispatcher system which was developed to make machine-loading recommendations on a set of key fab machines. Dispatching policies that generally perform well in job shops (e.g., Shortest Remaining Processing Time) perform poorly for workstations such as photolithography which are visited several times by the same lot of silicon wafers. The Photo Dispatcher evaluates the history of workloads throughout the fab and identifies bottleneck areas. The scheduler then assigns priorities to lots depending on where they are headed after photolithography. These priorities are designed to avoid starving bottleneck workstations and to give preference to lots that are headed to areas where they can be processed with minimal waiting. Other factors considered by the scheduler to establish priorities are the nearness of a lot to the end of its process flow and the time that the lot has already been waiting in queue. Simulations that model the equipment and products in one of Texas Instrument's wafer fabs show the Photo Dispatcher can produce a 10 percent improvement in the time required to fabricate integrated circuits.

  12. Multigranular integrated services optical network

    NASA Astrophysics Data System (ADS)

    Yu, Oliver; Yin, Leping; Xu, Huan; Liao, Ming

    2006-12-01

    Based on all-optical switches without requiring fiber delay lines and optical-electrical-optical interfaces, the multigranular optical switching (MGOS) network integrates three transport services via unified core control to efficiently support bursty and stream traffic of subwavelength to multiwavelength bandwidth. Adaptive robust optical burst switching (AR-OBS) aggregates subwavelength burst traffic into asynchronous light-rate bursts, transported via slotted-time light paths established by fast two-way reservation with robust blocking recovery control. Multiwavelength optical switching (MW-OS) decomposes multiwavelength stream traffic into a group of timing-related light-rate streams, transported via a light-path group to meet end-to-end delay-variation requirements. Optical circuit switching (OCS) simply converts wavelength stream traffic from an electrical-rate into a light-rate stream. The MGOS network employs decoupled routing, wavelength, and time-slot assignment (RWTA) and novel group routing and wavelength assignment (GRWA) to select slotted-time light paths and light-path groups, respectively. The selected resources are reserved by the unified multigranular robust fast optical reservation protocol (MG-RFORP). Simulation results show that elastic traffic is efficiently supported via AR-OBS in terms of loss rate and wavelength utilization, while connection-oriented wavelength traffic is efficiently supported via wavelength-routed OCS in terms of connection blocking and wavelength utilization. The GRWA-tuning result for MW-OS is also shown.

  13. Littoral transport in the surf zone elucidated by an Eulerian sediment tracer.

    USGS Publications Warehouse

    Duane, D.B.; James, W.R.

    1980-01-01

    An Eulerian, or time integration, sand tracer experiment was designed and carried out in the surf zone near Pt. Mugu, California on April 19, 1972. Data indicate that conditions of stationarity and finite boundaries required for proper application of Eulerian tracer theory exist for short time periods in the surf zone. Grain counts suggest time required for tracer sand to attain equilibrium concentration is on the order of 30-60 minutes. Grain counts also indicate transport (discharge) was strongly dependent upon grain size, with the maximum rate occurring in the size 2.5-2.75 phi, decreasing to both finer and coarser sizes. The measured instantaneous transport was at the annual rate of 2.4 x 106 m3/yr.- Authors

  14. Development of Improved Surface Integral Methods for Jet Aeroacoustic Predictions

    NASA Technical Reports Server (NTRS)

    Pilon, Anthony R.; Lyrintzis, Anastasios S.

    1997-01-01

    The accurate prediction of aerodynamically generated noise has become an important goal over the past decade. Aeroacoustics must now be an integral part of the aircraft design process. The direct calculation of aerodynamically generated noise with CFD-like algorithms is plausible. However, large computer time and memory requirements often make these predictions impractical. It is therefore necessary to separate the aeroacoustics problem into two parts, one in which aerodynamic sound sources are determined, and another in which the propagating sound is calculated. This idea is applied in acoustic analogy methods. However, in the acoustic analogy, the determination of far-field sound requires the solution of a volume integral. This volume integration again leads to impractical computer requirements. An alternative to the volume integrations can be found in the Kirchhoff method. In this method, Green's theorem for the linear wave equation is used to determine sound propagation based on quantities on a surface surrounding the source region. The change from volume to surface integrals represents a tremendous savings in the computer resources required for an accurate prediction. This work is concerned with the development of enhancements of the Kirchhoff method for use in a wide variety of aeroacoustics problems. This enhanced method, the modified Kirchhoff method, is shown to be a Green's function solution of Lighthill's equation. It is also shown rigorously to be identical to the methods of Ffowcs Williams and Hawkings. This allows for development of versatile computer codes which can easily alternate between the different Kirchhoff and Ffowcs Williams-Hawkings formulations, using the most appropriate method for the problem at hand. The modified Kirchhoff method is developed primarily for use in jet aeroacoustics predictions. Applications of the method are shown for two dimensional and three dimensional jet flows. Additionally, the enhancements are generalized so that they may be used in any aeroacoustics problem.

  15. Supportability Issues and Approaches for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Watson, J. K.; Ivins, M. S.; Cunningham, R. A.

    2006-01-01

    Maintaining and repairing spacecraft systems hardware to achieve required levels of operational availability during long-duration exploration missions will be challenged by limited resupply opportunities, constraints on the mass and volume available for spares and other maintenance-related provisions, and extended communications times. These factors will force the adoption of new approaches to the integrated logistics support of spacecraft systems hardware. For missions beyond the Moon, all spares, equipment, and supplies must either be prepositioned prior to departure from Earth of human crews or carried with the crews. The mass and volume of spares must be minimized by enabling repair at the lowest hardware levels, imposing commonality and standardization across all mission elements at all hardware levels, and providing the capability to fabricate structural and mechanical spares as required. Long round-trip communications times will require increasing levels of autonomy by the crews for most operations including spacecraft maintenance. Effective implementation of these approaches will only be possible when their need is recognized at the earliest stages of the program, when they are incorporated in operational concepts and programmatic requirements, and when diligence is applied in enforcing these requirements throughout system design in an integrated way across all contractors and suppliers. These approaches will be essential for the success of missions to Mars. Although limited duration lunar missions may be successfully accomplished with more traditional approaches to supportability, those missions will offer an opportunity to refine these concepts, associated technologies, and programmatic implementation methodologies so that they can be most effectively applied to later missions.

  16. Integration and Testing Challenges of Small Satellite Missions: Experiences from the Space Technology 5 Project

    NASA Technical Reports Server (NTRS)

    Sauerwein, Timothy A.; Gostomski, Tom

    2007-01-01

    The Space Technology 5(ST5) payload was successfully carried into orbit on an OSC Pegasus XL launch vehicle, which was carried aloft and dropped from the OSC Lockheed L-1011 from Vandenberg Air Force Base March 22,2006, at 9:03 am Eastern time, 6:03 am Pacific time. In order to reach the completion of the development and successful launch of ST 5, the systems integration and test(I&T) team determined that a different approach was required to meet the project requirements rather than the standard I&T approach used for single, room-sized satellites. The ST5 payload, part of NASA's New Millennium Program headquartered at JPL, consisted of three micro satellites (approximately 30 kg each) and the Pegasus Support Structure (PSS), the system that connected the spacecrafts to the launch vehicle and deployed the spacecrafts into orbit from the Pegasus XL launch vehicle. ST5 was a technology demonstration payload, intended to test six (6) new technologies for potential use for future space flights along with demonstrating the ability of small satellites to perform quality science. The main technology was a science grade magnetometer designed to take measurements of the earth's magnetic field. The three spacecraft were designed, integrated, and tested at NASA Goddard Space Flight Center with integration and environmental testing occurring in the Bldg. 7-1 0-15-29. The three spacecraft were integrated and tested by the same I&T team. The I&T Manager determined that there was insufficient time in the schedule to perform the three I&T spacecraft activities in series used standard approaches. The solution was for spacecraft #1 to undergo integration and test first, followed by spacecraft #2 and #3 simultaneously. This simultaneous integration was successful for several reasons. Each spacecraft had a Lead Test Conductor who planned and coordinated their spacecraft through its integration and test activities. One team of engineers and technicians executed the integration of all three spacecraft, learning and gaining knowledge and efficiency as spacecraft #1 integration and testing progressed. They became acutely familiar with the hardware, operation and processes for I&T, thus each team member had the experience and knowledge to safely execute I&T for spacecraft #2 and #3 together. The integration team was very versatile and each member could perform many different activities or work any spacecraft, when needed. Daily meetings between the three Lead TCs and technician team allowed the team to plan and implement activities efficiently. The three (3) spacecraft and PSS were successfully integrated and tested, shipped to the launch site, and ready for launch per the I&T schedule that was planned three years previously.

  17. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.

  18. Maintaining Balance: The Increasing Role of Energy Storage for Renewable Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stenclik, Derek; Denholm, Paul; Chalamala, Babu

    For nearly a century, global power systems have focused on three key functions: generating, transmitting, and distributing electricity as a real-time commodity. Physics requires that electricity generation always be in real-time balance with load-despite variability in load on time scales ranging from subsecond disturbances to multiyear trends. With the increasing role of variable generation from wind and solar, the retirement of fossil-fuel-based generation, and a changing consumer demand profile, grid operators are using new methods to maintain this balance.

  19. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  20. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  1. WARP3D-Release 10.8: Dynamic Nonlinear Analysis of Solids using a Preconditioned Conjugate Gradient Software Architecture

    NASA Technical Reports Server (NTRS)

    Koppenhoefer, Kyle C.; Gullerud, Arne S.; Ruggieri, Claudio; Dodds, Robert H., Jr.; Healy, Brian E.

    1998-01-01

    This report describes theoretical background material and commands necessary to use the WARP3D finite element code. WARP3D is under continuing development as a research code for the solution of very large-scale, 3-D solid models subjected to static and dynamic loads. Specific features in the code oriented toward the investigation of ductile fracture in metals include a robust finite strain formulation, a general J-integral computation facility (with inertia, face loading), an element extinction facility to model crack growth, nonlinear material models including viscoplastic effects, and the Gurson-Tver-gaard dilatant plasticity model for void growth. The nonlinear, dynamic equilibrium equations are solved using an incremental-iterative, implicit formulation with full Newton iterations to eliminate residual nodal forces. The history integration of the nonlinear equations of motion is accomplished with Newmarks Beta method. A central feature of WARP3D involves the use of a linear-preconditioned conjugate gradient (LPCG) solver implemented in an element-by-element format to replace a conventional direct linear equation solver. This software architecture dramatically reduces both the memory requirements and CPU time for very large, nonlinear solid models since formation of the assembled (dynamic) stiffness matrix is avoided. Analyses thus exhibit the numerical stability for large time (load) steps provided by the implicit formulation coupled with the low memory requirements characteristic of an explicit code. In addition to the much lower memory requirements of the LPCG solver, the CPU time required for solution of the linear equations during each Newton iteration is generally one-half or less of the CPU time required for a traditional direct solver. All other computational aspects of the code (element stiffnesses, element strains, stress updating, element internal forces) are implemented in the element-by- element, blocked architecture. This greatly improves vectorization of the code on uni-processor hardware and enables straightforward parallel-vector processing of element blocks on multi-processor hardware.

  2. Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities

    NASA Technical Reports Server (NTRS)

    Ross, Richard W.

    2001-01-01

    The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.

  3. Real-Time Monitoring of Scada Based Control System for Filling Process

    NASA Astrophysics Data System (ADS)

    Soe, Aung Kyaw; Myint, Aung Naing; Latt, Maung Maung; Theingi

    2008-10-01

    This paper is a design of real-time monitoring for filling system using Supervisory Control and Data Acquisition (SCADA). The monitoring of production process is described in real-time using Visual Basic.Net programming under Visual Studio 2005 software without SCADA software. The software integrators are programmed to get the required information for the configuration screens. Simulation of components is expressed on the computer screen using parallel port between computers and filling devices. The programs of real-time simulation for the filling process from the pure drinking water industry are provided.

  4. Photothermal damage is correlated to the delivery rate of time-integrated temperature

    NASA Astrophysics Data System (ADS)

    Denton, Michael L.; Noojin, Gary D.; Gamboa, B. Giovanna; Ahmed, Elharith M.; Rockwell, Benjamin A.

    2016-03-01

    Photothermal damage rate processes in biological tissues are usually characterized by a kinetics approach. This stems from experimental data that show how the transformation of a specified biological property of cells or biomolecule (plating efficiency for viability, change in birefringence, tensile strength, etc.) is dependent upon both time and temperature. However, kinetic methods require determination of kinetic rate constants and knowledge of substrate or product concentrations during the reaction. To better understand photothermal damage processes we have identified temperature histories of cultured retinal cells receiving minimum lethal thermal doses for a variety of laser and culture parameters. These "threshold" temperature histories are of interest because they inherently contain information regarding the fundamental thermal dose requirements for damage in individual cells. We introduce the notion of time-integrated temperature (Tint) as an accumulated thermal dose (ATD) with units of °C s. Damaging photothermal exposure raises the rate of ATD accumulation from that of the ambient (e.g. 37 °C) to one that correlates with cell death (e.g. 52 °C). The degree of rapid increase in ATD (ΔATD) during photothermal exposure depends strongly on the laser exposure duration and the ambient temperature.

  5. Next-generation digital camera integration and software development issues

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Peters, Ken; Hecht, Richard

    1998-04-01

    This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.

  6. Classification of DNA nucleotides with transverse tunneling currents

    NASA Astrophysics Data System (ADS)

    Nyvold Pedersen, Jonas; Boynton, Paul; Di Ventra, Massimiliano; Jauho, Antti-Pekka; Flyvbjerg, Henrik

    2017-01-01

    It has been theoretically suggested and experimentally demonstrated that fast and low-cost sequencing of DNA, RNA, and peptide molecules might be achieved by passing such molecules between electrodes embedded in a nanochannel. The experimental realization of this scheme faces major challenges, however. In realistic liquid environments, typical currents in tunneling devices are of the order of picoamps. This corresponds to only six electrons per microsecond, and this number affects the integration time required to do current measurements in real experiments. This limits the speed of sequencing, though current fluctuations due to Brownian motion of the molecule average out during the required integration time. Moreover, data acquisition equipment introduces noise, and electronic filters create correlations in time-series data. We discuss how these effects must be included in the analysis of, e.g., the assignment of specific nucleobases to current signals. As the signals from different molecules overlap, unambiguous classification is impossible with a single measurement. We argue that the assignment of molecules to a signal is a standard pattern classification problem and calculation of the error rates is straightforward. The ideas presented here can be extended to other sequencing approaches of current interest.

  7. A comparison of artificial compressibility and fractional step methods for incompressible flow computations

    NASA Technical Reports Server (NTRS)

    Chan, Daniel C.; Darian, Armen; Sindir, Munir

    1992-01-01

    We have applied and compared the efficiency and accuracy of two commonly used numerical methods for the solution of Navier-Stokes equations. The artificial compressibility method augments the continuity equation with a transient pressure term and allows one to solve the modified equations as a coupled system. Due to its implicit nature, one can have the luxury of taking a large temporal integration step at the expense of higher memory requirement and larger operation counts per step. Meanwhile, the fractional step method splits the Navier-Stokes equations into a sequence of differential operators and integrates them in multiple steps. The memory requirement and operation count per time step are low, however, the restriction on the size of time marching step is more severe. To explore the strengths and weaknesses of these two methods, we used them for the computation of a two-dimensional driven cavity flow with Reynolds number of 100 and 1000, respectively. Three grid sizes, 41 x 41, 81 x 81, and 161 x 161 were used. The computations were considered after the L2-norm of the change of the dependent variables in two consecutive time steps has fallen below 10(exp -5).

  8. Innovative hyperchaotic encryption algorithm for compressed video

    NASA Astrophysics Data System (ADS)

    Yuan, Chun; Zhong, Yuzhuo; Yang, Shiqiang

    2002-12-01

    It is accepted that stream cryptosystem can achieve good real-time performance and flexibility which implements encryption by selecting few parts of the block data and header information of the compressed video stream. Chaotic random number generator, for example Logistics Map, is a comparatively promising substitute, but it is easily attacked by nonlinear dynamic forecasting and geometric information extracting. In this paper, we present a hyperchaotic cryptography scheme to encrypt the compressed video, which integrates Logistics Map with Z(232 - 1) field linear congruential algorithm to strengthen the security of the mono-chaotic cryptography, meanwhile, the real-time performance and flexibility of the chaotic sequence cryptography are maintained. It also integrates with the dissymmetrical public-key cryptography and implements encryption and identity authentification on control parameters at initialization phase. In accord with the importance of data in compressed video stream, encryption is performed in layered scheme. In the innovative hyperchaotic cryptography, the value and the updating frequency of control parameters can be changed online to satisfy the requirement of the network quality, processor capability and security requirement. The innovative hyperchaotic cryprography proves robust security by cryptoanalysis, shows good real-time performance and flexible implement capability through the arithmetic evaluating and test.

  9. Multipurpose silicon photonics signal processor core.

    PubMed

    Pérez, Daniel; Gasulla, Ivana; Crudgington, Lee; Thomson, David J; Khokhar, Ali Z; Li, Ke; Cao, Wei; Mashanovich, Goran Z; Capmany, José

    2017-09-21

    Integrated photonics changes the scaling laws of information and communication systems offering architectural choices that combine photonics with electronics to optimize performance, power, footprint, and cost. Application-specific photonic integrated circuits, where particular circuits/chips are designed to optimally perform particular functionalities, require a considerable number of design and fabrication iterations leading to long development times. A different approach inspired by electronic Field Programmable Gate Arrays is the programmable photonic processor, where a common hardware implemented by a two-dimensional photonic waveguide mesh realizes different functionalities through programming. Here, we report the demonstration of such reconfigurable waveguide mesh in silicon. We demonstrate over 20 different functionalities with a simple seven hexagonal cell structure, which can be applied to different fields including communications, chemical and biomedical sensing, signal processing, multiprocessor networks, and quantum information systems. Our work is an important step toward this paradigm.Integrated optical circuits today are typically designed for a few special functionalities and require complex design and development procedures. Here, the authors demonstrate a reconfigurable but simple silicon waveguide mesh with different functionalities.

  10. The importance of values in evidence-based medicine.

    PubMed

    Kelly, Michael P; Heath, Iona; Howick, Jeremy; Greenhalgh, Trisha

    2015-10-12

    Evidence-based medicine (EBM) has always required integration of patient values with 'best' clinical evidence. It is widely recognized that scientific practices and discoveries, including those of EBM, are value-laden. But to date, the science of EBM has focused primarily on methods for reducing bias in the evidence, while the role of values in the different aspects of the EBM process has been almost completely ignored. In this paper, we address this gap by demonstrating how a consideration of values can enhance every aspect of EBM, including: prioritizing which tests and treatments to investigate, selecting research designs and methods, assessing effectiveness and efficiency, supporting patient choice and taking account of the limited time and resources available to busy clinicians. Since values are integral to the practice of EBM, it follows that the highest standards of EBM require values to be made explicit, systematically explored, and integrated into decision making. Through 'values based' approaches, EBM's connection to the humanitarian principles upon which it was founded will be strengthened.

  11. Military applications of a cockpit integrated electronic flight bag

    NASA Astrophysics Data System (ADS)

    Herman, Robert P.; Seinfeld, Robert D.

    2004-09-01

    Converting the pilot's flight bag information from paper to electronic media is being performed routinely by commercial airlines for use with an on-board PC. This concept is now being further advanced with a new class of electronic flight bags (EFB) recently put into commercial operation which interface directly with major on-board avionics systems and has its own dedicated panel mounted display. This display combines flight bag information with real time aircraft performance and maintenance data. This concept of an integrated EFB which is now being used by the commercial airlines as a level 1 certified system, needs to be explored for military applications. This paper describes a system which contains all the attributes of an Electronic Flight Bag with the addition of interfaces which are linked to military aircraft missions such as those for tankers, cargo haulers, search and rescue and maritime aircraft as well as GATM requirements. The adaptation of the integrated EFB to meet these military requirements is then discussed.

  12. The optical design of 3D ICs for smartphone and optro-electronics sensing module

    NASA Astrophysics Data System (ADS)

    Huang, Jiun-Woei

    2018-03-01

    Smartphone require limit space for image system, current lens, used in smartphones are refractive type, the effective focal length is limited the thickness of phone physical size. Other, such as optro-electronics sensing chips, proximity optical sensors, and UV indexer chips are integrated into smart phone with limit space. Due to the requirement of multiple lens in smartphone, proximity optical sensors, UV indexer and other optro-electronics sensing chips in a limited space of CPU board in future smart phone, optro-electronics 3D IC's integrated with optical lens or components may be a key technology for 3 C products. A design for reflective lens is fitted to CMOS, proximity optical sensors, UV indexer and other optro-electronics sensing chips based on 3-D IC. The reflective lens can be threes times of effective focal lens, and be able to resolve small object. The system will be assembled and integrated in one 3-D IC more easily.

  13. Development of Integrated Modular Avionics Application Based on Simulink and XtratuM

    NASA Astrophysics Data System (ADS)

    Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons

    2013-08-01

    This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.

  14. From generic pathways to ICT-supported horizontally integrated care: the SmartCare approach and convergence with future Internet assembly.

    PubMed

    Urošević, Vladimir; Mitić, Marko

    2014-01-01

    Successful service integration in policy and practice requires both technology innovation and service process innovation being pursued and implemented at the same time. The SmartCare project (partially EC-funded under CIP ICT PSP Program) aims to achieve this through development, piloting and evaluation of ICT-based services, horizontally integrating health and social care in ten pilot regions, including Kraljevo region in Serbia. The project has identified and adopted two generic highest-level common thematic pathways in joint consolidation phase - integrated support for long-term care and integrated support after hospital discharge. A common set of standard functional specifications for an open ICT platform enabling the delivery of integrated care is being defined, around the challenges of data sharing, coordination and communication in these two formalized pathways. Implementation and system integration on technology and architecture level are to be based on open standards, multivendor interoperability, and leveraging on the current evolving open specification technology foundations developed in relevant projects across the European Research Area.

  15. Preservation and distribution of fungal cultures

    Treesearch

    Karen K. Nakasone; Stephen W. Peterson; Shung-Chang Jong

    2004-01-01

    Maintaining and preserving fungal cultures are essential elements of systematics and biodiversity studies. Because fungi are such a diverse group, several methods of cultivation and preservation are required to ensure the viability and morphological, physiological, and genetic integrity of the cultures over time. The cost and convenience of each method, however, also...

  16. Manhattan Country School: An Urban School in the Catskills

    ERIC Educational Resources Information Center

    Southern, Jane; Plummer, James

    1978-01-01

    This school integrates an outdoor, farm experience with an urban school curriculum. Elementary students spend increasing lengths of time working on a country farm as a mandatory requirement. Activities include farm chores, nature hikes, household chores, and practical crafts. Students come from a wide range of backgrounds and incomes. (MA)

  17. DEMONSTRATION OF A MULTI-SCALE INTEGRATED MONITORING AND ASSESSMENT IN NY/NJ HARBOR

    EPA Science Inventory

    The Clean Water Act (CWA) requires states and tribes to assess the overall quality of their waters (Sec 305(b)), determine whether that quality is changing over time, identify problem areas and management actions necessary to resolve those problems, and evaluate the effectiveness...

  18. 40 CFR 63.7330 - What are my monitoring requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... relative change in particulate matter loadings using a bag leak detection system according to the... integrity of the baghouse through quarterly visual inspections of the baghouse interior for air leaks; and... must at all times monitor the pressure drop and water flow rate using a CPMS according to the...

  19. Thinking in Three Dimensions: Leadership for Capacity Building, Sustainability, and Succession

    ERIC Educational Resources Information Center

    Byrne-Jimenez, Monica; Orr, Margaret Terry

    2012-01-01

    Urban schools often experience rapid turnover among teachers and leaders. Yet, research and practice highlight the importance of sustained leadership over time as an integral component of school improvement. Successful leadership requires principals who operate in multiple dimensions at once, moving from individual capacity to group empowerment,…

  20. Profiles of Inconsistent Knowledge in Children's Pathways of Conceptual Change

    ERIC Educational Resources Information Center

    Schneider, Michael; Hardy, Ilonca

    2013-01-01

    Conceptual change requires learners to restructure parts of their conceptual knowledge base. Prior research has identified the fragmentation and the integration of knowledge as 2 important component processes of knowledge restructuring but remains unclear as to their relative importance and the time of their occurrence during development. Previous…

  1. Applying Adverse Outcome Pathways (AOPs) to support Integrated Approaches to Testing and Assessment (IATA workshop report)

    EPA Science Inventory

    Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or alread...

  2. Development of near-infrared spectroscopy calibrations to measure quality characteristics in intact Brassicaceae germplasm

    USDA-ARS?s Scientific Manuscript database

    Determining seed quality parameters is an integral part of cultivar improvement and germplasm screening. However, quality tests are often time cnsuming, seed destructive, and can require large seed samples. This study describes the development of near-infrared spectroscopy (NIRS) calibrations to mea...

  3. Assimilating Digital Immigrants into High-Access Learning Environments

    ERIC Educational Resources Information Center

    Roseberry, Jason

    2016-01-01

    As schools have placed an increased emphasis on instructional technology, the amount of money spent on hardware and student devices in classrooms has increased significantly (Nagel, 2014). Because administrators are underestimating the instructional shift required for effective integration of these devices, they are not allocating enough time and…

  4. Choice in Quail Neonates: The Origins of Generalized Matching

    ERIC Educational Resources Information Center

    Schneider, Susan M.; Lickliter, Robert

    2010-01-01

    Although newborns have surprised scientists with their learning skills, proficiency on concurrent schedules of reinforcement requires (in effect) the ability to integrate and compare behavior-consequence relations over time. Can very young animals obey the quantitative relation that applies to such repeated choices, the generalized matching law?…

  5. Reliability program requirements for aeronautical and space system contractors

    NASA Technical Reports Server (NTRS)

    1987-01-01

    General reliability program requirements for NASA contracts involving the design, development, fabrication, test, and/or use of aeronautical and space systems including critical ground support equipment are prescribed. The reliability program requirements require (1) thorough planning and effective management of the reliability effort; (2) definition of the major reliability tasks and their place as an integral part of the design and development process; (3) planning and evaluating the reliability of the system and its elements (including effects of software interfaces) through a program of analysis, review, and test; and (4) timely status indication by formal documentation and other reporting to facilitate control of the reliability program.

  6. Towards efficient backward-in-time adjoint computations using data compression techniques

    DOE PAGES

    Cyr, E. C.; Shadid, J. N.; Wildey, T.

    2014-12-16

    In the context of a posteriori error estimation for nonlinear time-dependent partial differential equations, the state-of-the-practice is to use adjoint approaches which require the solution of a backward-in-time problem defined by a linearization of the forward problem. One of the major obstacles in the practical application of these approaches, we found, is the need to store, or recompute, the forward solution to define the adjoint problem and to evaluate the error representation. Our study considers the use of data compression techniques to approximate forward solutions employed in the backward-in-time integration. The development derives an error representation that accounts for themore » difference between the standard-approach and the compressed approximation of the forward solution. This representation is algorithmically similar to the standard representation and only requires the computation of the quantity of interest for the forward solution and the data-compressed reconstructed solution (i.e. scalar quantities that can be evaluated as the forward problem is integrated). This approach is then compared with existing techniques, such as checkpointing and time-averaged adjoints. Lastly, we provide numerical results indicating the potential efficiency of our approach on a transient diffusion–reaction equation and on the Navier–Stokes equations. These results demonstrate memory compression ratios up to 450×450× while maintaining reasonable accuracy in the error-estimates.« less

  7. The DaveMLTranslator: An Interface for DAVE-ML Aerodynamic Models

    NASA Technical Reports Server (NTRS)

    Hill, Melissa A.; Jackson, E. Bruce

    2007-01-01

    It can take weeks or months to incorporate a new aerodynamic model into a vehicle simulation and validate the performance of the model. The Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML) has been proposed as a means to reduce the time required to accomplish this task by defining a standard format for typical components of a flight dynamic model. The purpose of this paper is to describe an object-oriented C++ implementation of a class that interfaces a vehicle subsystem model specified in DAVE-ML and a vehicle simulation. Using the DaveMLTranslator class, aerodynamic or other subsystem models can be automatically imported and verified at run-time, significantly reducing the elapsed time between receipt of a DAVE-ML model and its integration into a simulation environment. The translator performs variable initializations, data table lookups, and mathematical calculations for the aerodynamic build-up, and executes any embedded static check-cases for verification. The implementation is efficient, enabling real-time execution. Simple interface code for the model inputs and outputs is the only requirement to integrate the DaveMLTranslator as a vehicle aerodynamic model. The translator makes use of existing table-lookup utilities from the Langley Standard Real-Time Simulation in C++ (LaSRS++). The design and operation of the translator class is described and comparisons with existing, conventional, C++ aerodynamic models of the same vehicle are given.

  8. Advanced optical network architecture for integrated digital avionics

    NASA Astrophysics Data System (ADS)

    Morgan, D. Reed

    1996-12-01

    For the first time in the history of avionics, the network designer now has a choice in selecting the media that interconnects the sources and sinks of digital data on aircraft. Electrical designs are already giving way to photonics in application areas where the data rate times distance product is large or where special design requirements such as low weight or EMI considerations are critical. Future digital avionic architectures will increasingly favor the use of photonic interconnects as network data rates of one gigabit/second and higher are needed to support real-time operation of high-speed integrated digital processing. As the cost of optical network building blocks is reduced and as temperature-rugged laser sources are matured, metal interconnects will be forced to retreat to applications spanning shorter and shorter distances. Although the trend is already underway, the widespread use of digital optics will first occur at the system level, where gigabit/second, real-time interconnects between sensors, processors, mass memories and displays separated by a least of few meters will be required. The application of photonic interconnects for inter-printed wiring board signalling across the backplane will eventually find application for gigabit/second applications since signal degradation over copper traces occurs before one gigabit/second and 0.5 meters are reached. For the foreseeable future however, metal interconnects will continue to be used to interconnect devices on printed wiring boards since 5 gigabit/second signals can be sent over metal up to around 15 centimeters. Current-day applications of optical interconnects at the system level are described and a projection of how advanced optical interconnect technology will be driven by the use of high speed integrated digital processing on future aircraft is presented. The recommended advanced network for application in the 2010 time frame is a fiber-based system with a signalling speed of around 2-3 gigabits per second. This switch-based unified network will interconnect sensors, displays, mass memory and controls and displays to computer modules within the processing complex. The characteristics of required building blocks needed for the future are described. These building blocks include the fiber, an optical switch, a laser-based transceiver, blind-mate connectors and an optical backplane.

  9. The fast and the slow of skilled bimanual rhythm production: parallel versus integrated timing.

    PubMed

    Krampe, R T; Kliegl, R; Mayr, U; Engbert, R; Vorberg, D

    2000-02-01

    Professional pianists performed 2 bimanual rhythms at a wide range of different tempos. The polyrhythmic task required the combination of 2 isochronous sequences (3 against 4) between the hands; in the syncopated rhythm task successive keystrokes formed intervals of identical (isochronous) durations. At slower tempos, pianists relied on integrated timing control merging successive intervals between the hands into a common reference frame. A timer-motor model is proposed based on the concepts of rate fluctuation and the distinction between target specification and timekeeper execution processes as a quantitative account of performance at slow tempos. At rapid rates expert pianists used hand-independent, parallel timing control. In alternative to a model based on a single central clock, findings support a model of flexible control structures with multiple timekeepers that can work in parallel to accommodate specific task constraints.

  10. The reablement team's voice: a qualitative study of how an integrated multidisciplinary team experiences participation in reablement.

    PubMed

    Hjelle, Kari Margrete; Skutle, Olbjørg; Førland, Oddvar; Alvsvåg, Herdis

    2016-01-01

    Reablement is an early and time-limited home-based rehabilitation intervention that emphasizes intensive, goal-oriented, and multidisciplinary assistance for people experiencing functional decline. Few empirical studies to date have examined the experiences of the integrated multidisciplinary teams involved in reablement. Accordingly, the aim of this study was to explore and describe how an integrated multidisciplinary team in Norway experienced participation in reablement. An integrated multidisciplinary team consisting of health care professionals with a bachelor's degree (including a physiotherapist, a social educator, occupational therapists, and nurses) and home-based care personnel without a bachelor's degree (auxiliary nurses and nursing assistants) participated in focus group discussions. Qualitative content analysis was used to analyze the resulting data. Three main themes emerged from the participants' experiences with participating in reablement, including "the older adult's goals are crucial", "a different way of thinking and acting - a shift in work culture", and "a better framework for cooperation and application of professional expertise and judgment". The integrated multidisciplinary team and the older adults collaborated and worked in the same direction to achieve the person's valued goals. The team supported the older adults in performing activities themselves rather than completing tasks for them. To facilitate cooperation and application of professional expertise and judgment, common meeting times and meeting places for communication and supervision were necessary. Structural factors that promote integrated multidisciplinary professional decisions include providing common meeting times and meeting places as well as sufficient time to apply professional knowledge when supervising and supporting older persons in everyday activities. These findings have implications for practice and suggest future directions for improving health care services. The shift in work culture from static to dynamic service is time consuming and requires politicians, community leaders, and health care systems to allocate the necessary time to support this approach to thinking and working.

  11. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  12. Does Temporal Integration Occur for Unrecognizable Words in Visual Crowding?

    PubMed Central

    Zhou, Jifan; Lee, Chia-Lin; Li, Kuei-An; Tien, Yung-Hsuan; Yeh, Su-Ling

    2016-01-01

    Visual crowding—the inability to see an object when it is surrounded by flankers in the periphery—does not block semantic activation: unrecognizable words due to visual crowding still generated robust semantic priming in subsequent lexical decision tasks. Based on the previous finding, the current study further explored whether unrecognizable crowded words can be temporally integrated into a phrase. By showing one word at a time, we presented Chinese four-word idioms with either a congruent or incongruent ending word in order to examine whether the three preceding crowded words can be temporally integrated to form a semantic context so as to affect the processing of the ending word. Results from both behavioral (Experiment 1) and Event-Related Potential (Experiment 2 and 3) measures showed congruency effect in only the non-crowded condition, which does not support the existence of unconscious multi-word integration. Aside from four-word idioms, we also found that two-word (modifier + adjective combination) integration—the simplest kind of temporal semantic integration—did not occur in visual crowding (Experiment 4). Our findings suggest that integration of temporally separated words might require conscious awareness, at least under the timing conditions tested in the current study. PMID:26890366

  13. Design of a Nanoscale, CMOS-Integrable, Thermal-Guiding Structure for Boolean-Logic and Neuromorphic Computation.

    PubMed

    Loke, Desmond; Skelton, Jonathan M; Chong, Tow-Chong; Elliott, Stephen R

    2016-12-21

    One of the requirements for achieving faster CMOS electronics is to mitigate the unacceptably large chip areas required to steer heat away from or, more recently, toward the critical nodes of state-of-the-art devices. Thermal-guiding (TG) structures can efficiently direct heat by "meta-materials" engineering; however, some key aspects of the behavior of these systems are not fully understood. Here, we demonstrate control of the thermal-diffusion properties of TG structures by using nanometer-scale, CMOS-integrable, graphene-on-silica stacked materials through finite-element-methods simulations. It has been shown that it is possible to implement novel, controllable, thermally based Boolean-logic and spike-timing-dependent plasticity operations for advanced (neuromorphic) computing applications using such thermal-guide architectures.

  14. Cytoscape: the network visualization tool for GenomeSpace workflows.

    PubMed

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013.

  15. Cytoscape: the network visualization tool for GenomeSpace workflows

    PubMed Central

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P.

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013. PMID:25165537

  16. Doing One Thing Well: Leveraging Microservices for NASA Earth Science Discovery and Access Across Heterogenous Data Sources

    NASA Astrophysics Data System (ADS)

    Baynes, K.; Gilman, J.; Pilone, D.; Mitchell, A. E.

    2015-12-01

    The NASA EOSDIS (Earth Observing System Data and Information System) Common Metadata Repository (CMR) is a continuously evolving metadata system that merges all existing capabilities and metadata from EOS ClearingHOuse (ECHO) and the Global Change Master Directory (GCMD) systems. This flagship catalog has been developed with several key requirements: fast search and ingest performance ability to integrate heterogenous external inputs and outputs high availability and resiliency scalability evolvability and expandability This talk will focus on the advantages and potential challenges of tackling these requirements using a microservices architecture, which decomposes system functionality into smaller, loosely-coupled, individually-scalable elements that communicate via well-defined APIs. In addition, time will be spent examining specific elements of the CMR architecture and identifying opportunities for future integrations.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, David E.

    The process by which super-thermal ions slow down against background Coulomb potentials arises in many fields of study. In particular, this is one of the main mechanisms by which the mass and energy from the reaction products of fusion reactions is deposited back into the background. Many of these fields are characterized by length and time scales that are the same magnitude as the range and duration of the trajectory of these particles, before they thermalize into the background. This requires numerical simulation of this slowing down process through numerically integrating the velocities and energies of these particles. This papermore » first presents a simple introduction to the required plasma physics, followed by the description of the numerical integration used to integrate a beam of particles. This algorithm is unique in that it combines in an integrated manner both a second-order integration of the slowing down with the particle beam dispersion. These two processes are typically computed in isolation from each other. A simple test problem of a beam of alpha particles slowing down against an inert background of deuterium and tritium with varying properties of both the beam and the background illustrate the utility of the algorithm. This is followed by conclusions and appendices. The appendices define the notation, units, and several useful identities.« less

  18. How much is not enough? Human resources requirements for primary health care: a case study from South Africa.

    PubMed

    Daviaud, Emmanuelle; Chopra, Mickey

    2008-01-01

    To quantify staff requirements in primary health care facilities in South Africa through an adaptation of the WHO workload indicator of staff needs tool. We use a model to estimate staffing requirements at primary health care facilities. The model integrates several empirically-based assumptions including time and type of health worker required for each type of consultation, amount of management time required, amount of clinical support required and minimum staff requirements per type of facility. We also calculate the number of HIV-related consultations per district. The model incorporates type of facility, monthly travelling time for mobile clinics, opening hours per week, yearly activity and current staffing and calculates the expected staffing per category of staff per facility and compares it to the actual staffing. Across all the districts there is either an absence of doctors visiting clinics or too few doctors to cover the opening times of community health centres. Overall the number of doctors is only 7% of the required amount. There is 94% of the required number of professional nurses but with wide variations between districts, with a few districts having excesses while most have shortages. The number of enrolled nurses is 60% of what it should be. There are 17% too few enrolled nurse assistants. Across all districts there is wide variation in staffing levels between facilities leading to inefficient use of professional staff. The application of an adapted WHO workload tool identified important human resource planning issues.

  19. L-3 Com AVISYS civil aviation self-protection system

    NASA Astrophysics Data System (ADS)

    Carey, Jim

    2006-05-01

    In early 2004, L-3 Com AVISYS Corporation (hereinafter referred to as L-3 AVISYS or AVISYS) completed a contract for the integration and deployment of an advanced Infrared Countermeasures self-protection suite for a Head of State Airbus A340 aircraft. This initial L-3 AVISYS IRCM Suite was named WIPPS (Widebody Integrated Platform Protection System). The A340 WIPPS installation provisions were FAA certified with the initial deployment of the modified aircraft. WIPPS is unique in that it utilizes a dual integrated missile warning subsystem to produce a robust, multi-spectral, ultra-low false alarm rate threat warning capability. WIPPS utilizes the Thales MWS-20 Pulsed Doppler Radar Active MWS and the EADS AN/AAR-60 Ultraviolet Passive MWS. These MWS subsystems are integrated through an L-3 AVISYS Electronic Warfare Control Set (EWCS). The EWCS also integrates the WIPPS MWS threat warning information with the A340 flight computer data to optimize ALE-47 Countermeasure Dispensing System IR decoy dispensing commands, program selection and timing. WIPPS utilizes standard and advanced IR Decoys produced by ARMTEC Defense and Alloy Surfaces. WIPPS demonstrated that when IR decoy dispensing is controlled by threat range and time-to-go information provided by an Active MWS, unsurpassed self protection levels are achievable. Recognizing the need for high volume civil aviation protection, L-3 AVISYS configured a variant of WIPPS optimized for commercial airline reliability requirements, safety requirements, supportability and most importantly, affordability. L-3 AVISYS refers to this IRCM suite as CAPS (Commercial Airliner Protection System). CAPS has been configured for applications to all civil aircraft ranging from the small Regional Jets to the largest Wide-bodies. This presentation and paper will provide an overview of the initial WIPPS IRCM Suite and the important factors that were considered in defining the CAPS configuration.

  20. Ab initio molecular dynamics with nuclear quantum effects at classical cost: Ring polymer contraction for density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsalek, Ondrej; Markland, Thomas E., E-mail: tmarkland@stanford.edu

    Path integral molecular dynamics simulations, combined with an ab initio evaluation of interactions using electronic structure theory, incorporate the quantum mechanical nature of both the electrons and nuclei, which are essential to accurately describe systems containing light nuclei. However, path integral simulations have traditionally required a computational cost around two orders of magnitude greater than treating the nuclei classically, making them prohibitively costly for most applications. Here we show that the cost of path integral simulations can be dramatically reduced by extending our ring polymer contraction approach to ab initio molecular dynamics simulations. By using density functional tight binding asmore » a reference system, we show that our ring polymer contraction scheme gives rapid and systematic convergence to the full path integral density functional theory result. We demonstrate the efficiency of this approach in ab initio simulations of liquid water and the reactive protonated and deprotonated water dimer systems. We find that the vast majority of the nuclear quantum effects are accurately captured using contraction to just the ring polymer centroid, which requires the same number of density functional theory calculations as a classical simulation. Combined with a multiple time step scheme using the same reference system, which allows the time step to be increased, this approach is as fast as a typical classical ab initio molecular dynamics simulation and 35× faster than a full path integral calculation, while still exactly including the quantum sampling of nuclei. This development thus offers a route to routinely include nuclear quantum effects in ab initio molecular dynamics simulations at negligible computational cost.« less

  1. A Qualitative Readiness-Requirements Assessment Model for Enterprise Big-Data Infrastructure Investment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olama, Mohammed M; McNair, Wade; Sukumar, Sreenivas R

    2014-01-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its ownmore » set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system s readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMA-DMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system s data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.« less

  2. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    NASA Astrophysics Data System (ADS)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  3. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  4. Numerical solution methods for viscoelastic orthotropic materials

    NASA Technical Reports Server (NTRS)

    Gramoll, K. C.; Dillard, D. A.; Brinson, H. F.

    1988-01-01

    Numerical solution methods for viscoelastic orthotropic materials, specifically fiber reinforced composite materials, are examined. The methods include classical lamination theory using time increments, direction solution of the Volterra Integral, Zienkiewicz's linear Prony series method, and a new method called Nonlinear Differential Equation Method (NDEM) which uses a nonlinear Prony series. The criteria used for comparison of the various methods include the stability of the solution technique, time step size stability, computer solution time length, and computer memory storage. The Volterra Integral allowed the implementation of higher order solution techniques but had difficulties solving singular and weakly singular compliance function. The Zienkiewicz solution technique, which requires the viscoelastic response to be modeled by a Prony series, works well for linear viscoelastic isotropic materials and small time steps. The new method, NDEM, uses a modified Prony series which allows nonlinear stress effects to be included and can be used with orthotropic nonlinear viscoelastic materials. The NDEM technique is shown to be accurate and stable for both linear and nonlinear conditions with minimal computer time.

  5. Increasing Efficacy of Thrombectomy by Using Digital Subtraction Angiography to Confirm Stent Retriever Clot Integration

    PubMed Central

    Simon, Scott; Cooke, Jonathon

    2016-01-01

    Physicians performing thrombectomy for acute stroke have had increasing success as thrombectomy-specific devices have continued to evolve. As the devices evolve, so too must the techniques. The current generation of stent retriever thrombectomy devices requires five minutes of dwell time, regardless of the particularities of the case. We have noticed the presence of flow through the stent immediately prior to removal portends a lower chance of successful thrombus retrieval than when no flow is seen, regardless of dwell time. We hypothesize that interventionalists can use the presence or absence of flow to predict adequacy of seating time and decrease the number of deployments per case. This could significantly decrease time to recanalization by avoiding time-consuming, unsuccessful pulls. This is a technical report of a few cases of stent retriever thrombectomy. We propose using post-deployment digital subtraction angiography to confirm thrombus-device integration and increase the chance of thrombus removal. PMID:27182473

  6. Real-time optimizations for integrated smart network camera

    NASA Astrophysics Data System (ADS)

    Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois

    2005-02-01

    We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.

  7. WGM Temperature Tracker

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.

    2012-01-01

    This software implements digital control of a WGM (whispering-gallerymode) resonator temperature based on the dual-mode approach. It comprises one acquisition (dual-channel) and three control modules. The interaction of the proportional-integral loops is designed in the original way, preventing the loops from fighting. The data processing is organized in parallel with the acquisition, which allows the computational overhead time to be suppressed or often completely avoided. WGM resonators potentially provide excellent optical references for metrology, clocks, spectroscopy, and other applications. However, extremely accurate (below micro-Kelvin) temperature stabilization is required. This software allows one specifically advantageous method of such stabilization to be implemented, which is immune to a variety of effects that mask the temperature variation. WGM Temperature Tracker 2.3 (see figure) is a LabVIEW code developed for dual-mode temperature stabilization of WGM resonators. It has allowed for the temperature stabilization at the level of 200 nK with one-second integration time, and 6 nK with 10,000-second integration time, with the above room-temperature set point. This software, in conjunction with the appropriate hardware, can be used as a noncryogenic temperature sensor/ controller with sub-micro-Kelvin sensitivity, which at the time of this reporting considerably outperforms the state of the art.

  8. Integral Equations in Computational Electromagnetics: Formulations, Properties and Isogeometric Analysis

    NASA Astrophysics Data System (ADS)

    Lovell, Amy Elizabeth

    Computational electromagnetics (CEM) provides numerical methods to simulate electromagnetic waves interacting with its environment. Boundary integral equation (BIE) based methods, that solve the Maxwell's equations in the homogeneous or piecewise homogeneous medium, are both efficient and accurate, especially for scattering and radiation problems. Development and analysis electromagnetic BIEs has been a very active topic in CEM research. Indeed, there are still many open problems that need to be addressed or further studied. A short and important list includes (1) closed-form or quasi-analytical solutions to time-domain integral equations, (2) catastrophic cancellations at low frequencies, (3) ill-conditioning due to high mesh density, multi-scale discretization, and growing electrical size, and (4) lack of flexibility due to re-meshing when increasing number of forward numerical simulations are involved in the electromagnetic design process. This dissertation will address those several aspects of boundary integral equations in computational electromagnetics. The first contribution of the dissertation is to construct quasi-analytical solutions to time-dependent boundary integral equations using a direct approach. Direct inverse Fourier transform of the time-harmonic solutions is not stable due to the non-existence of the inverse Fourier transform of spherical Hankel functions. Using new addition theorems for the time-domain Green's function and dyadic Green's functions, time-domain integral equations governing transient scattering problems of spherical objects are solved directly and stably for the first time. Additional, the direct time-dependent solutions, together with the newly proposed time-domain dyadic Green's functions, can enrich the time-domain spherical multipole theory. The second contribution is to create a novel method of moments (MoM) framework to solve electromagnetic boundary integral equation on subdivision surfaces. The aim is to avoid the meshing and re-meshing stages to accelerate the design process when the geometry needs to be updated. Two schemes to construct basis functions on the subdivision surface have been explored. One is to use the div-conforming basis function, and the other one is to create a rigorous iso-geometric approach based on the subdivision basis function with better smoothness properties. This new framework provides us better accuracy, more stability and high flexibility. The third contribution is a new stable integral equation formulation to avoid catastrophic cancellations due to low-frequency breakdown or dense-mesh breakdown. Many of the conventional integral equations and their associated post-processing operations suffer from numerical catastrophic cancellations, which can lead to ill-conditioning of the linear systems or serious accuracy problems. Examples includes low-frequency breakdown and dense mesh breakdown. Another instability may come from nontrivial null spaces of involving integral operators that might be related with spurious resonance or topology breakdown. This dissertation presents several sets of new boundary integral equations and studies their analytical properties. The first proposed formulation leads to the scalar boundary integral equations where only scalar unknowns are involved. Besides the requirements of gaining more stability and better conditioning in the resulting linear systems, multi-physics simulation is another driving force for new formulations. Scalar and vector potentials (rather than electromagnetic field) based formulation have been studied for this purpose. Those new contributions focus on different stages of boundary integral equations in an almost independent manner, e.g. isogeometric analysis framework can be used to solve different boundary integral equations, and the time-dependent solutions to integral equations from different formulations can be achieved through the same methodology proposed.

  9. Compute Element and Interface Box for the Hazard Detection System

    NASA Technical Reports Server (NTRS)

    Villalpando, Carlos Y.; Khanoyan, Garen; Stern, Ryan A.; Some, Raphael R.; Bailey, Erik S.; Carson, John M.; Vaughan, Geoffrey M.; Werner, Robert A.; Salomon, Phil M.; Martin, Keith E.; hide

    2013-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is building a sensor that enables a spacecraft to evaluate autonomously a potential landing area to generate a list of hazardous and safe landing sites. It will also provide navigation inputs relative to those safe sites. The Hazard Detection System Compute Element (HDS-CE) box combines a field-programmable gate array (FPGA) board for sensor integration and timing, with a multicore computer board for processing. The FPGA does system-level timing and data aggregation, and acts as a go-between, removing the real-time requirements from the processor and labeling events with a high resolution time. The processor manages the behavior of the system, controls the instruments connected to the HDS-CE, and services the "heavy lifting" computational requirements for analyzing the potential landing spots.

  10. IR-detection modules from SWIR to VLWIR: performance and applications

    NASA Astrophysics Data System (ADS)

    Breiter, R.; Wendler, J.; Lutz, H.; Rutzinger, S.; Hofmann, K.; Ziegler, J.

    2009-05-01

    The predominant spectral bands for IR applications are the 3-5μm MWIR and 8-10μm LWIR. AIM covers all these bands since many years with a mature MCT technology. For weight, size, power consumption and - last but not least - cost reduction, detection modules for these applications move to a pitch of 15μm. This is in both bands still a good match referring to the optical blur spot size and detector performance. Due to the compact design, the modules are equally well suited for new programs as well as retrofits of 1st GEN systems. Typical configurations at AIM are a 640x512 MWIR module, achieving an NETD < 25 mK @ F/4.6 and 5 ms integration time equivalent to half well fill conditions and an LWIR version with NETD < 30 mK @ F/2 and 110μs integration time. The modules are available either with an integral rotary cooler for portable applications which require minimum cooling power or a split linear cooler with a flexure bearing compressor providing long lifetimes with a MTTF >20,000h as required e.g. for warning sensors in 24/7 operation. A new field of applications supplied by AIM is the short wave infrared SWIR. The major advantage of MCT, the tunable bandgap i.e. cut-off wavelength, allows to match various requirements. So far specifically driven by spaceborne programs, a 1024x256 SWIR focal plane array (FPA) integrated detector cooler assembly (IDCA) with flexure bearing cooler and pulse tube cold finger was developed. The same technology including charge transimpedance amplifier for the low flux in the SWIR is available in a half TV 384x288 configuration. The read-out integrated circuit (ROIC) provides among other features 8 outputs for high frame rates up to 450Hz. Again for spaceborne commercial but also military applications like sensors in ballistic missile defense systems AIM develops MCT based very long wave (VLWIR) detectors with a cut-off wavelength >15μm. The current status and trends at AIM on IR detection modules sensitive in spectral ranges from short wave IR (SWIR) to very long wave IR (VLWIR) together with the requirements of the demanding applications are summarized.

  11. Delivering Integrated Care to the Frail Elderly: The Impact on Professionals’ Objective Burden and Job Satisfaction

    PubMed Central

    Huijsman, Robbert; de Kuyper, Ruben Dennis Maurice; Fabbricotti, Isabelle Natalina

    2016-01-01

    Background: The impact of integrated working on professionals’ objective burden and job satisfaction was examined. An evidence-based intervention targeting frail elderly patients was implemented in the Walcheren region of the Netherlands in 2010. The intervention involved the primary care practice as a single entry point, and included proactive frailty screening, a comprehensive assessment of patient needs, case management, multidisciplinary teams, care plans and protocols, task delegation and task specialisation, a shared information system, a geriatric care network and integrated funding. Methods: A quasi-experimental design with a control group was used. Data regarding objective burden involved the professionals’ time investments over a 12-month period that were collected from patient medical records (n = 377) time registrations, transcripts of meetings and patient questionnaires. Data regarding job satisfaction were collected using questionnaires that were distributed to primary care and home-care professionals (n = 180) after the intervention’s implementation. Within- and between-groups comparisons and regression analyses were performed. Results: Non-patient related time was significantly higher in the experimental group than in the control group, whereas patient-related time did not differ. Job satisfaction remained unaffected by the intervention. Conclusion and Discussion: Integrated working is likely to increase objective burden as it requires professionals to perform additional activities that are largely unrelated to actual patient care. Implications for research and practice are discussed. [Current Controlled Trials ISRCTN05748494]. PMID:28413364

  12. Integrated CMOS photodetectors and signal processing for very low-level chemical sensing with the bioluminescent bioreporter integrated circuit

    NASA Technical Reports Server (NTRS)

    Bolton, Eric K.; Sayler, Gary S.; Nivens, David E.; Rochelle, James M.; Ripp, Steven; Simpson, Michael L.

    2002-01-01

    We report an integrated CMOS microluminometer optimized for the detection of low-level bioluminescence as part of the bioluminescent bioreporter integrated circuit (BBIC). This microluminometer improves on previous devices through careful management of the sub-femtoampere currents, both signal and leakage, that flow in the front-end processing circuitry. In particular, the photodiode is operated with a reverse bias of only a few mV, requiring special attention to the reset circuitry of the current-to-frequency converter (CFC) that forms the front-end circuit. We report a sub-femtoampere leakage current and a minimum detectable signal (MDS) of 0.15 fA (1510 s integration time) using a room temperature 1.47 mm2 CMOS photodiode. This microluminometer can detect luminescence from as few as 5000 fully induced Pseudomonas fluorescens 5RL bacterial cells. c2002 Elsevier Science B.V. All rights reserved.

  13. Ver-i-Fus: an integrated access control and information monitoring and management system

    NASA Astrophysics Data System (ADS)

    Thomopoulos, Stelios C.; Reisman, James G.; Papelis, Yiannis E.

    1997-01-01

    This paper describes the Ver-i-Fus Integrated Access Control and Information Monitoring and Management (IAC-I2M) system that INTELNET Inc. has developed. The Ver-i-Fus IAC-I2M system has been designed to meet the most stringent security and information monitoring requirements while allowing two- way communication between the user and the system. The systems offers a flexible interface that permits to integrate practically any sensing device, or combination of sensing devices, including a live-scan fingerprint reader, thus providing biometrics verification for enhanced security. Different configurations of the system provide solutions to different sets of access control problems. The re-configurable hardware interface, tied together with biometrics verification and a flexible interface that allows to integrate Ver-i-Fus with an MIS, provide an integrated solution to security, time and attendance, labor monitoring, production monitoring, and payroll applications.

  14. A fast low-power optical memory based on coupled micro-ring lasers

    NASA Astrophysics Data System (ADS)

    Hill, Martin T.; Dorren, Harmen J. S.; de Vries, Tjibbe; Leijtens, Xaveer J. M.; den Besten, Jan Hendrik; Smalbrugge, Barry; Oei, Yok-Siang; Binsma, Hans; Khoe, Giok-Djan; Smit, Meint K.

    2004-11-01

    The increasing speed of fibre-optic-based telecommunications has focused attention on high-speed optical processing of digital information. Complex optical processing requires a high-density, high-speed, low-power optical memory that can be integrated with planar semiconductor technology for buffering of decisions and telecommunication data. Recently, ring lasers with extremely small size and low operating power have been made, and we demonstrate here a memory element constructed by interconnecting these microscopic lasers. Our device occupies an area of 18 × 40µm2 on an InP/InGaAsP photonic integrated circuit, and switches within 20ps with 5.5fJ optical switching energy. Simulations show that the element has the potential for much smaller dimensions and switching times. Large numbers of such memory elements can be densely integrated and interconnected on a photonic integrated circuit: fast digital optical information processing systems employing large-scale integration should now be viable.

  15. Observations, Ideas, and Opinions: Systems Engineering and Integration for Return to Flight

    NASA Technical Reports Server (NTRS)

    Gafka, George K.

    2006-01-01

    This presentation addresses project management and systems engineering and integration challenges for return to flight, focusing on the Thermal Protection System Tile Repair Project (TRP). The program documentation philosophy, communication with program requirements flow and philosophy and planned deliverables and documentation are outlined. The development of TRP 'use-as-is' analytical tools is also highlighted and emphasis is placed on the use flight history to assess pre-flight and real-time risk. Additionally, an overview is provided of the repair procedure, including an outline of the logistics deployment chart.

  16. Real Time Maintenance Approval and Required IMMT Coordination

    NASA Technical Reports Server (NTRS)

    Burchell, S.

    2016-01-01

    Payloads are assessed for nominal operations. Payload Developers have the option of performing a maintenance hazard assessment (MHA) for potential maintenance activities. When POIC (Payload Operations and Integration Center) Safety reviews an OCR calling for a maintenance procedure, we cannot approve it without a MHA. If no MHA exists, we contact MER (Mission Evaluation Room) Safety. Depending on the nature of the problem, MER Safety has the option to: Analyze and grant approval themselves; Direct the payload back to the ISRP (Integrated Safety Review Panel); Direct the payload to the IMMT (Increment Mission Management Team).

  17. Operational Overview for UAS Integration in the NAS Project Flight Test Series 3

    NASA Technical Reports Server (NTRS)

    Valkov, Steffi B.; Sternberg, Daniel; Marston, Michael

    2018-01-01

    The National Aeronautics and Space Administration Unmanned Aircraft Systems Integration in the National Airspace System Project has conducted a series of flight tests intended to support the reduction of barriers that prevent unmanned aircraft from flying without the required waivers from the Federal Aviation Administration. The 2015 Flight Test Series 3, supported two separate test configurations. The first configuration investigated the timing of Detect and Avoid alerting thresholds using a radar equipped unmanned vehicle and multiple live intruders flown at varying encounter geometries.

  18. A transition from using multi‐step procedures to a fully integrated system for performing extracorporeal photopheresis: A comparison of costs and efficiencies

    PubMed Central

    Leblond, Veronique; Ouzegdouh, Maya; Button, Paul

    2017-01-01

    Abstract Introduction The Pitié Salpêtrière Hospital Hemobiotherapy Department, Paris, France, has been providing extracorporeal photopheresis (ECP) since November 2011, and started using the Therakos® CELLEX® fully integrated system in 2012. This report summarizes our single‐center experience of transitioning from the use of multi‐step ECP procedures to the fully integrated ECP system, considering the capacity and cost implications. Materials and Methods The total number of ECP procedures performed 2011–2015 was derived from department records. The time taken to complete a single ECP treatment using a multi‐step technique and the fully integrated system at our department was assessed. Resource costs (2014€) were obtained for materials and calculated for personnel time required. Time‐driven activity‐based costing methods were applied to provide a cost comparison. Results The number of ECP treatments per year increased from 225 (2012) to 727 (2015). The single multi‐step procedure took 270 min compared to 120 min for the fully integrated system. The total calculated per‐session cost of performing ECP using the multi‐step procedure was greater than with the CELLEX® system (€1,429.37 and €1,264.70 per treatment, respectively). Conclusions For hospitals considering a transition from multi‐step procedures to fully integrated methods for ECP where cost may be a barrier, time‐driven activity‐based costing should be utilized to gain a more comprehensive understanding the full benefit that such a transition offers. The example from our department confirmed that there were not just cost and time savings, but that the time efficiencies gained with CELLEX® allow for more patient treatments per year. PMID:28419561

  19. Exact event-driven implementation for recurrent networks of stochastic perfect integrate-and-fire neurons.

    PubMed

    Taillefumier, Thibaud; Touboul, Jonathan; Magnasco, Marcelo

    2012-12-01

    In vivo cortical recording reveals that indirectly driven neural assemblies can produce reliable and temporally precise spiking patterns in response to stereotyped stimulation. This suggests that despite being fundamentally noisy, the collective activity of neurons conveys information through temporal coding. Stochastic integrate-and-fire models delineate a natural theoretical framework to study the interplay of intrinsic neural noise and spike timing precision. However, there are inherent difficulties in simulating their networks' dynamics in silico with standard numerical discretization schemes. Indeed, the well-posedness of the evolution of such networks requires temporally ordering every neuronal interaction, whereas the order of interactions is highly sensitive to the random variability of spiking times. Here, we answer these issues for perfect stochastic integrate-and-fire neurons by designing an exact event-driven algorithm for the simulation of recurrent networks, with delayed Dirac-like interactions. In addition to being exact from the mathematical standpoint, our proposed method is highly efficient numerically. We envision that our algorithm is especially indicated for studying the emergence of polychronized motifs in networks evolving under spike-timing-dependent plasticity with intrinsic noise.

  20. A facility birth can be the time to start family planning: postpartum intrauterine device experiences from six countries.

    PubMed

    Pfitzer, Anne; Mackenzie, Devon; Blanchard, Holly; Hyjazi, Yolande; Kumar, Somesh; Lisanework Kassa, Serawit; Marinduque, Bernabe; Mateo, Marie Grace; Mukarugwiro, Beata; Ngabo, Fidele; Zaeem, Shabana; Zafar, Zonobia; Smith, Jeffrey Michael

    2015-06-01

    Initiation of family planning at the time of birth is opportune, since few women in low-resource settings who give birth in a facility return for further care. Postpartum family planning (PPFP) and postpartum intrauterine device (PPIUD) services were integrated into maternal care in six low- and middle-income countries, applying an insertion technique developed in Paraguay. Facilities with high delivery volume were selected to integrate PPFP/PPIUD services into routine care. Effective PPFP/PPIUD integration requires training and mentoring those providers assisting women at the time of birth. Ongoing monitoring generated data for advocacy. The percentages of PPIUD acceptors ranged from 2.3% of women counseled in Pakistan to 5.8% in the Philippines. Rates of complications among women returning for follow-up were low. Expulsion rates were 3.7% in Pakistan, 3.6% in Ethiopia, and 1.7% in Guinea and the Philippines. Infection rates did not exceed 1.3%, and three countries recorded no cases. Offering PPFP/PPIUD at birth improves access to contraception. Copyright © 2015. Published by Elsevier Ireland Ltd.

  1. CASPER Version 2.0

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Rabideau, Gregg; Tran, Daniel; Knight, Russell; Chouinard, Caroline; Estlin, Tara; Gaines, Daniel; Clement, Bradley; Barrett, Anthony

    2007-01-01

    CASPER is designed to perform automated planning of interdependent activities within a system subject to requirements, constraints, and limitations on resources. In contradistinction to the traditional concept of batch planning followed by execution, CASPER implements a concept of continuous planning and replanning in response to unanticipated changes (including failures), integrated with execution. Improvements over other, similar software that have been incorporated into CASPER version 2.0 include an enhanced executable interface to facilitate integration with a wide range of execution software systems and supporting software libraries; features to support execution while reasoning about urgency, importance, and impending deadlines; features that enable accommodation to a wide range of computing environments that include various central processing units and random- access-memory capacities; and improved generic time-server and time-control features.

  2. Packetized video on MAGNET

    NASA Astrophysics Data System (ADS)

    Lazar, Aurel A.; White, John S.

    1986-11-01

    Theoretical analysis of an ILAN model of MAGNET, an integrated network testbed developed at Columbia University, shows that the bandwidth freed up by video and voice calls during periods of little movement in the images and silence periods in the speech signals could be utilized efficiently for graphics and data transmission. Based on these investigations, an architecture supporting adaptive protocols that are dynamically controlled by the requirements of a fluctuating load and changing user environment has been advanced. To further analyze the behavior of the network, a real-time packetized video system has been implemented. This system is embedded in the real time multimedia workstation EDDY that integrates video, voice and data traffic flows. Protocols supporting variable bandwidth, constant quality packetized video transport are descibed in detail.

  3. Observation of emission process in hydrogen-like nitrogen Z-pinch discharge with time integrated soft X-ray spectrum pinhole image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakai, Y.; Kumai, H.; Nakanishi, Y.

    2013-02-15

    The emission spectra of hydrogen-like nitrogen Balmer at the wavelength of 13.4 nm in capillary Z-pinch discharge plasma are experimentally examined. Ionization to fully strip nitrogen at the pinch maximum, and subsequent rapid expansion cooling are required to establish the population inversion between the principal quantum number of n = 2 and n = 3. The ionization and recombination processes with estimated plasma parameters are evaluated by utilizing a time integrated spectrum pinhole image containing radial spatial information. A cylindrical capillary plasma is pinched by a triangular pulsed current with peak amplitude of 50 kA and pulse width of 50more » ns.« less

  4. Autonomous Preference-Aware Information Services Integration for High Response in Integrated Faded Information Field Systems

    NASA Astrophysics Data System (ADS)

    Lu, Xiaodong; Mori, Kinji

    The market and users' requirements have been rapidly changing and diversified. Under these heterogeneous and dynamic situations, not only the system structure itself, but also the accessible information services would be changed constantly. To cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed, which is a agent-based distributed information service system architecture. In the case of a mono-service request, the system is designed to improve users' access time and preserve load balancing through the information structure. However, with interdependent requests of multi-service increasing, adaptability and timeliness have to be assured by the system. In this paper, the relationship that exists among the correlated services and the users' preferences for separate and integrated services is clarified. Based on these factors, the autonomous preference-aware information services integration technology to provide one-stop service for users multi-service requests is proposed. As compared to the conventional system, we show that proposed technology is able to reduce the total access time.

  5. Movement Integration and the One-Target Advantage.

    PubMed

    Hoffmann, Errol R

    2017-01-01

    The 1-target advantage (OTA) has been found to occur in many circumstances and the current best explanation for this phenomenon is that of the movement integration hypothesis. The author's purpose is twofold: (a) to model the conditions under which there is integration of the movement components in a 2-component movement and (b) to study the factors that determine the magnitude of the OTA for both the first and second component of a 2-component movement. Results indicate that integration of movement components, where times for one component are affected by the geometry of the other component, occurs when 1 of the movement components is made ballistically. Movement components that require ongoing visual control show only weak interaction with the second component, whereas components made ballistically always show movement time dependence on first and second component amplitude, independent of location within the sequence. The OTA is present on both the first and second components of the movement, with a magnitude that is dependent on whether the components are performed ballistically or with ongoing visual control and also on the amplitudes and indexes of difficulty of the component movements.

  6. Automation and integration of components for generalized semantic markup of electronic medical texts.

    PubMed Central

    Dugan, J. M.; Berrios, D. C.; Liu, X.; Kim, D. K.; Kaizer, H.; Fagan, L. M.

    1999-01-01

    Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models. Images Figure 1 Figure 2 Figure 4 Figure 5 PMID:10566457

  7. Present capabilities and future requirements for computer-aided geometric modeling in the design and manufacture of gas turbine

    NASA Technical Reports Server (NTRS)

    Caille, E.; Propen, M.; Hoffman, A.

    1984-01-01

    Gas turbine engine design requires the ability to rapidly develop complex structures which are subject to severe thermal and mechanical operating loads. As in all facets of the aerospace industry, engine designs are constantly driving towards increased performance, higher temperatures, higher speeds, and lower weight. The ability to address such requirements in a relatively short time frame has resulted in a major thrust towards integrated design/analysis/manufacturing systems. These computer driven graphics systems represent a unique challenge, with major payback opportunities if properly conceived, implemented, and applied.

  8. Exchange of Veterans Affairs medical data using national and local networks.

    PubMed

    Dayhoff, R E; Maloney, D L

    1992-12-17

    Remote data exchange is extremely useful to a number of medical applications. It requires an infrastructure including systems, network and software tools. With such an infrastructure, existing local applications can be extended to serve national needs. There are many approaches to providing remote data exchange. Selection of an approach for an application requires balancing of various factors, including the need for rapid interactive access to data and ad hoc queries, the adequacy of access to predefined data sets, the need for an integrated view of the data, the ability to provide adequate security protection, the amount of data required, and the time frame in which data is required. The applications described here demonstrate new ways that the VA is reaping benefits from its infrastructure and its compatible integrated hospital information systems located at its facilities. The needs that have been met are also needs of private hospitals. However, in many cases the infrastructure to allow data exchange is not present. The VA's experiences may serve to establish the benefits that can be obtained by all hospitals.

  9. James Webb Space Telescope (JWST) Integrated Science Instruments Module (ISIM) Electronics Compartment (IEC) Conformal Shields Composite Bond Structure Qualification Test Method

    NASA Technical Reports Server (NTRS)

    Yew, Calinda; Stephens, Matt

    2015-01-01

    The JWST IEC conformal shields are mounted onto a composite frame structure that must undergo qualification testing to satisfy mission assurance requirements. The composite frame segments are bonded together at the joints using epoxy, EA 9394. The development of a test method to verify the integrity of the bonded structure at its operating environment introduces challenges in terms of requirements definition and the attainment of success criteria. Even though protoflight thermal requirements were not achieved, the first attempt in exposing the structure to cryogenic operating conditions in a thermal vacuum environment resulted in approximately 1 bonded joints failure during mechanical pull tests performed at 1.25 times the flight loads. Failure analysis concluded that the failure mode was due to adhesive cracks that formed and propagated along stress concentrated fillets as a result of poor bond squeeze-out control during fabrication. Bond repairs were made and the structures successfully re-tested with an improved LN2 immersion test method to achieve protoflight thermal requirements.

  10. NASA Integrated Vehicle Health Management (NIVHM) A New Simulation Architecture. Part I; An Investigation

    NASA Technical Reports Server (NTRS)

    Sheppard, Gene

    2005-01-01

    The overall objective of this research is to explore the development of a new architecture for simulating a vehicle health monitoring system in support of NASA s on-going Integrated Vehicle Health Monitoring (IVHM) initiative. As discussed in NASA MSFC s IVHM workshop on June 29-July 1, 2004, a large number of sensors will be required for a robust IVHM system. The current simulation architecture is incapable of simulating the large number of sensors required for IVHM. Processing the data from the sensors into a format that a human operator can understand and assimilate in a timely manner will require a paradigm shift. Data from a single sensor is, at best, suspect and in order to overcome this deficiency, redundancy will be required for tomorrow s sensors. The sensor technology of tomorrow will allow for the placement of thousands of sensors per square inch. The major obstacle to overcome will then be how we can mitigate the torrent of data from raw sensor data to useful information to computer assisted decisionmaking.

  11. Bringing the Pieces Together – Placing Core Facilities at the Core of Universities and Institutions: Lessons from Mergers, Acquisitions and Consolidations

    PubMed Central

    Mundoma, Claudius

    2013-01-01

    As organizations expand and grow, the core facilities have become more dispersed disconnected. This is happening at a time when collaborations within the organization is a driver to increased productivity. Stakeholders are looking at the best way to bring the pieces together. It is inevitable that core facilities at universities and research institutes have to be integrated in order to streamline services and facilitate ease of collaboration. The path to integration often goes through consolidation, merging and shedding of redundant services. Managing this process requires a delicate coordination of two critical factors: the human (lab managers) factor and the physical assets factor. Traditionally more emphasis has been placed on reorganizing the physical assets without paying enough attention to the professionals who have been managing the assets for years, if not decades. The presentation focuses on how a systems approach can be used to effect a smooth core facility integration process. Managing the human element requires strengthening existing channels of communication and if necessary, creating new ones throughout the organization to break cultural and structural barriers. Managing the physical assets requires a complete asset audit and this requires direct input from the administration as well as the facility managers. Organizations can harness the power of IT to create asset visibility. Successfully managing the physical assets and the human assets increases productivity and efficiency within the organization.

  12. Integrated software system for low level waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worku, G.

    1995-12-31

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less

  13. Development of an Intelligent Monitoring System for Geological Carbon Sequestration (GCS) Systems

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Jeong, H.; Xu, W.; Hovorka, S. D.; Zhu, T.; Templeton, T.; Arctur, D. K.

    2016-12-01

    To provide stakeholders timely evidence that GCS repositories are operating safely and efficiently requires integrated monitoring to assess the performance of the storage reservoir as the CO2 plume moves within it. As a result, GCS projects can be data intensive, as a result of proliferation of digital instrumentation and smart-sensing technologies. GCS projects are also resource intensive, often requiring multidisciplinary teams performing different monitoring, verification, and accounting (MVA) tasks throughout the lifecycle of a project to ensure secure containment of injected CO2. How to correlate anomaly detected by a certain sensor to events observed by other devices to verify leakage incidents? How to optimally allocate resources for task-oriented monitoring if reservoir integrity is in question? These are issues that warrant further investigation before real integration can take place. In this work, we are building a web-based, data integration, assimilation, and learning framework for geologic carbon sequestration projects (DIAL-GCS). DIAL-GCS will be an intelligent monitoring system (IMS) for automating GCS closed-loop management by leveraging recent developments in high-throughput database, complex event processing, data assimilation, and machine learning technologies. Results will be demonstrated using realistic data and model derived from a GCS site.

  14. Integrating International Engineering Organizations For Successful ISS Operations

    NASA Technical Reports Server (NTRS)

    Blome, Elizabeth; Duggan, Matt; Patten, L.; Pieterek, Hhtrud

    2006-01-01

    The International Space Station (ISS) is a multinational orbiting space laboratory that is built in cooperation with 16 nations. The design and sustaining engineering expertise is spread worldwide. As the number of Partners with orbiting elements on the ISS grows, the challenge NASA is facing as the ISS integrator is to ensure that engineering expertise and data are accessible in a timely fashion to ensure ongoing operations and mission success. Integrating international engineering teams requires definition and agreement on common processes and responsibilities, joint training and the emergence of a unique engineering team culture. ISS engineers face daunting logistical and political challenges regarding data sharing requirements. To assure systematic information sharing and anomaly resolution of integrated anomalies, the ISS Partners are developing multi-lateral engineering interface procedures. Data sharing and individual responsibility are key aspects of this plan. This paper describes several examples of successful multilateral anomaly resolution. These successes were used to form the framework of the Partner to Partner engineering interface procedures, and this paper describes those currently documented multilateral engineering processes. Furthermore, it addresses the challenges experienced to date, and the forward work expected in establishing a successful working relationship with Partners as their hardware is launched.

  15. Impact of Dean Vortices on the Integrity Testing of a Continuous Viral Inactivation Reactor.

    PubMed

    Amarikwa, Linus; Orozco, Raquel; Brown, Matthew; Coffman, Jon

    2018-05-26

    We propose a standard protocol for integrity testing the residence-time distribution (RTD) in a "Jig in a Box" design (JIB)-a previously described tortuous-path, tubular, low-pH, continuous viral inactivation reactor, ensuring that biopharmaceutical products will be incubated for the required minimum residence time, t min . t min is the time by which just 0.001% of the total product containing virus has exited the incubation chamber (i.e., t 0.00001 ). This t 0.00001 is selected to ensure a >4-log reduction in viral load. As current tracers and in-line analytical technologies may not be able to detect tracers at the 0.001% level, an alternative approach is required. The authors describe a method for deriving t min from t 0.005 (i.e., the time at which 0.5% of the product has emerged from the reactor outlet) and an experimentally confirmed offset value, t offset  = t 0.005 -t 0.00001 . The authors also evaluate tracer candidates-including 100-nm-diameter gold nanoparticles, dextrose, monoclonal antibody, and riboflavin-for pre-process acceptability and the effects of viscosity, molecular diffusion coefficient, and particle size. The authors show that a JIB will yield t min and RTDs that are nearly identical for multiple tracers due to Dean vortex induced mixing. Results indicate that almost any small-molecule tracer that is generally recognized as safe can be used in pre-use integrity testing of a continuous viral inactivation reactor under the Deans values (De) of 119-595. © 2018 Boehringer Ingelheim Fremont Inc. Biotechnology Journal published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Multiresource allocation and scheduling for periodic soft real-time applications

    NASA Astrophysics Data System (ADS)

    Gopalan, Kartik; Chiueh, Tzi-cker

    2001-12-01

    Real-time applications that utilize multiple system resources, such as CPU, disks, and network links, require coordinated scheduling of these resources in order to meet their end-to-end performance requirements. Most state-of-the-art operating systems support independent resource allocation and deadline-driven scheduling but lack coordination among multiple heterogeneous resources. This paper describes the design and implementation of an Integrated Real-time Resource Scheduler (IRS) that performs coordinated allocation and scheduling of multiple heterogeneous resources on the same machine for periodic soft real-time application. The principal feature of IRS is a heuristic multi-resource allocation algorithm that reserves multiple resources for real-time applications in a manner that can maximize the number of applications admitted into the system in the long run. At run-time, a global scheduler dispatches the tasks of the soft real-time application to individual resource schedulers according to the precedence constraints between tasks. The individual resource schedulers, which could be any deadline based schedulers, can make scheduling decisions locally and yet collectively satisfy a real-time application's performance requirements. The tightness of overall timing guarantees is ultimately determined by the properties of individual resource schedulers. However, IRS maximizes overall system resource utilization efficiency by coordinating deadline assignment across multiple tasks in a soft real-time application.

  17. Shuttle's 160 hour ground turnaround - A design driver

    NASA Technical Reports Server (NTRS)

    Widick, F.

    1977-01-01

    Turnaround analysis added a new dimension to the Space Program with the advent of the Space Shuttle. The requirement to turn the flight hardware around in 160 working hours from landing to launch was a significant design driver and a useful tool in forcing the integration of flight and ground systems design to permit an efficient ground operation. Although there was concern that time constraints might increase program costs, the result of the analysis was to minimize facility requirements and simplify operations with resultant cost savings.

  18. A Standard for Command, Control, Communications and Computers (C4) Test Data Representation to Integrate with High-Performance Data Reduction

    DTIC Science & Technology

    2015-06-01

    events was ad - hoc and problematic due to time constraints and changing requirements. Determining errors in context and heuristics required expertise...area code ) 410-278-4678 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures iv 1. Introduction 1...reduction code ...........8 1 1. Introduction Data reduction for analysis of Command, Control, Communications, and Computer (C4) network tests

  19. Behavioral Health and Performance (BHP) Work-Rest Cycles

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.; Whitmire, Alexandra

    2011-01-01

    BHP Program Element Goal: Identify, characterize, and prevent or reduce behavioral health and performance risks associated with space travel, exploration and return to terrestrial life. BHP Requirements: a) Characterize and assess risks (e.g., likelihood and consequences). b) Develop tools and technologies to prevent, monitor, and treat adverse outcomes. c) Inform standards. d) Develop technologies to: 1) reduce risks and human systems resource requirements (e.g., crew time, mass, volume, power) and 2) ensure effective human-system integration across exploration mission.

  20. Artificial intelligence techniques for scheduling Space Shuttle missions

    NASA Technical Reports Server (NTRS)

    Henke, Andrea L.; Stottler, Richard H.

    1994-01-01

    Planning and scheduling of NASA Space Shuttle missions is a complex, labor-intensive process requiring the expertise of experienced mission planners. We have developed a planning and scheduling system using combinations of artificial intelligence knowledge representations and planning techniques to capture mission planning knowledge and automate the multi-mission planning process. Our integrated object oriented and rule-based approach reduces planning time by orders of magnitude and provides planners with the flexibility to easily modify planning knowledge and constraints without requiring programming expertise.

  1. Enhancing the Human Factors Engineering Role in an Austere Fiscal Environment

    NASA Technical Reports Server (NTRS)

    Stokes, Jack W.

    2003-01-01

    An austere fiscal environment in the aerospace community creates pressures to reduce program costs, often minimizing or sometimes even deleting the human interface requirements from the design process. With an assumption that the flight crew can recover real time from a poorly human factored space vehicle design, the classical crew interface requirements have been either not included in the design or not properly funded, though carried as requirements. Cost cuts have also affected quality of retained human factors engineering personnel. In response to this concern, planning is ongoing to correct the acting issues. Herein are techniques for ensuring that human interface requirements are integrated into a flight design, from proposal through verification and launch activation. This includes human factors requirements refinement and consolidation across flight programs; keyword phrases in the proposals; closer ties with systems engineering and other classical disciplines; early planning for crew-interface verification; and an Agency integrated human factors verification program, under the One NASA theme. Importance is given to communication within the aerospace human factors discipline, and utilizing the strengths of all government, industry, and academic human factors organizations in an unified research and engineering approach. A list of recommendations and concerns are provided in closing.

  2. Exploration of the Trade Space Between Unmanned Aircraft Systems Descent Maneuver Performance and Sense-and-Avoid System Performance Requirements

    NASA Technical Reports Server (NTRS)

    Jack, Devin P.; Hoffler, Keith D.; Johnson, Sally C.

    2014-01-01

    A need exists to safely integrate Unmanned Aircraft Systems (UAS) into the United States' National Airspace System. Replacing manned aircraft's see-and-avoid capability in the absence of an onboard pilot is one of the key challenges associated with safe integration. Sense-and-avoid (SAA) systems will have to achieve yet-to-be-determined required separation distances for a wide range of encounters. They will also need to account for the maneuver performance of the UAS they are paired with. The work described in this paper is aimed at developing an understanding of the trade space between UAS maneuver performance and SAA system performance requirements, focusing on a descent avoidance maneuver. An assessment of current manned and unmanned aircraft performance was used to establish potential UAS performance test matrix bounds. Then, near-term UAS integration work was used to narrow down the scope. A simulator was developed with sufficient fidelity to assess SAA system performance requirements. The simulator generates closest-point-of-approach (CPA) data from the wide range of UAS performance models maneuvering against a single intruder with various encounter geometries. Initial attempts to model the results made it clear that developing maneuver performance groups is required. Discussion of the performance groups developed and how to know in which group an aircraft belongs for a given flight condition and encounter is included. The groups are airplane, flight condition, and encounter specific, rather than airplane-only specific. Results and methodology for developing UAS maneuver performance requirements are presented for a descent avoidance maneuver. Results for the descent maneuver indicate that a minimum specific excess power magnitude can assure a minimum CPA for a given time-to-go prediction. However, smaller amounts of specific excess power may achieve or exceed the same CPA if the UAS has sufficient speed to trade for altitude. The results of this study will support UAS maneuver performance requirements development for integrating UAS in the NAS. The methods described are being used to help RTCA Special Committee 228 develop requirements.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argo, P.E.; DeLapp, D.; Sutherland, C.D.

    TRACKER is an extension of a three-dimensional Hamiltonian raytrace code developed some thirty years ago by R. Michael Jones. Subsequent modifications to this code, which is commonly called the {open_quotes}Jones Code,{close_quotes} were documented by Jones and Stephensen (1975). TRACKER incorporates an interactive user`s interface, modern differential equation integrators, graphical outputs, homing algorithms, and the Ionospheric Conductivity and Electron Density (ICED) ionosphere. TRACKER predicts the three-dimensional paths of radio waves through model ionospheres by numerically integrating Hamilton`s equations, which are a differential expression of Fermat`s principle of least time. By using continuous models, the Hamiltonian method avoids false caustics and discontinuousmore » raypath properties often encountered in other raytracing methods. In addition to computing the raypath, TRACKER also calculates the group path (or pulse travel time), the phase path, the geometrical (or {open_quotes}real{close_quotes}) pathlength, and the Doppler shift (if the time variation of the ionosphere is explicitly included). Computational speed can be traded for accuracy by specifying the maximum allowable integration error per step in the integration. Only geometrical optics are included in the main raytrace code; no partial reflections or diffraction effects are taken into account. In addition, TRACKER does not lend itself to statistical descriptions of propagation -- it requires a deterministic model of the ionosphere.« less

  4. FPGA-based real-time embedded system for RISS/GPS integrated navigation.

    PubMed

    Abdelfatah, Walid Farid; Georgy, Jacques; Iqbal, Umar; Noureldin, Aboelmagd

    2012-01-01

    Navigation algorithms integrating measurements from multi-sensor systems overcome the problems that arise from using GPS navigation systems in standalone mode. Algorithms which integrate the data from 2D low-cost reduced inertial sensor system (RISS), consisting of a gyroscope and an odometer or wheel encoders, along with a GPS receiver via a Kalman filter has proved to be worthy in providing a consistent and more reliable navigation solution compared to standalone GPS receivers. It has been also shown to be beneficial, especially in GPS-denied environments such as urban canyons and tunnels. The main objective of this paper is to narrow the idea-to-implementation gap that follows the algorithm development by realizing a low-cost real-time embedded navigation system capable of computing the data-fused positioning solution. The role of the developed system is to synchronize the measurements from the three sensors, relative to the pulse per second signal generated from the GPS, after which the navigation algorithm is applied to the synchronized measurements to compute the navigation solution in real-time. Employing a customizable soft-core processor on an FPGA in the kernel of the navigation system, provided the flexibility for communicating with the various sensors and the computation capability required by the Kalman filter integration algorithm.

  5. Visual Servoing for an Autonomous Hexarotor Using a Neural Network Based PID Controller.

    PubMed

    Lopez-Franco, Carlos; Gomez-Avila, Javier; Alanis, Alma Y; Arana-Daniel, Nancy; Villaseñor, Carlos

    2017-08-12

    In recent years, unmanned aerial vehicles (UAVs) have gained significant attention. However, we face two major drawbacks when working with UAVs: high nonlinearities and unknown position in 3D space since it is not provided with on-board sensors that can measure its position with respect to a global coordinate system. In this paper, we present a real-time implementation of a servo control, integrating vision sensors, with a neural proportional integral derivative (PID), in order to develop an hexarotor image based visual servo control (IBVS) that knows the position of the robot by using a velocity vector as a reference to control the hexarotor position. This integration requires a tight coordination between control algorithms, models of the system to be controlled, sensors, hardware and software platforms and well-defined interfaces, to allow the real-time implementation, as well as the design of different processing stages with their respective communication architecture. All of these issues and others provoke the idea that real-time implementations can be considered as a difficult task. For the purpose of showing the effectiveness of the sensor integration and control algorithm to address these issues on a high nonlinear system with noisy sensors as cameras, experiments were performed on the Asctec Firefly on-board computer, including both simulation and experimenta results.

  6. FPGA-Based Real-Time Embedded System for RISS/GPS Integrated Navigation

    PubMed Central

    Abdelfatah, Walid Farid; Georgy, Jacques; Iqbal, Umar; Noureldin, Aboelmagd

    2012-01-01

    Navigation algorithms integrating measurements from multi-sensor systems overcome the problems that arise from using GPS navigation systems in standalone mode. Algorithms which integrate the data from 2D low-cost reduced inertial sensor system (RISS), consisting of a gyroscope and an odometer or wheel encoders, along with a GPS receiver via a Kalman filter has proved to be worthy in providing a consistent and more reliable navigation solution compared to standalone GPS receivers. It has been also shown to be beneficial, especially in GPS-denied environments such as urban canyons and tunnels. The main objective of this paper is to narrow the idea-to-implementation gap that follows the algorithm development by realizing a low-cost real-time embedded navigation system capable of computing the data-fused positioning solution. The role of the developed system is to synchronize the measurements from the three sensors, relative to the pulse per second signal generated from the GPS, after which the navigation algorithm is applied to the synchronized measurements to compute the navigation solution in real-time. Employing a customizable soft-core processor on an FPGA in the kernel of the navigation system, provided the flexibility for communicating with the various sensors and the computation capability required by the Kalman filter integration algorithm. PMID:22368460

  7. Solving modal equations of motion with initial conditions using MSC/NASTRAN DMAP. Part 1: Implementing exact mode superposition

    NASA Technical Reports Server (NTRS)

    Abdallah, Ayman A.; Barnett, Alan R.; Ibrahim, Omar M.; Manella, Richard T.

    1993-01-01

    Within the MSC/NASTRAN DMAP (Direct Matrix Abstraction Program) module TRD1, solving physical (coupled) or modal (uncoupled) transient equations of motion is performed using the Newmark-Beta or mode superposition algorithms, respectively. For equations of motion with initial conditions, only the Newmark-Beta integration routine has been available in MSC/NASTRAN solution sequences for solving physical systems and in custom DMAP sequences or alters for solving modal systems. In some cases, one difficulty with using the Newmark-Beta method is that the process of selecting suitable integration time steps for obtaining acceptable results is lengthy. In addition, when very small step sizes are required, a large amount of time can be spent integrating the equations of motion. For certain aerospace applications, a significant time savings can be realized when the equations of motion are solved using an exact integration routine instead of the Newmark-Beta numerical algorithm. In order to solve modal equations of motion with initial conditions and take advantage of efficiencies gained when using uncoupled solution algorithms (like that within TRD1), an exact mode superposition method using MSC/NASTRAN DMAP has been developed and successfully implemented as an enhancement to an existing coupled loads methodology at the NASA Lewis Research Center.

  8. Visual Servoing for an Autonomous Hexarotor Using a Neural Network Based PID Controller

    PubMed Central

    Lopez-Franco, Carlos; Alanis, Alma Y.; Arana-Daniel, Nancy; Villaseñor, Carlos

    2017-01-01

    In recent years, unmanned aerial vehicles (UAVs) have gained significant attention. However, we face two major drawbacks when working with UAVs: high nonlinearities and unknown position in 3D space since it is not provided with on-board sensors that can measure its position with respect to a global coordinate system. In this paper, we present a real-time implementation of a servo control, integrating vision sensors, with a neural proportional integral derivative (PID), in order to develop an hexarotor image based visual servo control (IBVS) that knows the position of the robot by using a velocity vector as a reference to control the hexarotor position. This integration requires a tight coordination between control algorithms, models of the system to be controlled, sensors, hardware and software platforms and well-defined interfaces, to allow the real-time implementation, as well as the design of different processing stages with their respective communication architecture. All of these issues and others provoke the idea that real-time implementations can be considered as a difficult task. For the purpose of showing the effectiveness of the sensor integration and control algorithm to address these issues on a high nonlinear system with noisy sensors as cameras, experiments were performed on the Asctec Firefly on-board computer, including both simulation and experimenta results. PMID:28805689

  9. SMZ/SNZ and gibberellin signaling are required for nitrate-elicited delay of flowering time in Arabidopsis thaliana.

    PubMed

    Gras, Diana E; Vidal, Elena A; Undurraga, Soledad F; Riveras, Eleodoro; Moreno, Sebastián; Dominguez-Figueroa, José; Alabadi, David; Blázquez, Miguel A; Medina, Joaquín; Gutiérrez, Rodrigo A

    2018-01-23

    The reproductive success of plants largely depends on the correct programming of developmental phase transitions, particularly the shift from vegetative to reproductive growth. The timing of this transition is finely regulated by the integration of an array of environmental and endogenous factors. Nitrogen is the mineral macronutrient that plants require in the largest amount, and as such its availability greatly impacts on many aspects of plant growth and development, including flowering time. We found that nitrate signaling interacts with the age-related and gibberellic acid pathways to control flowering time in Arabidopsis thaliana. We revealed that repressors of flowering time belonging to the AP2-type transcription factor family including SCHLAFMUTZE (SMZ) and SCHNARCHZAPFEN (SNZ) are important regulators of flowering time in response to nitrate. Our results support a model whereby nitrate activates SMZ and SNZ via the gibberellin pathway to repress flowering time in Arabidopsis thaliana. © The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  10. Expanding Lorentz and spectrum corrections to large volumes of reciprocal space for single-crystal time-of-flight neutron diffraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michels-Clark, Tara M.; Savici, Andrei T.; Lynch, Vickie E.

    Evidence is mounting that potentially exploitable properties of technologically and chemically interesting crystalline materials are often attributable to local structure effects, which can be observed as modulated diffuse scattering (mDS) next to Bragg diffraction (BD). BD forms a regular sparse grid of intense discrete points in reciprocal space. Traditionally, the intensity of each Bragg peak is extracted by integration of each individual reflection first, followed by application of the required corrections. In contrast, mDS is weak and covers expansive volumes of reciprocal space close to, or between, Bragg reflections. For a representative measurement of the diffuse scattering, multiple sample orientationsmore » are generally required, where many points in reciprocal space are measured multiple times and the resulting data are combined. The common post-integration data reduction method is not optimal with regard to counting statistics. A general and inclusive data processing method is needed. In this contribution, a comprehensive data analysis approach is introduced to correct and merge the full volume of scattering data in a single step, while correctly accounting for the statistical weight of the individual measurements. Lastly, development of this new approach required the exploration of a data treatment and correction protocol that includes the entire collected reciprocal space volume, using neutron time-of-flight or wavelength-resolved data collected at TOPAZ at the Spallation Neutron Source at Oak Ridge National Laboratory.« less

  11. Expanding Lorentz and spectrum corrections to large volumes of reciprocal space for single-crystal time-of-flight neutron diffraction

    DOE PAGES

    Michels-Clark, Tara M.; Savici, Andrei T.; Lynch, Vickie E.; ...

    2016-03-01

    Evidence is mounting that potentially exploitable properties of technologically and chemically interesting crystalline materials are often attributable to local structure effects, which can be observed as modulated diffuse scattering (mDS) next to Bragg diffraction (BD). BD forms a regular sparse grid of intense discrete points in reciprocal space. Traditionally, the intensity of each Bragg peak is extracted by integration of each individual reflection first, followed by application of the required corrections. In contrast, mDS is weak and covers expansive volumes of reciprocal space close to, or between, Bragg reflections. For a representative measurement of the diffuse scattering, multiple sample orientationsmore » are generally required, where many points in reciprocal space are measured multiple times and the resulting data are combined. The common post-integration data reduction method is not optimal with regard to counting statistics. A general and inclusive data processing method is needed. In this contribution, a comprehensive data analysis approach is introduced to correct and merge the full volume of scattering data in a single step, while correctly accounting for the statistical weight of the individual measurements. Lastly, development of this new approach required the exploration of a data treatment and correction protocol that includes the entire collected reciprocal space volume, using neutron time-of-flight or wavelength-resolved data collected at TOPAZ at the Spallation Neutron Source at Oak Ridge National Laboratory.« less

  12. Space Debris Detection on the HPDP, a Coarse-Grained Reconfigurable Array Architecture for Space

    NASA Astrophysics Data System (ADS)

    Suarez, Diego Andres; Bretz, Daniel; Helfers, Tim; Weidendorfer, Josef; Utzmann, Jens

    2016-08-01

    Stream processing, widely used in communications and digital signal processing applications, requires high- throughput data processing that is achieved in most cases using Application-Specific Integrated Circuit (ASIC) designs. Lack of programmability is an issue especially in space applications, which use on-board components with long life-cycles requiring applications updates. To this end, the High Performance Data Processor (HPDP) architecture integrates an array of coarse-grained reconfigurable elements to provide both flexible and efficient computational power suitable for stream-based data processing applications in space. In this work the capabilities of the HPDP architecture are demonstrated with the implementation of a real-time image processing algorithm for space debris detection in a space-based space surveillance system. The implementation challenges and alternatives are described making trade-offs to improve performance at the expense of negligible degradation of detection accuracy. The proposed implementation uses over 99% of the available computational resources. Performance estimations based on simulations show that the HPDP can amply match the application requirements.

  13. Unmanned aircraft system sense and avoid integrity and continuity

    NASA Astrophysics Data System (ADS)

    Jamoom, Michael B.

    This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.

  14. Development of an Integrated Waste Plan for Chalk River Laboratories - 13376

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, L.

    2013-07-01

    To further its Strategic Planning, the Atomic Energy of Canada Limited (AECL) required an effective approach to developing a fully integrated waste plan for its Chalk River Laboratories (CRL) site. Production of the first Integrated Waste Plan (IWP) for Chalk River was a substantial task involving representatives from each of the major internal stakeholders. Since then, a second revision has been produced and a third is underway. The IWP remains an Interim IWP until all gaps have been resolved and all pathways are at an acceptable level of detail. Full completion will involve a number of iterations, typically annually formore » up to six years. The end result of completing this process is a comprehensive document and supporting information that includes: - An Integrated Waste Plan document summarizing the entire waste management picture in one place; - Details of all the wastes required to be managed, including volume and timings by waste stream; - Detailed waste stream pathway maps for the whole life-cycle for each waste stream to be managed from pre-generation planning through to final disposition; and - Critical decision points, i.e. decisions that need to be made and timings by when they need to be made. A waste inventory has been constructed that serves as the master reference inventory of all waste that has been or is committed to be managed at CRL. In the past, only the waste that is in storage has been effectively captured, and future predictions of wastes requiring to be managed were not available in one place. The IWP has also provided a detailed baseline plan at the current level of refinement. Waste flow maps for all identified waste streams, for the full waste life cycle complete to disposition have been constructed. The maps identify areas requiring further development, and show the complexities and inter-relationships between waste streams. Knowledge of these inter-dependencies is necessary in order to perform effective options studies for enabling facilities that may be necessary for multiple related waste streams. The next step is to engage external stakeholders in the optioneering work required to provide enhanced confidence that the path forward identified within future iterations of the IWP will be acceptable to all. (authors)« less

  15. Using 2H and 18O in assessing evaporation and water residence time of lakes in EPA’s National Lakes Assessment.

    EPA Science Inventory

    Stable isotopes of water and organic material can be very useful in monitoring programs because stable isotopes integrate information about ecological processes and record this information. Most ecological processes of interest for water quality (i.e. denitrification) require si...

  16. Distinguishing the Forest from the Trees: Synthesizing IHRMP Research

    Treesearch

    Gregory B. Greenwood

    1991-01-01

    A conceptual model of hardwood rangelands as multi-output resource system is developed and used to achieve a synthesis of Integrated Hardwood Range Management Program (IHRMP) research. The model requires the definition of state variables which characterize the system at any time, processes that move the system to different states, outputs...

  17. Function modeling improves the efficiency of spatial modeling using big data from remote sensing

    Treesearch

    John Hogland; Nathaniel Anderson

    2017-01-01

    Spatial modeling is an integral component of most geographic information systems (GISs). However, conventional GIS modeling techniques can require substantial processing time and storage space and have limited statistical and machine learning functionality. To address these limitations, many have parallelized spatial models using multiple coding libraries and have...

  18. 76 FR 81015 - Notice of Public Webinar on Implementation of Distribution Integrity Management Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-27

    ... discussion of analyses of the initial submissions of data concerning mechanical fitting failures in... information. The DIMP rule also required distribution pipeline operators to report failures of mechanical... mechanical fitting failure reporting will be preliminary at this time. They will be based on a limited set of...

  19. University ERP Implementation in Germany: Qualitative Exploratory Case Study of Administrative Staff Experiences

    ERIC Educational Resources Information Center

    Thelen, Anja

    2015-01-01

    Enterprise Resource Planning (ERP) implementations are expensive, time-consuming, and often do not lead to the expected outcome of integrated IT systems. Many German universities are implementing ERP systems as Campus Management Systems (CMS) and a solution to any problem, need, or requirement the organization has. This exploratory case study…

  20. 49 CFR 571.106 - Standard No. 106; Brake hoses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... Brake hose end fitting means a coupler, other than a clamp, designed for attachment to the end of a... a sacrificial sleeve or ferrule that requires replacement each time a hose assembly is rebuilt..., as an integral part of the vehicle's original design, with a means of attaching the support to the...

  1. 49 CFR 571.106 - Standard No. 106; Brake hoses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... Brake hose end fitting means a coupler, other than a clamp, designed for attachment to the end of a... a sacrificial sleeve or ferrule that requires replacement each time a hose assembly is rebuilt..., as an integral part of the vehicle's original design, with a means of attaching the support to the...

  2. 49 CFR 571.106 - Standard No. 106; Brake hoses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... Brake hose end fitting means a coupler, other than a clamp, designed for attachment to the end of a... a sacrificial sleeve or ferrule that requires replacement each time a hose assembly is rebuilt..., as an integral part of the vehicle's original design, with a means of attaching the support to the...

  3. 49 CFR 571.106 - Standard No. 106; Brake hoses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... Brake hose end fitting means a coupler, other than a clamp, designed for attachment to the end of a... a sacrificial sleeve or ferrule that requires replacement each time a hose assembly is rebuilt..., as an integral part of the vehicle's original design, with a means of attaching the support to the...

  4. Schools and the Community: A Necessary Partnership: A Guide to Interagency Collaboration.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton. Education Response Centre.

    The problems facing students and families in Alberta, Canada, have been recognized as community problems that require community solutions. Interagency collaboration has become a necessity indicative of the changing times and the global focus on integration rather than isolation. Interagency collaboration is an arrangement in which agencies work…

  5. 46 CFR 164.008-7 - Procedure for approval.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...

  6. 46 CFR 164.008-7 - Procedure for approval.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...

  7. 46 CFR 164.008-7 - Procedure for approval.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...

  8. 46 CFR 164.008-7 - Procedure for approval.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...

  9. 46 CFR 164.008-7 - Procedure for approval.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... conducted at some laboratory other than the National Bureau of Standards, this information shall be supplied... fire resistance and integrity test will be given at this time together with the estimated cost of the... manufacturer shall submit the samples required by paragraph (c)(1) of this section to the Fire Research Section...

  10. 16 CFR 1512.6 - Requirements for steering system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... stem insertion mark. Quill-type handlebar stems shall contain a permanent ring or mark which clearly indicates the minimum insertion depth of the handlebar stem into the fork assembly. The insertion mark shall not affect the structural integrity of the stem and shall not be less than 21/2 times the stem...

  11. 16 CFR 1512.6 - Requirements for steering system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... stem insertion mark. Quill-type handlebar stems shall contain a permanent ring or mark which clearly indicates the minimum insertion depth of the handlebar stem into the fork assembly. The insertion mark shall not affect the structural integrity of the stem and shall not be less than 21/2 times the stem...

  12. 16 CFR 1512.6 - Requirements for steering system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... stem insertion mark. The handlebar stem shall contain a permanent ring or mark which clearly indicates the minimum insertion depth of the handlebar stem into the fork assembly. The insertion mark shall not affect the structural integrity of the stem and shall not be less than 21/2 times the stem diameter from...

  13. 16 CFR 1512.6 - Requirements for steering system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... stem insertion mark. The handlebar stem shall contain a permanent ring or mark which clearly indicates the minimum insertion depth of the handlebar stem into the fork assembly. The insertion mark shall not affect the structural integrity of the stem and shall not be less than 21/2 times the stem diameter from...

  14. 16 CFR § 1512.6 - Requirements for steering system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... stem insertion mark. Quill-type handlebar stems shall contain a permanent ring or mark which clearly indicates the minimum insertion depth of the handlebar stem into the fork assembly. The insertion mark shall not affect the structural integrity of the stem and shall not be less than 21/2 times the stem...

  15. Using DCOM to support interoperability in forest ecosystem management decision support systems

    Treesearch

    W.D. Potter; S. Liu; X. Deng; H.M. Rauscher

    2000-01-01

    Forest ecosystems exhibit complex dynamics over time and space. Management of forest ecosystems involves the need to forecast future states of complex systems that are often undergoing structural changes. This in turn requires integration of quantitative science and engineering components with sociopolitical, regulatory, and economic considerations. The amount of data...

  16. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    ERIC Educational Resources Information Center

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  17. Publication of the maps of Tenke and Manono (Zaire) from LANDSAT data

    NASA Technical Reports Server (NTRS)

    Yampania, M.

    1981-01-01

    The collection of cartographic data on Zaire up to the present time was based on aerial reconnaissance. This approach is very expensive if repetitive coverage is required in such a large country. The integration with the LANDSAT program among the data collection systems improves the mapping efforts substantially.

  18. Medical System Concept of Operations for Mars Exploration Missions

    NASA Technical Reports Server (NTRS)

    Urbina, Michelle; Rubin, D.; Hailey, M.; Reyes, D.; Antonsen, Eric

    2017-01-01

    Future exploration missions will be the first time humanity travels beyond Low Earth Orbit (LEO) since the Apollo program, taking us to cis-lunar space, interplanetary space, and Mars. These long-duration missions will cover vast distances, severely constraining opportunities for emergency evacuation to Earth and cargo resupply opportunities. Communication delays and blackouts between the crew and Mission Control will eliminate reliable, real-time telemedicine consultations. As a result, compared to current LEO operations onboard the International Space Station, exploration mission medical care requires an integrated medical system that provides additional in-situ capabilities and a significant increase in crew autonomy. The Medical System Concept of Operations for Mars Exploration Missions illustrates how a future NASA Mars program could ensure appropriate medical care for the crew of this highly autonomous mission. This Concept of Operations document, when complete, will document all mission phases through a series of mission use case scenarios that illustrate required medical capabilities, enabling the NASA Human Research Program (HRP) Exploration Medical Capability (ExMC) Element to plan, design, and prototype an integrated medical system to support human exploration to Mars.

  19. Dynfarm: A Dynamic Site Extension

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; De Girolamo, D.

    2017-10-01

    Requests for computing resources from LHC experiments are constantly mounting, and so are their peak usage. Since dimensioning a site to handle the peak usage times is impractical due to constraints on resources that many publicly-owned computing centres have, opportunistic usage of resources from external, even commercial, cloud providers is becoming more and more interesting, and is even the subject of upcoming initiative from the EU commission, named HelixNebula. While extra resources are always a good thing, to fully take advantage of them they must be integrated in the site’s own infrastructure and made available to users as if they were local resources. At the CNAF INFN Tier-1 we have developed a framework, called dynfarm, capable of taking external resources and, placing minimal and easily satisfied requirements upon them, fully integrate them into a pre-existing infrastructure and treat them as if they were local, fully-owned resources. In this article we for the first time will a give a full, complete description of the framework’s architecture along with all of its capabilities, to describe exactly what is possible with it and what are its requirements.

  20. Power Hardware-in-the-Loop Evaluation of PV Inverter Grid Support on Hawaiian Electric Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Austin A; Prabakar, Kumaraguru; Nagarajan, Adarsh

    As more grid-connected photovoltaic (PV) inverters become compliant with evolving interconnections requirements, there is increased interest from utilities in understanding how to best deploy advanced grid-support functions (GSF) in the field. One efficient and cost-effective method to examine such deployment options is to leverage power hardware-in-the-loop (PHIL) testing methods, which combine the fidelity of hardware tests with the flexibility of computer simulation. This paper summarizes a study wherein two Hawaiian Electric feeder models were converted to real-time models using an OPAL-RT real-time digital testing platform, and integrated with models of GSF capable PV inverters based on characterization test data. Themore » integrated model was subsequently used in PHIL testing to evaluate the effects of different fixed power factor and volt-watt control settings on voltage regulation of the selected feeders using physical inverters. Selected results are presented in this paper, and complete results of this study were provided as inputs for field deployment and technical interconnection requirements for grid-connected PV inverters on the Hawaiian Islands.« less

  1. Experimentation and evaluation of advanced integrated system concepts

    NASA Astrophysics Data System (ADS)

    Ross, M.; Garrigus, K.; Gottschalck, J.; Rinearson, L.; Longee, E.

    1980-09-01

    This final report examines the implementation of a time-phased test bed for experimentation and evaluation of advanced system concepts relative to the future Defense Switched Network (DSN). After identifying issues pertinent to the DSN, a set of experiments which address these issues are developed. Experiments are ordered based on their immediacy and relative importance to DSN development. The set of experiments thus defined allows requirements for a time phased implementation of a test bed to be identified, and several generic test bed architectures which meet these requirements are examined. Specific architecture implementations are costed and cost/schedule profiles are generated as a function of experimental capability. The final recommended system consists of two separate test beds: a circuit switch test bed, configured around an off-the-shelf commercial switch, and directed toward the examination of nearer term and transitional issues raised by the evolving DSN; and a packet/hybrid test bed, featuring a discrete buildup of new hardware and software modules, and directed toward examination of the more advanced integrated voice and data telecommunications issues and concepts.

  2. Bio-inspired feedback-circuit implementation of discrete, free energy optimizing, winner-take-all computations.

    PubMed

    Genewein, Tim; Braun, Daniel A

    2016-06-01

    Bayesian inference and bounded rational decision-making require the accumulation of evidence or utility, respectively, to transform a prior belief or strategy into a posterior probability distribution over hypotheses or actions. Crucially, this process cannot be simply realized by independent integrators, since the different hypotheses and actions also compete with each other. In continuous time, this competitive integration process can be described by a special case of the replicator equation. Here we investigate simple analog electric circuits that implement the underlying differential equation under the constraint that we only permit a limited set of building blocks that we regard as biologically interpretable, such as capacitors, resistors, voltage-dependent conductances and voltage- or current-controlled current and voltage sources. The appeal of these circuits is that they intrinsically perform normalization without requiring an explicit divisive normalization. However, even in idealized simulations, we find that these circuits are very sensitive to internal noise as they accumulate error over time. We discuss in how far neural circuits could implement these operations that might provide a generic competitive principle underlying both perception and action.

  3. Method to manage integration error in the Green-Kubo method.

    PubMed

    Oliveira, Laura de Sousa; Greaney, P Alex

    2017-02-01

    The Green-Kubo method is a commonly used approach for predicting transport properties in a system from equilibrium molecular dynamics simulations. The approach is founded on the fluctuation dissipation theorem and relates the property of interest to the lifetime of fluctuations in its thermodynamic driving potential. For heat transport, the lattice thermal conductivity is related to the integral of the autocorrelation of the instantaneous heat flux. A principal source of error in these calculations is that the autocorrelation function requires a long averaging time to reduce remnant noise. Integrating the noise in the tail of the autocorrelation function becomes conflated with physically important slow relaxation processes. In this paper we present a method to quantify the uncertainty on transport properties computed using the Green-Kubo formulation based on recognizing that the integrated noise is a random walk, with a growing envelope of uncertainty. By characterizing the noise we can choose integration conditions to best trade off systematic truncation error with unbiased integration noise, to minimize uncertainty for a given allocation of computational resources.

  4. Method to manage integration error in the Green-Kubo method

    NASA Astrophysics Data System (ADS)

    Oliveira, Laura de Sousa; Greaney, P. Alex

    2017-02-01

    The Green-Kubo method is a commonly used approach for predicting transport properties in a system from equilibrium molecular dynamics simulations. The approach is founded on the fluctuation dissipation theorem and relates the property of interest to the lifetime of fluctuations in its thermodynamic driving potential. For heat transport, the lattice thermal conductivity is related to the integral of the autocorrelation of the instantaneous heat flux. A principal source of error in these calculations is that the autocorrelation function requires a long averaging time to reduce remnant noise. Integrating the noise in the tail of the autocorrelation function becomes conflated with physically important slow relaxation processes. In this paper we present a method to quantify the uncertainty on transport properties computed using the Green-Kubo formulation based on recognizing that the integrated noise is a random walk, with a growing envelope of uncertainty. By characterizing the noise we can choose integration conditions to best trade off systematic truncation error with unbiased integration noise, to minimize uncertainty for a given allocation of computational resources.

  5. Integration of virtual and real scenes within an integral 3D imaging environment

    NASA Astrophysics Data System (ADS)

    Ren, Jinsong; Aggoun, Amar; McCormick, Malcolm

    2002-11-01

    The Imaging Technologies group at De Montfort University has developed an integral 3D imaging system, which is seen as the most likely vehicle for 3D television avoiding psychological effects. To create real fascinating three-dimensional television programs, a virtual studio that performs the task of generating, editing and integrating the 3D contents involving virtual and real scenes is required. The paper presents, for the first time, the procedures, factors and methods of integrating computer-generated virtual scenes with real objects captured using the 3D integral imaging camera system. The method of computer generation of 3D integral images, where the lens array is modelled instead of the physical camera is described. In the model each micro-lens that captures different elemental images of the virtual scene is treated as an extended pinhole camera. An integration process named integrated rendering is illustrated. Detailed discussion and deep investigation are focused on depth extraction from captured integral 3D images. The depth calculation method from the disparity and the multiple baseline method that is used to improve the precision of depth estimation are also presented. The concept of colour SSD and its further improvement in the precision is proposed and verified.

  6. Temporal Integration of Auditory Information Is Invariant to Temporal Grouping Cues

    PubMed

    Liu, Andrew S K; Tsunada, Joji; Gold, Joshua I; Cohen, Yale E

    2015-01-01

    Auditory perception depends on the temporal structure of incoming acoustic stimuli. Here, we examined whether a temporal manipulation that affects the perceptual grouping also affects the time dependence of decisions regarding those stimuli. We designed a novel discrimination task that required human listeners to decide whether a sequence of tone bursts was increasing or decreasing in frequency. We manipulated temporal perceptual-grouping cues by changing the time interval between the tone bursts, which led to listeners hearing the sequences as a single sound for short intervals or discrete sounds for longer intervals. Despite these strong perceptual differences, this manipulation did not affect the efficiency of how auditory information was integrated over time to form a decision. Instead, the grouping manipulation affected subjects' speed-accuracy trade-offs. These results indicate that the temporal dynamics of evidence accumulation for auditory perceptual decisions can be invariant to manipulations that affect the perceptual grouping of the evidence.

  7. Efficient Trajectory Propagation for Orbit Determination Problems

    NASA Technical Reports Server (NTRS)

    Roa, Javier; Pelaez, Jesus

    2015-01-01

    Regularized formulations of orbital motion apply a series of techniques to improve the numerical integration of the orbit. Despite their advantages and potential applications little attention has been paid to the propagation of the partial derivatives of the corresponding set of elements or coordinates, required in many orbit-determination scenarios and optimization problems. This paper fills this gap by presenting the general procedure for integrating the state-transition matrix of the system together with the nominal trajectory using regularized formulations and different sets of elements. The main difficulty comes from introducing an independent variable different from time, because the solution needs to be synchronized. The correction of the time delay is treated from a generic perspective not focused on any particular formulation. The synchronization using time-elements is also discussed. Numerical examples include strongly-perturbed orbits in the Pluto system, motivated by the recent flyby of the New Horizons spacecraft, together with a geocentric flyby of the NEAR spacecraft.

  8. Time-reversal symmetry breaking with acoustic pumping of nanophotonic circuits

    NASA Astrophysics Data System (ADS)

    Sohn, Donggyu B.; Kim, Seunghwi; Bahl, Gaurav

    2018-02-01

    Achieving non-reciprocal light propagation via stimuli that break time-reversal symmetry, without magneto-optics, remains a major challenge for integrated nanophotonic devices. Recently, optomechanical microsystems in which light and vibrational modes are coupled through ponderomotive forces have demonstrated strong non-reciprocal effects through a variety of techniques, but always using optical pumping. None of these approaches has demonstrated bandwidth exceeding that of the mechanical system, and all of them require optical power; these are both fundamental and practical issues. Here, we resolve both challenges by breaking time-reversal symmetry using a two-dimensional acoustic pump that simultaneously provides a non-zero overlap integral for light-sound interaction and also satisfies the necessary phase-matching. We use this technique to produce a non-reciprocal modulator (a frequency shifting isolator) by means of indirect interband scattering. We demonstrate mode conversion asymmetry up to 15 dB and efficiency as high as 17% over a bandwidth exceeding 1 GHz.

  9. SURGNET: An Integrated Surgical Data Transmission System for Telesurgery.

    PubMed

    Natarajan, Sriram; Ganz, Aura

    2009-01-01

    Remote surgery information requires quick and reliable transmission between the surgeon and the patient site. However, the networks that interconnect the surgeon and patient sites are usually time varying and lossy which can cause packet loss and delay jitter. In this paper we propose SURGNET, a telesurgery system for which we developed the architecture, algorithms and implemented it on a testbed. The algorithms include adaptive packet prediction and buffer time adjustment techniques which reduce the negative effects caused by the lossy and time varying networks. To evaluate the proposed SURGNET system, at the therapist site, we implemented a therapist panel which controls the force feedback device movements and provides image analysis functionality. At the patient site we controlled a virtual reality applet built in Matlab. The varying network conditions were emulated using NISTNet emulator. Our results show that even for severe packet loss and variable delay jitter, the proposed integrated synchronization techniques significantly improve SURGNET performance.

  10. Variational Algorithms for Test Particle Trajectories

    NASA Astrophysics Data System (ADS)

    Ellison, C. Leland; Finn, John M.; Qin, Hong; Tang, William M.

    2015-11-01

    The theory of variational integration provides a novel framework for constructing conservative numerical methods for magnetized test particle dynamics. The retention of conservation laws in the numerical time advance captures the correct qualitative behavior of the long time dynamics. For modeling the Lorentz force system, new variational integrators have been developed that are both symplectic and electromagnetically gauge invariant. For guiding center test particle dynamics, discretization of the phase-space action principle yields multistep variational algorithms, in general. Obtaining the desired long-term numerical fidelity requires mitigation of the multistep method's parasitic modes or applying a discretization scheme that possesses a discrete degeneracy to yield a one-step method. Dissipative effects may be modeled using Lagrange-D'Alembert variational principles. Numerical results will be presented using a new numerical platform that interfaces with popular equilibrium codes and utilizes parallel hardware to achieve reduced times to solution. This work was supported by DOE Contract DE-AC02-09CH11466.

  11. Viscous-inviscid interaction method including wake effects for three-dimensional wing-body configurations

    NASA Technical Reports Server (NTRS)

    Streett, C. L.

    1981-01-01

    A viscous-inviscid interaction method has been developed by using a three-dimensional integral boundary-layer method which produces results in good agreement with a finite-difference method in a fraction of the computer time. The integral method is stable and robust and incorporates a model for computation in a small region of streamwise separation. A locally two-dimensional wake model, accounting for thickness and curvature effects, is also included in the interaction procedure. Computation time spent in converging an interacted result is, many times, only slightly greater than that required to converge an inviscid calculation. Results are shown from the interaction method, run at experimental angle of attack, Reynolds number, and Mach number, on a wing-body test case for which viscous effects are large. Agreement with experiment is good; in particular, the present wake model improves prediction of the spanwise lift distribution and lower surface cove pressure.

  12. Optical biosensor technologies for molecular diagnostics at the point-of-care

    NASA Astrophysics Data System (ADS)

    Schotter, Joerg; Schrittwieser, Stefan; Muellner, Paul; Melnik, Eva; Hainberger, Rainer; Koppitsch, Guenther; Schrank, Franz; Soulantika, Katerina; Lentijo-Mozo, Sergio; Pelaz, Beatriz; Parak, Wolfgang; Ludwig, Frank; Dieckhoff, Jan

    2015-05-01

    Label-free optical schemes for molecular biosensing hold a strong promise for point-of-care applications in medical research and diagnostics. Apart from diagnostic requirements in terms of sensitivity, specificity, and multiplexing capability, also other aspects such as ease of use and manufacturability have to be considered in order to pave the way to a practical implementation. We present integrated optical waveguide as well as magnetic nanoparticle based molecular biosensor concepts that address these aspects. The integrated optical waveguide devices are based on low-loss photonic wires made of silicon nitride deposited by a CMOS compatible plasma-enhanced chemical vapor deposition (PECVD) process that allows for backend integration of waveguides on optoelectronic CMOS chips. The molecular detection principle relies on evanescent wave sensing in the 0.85 μm wavelength regime by means of Mach-Zehnder interferometers, which enables on-chip integration of silicon photodiodes and, thus, the realization of system-on-chip solutions. Our nanoparticle-based approach is based on optical observation of the dynamic response of functionalized magneticcore/ noble-metal-shell nanorods (`nanoprobes') to an externally applied time-varying magnetic field. As target molecules specifically bind to the surface of the nanoprobes, the observed dynamics of the nanoprobes changes, and the concentration of target molecules in the sample solution can be quantified. This approach is suitable for dynamic real-time measurements and only requires minimal sample preparation, thus presenting a highly promising point-of-care diagnostic system. In this paper, we present a prototype of a diagnostic device suitable for highly automated sample analysis by our nanoparticle-based approach.

  13. High throughput wafer defect monitor for integrated metrology applications in photolithography

    NASA Astrophysics Data System (ADS)

    Rao, Nagaraja; Kinney, Patrick; Gupta, Anand

    2008-03-01

    The traditional approach to semiconductor wafer inspection is based on the use of stand-alone metrology tools, which while highly sensitive, are large, expensive and slow, requiring inspection to be performed off-line and on a lot sampling basis. Due to the long cycle times and sparse sampling, the current wafer inspection approach is not suited to rapid detection of process excursions that affect yield. The semiconductor industry is gradually moving towards deploying integrated metrology tools for real-time "monitoring" of product wafers during the manufacturing process. Integrated metrology aims to provide end-users with rapid feedback of problems during the manufacturing process, and the benefit of increased yield, and reduced rework and scrap. The approach of monitoring 100% of the wafers being processed requires some trade-off in sensitivity compared to traditional standalone metrology tools, but not by much. This paper describes a compact, low-cost wafer defect monitor suitable for integrated metrology applications and capable of detecting submicron defects on semiconductor wafers at an inspection rate of about 10 seconds per wafer (or 360 wafers per hour). The wafer monitor uses a whole wafer imaging approach to detect defects on both un-patterned and patterned wafers. Laboratory tests with a prototype system have demonstrated sensitivity down to 0.3 µm on un-patterned wafers and down to 1 µm on patterned wafers, at inspection rates of 10 seconds per wafer. An ideal application for this technology is preventing photolithography defects such as "hot spots" by implementing a wafer backside monitoring step prior to exposing wafers in the lithography step.

  14. Electronic information and clinical decision support for prescribing: state of play in Australian general practice

    PubMed Central

    Robertson, Jane; Moxey, Annette J; Newby, David A; Gillies, Malcolm B; Williamson, Margaret; Pearson, Sallie-Anne

    2011-01-01

    Background. Investments in eHealth worldwide have been mirrored in Australia, with >90% of general practices computerized. Recent eHealth incentives promote the use of up to date electronic information sources relevant to general practice with flexibility in mode of access. Objective. To determine GPs’ access to and use of electronic information sources and computerized clinical decision support systems (CDSSs) for prescribing. Methods. Semi-structured interviews were conducted with 18 experienced GPs and nine GP trainees in New South Wales, Australia in 2008. A thematic analysis of interview transcripts was undertaken. Results. Information needs varied with clinical experience, and people resources (specialists, GP peers and supervisors for trainees) were often preferred over written formats. Experienced GPs used a small number of electronic resources and accessed them infrequently. Familiarity from training and early clinical practice and easy access were dominant influences on resource use. Practice time constraints meant relevant information needed to be readily accessible during consultations, requiring integration or direct access from prescribing software. Quality of electronic resource content was assumed and cost a barrier for some GPs. Conclusions. The current Australian practice incentives do not prescribe which information resources GPs should use. Without integration into practice computing systems, uptake and routine use seem unlikely. CDSS developments must recognize the time pressures of practice, preference for integration and cost concerns. Minimum standards are required to ensure that high-quality information resources are integrated and regularly updated. Without standards, the anticipated benefits of computerization on patient safety and health outcomes will be uncertain. PMID:21109619

  15. Impacts of EHR Certification and Meaningful Use Implementation on an Integrated Delivery Network.

    PubMed

    Bowes, Watson A

    2014-01-01

    Three years ago Intermountain Healthcare made the decision to participate in the Medicare and Medicaid Electronic Heath Record (EHR) Incentive Program which required that hospitals and providers use a certified EHR in a meaningful way. At that time, the barriers to enhance our home grown system, and change clinician workflows were numerous and large. This paper describes the time and effort required to enhance our legacy systems in order to pass certification, including filling 47 gaps in (EHR) functionality. We also describe the processes and resources that resulted in successful changes to many clinical workflows required by clinicians to meet meaningful use requirements. In 2011 we set meaningful use targets of 75% of employed physicians and 75% of our hospitals to meet Stage 1 of meaningful use by 2013. By the end of 2013, 87% of 696 employed eligible professionals and 100% of 22 Intermountain hospitals had successfully attested for Stage 1. This paper describes documented and perceived costs to Intermountain including time, effort, resources, postponement of other projects, as well as documented and perceived benefits of attainment of meaningful use.

  16. Development of a light-weight, wind-turbine-rotor-based data acquisition system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, D.E.; Rumsey, M.; Robertson, P.

    1997-12-01

    Wind-energy researchers at Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) are developing a new, light-weight, modular system capable of acquiring long-term, continuous time-series data from current-generation small or large, dynamic wind-turbine rotors. Meetings with wind-turbine research personnel at NREL and SNL resulted in a list of the major requirements that the system must meet. Initial attempts to locate a commercial system that could meet all of these requirements were not successful, but some commercially available data acquisition and radio/modem subsystems that met many of the requirements were identified. A time synchronization subsystem and a programmable logicmore » device subsystem to integrate the functions of the data acquisition, the radio/modem, and the time synchronization subsystems and to communicate with the user have been developed at SNL. This paper presents the data system requirements, describes the four major subsystems comprising the system, summarizes the current status of the system, and presents the current plans for near-term development of hardware and software.« less

  17. On-patient see-through augmented reality based on visual SLAM.

    PubMed

    Mahmoud, Nader; Grasa, Óscar G; Nicolau, Stéphane A; Doignon, Christophe; Soler, Luc; Marescaux, Jacques; Montiel, J M M

    2017-01-01

    An augmented reality system to visualize a 3D preoperative anatomical model on intra-operative patient is proposed. The hardware requirement is commercial tablet-PC equipped with a camera. Thus, no external tracking device nor artificial landmarks on the patient are required. We resort to visual SLAM to provide markerless real-time tablet-PC camera location with respect to the patient. The preoperative model is registered with respect to the patient through 4-6 anchor points. The anchors correspond to anatomical references selected on the tablet-PC screen at the beginning of the procedure. Accurate and real-time preoperative model alignment (approximately 5-mm mean FRE and TRE) was achieved, even when anchors were not visible in the current field of view. The system has been experimentally validated on human volunteers, in vivo pigs and a phantom. The proposed system can be smoothly integrated into the surgical workflow because it: (1) operates in real time, (2) requires minimal additional hardware only a tablet-PC with camera, (3) is robust to occlusion, (4) requires minimal interaction from the medical staff.

  18. A Three Month Comparative Evaluation of the Effect of Different Surface Treatment Agents on the Surface Integrity and Softness of Acrylic based Soft Liner: An In vivo Study

    PubMed Central

    Mahajan, Neerja; Naveen, Y. G.; Sethuraman, Rajesh

    2017-01-01

    Introduction Acrylic based soft liners are cost effective, yet are inferior in durability as compared to silicone based liners. Hence, this study was conducted to evaluate if the softness and surface integrity of acrylic based soft liner can be maintained by using different surface treatment agents. Aim To comparatively evaluate the effects of Varnish, Monopoly and Kregard surface treatment agents on the surface integrity and softness of acrylic based soft liner at baseline, at one month and after three months. Materials and Methods A total of 37 participants who required conventional maxillary dentures were selected according to the determined inclusion and exclusion criteria of the study. In the maxillary denture on the denture bearing surface, eight palatal recesses (5 mm x 3 mm) were made and filled with acrylic based soft liner (Permasoft). The soft liners in these recesses were given surface treatment and divided as control (uncoated), Varnish, Monopoly and Kregard groups. The hardness and surface integrity were evaluated with Shore A Durometer and Scanning Electron Microscope (SEM) respectively at baseline, one month and three months interval. Surface integrity between groups was compared using Kruskal-Wallis test. Intergroup comparison for hardness was done using ANOVA and Tukey’s HSD post-hoc tests. Results Amongst all the groups tested, surface integrity was maintained in the Kregard group, as compared to control, Varnish and Monopoly groups for all three time intervals (p< 0.001). Kregard treated samples also demonstrated significantly higher softness at all the time intervals (p<0.001). Conclusion Surface treatment with Kregard demonstrated better surface integrity and softness at all the time intervals. PMID:29207842

  19. Gender differences in limited duty time for lower limb injury.

    PubMed

    Holsteen, K K; Choi, Y S; Bedno, S A; Nelson, D A; Kurina, L M

    2018-02-16

    Among active-duty military personnel, lower limb musculoskeletal injuries and related conditions (injuries) frequently arise as unintended consequences of physical training. These injuries are particularly common among women. The practical impact of such injuries on temporary military occupational disability has not been estimated with precision on a large scale. To determine the proportion of service time compromised by limited duty days attributable to lower limb injuries, characterize the time affected by these limitations in terms of specific lower limb region and compare the limited duty time between male and female soldiers. Administrative data and individual limited duty assignments (profiles) were obtained for active-duty US Army personnel who served in 2014. Lower limb injury-related profiles were used to calculate the percent of person-time requiring duty limitations by gender and body region. The study group was 568 753 soldiers of whom 14% were women. Nearly 13% of service days for active-duty US Army soldiers required limited duty for lower limb injuries during 2014. Knee injuries were responsible for 45% of those days. Within integrated military occupations, female soldiers experienced 27-57% more time on limited duty for lower limb injuries compared with men. The substantial amount of limited duty for lower limb musculoskeletal injuries among soldiers highlights the need for improvement in training-related injury screening, prevention and timely treatment with particular attention to knee injuries. The excessive impact of lower limb injuries on female soldiers' occupational functions should be a surveillance priority in the current environment of expanding gender-integrated training. Published by Oxford University Press on behalf of The Society of Occupational Medicine 2017.

  20. Best Merge Region Growing Segmentation with Integrated Non-Adjacent Region Object Aggregation

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Tarabalka, Yuliya; Montesano, Paul M.; Gofman, Emanuel

    2012-01-01

    Best merge region growing normally produces segmentations with closed connected region objects. Recognizing that spectrally similar objects often appear in spatially separate locations, we present an approach for tightly integrating best merge region growing with non-adjacent region object aggregation, which we call Hierarchical Segmentation or HSeg. However, the original implementation of non-adjacent region object aggregation in HSeg required excessive computing time even for moderately sized images because of the required intercomparison of each region with all other regions. This problem was previously addressed by a recursive approximation of HSeg, called RHSeg. In this paper we introduce a refined implementation of non-adjacent region object aggregation in HSeg that reduces the computational requirements of HSeg without resorting to the recursive approximation. In this refinement, HSeg s region inter-comparisons among non-adjacent regions are limited to regions of a dynamically determined minimum size. We show that this refined version of HSeg can process moderately sized images in about the same amount of time as RHSeg incorporating the original HSeg. Nonetheless, RHSeg is still required for processing very large images due to its lower computer memory requirements and amenability to parallel processing. We then note a limitation of RHSeg with the original HSeg for high spatial resolution images, and show how incorporating the refined HSeg into RHSeg overcomes this limitation. The quality of the image segmentations produced by the refined HSeg is then compared with other available best merge segmentation approaches. Finally, we comment on the unique nature of the hierarchical segmentations produced by HSeg.

Top