Sample records for observing time required

  1. Space station needs, attributes, and architectural options study. Volume 1: Missions and requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Science and applications, NOAA environmental observation, commercial resource observations, commercial space processing, commercial communications, national security, technology development, and GEO servicing are addressed. Approach to time phasing of mission requirements, system sizing summary, time-phased user mission payload support, space station facility requirements, and integrated time-phased system requirements are also addressed.

  2. 5 CFR 550.1002 - Compensatory time off for religious observances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Compensatory time off for religious... personal religious beliefs require the abstention from work during certain periods of time may elect to... religious observances when the employee's personal religious beliefs require that the employee abstain from...

  3. 5 CFR 550.1002 - Compensatory time off for religious observances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Compensatory time off for religious... personal religious beliefs require the abstention from work during certain periods of time may elect to... religious observances when the employee's personal religious beliefs require that the employee abstain from...

  4. 5 CFR 550.1002 - Compensatory time off for religious observances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Compensatory time off for religious... personal religious beliefs require the abstention from work during certain periods of time may elect to... religious observances when the employee's personal religious beliefs require that the employee abstain from...

  5. 5 CFR 550.1002 - Compensatory time off for religious observances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Compensatory time off for religious... personal religious beliefs require the abstention from work during certain periods of time may elect to... religious observances when the employee's personal religious beliefs require that the employee abstain from...

  6. 5 CFR 550.1002 - Compensatory time off for religious observances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Compensatory time off for religious... personal religious beliefs require the abstention from work during certain periods of time may elect to... religious observances when the employee's personal religious beliefs require that the employee abstain from...

  7. Rational reduction of periodic propagators for off-period observations.

    PubMed

    Blanton, Wyndham B; Logan, John W; Pines, Alexander

    2004-02-01

    Many common solid-state nuclear magnetic resonance problems take advantage of the periodicity of the underlying Hamiltonian to simplify the computation of an observation. Most of the time-domain methods used, however, require the time step between observations to be some integer or reciprocal-integer multiple of the period, thereby restricting the observation bandwidth. Calculations of off-period observations are usually reduced to brute force direct methods resulting in many demanding matrix multiplications. For large spin systems, the matrix multiplication becomes the limiting step. A simple method that can dramatically reduce the number of matrix multiplications required to calculate the time evolution when the observation time step is some rational fraction of the period of the Hamiltonian is presented. The algorithm implements two different optimization routines. One uses pattern matching and additional memory storage, while the other recursively generates the propagators via time shifting. The net result is a significant speed improvement for some types of time-domain calculations.

  8. Differences in care burden of patients undergoing dialysis in different centres in the netherlands.

    PubMed

    de Kleijn, Ria; Uyl-de Groot, Carin; Hagen, Chris; Diepenbroek, Adry; Pasker-de Jong, Pieternel; Ter Wee, Piet

    2017-06-01

    A classification model was developed to simplify planning of personnel at dialysis centres. This model predicted the care burden based on dialysis characteristics. However, patient characteristics and different dialysis centre categories might also influence the amount of care time required. To determine if there is a difference in care burden between different categories of dialysis centres and if specific patient characteristics predict nursing time needed for patient treatment. An observational study. Two hundred and forty-two patients from 12 dialysis centres. In 12 dialysis centres, nurses filled out the classification list per patient and completed a form with patient characteristics. Nephrologists filled out the Charlson Comorbidity Index. Independent observers clocked the time nurses spent on separate steps of the dialysis for each patient. Dialysis centres were categorised into four types. Data were analysed using regression models. In contrast to other dialysis centres, academic centres needed 14 minutes more care time per patient per dialysis treatment than predicted in the classification model. No patient characteristics were found that influenced this difference. The only patient characteristic that predicted the time required was gender, with more time required to treat women. Gender did not affect the difference between measured and predicted care time. Differences in care burden were observed between academic and other centres, with more time required for treatment in academic centres. Contribution of patient characteristics to the time difference was minimal. The only patient characteristics that predicted care time were previous transplantation, which reduced the time required, and gender, with women requiring more care time. © 2017 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  9. Lower Limits on Aperture Size for an ExoEarth Detecting Coronagraphic Mission

    NASA Technical Reports Server (NTRS)

    Stark, Christopher C.; Roberge, Aki; Mandell, Avi; Clampin, Mark; Domagal-Goldman, Shawn D.; McElwain, Michael W.; Stapelfeldt, Karl R.

    2015-01-01

    The yield of Earth-like planets will likely be a primary science metric for future space-based missions that will drive telescope aperture size. Maximizing the exoEarth candidate yield is therefore critical to minimizing the required aperture. Here we describe a method for exoEarth candidate yield maximization that simultaneously optimizes, for the first time, the targets chosen for observation, the number of visits to each target, the delay time between visits, and the exposure time of every observation. This code calculates both the detection time and multiwavelength spectral characterization time required for planets. We also refine the astrophysical assumptions used as inputs to these calculations, relying on published estimates of planetary occurrence rates as well as theoretical and observational constraints on terrestrial planet sizes and classical habitable zones. Given these astrophysical assumptions, optimistic telescope and instrument assumptions, and our new completeness code that produces the highest yields to date, we suggest lower limits on the aperture size required to detect and characterize a statistically motivated sample of exoEarths.

  10. SNR-based queue observations at CFHT

    NASA Astrophysics Data System (ADS)

    Devost, Daniel; Moutou, Claire; Manset, Nadine; Mahoney, Billy; Burdullis, Todd; Cuillandre, Jean-Charles; Racine, René

    2016-07-01

    In an effort to optimize the night time utilizing the exquisite weather on Maunakea, CFHT has equipped its dome with vents and is now moving its Queued Scheduled Observing (QSO)1 based operations toward Signal to Noise Ratio (SNR) observing. In this new mode, individual exposure times for a science program are estimated using a model that uses measurements of the weather conditions as input and the science program is considered completed when the depth required by the scientific requirements are reached. These changes allow CFHT to make better use of the excellent seeing conditions provided by Maunakea, allowing us to complete programs in a shorter time than allocated to the science programs.

  11. 50 CFR 216.275 - Requirements for monitoring and reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., what type of surface vessel, i.e., FFG, DDG, or CG) (G) Length of time observers maintained visual... exercise) (I) Narrative description of sensors and platforms utilized for marine mammal detection and... calves were observed (E) Initial detection sensor (F) Length of time observers maintained visual contact...

  12. Critical Review of NOAA's Observation Requirements Process

    NASA Astrophysics Data System (ADS)

    LaJoie, M.; Yapur, M.; Vo, T.; Templeton, A.; Bludis, D.

    2017-12-01

    NOAA's Observing Systems Council (NOSC) maintains a comprehensive database of user observation requirements. The requirements collection process engages NOAA subject matter experts to document and effectively communicate the specific environmental observation measurements (parameters and attributes) needed to produce operational products and pursue research objectives. User observation requirements documented using a structured and standardized manner and framework enables NOAA to assess its needs across organizational lines in an impartial, objective, and transparent manner. This structure provides the foundation for: selecting, designing, developing, acquiring observing technologies, systems and architectures; budget and contract formulation and decision-making; and assessing in a repeatable fashion the productivity, efficiency and optimization of NOAA's observing system enterprise. User observation requirements are captured independently from observing technologies. Therefore, they can be addressed by a variety of current or expected observing capabilities and allow flexibility to be remapped to new and evolving technologies. NOAA's current inventory of user observation requirements were collected over a ten-year period, and there have been many changes in policies, mission priorities, and funding levels during this time. In light of these changes, the NOSC initiated a critical, in-depth review to examine all aspects of user observation requirements and associated processes during 2017. This presentation provides background on the NOAA requirements process, major milestones and outcomes of the critical review, and plans for evolving and connecting observing requirements processes in the next year.

  13. The Earth Phenomena Observing System: Intelligent Autonomy for Satellite Operations

    NASA Technical Reports Server (NTRS)

    Ricard, Michael; Abramson, Mark; Carter, David; Kolitz, Stephan

    2003-01-01

    Earth monitoring systems of the future may include large numbers of inexpensive small satellites, tasked in a coordinated fashion to observe both long term and transient targets. For best performance, a tool which helps operators optimally assign targets to satellites will be required. We present the design of algorithms developed for real-time optimized autonomous planning of large numbers of small single-sensor Earth observation satellites. The algorithms will reduce requirements on the human operators of such a system of satellites, ensure good utilization of system resources, and provide the capability to dynamically respond to temporal terrestrial phenomena. Our initial real-time system model consists of approximately 100 satellites and large number of points of interest on Earth (e.g., hurricanes, volcanoes, and forest fires) with the objective to maximize the total science value of observations over time. Several options for calculating the science value of observations include the following: 1) total observation time, 2) number of observations, and the 3) quality (a function of e.g., sensor type, range, slant angle) of the observations. An integrated approach using integer programming, optimization and astrodynamics is used to calculate optimized observation and sensor tasking plans.

  14. The NANOGrav Observing Program: Automation and Reproducibility

    NASA Astrophysics Data System (ADS)

    Brazier, Adam; Cordes, James; Demorest, Paul; Dolch, Timothy; Ferdman, Robert; Garver-Daniels, Nathaniel; Hawkins, Steven; Lam, Michael Timothy; Lazio, T. Joseph W.

    2018-01-01

    The NANOGrav Observing Program is a decades-long search for gravitational waves using pulsar timing which relies, for its sensitivity, on large data sets from observations of many pulsars. These are constructed through an intensive, long-term observing campaign. The nature of the program requires automation in the transfer and archiving of the large volume of raw telescope data, the calibration of those data, and making these resulting data products—required for diagnostic and data exploration purposes—available to NANOGrav members. Reproducibility of results is a key goal in this project, and essential to its success; it requires treating the software itself as a data product of the research, while ensuring easy access by, and collaboration between, members of NANOGrav, the International Pulsar Timing Array consortium (of which NANOGrav is a key member), as well as the wider astronomy community and the public.

  15. The solar exposure time required for vitamin D3 synthesis in the human body estimated by numerical simulation and observation in Japan.

    PubMed

    Miyauchi, Masaatsu; Hirai, Chizuko; Nakajima, Hideaki

    2013-01-01

    Although the importance of solar radiation for vitamin D3 synthesis in the human body is well known, the solar exposure time required to prevent vitamin D deficiency has not been determined in Japan. This study attempted to identify the time of solar exposure required for vitamin D3 synthesis in the body by season, time of day, and geographic location (Sapporo, Tsukuba, and Naha) using both numerical simulations and observations. According to the numerical simulation for Tsukuba at noon in July under a cloudless sky, 3.5 min of solar exposure are required to produce 5.5 μg vitamin D3 per 600 cm2 skin corresponding to the area of a face and the back of a pair of hands without ingestion from foods. In contrast, it took 76.4 min to produce the same quantity of vitamin D3 at Sapporo in December, at noon under a cloudless sky. The necessary exposure time varied considerably with the time of the day. For Tsukuba at noon in December, 22.4 min were required, but 106.0 min were required at 09:00 and 271.3 min were required at 15:00 for the same meteorological conditions. Naha receives high levels of ultraviolet radiation allowing vitamin D3 synthesis almost throughout the year.

  16. Observing Planetary Rings and Small Satellites with the James Webb Space Telescope: Science Justification and Observation Requirements

    NASA Technical Reports Server (NTRS)

    Tiscareno, Matthew S.; Showalter, Mark R.; French, Richard G.; Burns, Joseph A.; Cuzzi, Jeffrey N.; de Pater, Imke; Hamilton, Douglas P.; Hedman, Matthew M.; Nicholson, Philip D.; Tamayo, Daniel; hide

    2016-01-01

    The James Webb Space Telescope (JWST) will provide unprecedented opportunities to observe the rings and small satellites in our Solar System, accomplishing three primary objectives: (1) discovering new rings and moons, (2) unprecedented spectroscopy, and (3) time-domain observations. We give details on these science objectives and describe requirements that JWST must fulfill in order to accomplish the science objectives.

  17. Time Requirements for the Different Item Types Proposed for Use in the Revised SAT®. Research Report No. 2007-3. ETS RR-07-35

    ERIC Educational Resources Information Center

    Bridgeman, Brent; Laitusis, Cara Cahalan; Cline, Frederick

    2007-01-01

    The current study used three data sources to estimate time requirements for different item types on the now current SAT Reasoning Test™. First, we estimated times from a computer-adaptive version of the SAT® (SAT CAT) that automatically recorded item times. Second, we observed students as they answered SAT questions under strict time limits and…

  18. Complete ISOPHOT (C200) Maps of a Nearby Prototypical GMC: W3 (Spring) or NGC7538 (Fall)

    NASA Technical Reports Server (NTRS)

    Sanders, David B.

    2001-01-01

    We were originally awarded Priority 3 time (approximately 60,000 sec) with Infrared Space Observatory (ISO) to obtain a complete ISOPHOT (PHT32-C200) map of a nearby prototypical giant molecular cloud (GMC). Following the FALL launch and revised estimates for the sensitivity of the ISOPHOT detectors, our program was modified to fit within the time constraints while still carrying out the main science requirements. The revised program requested long strip maps of our FALL target (NGC7538) using sequences of PHT37/38/39 observations with LWS observations of the brightest regions. The large number of AOTs required to cover each GMC required that our observations be spread over four separate proposals (PROP-01, PROP-02, PROP-03, PROP-04) which together comprise a single observing program. Our program was executed in early 1997; nearly 50,000 sec of data were obtained, including all of our requested ISOPHOT C200 observations. None of the LWS data were taken.

  19. 50 CFR 660.116 - Trawl fishery-observer requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-sea processing vessels. The observer sampling station must include a table at least 0.6 m deep, 1.2 m... at sea must carry one NMFS-certified observer, from the time the vessel leaves port on a trip in which the catch is sorted at sea to the time that all catch from that trip has been offloaded. (2...

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Znojil, Miloslav

    For many quantum models an apparent non-Hermiticity of observables just corresponds to their hidden Hermiticity in another, physical Hilbert space. For these models we show that the existence of observables which are manifestly time-dependent may require the use of a manifestly time-dependent representation of the physical Hilbert space of states.

  1. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    NASA Astrophysics Data System (ADS)

    An, Zhe; Rey, Daniel; Ye, Jingxin; Abarbanel, Henry D. I.

    2017-01-01

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of the full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. We show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.

  2. 28 CFR 552.12 - Close observation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... urine sample is required prior to releasing the inmate from close observation. (2) The light will be kept on at all times. (3) No inmate under close observation status may be allowed to come into contact...

  3. 28 CFR 552.12 - Close observation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... urine sample is required prior to releasing the inmate from close observation. (2) The light will be kept on at all times. (3) No inmate under close observation status may be allowed to come into contact...

  4. A Study on the Effectiveness of Lockup-Free Caches for a Reduced Instruction Set Computer (RISC) Processor

    DTIC Science & Technology

    1992-09-01

    to acquire or develop effective simulation tools to observe the behavior of a RISC implementation as it executes different types of programs . We choose...Performance Computer performance is measured by the amount of the time required to execute a program . Performance encompasses two types of time, elapsed time...and CPU time. Elapsed time is the time required to execute a program from start to finish. It includes latency of input/output activities such as

  5. QUIKVIS- CELESTIAL TARGET AVAILABILITY INFORMATION

    NASA Technical Reports Server (NTRS)

    Petruzzo, C.

    1994-01-01

    QUIKVIS computes the times during an Earth orbit when geometric requirements are satisfied for observing celestial objects. The observed objects may be fixed (stars, etc.) or moving (sun, moon, planets). QUIKVIS is useful for preflight analysis by those needing information on the availability of celestial objects to be observed. Two types of analyses are performed by QUIKVIS. One is used when specific objects are known, the other when targets are unknown and potentially useful regions of the sky must be identified. The results are useful in selecting candidate targets, examining the effects of observation requirements, and doing gross assessments of the effects of the orbit's right ascension of the ascending node (RAAN). The results are not appropriate when high accuracy is needed (e.g. for scheduling actual mission operations). The observation duration is calculated as a function of date, orbit node, and geometric requirements. The orbit right ascension of the ascending node can be varied to account for the effects of an uncertain launch time of day. The orbit semimajor axis and inclination are constant throughout the run. A circular orbit is assumed, but a simple program modification will allow eccentric orbits. The geometric requirements that can be processed are: 1) minimum separation angle between the line of sight to the object and the earth's horizon; 2) minimum separation angle between the line of sight to the object and the spacecraft velocity vector; 3) maximum separation angle between the line of sight to the object and the zenith direction; and 4) presence of the spacecraft in the earth's shadow. The user must supply a date or date range, the spacecraft orbit and inclination, up to 700 observation targets, and any geometric requirements to be met. The primary output is the time per orbit that conditions are satisfied, with options for sky survey maps, time since a user-specified orbit event, and bar graphs illustrating overlapping requirements. The output is printed in visually convenient lineprinter form but is also available on data files for use by postprocessors such as external XY plotters. QUIKVIS is written in FORTRAN 77 for batch or interactive execution and has been implemented on a DEC VAX 11/780 operating under VMS with a central memory requirement of approximately 500K of 8 bit bytes. QUIKVIS was developed in 1986 and revised in 1987.

  6. Climate Observing Systems: Where are we and where do we need to be in the future

    NASA Astrophysics Data System (ADS)

    Baker, B.; Diamond, H. J.

    2017-12-01

    Climate research and monitoring requires an observational strategy that blends long-term, carefully calibrated measurements as well as short-term, focused process studies. The operation and implementation of operational climate observing networks and the provision of related climate services, both have a significant role to play in assisting the development of national climate adaptation policies and in facilitating national economic development. Climate observing systems will require a strong research element for a long time to come. This requires improved observations of the state variables and the ability to set them in a coherent physical (as well as a chemical and biological) framework with models. Climate research and monitoring requires an integrated strategy of land/ocean/atmosphere observations, including both in situ and remote sensing platforms, and modeling and analysis. It is clear that we still need more research and analysis on climate processes, sampling strategies, and processing algorithms.

  7. 50 CFR 216.175 - Requirements for monitoring and reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...., FFG, DDG, or CG). (G) Length of time observers maintained visual contact with marine mammal. (H) Wave... height in feet (high, low and average during exercise). (I) Narrative description of sensors and... sensor. (F) Length of time observers maintained visual contact with marine mammal. (G) Wave height. (H...

  8. Applying Squeaky-Wheel Optimization Schedule Airborne Astronomy Observations

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Kuerklue, Elif

    2004-01-01

    We apply the Squeaky Wheel Optimization (SWO) algorithm to the problem of scheduling astronomy observations for the Stratospheric Observatory for Infrared Astronomy, an airborne observatory. The problem contains complex constraints relating the feasibility of an astronomical observation to the position and time at which the observation begins, telescope elevation limits, special use airspace, and available fuel. Solving the problem requires making discrete choices (e.g. selection and sequencing of observations) and continuous ones (e.g. takeoff time and setting up observations by repositioning the aircraft). The problem also includes optimization criteria such as maximizing observing time while simultaneously minimizing total flight time. Previous approaches to the problem fail to scale when accounting for all constraints. We describe how to customize SWO to solve this problem, and show that it finds better flight plans, often with less computation time, than previous approaches.

  9. 32 CFR 700.705 - Observance of international law.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Observance of international law. 700.705 Section... Other Commanders Titles and Duties of Commanders § 700.705 Observance of international law. At all times, commanders shall observe, and require their commands to observe, the principles of international law. Where...

  10. 32 CFR 700.705 - Observance of international law.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Observance of international law. 700.705 Section... Other Commanders Titles and Duties of Commanders § 700.705 Observance of international law. At all times, commanders shall observe, and require their commands to observe, the principles of international law. Where...

  11. Observing strategies for future solar facilities: the ATST test case

    NASA Astrophysics Data System (ADS)

    Uitenbroek, H.; Tritschler, A.

    2012-12-01

    Traditionally solar observations have been scheduled and performed very differently from night time efforts, in particular because we have been observing the Sun for a long time, requiring new combinations of observables to make progress, and because solar physics observations are often event driven on time scales of hours to days. With the proposal pressure that is expected for new large-aperture facilities, we can no longer afford the time spent on custom setups, and will have to rethink our scheduling and operations. We will discuss our efforts at Sac Peak in preparing for this new era, and outline the planned scheduling and operations planning for the ATST in particular.

  12. Further comments on sensitivities, parameter estimation, and sampling design in one-dimensional analysis of solute transport in porous media

    USGS Publications Warehouse

    Knopman, Debra S.; Voss, Clifford I.

    1988-01-01

    Sensitivities of solute concentration to parameters associated with first-order chemical decay, boundary conditions, initial conditions, and multilayer transport are examined in one-dimensional analytical models of transient solute transport in porous media. A sensitivity is a change in solute concentration resulting from a change in a model parameter. Sensitivity analysis is important because minimum information required in regression on chemical data for the estimation of model parameters by regression is expressed in terms of sensitivities. Nonlinear regression models of solute transport were tested on sets of noiseless observations from known models that exceeded the minimum sensitivity information requirements. Results demonstrate that the regression models consistently converged to the correct parameters when the initial sets of parameter values substantially deviated from the correct parameters. On the basis of the sensitivity analysis, several statements may be made about design of sampling for parameter estimation for the models examined: (1) estimation of parameters associated with solute transport in the individual layers of a multilayer system is possible even when solute concentrations in the individual layers are mixed in an observation well; (2) when estimating parameters in a decaying upstream boundary condition, observations are best made late in the passage of the front near a time chosen by adding the inverse of an hypothesized value of the source decay parameter to the estimated mean travel time at a given downstream location; (3) estimation of a first-order chemical decay parameter requires observations to be made late in the passage of the front, preferably near a location corresponding to a travel time of √2 times the half-life of the solute; and (4) estimation of a parameter relating to spatial variability in an initial condition requires observations to be made early in time relative to passage of the solute front.

  13. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    DOE PAGES

    An, Zhe; Rey, Daniel; Ye, Jingxin; ...

    2017-01-16

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of themore » full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. Here, we show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.« less

  14. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Zhe; Rey, Daniel; Ye, Jingxin

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of themore » full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. Here, we show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.« less

  15. Guidance, Navigation, and Control Performance for the GOES-R Spacecraft

    NASA Technical Reports Server (NTRS)

    Chapel, Jim D.; Stancliffe, Devin; Bevacqua, Tim; Winkler, Stephen; Clapp, Brian; Rood, Tim; Gaylor, David; Freesland, Douglas C.; Krimchansky, Alexander

    2014-01-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R) is the first of the next generation geostationary weather satellites, scheduled for delivery in late 2015 and launch in early 2016. Relative to the current generation of GOES satellites, GOES-R represents a dramatic increase in Earth and solar weather observation capabilities, with 4 times the resolution, 5 times the observation rate, and 3 times the number of spectral bands for Earth observations. GOES-R will also provide unprecedented availability, with less than 120 minutes per year of lost observation time. The Guidance Navigation & Control (GN&C) design requirements to achieve these expanded capabilities are extremely demanding. This paper first presents the pointing control, pointing stability, attitude knowledge, and orbit knowledge requirements necessary to realize the ambitious Image Navigation and Registration (INR) objectives of GOES-R. Because the GOES-R suite of instruments is sensitive to disturbances over a broad spectral range, a high fidelity simulation of the vehicle has been created with modal content over 500 Hz to assess the pointing stability requirements. Simulation results are presented showing acceleration, shock response spectrum (SRS), and line of sight responses for various disturbances from 0 Hz to 512 Hz. These disturbances include gimbal motion, reaction wheel disturbances, thruster firings for station keeping and momentum management, and internal instrument disturbances. Simulation results demonstrate excellent performance relative to the pointing and pointing stability requirements, with line of sight jitter of the isolated instrument platform of approximately 1 micro-rad. Low frequency motion of the isolated instrument platform is internally compensated within the primary instrument. Attitude knowledge and rate are provided directly to the instrument with an accuracy defined by the Integrated Rate Error (IRE) requirements. The allowable IRE ranges from 1 to 18.5 micro-rad, depending upon the time window of interest. The final piece of the INR performance is orbit knowledge. Extremely accurate orbital position is achieved by GPS navigation at Geosynchronous Earth Orbit (GEO). Performance results are shown demonstrating compliance with the 50 to 75 m orbit position accuracy requirements of GOES-R, including during station-keeping and momentum management maneuvers. As shown in this paper, the GN&C performance for the GOES-R series of spacecraft supports the challenging mission objectives of the next generation GEO Earth-observation satellites.

  16. astroplan: An Open Source Observation Planning Package in Python

    NASA Astrophysics Data System (ADS)

    Morris, Brett M.; Tollerud, Erik; Sipőcz, Brigitta; Deil, Christoph; Douglas, Stephanie T.; Berlanga Medina, Jazmin; Vyhmeister, Karl; Smith, Toby R.; Littlefair, Stuart; Price-Whelan, Adrian M.; Gee, Wilfred T.; Jeschke, Eric

    2018-03-01

    We present astroplan—an open source, open development, Astropy affiliated package for ground-based observation planning and scheduling in Python. astroplan is designed to provide efficient access to common observational quantities such as celestial rise, set, and meridian transit times and simple transformations from sky coordinates to altitude-azimuth coordinates without requiring a detailed understanding of astropy’s implementation of coordinate systems. astroplan provides convenience functions to generate common observational plots such as airmass and parallactic angle as a function of time, along with basic sky (finder) charts. Users can determine whether or not a target is observable given a variety of observing constraints, such as airmass limits, time ranges, Moon illumination/separation ranges, and more. A selection of observation schedulers are included that divide observing time among a list of targets, given observing constraints on those targets. Contributions to the source code from the community are welcome.

  17. The dynamics of learning about a climate threshold

    NASA Astrophysics Data System (ADS)

    Keller, Klaus; McInerney, David

    2008-02-01

    Anthropogenic greenhouse gas emissions may trigger threshold responses of the climate system. One relevant example of such a potential threshold response is a shutdown of the North Atlantic meridional overturning circulation (MOC). Numerous studies have analyzed the problem of early MOC change detection (i.e., detection before the forcing has committed the system to a threshold response). Here we analyze the early MOC prediction problem. To this end, we virtually deploy an MOC observation system into a simple model that mimics potential future MOC responses and analyze the timing of confident detection and prediction. Our analysis suggests that a confident prediction of a potential threshold response can require century time scales, considerably longer that the time required for confident detection. The signal enabling early prediction of an approaching MOC threshold in our model study is associated with the rate at which the MOC intensity decreases for a given forcing. A faster MOC weakening implies a higher MOC sensitivity to forcing. An MOC sensitivity exceeding a critical level results in a threshold response. Determining whether an observed MOC trend in our model differs in a statistically significant way from an unforced scenario (the detection problem) imposes lower requirements on an observation system than the determination whether the MOC will shut down in the future (the prediction problem). As a result, the virtual observation systems designed in our model for early detection of MOC changes might well fail at the task of early and confident prediction. Transferring this conclusion to the real world requires a considerably refined MOC model, as well as a more complete consideration of relevant observational constraints.

  18. Fast Molecular Cloud Destruction Requires Fast Cloud Formation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mac Low, Mordecai-Mark; Burkert, Andreas; Ibáñez-Mejía, Juan C., E-mail: mordecai@amnh.org, E-mail: burkert@usm.lmu.de, E-mail: ibanez@ph1.uni-koeln.de

    A large fraction of the gas in the Galaxy is cold, dense, and molecular. If all this gas collapsed under the influence of gravity and formed stars in a local free-fall time, the star formation rate in the Galaxy would exceed that observed by more than an order of magnitude. Other star-forming galaxies behave similarly. Yet, observations and simulations both suggest that the molecular gas is indeed gravitationally collapsing, albeit hierarchically. Prompt stellar feedback offers a potential solution to the low observed star formation rate if it quickly disrupts star-forming clouds during gravitational collapse. However, this requires that molecular cloudsmore » must be short-lived objects, raising the question of how so much gas can be observed in the molecular phase. This can occur only if molecular clouds form as quickly as they are destroyed, maintaining a global equilibrium fraction of dense gas. We therefore examine cloud formation timescales. We first demonstrate that supernova and superbubble sweeping cannot produce dense gas at the rate required to match the cloud destruction rate. On the other hand, Toomre gravitational instability can reach the required production rate. We thus argue that, although dense, star-forming gas may last only around a single global free-fall time; the dense gas in star-forming galaxies can globally exist in a state of dynamic equilibrium between formation by gravitational instability and disruption by stellar feedback. At redshift z ≳ 2, the Toomre instability timescale decreases, resulting in a prediction of higher molecular gas fractions at early times, in agreement with the observations.« less

  19. Guidance, Navigation, and Control Performance for the GOES-R Spacecraft

    NASA Technical Reports Server (NTRS)

    Chapel, Jim; Stancliffe, Devin; Bevacqua, TIm; Winkler, Stephen; Clapp, Brian; Rood, Tim; Gaylor, David; Freesland, Doug; Krimchansky, Alexander

    2014-01-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R) is the first of the next generation geostationary weather satellites. The series represents a dramatic increase in Earth observation capabilities, with 4 times the resolution, 5 times the observation rate, and 3 times the number of spectral bands. GOES-R also provides unprecedented availability, with less than 120 minutes per year of lost observation time. This paper presents the Guidance Navigation & Control (GN&C) requirements necessary to realize the ambitious pointing, knowledge, and Image Navigation and Registration (INR) objectives of GOES-R. Because the suite of instruments is sensitive to disturbances over a broad spectral range, a high fidelity simulation of the vehicle has been created with modal content over 500 Hz to assess the pointing stability requirements. Simulation results are presented showing acceleration, shock response spectra (SRS), and line of sight (LOS) responses for various disturbances from 0 Hz to 512 Hz. Simulation results demonstrate excellent performance relative to the pointing and pointing stability requirements, with LOS jitter for the isolated instrument platform of approximately 1 micro-rad. Attitude and attitude rate knowledge are provided directly to the instrument with an accuracy defined by the Integrated Rate Error (IRE) requirements. The data are used internally for motion compensation. The final piece of the INR performance is orbit knowledge, which GOES-R achieves with GPS navigation. Performance results are shown demonstrating compliance with the 50 to 75 m orbit position accuracy requirements. As presented in this paper, the GN&C performance supports the challenging mission objectives of GOES-R.

  20. Real-time stylistic prediction for whole-body human motions.

    PubMed

    Matsubara, Takamitsu; Hyon, Sang-Ho; Morimoto, Jun

    2012-01-01

    The ability to predict human motion is crucial in several contexts such as human tracking by computer vision and the synthesis of human-like computer graphics. Previous work has focused on off-line processes with well-segmented data; however, many applications such as robotics require real-time control with efficient computation. In this paper, we propose a novel approach called real-time stylistic prediction for whole-body human motions to satisfy these requirements. This approach uses a novel generative model to represent a whole-body human motion including rhythmic motion (e.g., walking) and discrete motion (e.g., jumping). The generative model is composed of a low-dimensional state (phase) dynamics and a two-factor observation model, allowing it to capture the diversity of motion styles in humans. A real-time adaptation algorithm was derived to estimate both state variables and style parameter of the model from non-stationary unlabeled sequential observations. Moreover, with a simple modification, the algorithm allows real-time adaptation even from incomplete (partial) observations. Based on the estimated state and style, a future motion sequence can be accurately predicted. In our implementation, it takes less than 15 ms for both adaptation and prediction at each observation. Our real-time stylistic prediction was evaluated for human walking, running, and jumping behaviors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. The solar UV exposure time required for vitamin D3 synthesis in the human body estimated by numerical simulation and observation in Japan

    NASA Astrophysics Data System (ADS)

    Nakajima, Hideaki; Miyauchi, Masaatsu; Hirai, Chizuko

    2013-04-01

    After the discovery of Antarctic ozone hole, the negative effect of exposure of human body to harmful solar ultraviolet (UV) radiation is widely known. However, there is positive effect of exposure to UV radiation, i.e., vitamin D synthesis. Although the importance of solar UV radiation for vitamin D3 synthesis in the human body is well known, the solar exposure time required to prevent vitamin D deficiency has not been well determined. This study attempted to identify the time of solar exposure required for vitamin D3 synthesis in the body by season, time of day, and geographic location (Sapporo, Tsukuba, and Naha, in Japan) using both numerical simulations and observations. According to the numerical simulation for Tsukuba at noon in July under a cloudless sky, 2.3 min of solar exposure are required to produce 5.5 μg vitamin D3 per 600 cm2 skin. This quantity of vitamin D represents the recommended intake for an adult by the Ministry of Health, Labour and Welfare, and the 2010 Japanese Dietary Reference Intakes (DRIs). In contrast, it took 49.5 min to produce the same amount of vitamin D3 at Sapporo in the northern part of Japan in December, at noon under a cloudless sky. The necessary exposure time varied considerably with the time of the day. For Tsukuba at noon in December, 14.5 min were required, but at 09:00 68.7 min were required and at 15:00 175.8 min were required for the same meteorological conditions. Naha receives high levels of UV radiation allowing vitamin D3 synthesis almost throughout the year. According to our results, we are further developing an index to quantify the necessary time of UV radiation exposure to produce required amount of vitamin D3 from a UV radiation data.

  2. Factors influencing nursing time spent on administration of medication in an Australian residential aged care home.

    PubMed

    Qian, Siyu; Yu, Ping; Hailey, David M; Wang, Ning

    2016-04-01

    To examine nursing time spent on administration of medications in a residential aged care (RAC) home, and to determine factors that influence the time to medicate a resident. Information on nursing time spent on medication administration is useful for planning and implementation of nursing resources. Nurses were observed over 12 morning medication rounds using a time-motion observational method and field notes, at two high-care units in an Australian RAC home. Nurses spent between 2.5 and 4.5 hours in a medication round. Administration of medication averaged 200 seconds per resident. Four factors had significant impact on medication time: number of types of medication, number of tablets taken by a resident, methods used by a nurse to prepare tablets and methods to provide tablets. Administration of medication consumed a substantial, though variable amount of time in the RAC home. Nursing managers need to consider the factors that influenced the nursing time required for the administration of medication in their estimation of nursing workload and required resources. To ensure safe medication administration for older people, managers should regularly assess the changes in the factors influencing nursing time on the administration of medication when estimating nursing workload and required resources. © 2015 John Wiley & Sons Ltd.

  3. TIME after TIMED - A perspective on Thermosphere-Ionosphere Mesosphere science and future observational needs after the TIMED mission epoch

    NASA Astrophysics Data System (ADS)

    Mlynczak, M. G.; Russell, J. M., III; Hunt, L. A.; Christensen, A. B.; Paxton, L. J.; Woods, T. N.; Niciejewski, R.; Yee, J. H.

    2016-12-01

    The past 40 years have been a true golden age for space-based observations of the Earth's middle atmosphere (stratosphere to thermosphere). Numerous instruments and missions have been developed and flown to explore the thermal structure, chemical composition, and energy budget of the middle atmosphere. A primary motivation for these observations was the need to understand the photochemistry of stratospheric ozone and its potential depletion by anthropogenic means. As technology evolved, observations were extended higher and higher, into regions previously unobserved from space by optical remote sensing techniques. In the 1990's, NASA initiated the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamcis (TIMED) mission to explore one of the last frontiers of the atmosphere - the region between 60 and 180 km - then referred to as "the ignorosphere." Today, we have 15 years of detailed observations from this remarkable satellite and its 4 instruments, and are recognizing rapid climate change that is occurring above 60 km. The upcoming ICON and GOLD missions will afford new opportunities for scientific discovery by combining data from all three missions. However, it has become clear that continued observations beyond TIMED are required to understand the upper atmosphere as a system that is fully coupled from the edge of Space to the surface of the Earth. In this talk we will review the current status of knowledge of the basic state properties of the thermosphere-ionosphere-mesosphere (TIME) system and will discuss future observations that are required to obtain a comprehensive understanding of the entire TIME system, especially the effects of long term change that are already underway.

  4. Observational Requirements for Underway Observations from Research Vessels

    NASA Astrophysics Data System (ADS)

    Smith, S. R.; Van Waes, M.

    2016-02-01

    Identifying observational requirements to build and sustain a global ocean observing system requires input from the user community. Research vessels are an essential and versatile component of the observing system. The authors will present results from a survey of the marine climate and oceanographic community that solicited observational requirements for research vessels. The goal of the survey is to determine priorities for underway instrumentation to be run on NOAA vessels operated by the Office of Marine and Aviation Operations (OMAO) to support secondary users of the NOAA fleet. Secondary users are defined as persons that do not routinely participate in cruises on NOAA vessels, but have a research or operational need for underway observations from these vessels. Secondary applications of underway data from NOAA vessels include, but are not limited to, evaluation of analyses/forecast from ocean and atmospheric models, developing satellite retrieval algorithms, and validating observations from remote sensing systems (e.g., satellites, aircraft). For this survey, underway observations are defined as digital data generated by environmental sensor systems permanently installed on the vessel and routinely maintained by the operator. The survey also assessed the need for access to these observations in real-time versus delayed-mode. The authors will discuss how these survey results can be used to inform NOAA management on the requirements for underway observations during future NOAA vessel deployments. Although originally designed to assess requirements for NOAA vessels, the international response to the survey makes the results applicable to research vessel operations around the world.

  5. Using large spectroscopic surveys to test the double degenerate model for Type Ia supernovae

    NASA Astrophysics Data System (ADS)

    Breedt, E.; Steeghs, D.; Marsh, T. R.; Gentile Fusillo, N. P.; Tremblay, P.-E.; Green, M.; De Pasquale, S.; Hermes, J. J.; Gänsicke, B. T.; Parsons, S. G.; Bours, M. C. P.; Longa-Peña, P.; Rebassa-Mansergas, A.

    2017-07-01

    An observational constraint on the contribution of double degenerates to Type Ia supernovae requires multiple radial velocity measurements of ideally thousands of white dwarfs. This is because only a small fraction of the double degenerate population is massive enough, with orbital periods short enough, to be considered viable Type Ia progenitors. We show how the radial velocity information available from public surveys such as the Sloan Digital Sky Survey can be used to pre-select targets for variability, leading to a 10-fold reduction in observing time required compared to an unranked or random survey. We carry out Monte Carlo simulations to quantify the detection probability of various types of binaries in the survey and show that this method, even in the most pessimistic case, doubles the survey size of the largest survey to date (the SPY Survey) in less than 15 per cent of the required observing time. Our initial follow-up observations corroborate the method, yielding 15 binaries so far (eight known and seven new), as well as orbital periods for four of the new binaries.

  6. LANCE in ECHO - Merging Science and Near Real-Time Data Search and Order

    NASA Astrophysics Data System (ADS)

    Kreisler, S.; Murphy, K. J.; Vollmer, B.; Lighty, L.; Mitchell, A. E.; Devine, N.

    2012-12-01

    NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) Land Atmosphere Near real-time Capability for EOS (LANCE) project provides expedited data products from the Terra, Aqua, and Aura satellites within three hours of observation. In order to satisfy latency requirements, LANCE data are produced with relaxed ancillary data resulting in a product that may have minor differences from its science quality counterpart. LANCE products are used by a number of different groups to support research and applications that require near real-time earth observations, such as disaster relief, hazard and air quality monitoring, and weather forecasting. LANCE elements process raw rate-buffered and/or session-based production datasets into higher-level products, which are freely available to registered users via LANCE FTP sites. The LANCE project also generates near real-time full resolution browse imagery from these products, which can be accessed through the Global Imagery Browse Services (GIBS). In an effort to support applications and services that require timely access to these near real-time products, the project is currently implementing the publication of LANCE product metadata to the EOS ClearingHouse (ECHO), a centralized EOSDIS registry of EOS data. Metadata within ECHO is made available through an Application Program Interface (API), and applications can utilize the API to allow users to efficiently search and order LANCE data. Publishing near real-time data to ECHO will permit applications to access near real-time product metadata prior to the release of its science quality counterpart and to associate imagery from GIBS with its underlying data product.

  7. 50 CFR 218.125 - Requirements for monitoring and reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... observers maintained visual contact with marine mammal(s); (H) Wave height (ft); (I) Visibility; (J) Sonar..., low, and average during exercise); and (I) Narrative description of sensors and platforms utilized for...) Calves observed (y/n); (E) Initial detection sensor; (F) Length of time observers maintained visual...

  8. Scheduling and calibration strategy for continuous radio monitoring of 1700 sources every three days

    NASA Astrophysics Data System (ADS)

    Max-Moerbeck, Walter

    2014-08-01

    The Owens Valley Radio Observatory 40 meter telescope is currently monitoring a sample of about 1700 blazars every three days at 15 GHz, with the main scientific goal of determining the relation between the variability of blazars at radio and gamma-rays as observed with the Fermi Gamma-ray Space Telescope. The time domain relation between radio and gamma-ray emission, in particular its correlation and time lag, can help us determine the location of the high-energy emission site in blazars, a current open question in blazar research. To achieve this goal, continuous observation of a large sample of blazars in a time scale of less than a week is indispensable. Since we only look at bright targets, the time available for target observations is mostly limited by source observability, calibration requirements and slewing of the telescope. Here I describe the implementation of a practical solution to this scheduling, calibration, and slewing time minimization problem. This solution combines ideas from optimization, in particular the traveling salesman problem, with astronomical and instrumental constraints. A heuristic solution using well established optimization techniques and astronomical insights particular to this situation, allow us to observe all the sources in the required three days cadence while obtaining reliable calibration of the radio flux densities. Problems of this nature will only be more common in the future and the ideas presented here can be relevant for other observing programs.

  9. TIME-TAG mode of STIS observations using the MAMA detectors

    NASA Astrophysics Data System (ADS)

    Sahu, Kailash; Danks, Anthony; Baum, Stefi; Balzano, Vicki; Kraemer, Steve; Kutina, Ray; Sears, William

    1995-04-01

    We summarize the time-tag mode of STIS observations using the MAMA detectors, both in imaging and spectroscopic modes. After a brief outline on the MAMA detector characteristics and the astronomical applications of the time-tag mode, the general philosophy and the details of the data management strategy are described in detail. The GO specifications, and the consequent different modes of data transfer strategy are outlined. Restrictions on maximum data rates, integration times, and BUFFER-TIME requirements are explained. A few cases where the subarray option would be useful are outlined.

  10. Statistical considerations in creating water vapor data records from combinations of satellite and other observation types, including in situ and ground-based remote sensing

    NASA Astrophysics Data System (ADS)

    Dykema, J. A.; Anderson, J. G.

    2014-12-01

    Measuring water vapor at the highest spatial and temporal at all vertical levels and at arbitrary times requires strategic utilization of disparate observations from satellites, ground-based remote sensing, and in situ measurements. These different measurement types have different response times and very different spatial averaging properties, both horizontally and vertically. Accounting for these different measurement properties and explicit propagation of associated uncertainties is necessary to test particular scientific hypotheses, especially in cases of detection of weak signals in the presence of natural fluctuations, and for process studies with small ensembles. This is also true where ancillary data from meteorological analyses are required, which have their own sampling limitations and uncertainties. This study will review two investigations pertaining to measurements of water vapor in the mid-troposphere and lower stratosphere that mix satellite observations with observations from other sources. The focus of the mid-troposphere analysis is to obtain improved estimates of water vapor at the instant of a sounding satellite overpass. The lower stratosphere work examines the uncertainty inherent in a small ensemble of anomalously elevated lower stratospheric water vapor observations when meteorological analysis products and aircraft in situ observations are required for interpretation.

  11. 49 CFR 71.2 - Annual advancement of standard time.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., requires that the standard time of each State observing Daylight Saving Time shall be advanced 1 hour... 49 Transportation 1 2012-10-01 2012-10-01 false Annual advancement of standard time. 71.2 Section 71.2 Transportation Office of the Secretary of Transportation STANDARD TIME ZONE BOUNDARIES § 71.2...

  12. 49 CFR 71.2 - Annual advancement of standard time.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., requires that the standard time of each State observing Daylight Saving Time shall be advanced 1 hour... 49 Transportation 1 2014-10-01 2014-10-01 false Annual advancement of standard time. 71.2 Section 71.2 Transportation Office of the Secretary of Transportation STANDARD TIME ZONE BOUNDARIES § 71.2...

  13. 49 CFR 71.2 - Annual advancement of standard time.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., requires that the standard time of each State observing Daylight Saving Time shall be advanced 1 hour... 49 Transportation 1 2013-10-01 2013-10-01 false Annual advancement of standard time. 71.2 Section 71.2 Transportation Office of the Secretary of Transportation STANDARD TIME ZONE BOUNDARIES § 71.2...

  14. 49 CFR 71.2 - Annual advancement of standard time.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., requires that the standard time of each State observing Daylight Saving Time shall be advanced 1 hour... 49 Transportation 1 2011-10-01 2011-10-01 false Annual advancement of standard time. 71.2 Section 71.2 Transportation Office of the Secretary of Transportation STANDARD TIME ZONE BOUNDARIES § 71.2...

  15. In-Flight Guidance, Navigation, and Control Performance Results for the GOES-16 Spacecraft

    NASA Technical Reports Server (NTRS)

    Chapel, Jim; Stancliffe, Devin; Bevacqua, Tim; Winkler, Stephen; Clapp, Brian; Rood, Tim; Freesland, Doug; Reth, Alan; Early, Derrick; Walsh, Tim; hide

    2017-01-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R), which launched in November 2016, is the first of the next generation geostationary weather satellites. GOES-R provides 4 times the resolution, 5 times the observation rate, and 3 times the number of spectral bands for Earth observations compared with its predecessor spacecraft. Additionally, Earth relative and Sun-relative pointing and pointing stability requirements are maintained throughout reaction wheel desaturation events and station keeping activities, allowing GOES-R to provide continuous Earth and sun observations. This paper reviews the pointing control, pointing stability, attitude knowledge, and orbit knowledge requirements necessary to realize the ambitious Image Navigation and Registration (INR) objectives of GOES-R. This paper presents a comparison between low-frequency on-orbit pointing results and simulation predictions for both the Earth Pointed Platform (EPP) and Sun Pointed Platform (SPP). Results indicate excellent agreement between simulation predictions and observed on-orbit performance, and compliance with pointing performance requirements. The EPP instrument suite includes 6 seismic accelerometers sampled at 2 KHz, allowing in-flight verification of jitter responses and comparison back to simulation predictions. This paper presents flight results of acceleration, shock response spectrum (SRS), and instrument line of sight responses for various operational scenarios and instrument observation modes. The results demonstrate the effectiveness of the dual-isolation approach employed on GOES-R. The spacecraft provides attitude and rate data to the primary Earth-observing instrument at 100 Hz, which are used to adjust instrument scanning. The data must meet accuracy and latency numbers defined by the Integrated Rate Error (IRE) requirements. This paper discusses the on-orbit IRE results, showing compliance to these requirements with margin. During the spacecraft checkout period, IRE disturbances were observed and subsequently attributed to thermal control of the Inertial Measurement Unit (IMU) mounting interface. Adjustments of IMU thermal control and the resulting improvements in IRE are presented. Orbit knowledge represents the final element of INR performance. Extremely accurate orbital position is achieved by GPS navigation at Geosynchronous Earth Orbit (GEO). On-orbit performance results are shown demonstrating compliance with the stringent orbit position accuracy requirements of GOES-R, including during station keeping activities and momentum desaturation events. As we show in this paper, the on-orbit performance of the GNC design provides the necessary capabilities to achieve GOES-R mission objectives.

  16. 23 CFR 511.309 - Provisions for traffic and travel conditions reporting.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... requirements for traffic and travel conditions made available by real-time information programs are: (1... or less from the time the hazardous conditions, blockage, or closure is observed. (4) Travel time information. The timeliness for the availability of travel time information along limited access roadway...

  17. Tracking of time-varying genomic regulatory networks with a LASSO-Kalman smoother

    PubMed Central

    2014-01-01

    It is widely accepted that cellular requirements and environmental conditions dictate the architecture of genetic regulatory networks. Nonetheless, the status quo in regulatory network modeling and analysis assumes an invariant network topology over time. In this paper, we refocus on a dynamic perspective of genetic networks, one that can uncover substantial topological changes in network structure during biological processes such as developmental growth. We propose a novel outlook on the inference of time-varying genetic networks, from a limited number of noisy observations, by formulating the network estimation as a target tracking problem. We overcome the limited number of observations (small n large p problem) by performing tracking in a compressed domain. Assuming linear dynamics, we derive the LASSO-Kalman smoother, which recursively computes the minimum mean-square sparse estimate of the network connectivity at each time point. The LASSO operator, motivated by the sparsity of the genetic regulatory networks, allows simultaneous signal recovery and compression, thereby reducing the amount of required observations. The smoothing improves the estimation by incorporating all observations. We track the time-varying networks during the life cycle of the Drosophila melanogaster. The recovered networks show that few genes are permanent, whereas most are transient, acting only during specific developmental phases of the organism. PMID:24517200

  18. Earth Sciences Requirements for the Information Sciences Experiment System

    NASA Technical Reports Server (NTRS)

    Bowker, David E. (Editor); Katzberg, Steve J. (Editor); Wilson, R. Gale (Editor)

    1990-01-01

    The purpose of the workshop was to further explore and define the earth sciences requirements for the Information Sciences Experiment System (ISES), a proposed onboard data processor with real-time communications capability intended to support the Earth Observing System (Eos). A review of representative Eos instrument types is given and a preliminary set of real-time data needs has been established. An executive summary is included.

  19. Report of the panel on earth rotation and reference frames, section 7

    NASA Technical Reports Server (NTRS)

    Dickey, Jean O.; Dickman, Steven R.; Eubanks, Marshall T.; Feissel, Martine; Herring, Thomas A.; Mueller, Ivan I.; Rosen, Richard D.; Schutz, Robert E.; Wahr, John M.; Wilson, Charles R.

    1991-01-01

    Objectives and requirements for Earth rotation and reference frame studies in the 1990s are discussed. The objectives are to observe and understand interactions of air and water with the rotational dynamics of the Earth, the effects of the Earth's crust and mantle on the dynamics and excitation of Earth rotation variations over time scales of hours to centuries, and the effects of the Earth's core on the rotational dynamics and the excitation of Earth rotation variations over time scales of a year or longer. Another objective is to establish, refine and maintain terrestrial and celestrial reference frames. Requirements include improvements in observations and analysis, improvements in celestial and terrestrial reference frames and reference frame connections, and improved observations of crustal motion and mass redistribution on the Earth.

  20. 50 CFR 679.28 - Equipment and operational requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... estimates, vessel monitoring system hardware, catch monitoring and control plan, and catcher vessel... container to store salmon must be located adjacent to the observer sampling station; (ii) All salmon stored in the container must remain in view of the observer at the observer sampling station at all times...

  1. Solar system exploration from the Moon: Synoptic and comparative study of bodies in our Planetary system

    NASA Technical Reports Server (NTRS)

    Bruston, P.; Mumma, M. J.

    1994-01-01

    An observational approach to Planetary Sciences and exploration from Earth applies to a quite limited number of targets, but most of these are spatially complex, and exhibit variability and evolution on a number of temporal scales which lie within the scope of possible observations. Advancing our understanding of the underlying physics requires the study of interactions between the various elements of such systems, and also requires study of the comparative response of both a given object to various conditions and of comparable objects to similar conditions. These studies are best conducted in 'campaigns', i.e. comprehensive programs combining simultaneous coherent observations of every interacting piece of the puzzle. The requirements include both imaging and spectroscopy over a wide spectral range, from UV to IR. While temporal simultaneity of operation in various modes is a key feature, these observations are also conducted over extended periods of time. The moon is a prime site offering long unbroken observation times and high positional stability, observations at small angular separation from the sun, comparative studies of planet Earth, and valuable technical advantages. A lunar observatory should become a central piece of any coherent set of planetary missions, supplying in-situ explorations with the synoptic and comparative data necessary for proper advance planning, correlative observations during the active exploratory phase, and follow-up studies of the target body or of related objects.

  2. [Expenditure on the Care of COPD Patients Under Everyday Conditions in Pneumological Practices Differentiated According to Patients in Chronic Care and New Patients and Severity of the Illness].

    PubMed

    Hellmann, A; Hering, T; Andres, J

    2018-06-01

    New patients in the secondary respiratory care require more time for the first consultation and place a higher diagnostic and therapeutic demand if compared to patients already in chronic care. More diagnostic procedures and patient's education by the team are required. No such burden is observed regarding differential degrees of severity of respiratory diseases, e. g. COPD. The overall demands add up to twice the demands of patients already in care. Thus the time required for the treatment of 50 new patients allows consultations for 100 patients already known in the office.As additional time and effort for new patients is not adequately represented in the German medical tax (EBM) a trend to risk selection and a preference for control patients is observed. In contrast incentives to foster treatment of new patients could be an effective measure to dramatically reduce waiting time for visits with pulmonologists. This should be achieved by changes in the German medical tax (EBM). © Georg Thieme Verlag KG Stuttgart · New York.

  3. Investigation of transient earth resources phenomena: Continuation study

    NASA Technical Reports Server (NTRS)

    Goldman, G. C.

    1974-01-01

    Calculated sensitivity requirements for an earth resource satellite in a geostationary orbit are reported. Radiance levels at the satellite sensor were computed for twenty top-priority Synchronous Earth Observatory Satellite (SEOS) applications. The observation requirements were reviewed and re-evaluated in terms of spectral band definition, spectral signatures of targets and backgrounds, observation time, and site location. With these data and an atmospheric attenuation and scattering model, the total radiances observed by the SEOS sensor were calculated as were the individual components contributed by the target, target variations, and the atmosphere.

  4. Efficient strategies for leave-one-out cross validation for genomic best linear unbiased prediction.

    PubMed

    Cheng, Hao; Garrick, Dorian J; Fernando, Rohan L

    2017-01-01

    A random multiple-regression model that simultaneously fit all allele substitution effects for additive markers or haplotypes as uncorrelated random effects was proposed for Best Linear Unbiased Prediction, using whole-genome data. Leave-one-out cross validation can be used to quantify the predictive ability of a statistical model. Naive application of Leave-one-out cross validation is computationally intensive because the training and validation analyses need to be repeated n times, once for each observation. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis. Efficient Leave-one-out cross validation strategies is 786 times faster than the naive application for a simulated dataset with 1,000 observations and 10,000 markers and 99 times faster with 1,000 observations and 100 markers. These efficiencies relative to the naive approach using the same model will increase with increases in the number of observations. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis.

  5. The effectiveness of snow cube throwing learning model based on exploration

    NASA Astrophysics Data System (ADS)

    Sari, Nenden Mutiara

    2017-08-01

    This study aimed to know the effectiveness of Snow Cube Throwing (SCT) and Cooperative Model in Exploration-Based Math Learning in terms of the time required to complete the teaching materials and student engagement. This study was quasi-experimental research was conducted at SMPN 5 Cimahi, Indonesia. All student in grade VIII SMPN 5 Cimahi which consists of 382 students is used as population. The sample consists of two classes which had been chosen randomly with purposive sampling. First experiment class consists of 38 students and the second experiment class consists of 38 students. Observation sheet was used to observe the time required to complete the teaching materials and record the number of students involved in each meeting. The data obtained was analyzed by independent sample-t test and used the chart. The results of this study: SCT learning model based on exploration are more effective than cooperative learning models based on exploration in terms of the time required to complete teaching materials based on exploration and student engagement.

  6. On modeling animal movements using Brownian motion with measurement error.

    PubMed

    Pozdnyakov, Vladimir; Meyer, Thomas; Wang, Yu-Bo; Yan, Jun

    2014-02-01

    Modeling animal movements with Brownian motion (or more generally by a Gaussian process) has a long tradition in ecological studies. The recent Brownian bridge movement model (BBMM), which incorporates measurement errors, has been quickly adopted by ecologists because of its simplicity and tractability. We discuss some nontrivial properties of the discrete-time stochastic process that results from observing a Brownian motion with added normal noise at discrete times. In particular, we demonstrate that the observed sequence of random variables is not Markov. Consequently the expected occupation time between two successively observed locations does not depend on just those two observations; the whole path must be taken into account. Nonetheless, the exact likelihood function of the observed time series remains tractable; it requires only sparse matrix computations. The likelihood-based estimation procedure is described in detail and compared to the BBMM estimation.

  7. An Examination of the Reliability of a New Observation Measure for Autism Spectrum Disorders: The Autism Spectrum Disorder Observation for Children

    ERIC Educational Resources Information Center

    Neal, Daniene; Matson, Johnny L.; Belva, Brian C.

    2013-01-01

    The "autism spectrum disorder observation for children" ("ASD-OC") is a newly created 54-item observation measure for autism spectrum disorders (ASD). Due to the fact that many of the ASD observation measures currently available do not have established psychometric properties and require extensive time and training to administer, the "ASD-OC"…

  8. Variable Generation Power Forecasting as a Big Data Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haupt, Sue Ellen; Kosovic, Branko

    To blend growing amounts of power from renewable resources into utility operations requires accurate forecasts. For both day ahead planning and real-time operations, the power from the wind and solar resources must be predicted based on real-time observations and a series of models that span the temporal and spatial scales of the problem, using the physical and dynamical knowledge as well as computational intelligence. Accurate prediction is a Big Data problem that requires disparate data, multiple models that are each applicable for a specific time frame, and application of computational intelligence techniques to successfully blend all of the model andmore » observational information in real-time and deliver it to the decision makers at utilities and grid operators. This paper describes an example system that has been used for utility applications and how it has been configured to meet utility needs while addressing the Big Data issues.« less

  9. Variable Generation Power Forecasting as a Big Data Problem

    DOE PAGES

    Haupt, Sue Ellen; Kosovic, Branko

    2016-10-10

    To blend growing amounts of power from renewable resources into utility operations requires accurate forecasts. For both day ahead planning and real-time operations, the power from the wind and solar resources must be predicted based on real-time observations and a series of models that span the temporal and spatial scales of the problem, using the physical and dynamical knowledge as well as computational intelligence. Accurate prediction is a Big Data problem that requires disparate data, multiple models that are each applicable for a specific time frame, and application of computational intelligence techniques to successfully blend all of the model andmore » observational information in real-time and deliver it to the decision makers at utilities and grid operators. This paper describes an example system that has been used for utility applications and how it has been configured to meet utility needs while addressing the Big Data issues.« less

  10. Efficient quantum algorithm for computing n-time correlation functions.

    PubMed

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  11. The relationship between worker, occupational and workplace characteristics and whether an injury requires time off work: a matched case-control analysis in Ontario, Canada.

    PubMed

    Smith, Peter; Chen, Cynthia; Mustard, Cameron; Hogg-Johnson, Sheilah; Tompa, Emile

    2015-04-01

    The objective of this study was to examine individual, occupational, and workplace level factors associated with time loss following a similar injury. Seven thousand three hundred and forty-eight workers' compensation claims that did not require time off work were matched with up to four claims that required time off work on the event, nature, and part of body injured as well as injury year. Conditional logistic regression models examined individual, occupational, and workplace level factors that were associated with the likelihood of not requiring time off work. Employees from firms with higher premium rates were more likely to report no time loss from work and workers in more physically demanding occupations were less likely to report no time loss from work. We observed no association between age or gender and the probability of a time loss claim submission. Our results suggest that insurance costs are an incentive for workplaces to adopt policies and practices that minimize time loss following a work injury. © 2015 Wiley Periodicals, Inc.

  12. Giving cosmic redshift drift a whirl

    NASA Astrophysics Data System (ADS)

    Kim, Alex G.; Linder, Eric V.; Edelstein, Jerry; Erskine, David

    2015-03-01

    Redshift drift provides a direct kinematic measurement of cosmic acceleration but it occurs with a characteristic time scale of a Hubble time. Thus redshift observations with a challenging precision of 10-9 require a 10 year time span to obtain a signal-to-noise of 1. We discuss theoretical and experimental approaches to address this challenge, potentially requiring less observer time and having greater immunity to common systematics. On the theoretical side we explore allowing the universe, rather than the observer, to provide long time spans; speculative methods include radial baryon acoustic oscillations, cosmic pulsars, and strongly lensed quasars. On the experimental side, we explore beating down the redshift precision using differential interferometric techniques, including externally dispersed interferometers and spatial heterodyne spectroscopy. Low-redshift emission line galaxies are identified as having high cosmology leverage and systematics control, with an 8 h exposure on a 10-m telescope (1000 h of exposure on a 40-m telescope) potentially capable of measuring the redshift of a galaxy to a precision of 10-8 (few ×10-10). Low-redshift redshift drift also has very strong complementarity with cosmic microwave background measurements, with the combination achieving a dark energy figure of merit of nearly 300 (1400) for 5% (1%) precision on drift.

  13. Semi-physical simulation test for micro CMOS star sensor

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Zhang, Guang-jun; Jiang, Jie; Fan, Qiao-yun

    2008-03-01

    A designed star sensor must be extensively tested before launching. Testing star sensor requires complicated process with much time and resources input. Even observing sky on the ground is a challenging and time-consuming job, requiring complicated and expensive equipments, suitable time and location, and prone to be interfered by weather. And moreover, not all stars distributed on the sky can be observed by this testing method. Semi-physical simulation in laboratory reduces the testing cost and helps to debug, analyze and evaluate the star sensor system while developing the model. The test system is composed of optical platform, star field simulator, star field simulator computer, star sensor and the central data processing computer. The test system simulates the starlight with high accuracy and good parallelism, and creates static or dynamic image in FOV (Field of View). The conditions of the test are close to observing real sky. With this system, the test of a micro star tracker designed by Beijing University of Aeronautics and Astronautics has been performed successfully. Some indices including full-sky autonomous star identification time, attitude update frequency and attitude precision etc. meet design requirement of the star sensor. Error source of the testing system is also analyzed. It is concluded that the testing system is cost-saving, efficient, and contributes to optimizing the embed arithmetic, shortening the development cycle and improving engineering design processes.

  14. A new high resolution permafrost map of Iceland from Earth Observation data

    NASA Astrophysics Data System (ADS)

    Barnie, Talfan; Conway, Susan; Balme, Matt; Graham, Alastair

    2017-04-01

    High resolution maps of permafrost are required for ongoing monitoring of environmental change and the resulting hazards to ecosystems, people and infrastructure. However, permafrost maps are difficult to construct - direct observations require maintaining networks of sensors and boreholes in harsh environments and are thus limited in extent in space and time, and indirect observations require models or assumptions relating the measurements (e.g. weather station air temperature, basal snow temperature) to ground temperature. Operationally produced Land Surface Temperature maps from Earth Observation data can be used to make spatially contiguous estimates of mean annual skin temperature, which has been used a proxy for the presence of permafrost. However these maps are subject to biases due to (i) selective sampling during the day due to limited satellite overpass times, (ii) selective sampling over the year due to seasonally varying cloud cover, (iii) selective sampling of LST only during clearsky conditions, (iv) errors in cloud masking (v) errors in temperature emissivity separation (vi) smoothing over spatial variability. In this study we attempt to compensate for some of these problems using a bayesian modelling approach and high resolution topography-based downscaling.

  15. Comparison of the coronal mass ejection shock acceleration of three widespread SEP events during solar cycle 24

    NASA Astrophysics Data System (ADS)

    Xie, H.; Mäkelä, P.; St. Cyr, O. C.; Gopalswamy, N.

    2017-07-01

    We studied three solar energetic particle (SEP) events observed on 14 August 2010, 3 November 2011, and 5 March 2013 by Solar Terrestrial Relations Observatory (STEREO) A, B, and near-Earth (L1) spacecraft with a longitudinal distribution of particles >90°. Using a forward modeling method combined with extreme ultraviolet and white-light images, we determined the angular extent of the shock, the time and location (cobpoint) of the shock intersection with the magnetic field line connecting to each spacecraft, and compute the shock speed at the cobpoint of each spacecraft. We then examine whether the observations of SEPs at each spacecraft were accelerated and injected by the spatially extended shocks or whether another mechanism such as cross-field transport is required for an alternative explanation. Our analyses results indicate that the SEPs observed at the three spacecraft on 3 November, STEREO B (STB) and L1 on 14 August, and the 5 March SEP event at STEREO A (STA) can be explained by the direct shock acceleration. This is consistent with the observed significant anisotropies, short time delays between particle release times and magnetic connection times, and sharp rises in the SEP time profiles. Cross-field diffusion is the likely cause for the 14 August SEP event observed by STA and the 5 March SEPs observed by STB and L1 spacecraft, as particle observations featured weak electron aniotropies and slow rising intensity profiles. Otherwise, the wide longitudinal spread of these SEP increases would require an existence of a circumsolar shock, which may not be a correct assumption in the corona and heliosphere.

  16. Comparison of the Coronal Mass Ejection Shock Acceleration of Three Widespread SEP Events During Solar Cycle 24

    NASA Technical Reports Server (NTRS)

    Xie, H.; Makela, P.; St. Cyr, O. C.; Gopalswamy, N.

    2017-01-01

    We studied three solar energetic particle (SEP) events observed on 14 August 2010, 3 November 2011, and 5 March 2013 by Solar Terrestrial Relations Observatory (STEREO) A, B, and near-Earth (L1) spacecraft with a longitudinal distribution of particles greater than 90 degrees. Using a forward modeling method combined with extreme ultraviolet and white-light images, we determined the angular extent of the shock, the time and location (cobpoint) of the shock intersection with the magnetic field line connecting to each spacecraft, and compute the shock speed at the cobpoint of each spacecraft. We then examine whether the observations of SEPs at each spacecraft were accelerated and injected by the spatially extended shocks or whether another mechanism such as cross-field transport is required for an alternative explanation. Our analyses results indicate that the SEPs observed at the three spacecraft on 3 November, STEREO B (STB) and L1 on 14 August, and the 5 March SEP event at STEREO A (STA) can be explained by the direct shock acceleration. This is consistent with the observed significant anisotropies, short time delays between particle release times and magnetic connection times, and sharp rises in the SEP time profiles. Cross-field diffusion is the likely cause for the 14 August SEP event observed by STA and the 5 March SEPs observed by STB and L1 spacecraft, as particle observations featured weak electron anisotropies and slow rising intensity profiles. Otherwise, the wide longitudinal spread of these SEP increases would require an existence of a circumsolar shock, which may not be a correct assumption in the corona and heliosphere.

  17. Stakeholder Alignment for Requirements in GEOValue

    NASA Astrophysics Data System (ADS)

    Cutcher-Gershenfeld, J.; King, J. L.

    2016-12-01

    Observation systems that collect information on environmental parameters relevant to biological and physical earth resources provide value. This has been demonstrated so many times in so many ways that it is not worth deliberating. Earlier projects for research (whether it is possible to do this or that), or for "dual use," typically involving defense, have been successful. Wealthy parties have built their own systems. Less wealthy parties seek to sustain the systems they have built. The history of systems suggests that "requirements" will be the next step. The objective is to maximize the "return" on the substantial investment required for construction, deployment, maintenance and renewal of observation systems. Stakeholders and their interests are assessed to construct the requirements from which specifications are built. Specifications drive procurement, and procurement produces built systems. Complicated (e.g. space-based) systems have long times between requirements analysis and deployment. It all depends on getting the requirements right, which depends on understanding stakeholders and requirements. And this is where things get complicated. Stakeholders and interests change, sometimes rapidly, as what is possible is altered. It becomes increasingly difficult to achieve stakeholder alignment required for effective management of constituent politics at the heart of any expensive endeavor. This paper presents results from a major study of stakeholder alignment in the Earth Sciences, focused especially on EarthCube.

  18. Effects of observation heights and atmospheric wave evolution in sunspot seismology: a study using HMI and AIA (1600 A and 1700 A) data

    NASA Astrophysics Data System (ADS)

    Rajaguru, S. P.; Couvidaa, S.

    2011-10-01

    In achieving a high cadence and whole Sun coverage required of them, Doppler imagers such as HMI/SDO and MDI/SOHO necessarily forgo certain intricacies associated with magnetic and velocity field interactions, which require high (spectral) resolution spectropolarimetry for their accurate measurements with straightforward derivation of physical quantities (or observables). Magnetic field modified wave evolution, due to much reduced acoustic cut-off frequencies, in inclined field regions is one such situation. We first show, using a high cadence imaging spectropolarimetric observations made with IBIS instrument at NSO/Sac Peak, that significant contributions to seismically measured travel times arise from the line formation layers. We then present a comparative study of time-distance helioseismic measurements made over three sunspot regions using HMI and AIA (1600 A and 1700 A) data, which provide oscillation signals from three different heights. We bring out clear signals of height dependent wave phases and hence height dependent travel times. We further show that such signatures, from their differing contributions in one way travel times (in- or out-going wave travel times), could explain a significant part of the discrepancies between time-distance and other local helioseismic measurements and inferences.

  19. Time required to initiate outbreak and pandemic observational research.

    PubMed

    Rishu, Asgar H; Marinoff, Nicole; Julien, Lisa; Dumitrascu, Mariana; Marten, Nicole; Eggertson, Shauna; Willems, Su; Ruddell, Stacy; Lane, Dan; Light, Bruce; Stelfox, Henry T; Jouvet, Philippe; Hall, Richard; Reynolds, Steven; Daneman, Nick; Fowler, Robert A

    2017-08-01

    Observational research focused upon emerging infectious diseases such as Ebola virus, Middle East respiratory syndrome, and Zika virus has been challenging to quickly initiate. We aimed to determine the duration of start-up procedures and barriers encountered for an observational study focused upon such infectious outbreaks. At 1 pediatric and 5 adult intensive care units, we measured durations from protocol receipt to a variety of outbreak research milestones, including research ethics board (REB) approval, data sharing agreement (DSA) execution, and patient study screening initiation. The median (interquartile range) time from site receipt of the protocol to REB submission was 73 (30-126) days; to REB approval, 158 (42-188) days; to DSA completion, 276 (186-312) days; and to study screening initiation, 293 (269-391) days. The median time from REB submission to REB approval was 43 (13-85) days. The median time for all start-up procedures was 335 (188-335) days. There is a lengthy start-up period required for outbreak-focused research. Completing DSAs was the most time-consuming step. A reactive approach to newly emerging threats such as Ebola virus, Middle East respiratory syndrome, and Zika virus will likely not allow sufficient time to initiate research before most outbreaks are advanced. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Localizing New Pulsars with Intensity Mapping

    NASA Astrophysics Data System (ADS)

    Swiggum, Joe; Gentile, Peter

    2018-01-01

    Although low-frequency, single dish pulsar surveys provide an efficient means of searching large regions of sky quickly, the localization of new discoveries is poor. For example, discoveries from 350 MHz surveys using the Green Bank Telescope (GBT) have position uncertainties up to the FWHM of the telescope's "beam" on the sky, over half a degree! Before finding a coherent timing solution (requires 8-12 months of dedicated timing observations) a "gridding" method is usually employed to improve localization of new pulsars, whereby a grid of higher frequency beam positions is used to tile the initial error region. This method often requires over an hour of observing time to achieve arcminute-precision localization (provided the pulsar is detectable at higher frequencies).Here, we describe another method that uses the same observing frequency as the discovery observation and scans over Right Ascension and Declination directions around the nominal position. A Gaussian beam model is fit to folded pulse profile intensities as a function of time/position to provide improved localization. Using five test cases, we show that intensity mapping localization at 350 MHz with the GBT yields pulsar positions to 1 arcminute precision, facilitating high-frequency follow-up and higher significance detections for future pulsar timing. This method is also well suited to be directly implemented in future low-frequency drift scan pulsar surveys (e.g. with the Five hundred meter Aperture Spherical Telescope; FAST).

  1. Structure, Dynamics, and Deuterium Fractionation of Massive Pre-stellar Cores

    NASA Astrophysics Data System (ADS)

    Goodson, Matthew D.; Kong, Shuo; Tan, Jonathan C.; Heitsch, Fabian; Caselli, Paola

    2016-12-01

    High levels of deuterium fraction in N2H+ are observed in some pre-stellar cores. Single-zone chemical models find that the timescale required to reach observed values ({D}{frac}{{{N}}2{{{H}}}+}\\equiv {{{N}}}2{{{D}}}+/{{{N}}}2{{{H}}}+≳ 0.1) is longer than the free-fall time, possibly 10 times longer. Here, we explore the deuteration of turbulent, magnetized cores with 3D magnetohydrodynamics simulations. We use an approximate chemical model to follow the growth in abundances of N2H+ and N2D+. We then examine the dynamics of the core using each tracer for comparison to observations. We find that the velocity dispersion of the core as traced by N2D+ appears slightly sub-virial compared to predictions of the Turbulent Core Model of McKee & Tan, except at late times just before the onset of protostar formation. By varying the initial mass surface density, the magnetic energy, the chemical age, and the ortho-to-para ratio of H2, we also determine the physical and temporal properties required for high deuteration. We find that low initial ortho-to-para ratios (≲ 0.01) and/or multiple free-fall times (≳ 3) of prior chemical evolution are necessary to reach the observed values of deuterium fraction in pre-stellar cores.

  2. Automated observation scheduling for the VLT

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    It is becoming increasingly evident that, in order to optimize the observing efficiency of large telescopes, some changes will be required in the way observations are planned and executed. Not all observing programs require the presence of the astronomer at the telescope: for those programs which permit service observing it is possible to better match planned observations to conditions at the telescope. This concept of flexible scheduling has been proposed for the VLT: based on current and predicted environmental and instrumental observations which make the most efficient possible use of valuable time. A similar kind of observation scheduling is already necessary for some space observatories, such as Hubble Space Telescope (HST). Space Telescope Science Institute is presently developing scheduling tools for HST, based on the use of artificial intelligence software development techniques. These tools could be readily adapted for ground-based telescope scheduling since they address many of the same issues. The concept are described on which the HST tools are based, their implementation, and what would be required to adapt them for use with the VLT and other ground-based observatories.

  3. Observability of nonlinear dynamics: normalized results and a time-series approach.

    PubMed

    Aguirre, Luis A; Bastos, Saulo B; Alves, Marcela A; Letellier, Christophe

    2008-03-01

    This paper investigates the observability of nonlinear dynamical systems. Two difficulties associated with previous studies are dealt with. First, a normalized degree observability is defined. This permits the comparison of different systems, which was not generally possible before. Second, a time-series approach is proposed based on omnidirectional nonlinear correlation functions to rank a set of time series of a system in terms of their potential use to reconstruct the original dynamics without requiring the knowledge of the system equations. The two approaches proposed in this paper and a former method were applied to five benchmark systems and an overall agreement of over 92% was found.

  4. Simplified Interval Observer Scheme: A New Approach for Fault Diagnosis in Instruments

    PubMed Central

    Martínez-Sibaja, Albino; Astorga-Zaragoza, Carlos M.; Alvarado-Lassman, Alejandro; Posada-Gómez, Rubén; Aguila-Rodríguez, Gerardo; Rodríguez-Jarquin, José P.; Adam-Medina, Manuel

    2011-01-01

    There are different schemes based on observers to detect and isolate faults in dynamic processes. In the case of fault diagnosis in instruments (FDI) there are different diagnosis schemes based on the number of observers: the Simplified Observer Scheme (SOS) only requires one observer, uses all the inputs and only one output, detecting faults in one detector; the Dedicated Observer Scheme (DOS), which again uses all the inputs and just one output, but this time there is a bank of observers capable of locating multiple faults in sensors, and the Generalized Observer Scheme (GOS) which involves a reduced bank of observers, where each observer uses all the inputs and m-1 outputs, and allows the localization of unique faults. This work proposes a new scheme named Simplified Interval Observer SIOS-FDI, which does not requires the measurement of any input and just with just one output allows the detection of unique faults in sensors and because it does not require any input, it simplifies in an important way the diagnosis of faults in processes in which it is difficult to measure all the inputs, as in the case of biologic reactors. PMID:22346593

  5. The effect of modeled absolute timing variability and relative timing variability on observational learning.

    PubMed

    Grierson, Lawrence E M; Roberts, James W; Welsher, Arthur M

    2017-05-01

    There is much evidence to suggest that skill learning is enhanced by skill observation. Recent research on this phenomenon indicates a benefit of observing variable/erred demonstrations. In this study, we explore whether it is variability within the relative organization or absolute parameterization of a movement that facilitates skill learning through observation. To do so, participants were randomly allocated into groups that observed a model with no variability, absolute timing variability, relative timing variability, or variability in both absolute and relative timing. All participants performed a four-segment movement pattern with specific absolute and relative timing goals prior to and following the observational intervention, as well as in a 24h retention test and transfers tests that featured new relative and absolute timing goals. Absolute timing error indicated that all groups initially acquired the absolute timing, maintained their performance at 24h retention, and exhibited performance deterioration in both transfer tests. Relative timing error revealed that the observation of no variability and relative timing variability produced greater performance at the post-test, 24h retention and relative timing transfer tests, but for the no variability group, deteriorated at absolute timing transfer test. The results suggest that the learning of absolute timing following observation unfolds irrespective of model variability. However, the learning of relative timing benefits from holding the absolute features constant, while the observation of no variability partially fails in transfer. We suggest learning by observing no variability and variable/erred models unfolds via similar neural mechanisms, although the latter benefits from the additional coding of information pertaining to movements that require a correction. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Impact of a Checklist on Principal-Teacher Feedback Conferences Following Classroom Observations. REL 2018-285

    ERIC Educational Resources Information Center

    Mihaly, Kata; Schwartz, Heather L.; Opper, Isaac M.; Grimm, Geoffrey; Rodriguez, Luis; Mariano, Louis T.

    2018-01-01

    Most states' teacher evaluation systems have changed substantially in the past decade. New evaluation systems typically require school leaders to observe teachers' classrooms two to three times a school year instead of once (Doherty & Jacobs, 2015). The feedback that school leaders provide to teachers after these observations is a key but…

  7. Time-Accurate Solutions of Incompressible Navier-Stokes Equations for Potential Turbopump Applications

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan

    2001-01-01

    Two numerical procedures, one based on artificial compressibility method and the other pressure projection method, are outlined for obtaining time-accurate solutions of the incompressible Navier-Stokes equations. The performance of the two method are compared by obtaining unsteady solutions for the evolution of twin vortices behind a at plate. Calculated results are compared with experimental and other numerical results. For an un- steady ow which requires small physical time step, pressure projection method was found to be computationally efficient since it does not require any subiterations procedure. It was observed that the artificial compressibility method requires a fast convergence scheme at each physical time step in order to satisfy incompressibility condition. This was obtained by using a GMRES-ILU(0) solver in our computations. When a line-relaxation scheme was used, the time accuracy was degraded and time-accurate computations became very expensive.

  8. The University of Colorado OSO-8 spectrometer experiment. IV - Mission operations

    NASA Technical Reports Server (NTRS)

    Hansen, E. R.; Bruner, E. C., Jr.

    1979-01-01

    The remote operation of two high-resolution ultraviolet spectrometers on the OSO-8 satellite is discussed. Mission operations enabled scientific observers to plan observations based on current solar data, interact with the observing program using real- or near real-time data and commands, evaluate quick-look instrument data, and analyze the observations for publication. During routine operations, experiments were planned a day prior to their execution, and the data from these experiments received a day later. When a shorter turnaround was required, a real-time mode was available. Here, the real-time data and command links into the remote control center were used to evaluate experiment operation and make satellite pointing or instrument configuration changes with a 1-90 minute turnaround.

  9. Two-Time Scale Virtual Sensor Design for Vibration Observation of a Translational Flexible-Link Manipulator Based on Singular Perturbation and Differential Games

    PubMed Central

    Ju, Jinyong; Li, Wei; Wang, Yuqiao; Fan, Mengbao; Yang, Xuefeng

    2016-01-01

    Effective feedback control requires all state variable information of the system. However, in the translational flexible-link manipulator (TFM) system, it is unrealistic to measure the vibration signals and their time derivative of any points of the TFM by infinite sensors. With the rigid-flexible coupling between the global motion of the rigid base and the elastic vibration of the flexible-link manipulator considered, a two-time scale virtual sensor, which includes the speed observer and the vibration observer, is designed to achieve the estimation for the vibration signals and their time derivative of the TFM, as well as the speed observer and the vibration observer are separately designed for the slow and fast subsystems, which are decomposed from the dynamic model of the TFM by the singular perturbation. Additionally, based on the linear-quadratic differential games, the observer gains of the two-time scale virtual sensor are optimized, which aims to minimize the estimation error while keeping the observer stable. Finally, the numerical calculation and experiment verify the efficiency of the designed two-time scale virtual sensor. PMID:27801840

  10. Correlations between time required for radiological diagnoses, readers' performance, display environments, and difficulty of cases

    NASA Astrophysics Data System (ADS)

    Gur, David; Rockette, Howard E.; Sumkin, Jules H.; Hoy, Ronald J.; Feist, John H.; Thaete, F. Leland; King, Jill L.; Slasky, B. S.; Miketic, Linda M.; Straub, William H.

    1991-07-01

    In a series of large ROC studies, the authors analyzed the time radiologists took to diagnose PA chest images as a function of observer performance indices (Az), display environments, and difficulty of cases. Board-certified radiologists interpreted at least 600 images each for the presence or absence of one or more of the following abnormalities: interstitial disease, nodule, and pneumothorax. Results indicated that there exists a large inter- reader variability in the time required to diagnose PA chest images. There is no correlation between a reader's specific median reading time and his/her performance. Time generally increases as the number of abnormalities on a single image increases and for cases with subtle abnormalities. Results also indicated that, in general, the longer the time for interpretation of a specific case (within reader), the further the observer's confidence ratings were from the truth. These findings were found to hold true regardless of the display mode. These results may have implications with regards to the appropriate methodology that should be used for imaging systems evaluations and for measurements of productivity for radiologists.

  11. How many research nurses for how many clinical trials in an oncology setting? Definition of the Nursing Time Required by Clinical Trial-Assessment Tool (NTRCT-AT).

    PubMed

    Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa

    2017-02-01

    Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.

  12. Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.

    2014-12-01

    Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.

  13. In situ accurate determination of the zero time delay between two independent ultrashort laser pulses by observing the oscillation of an atomic excited wave packet.

    PubMed

    Zhang, Qun; Hepburn, John W

    2008-08-15

    We propose a novel method that uses the oscillation of an atomic excited wave packet observed through a pump-probe technique to accurately determine the zero time delay between a pair of ultrashort laser pulses. This physically based approach provides an easy fix for the intractable problem of synchronizing two different femtosecond laser pulses in a practical experimental environment, especially where an in situ time zero measurement with high accuracy is required.

  14. Earth radiation balance and climate: Why the Moon is the wrong place to observe the Earth

    NASA Astrophysics Data System (ADS)

    Kandel, Robert S.

    1994-06-01

    Increasing 'greenhouse' gases in the Earth's atmosphere will perturb the Earth's radiation balance, forcing climate change over coming decades. Climate sensitivity depends critically on cloud-radiation feedback: its evaluation requires continual observation of changing patterns of Earth radiation balance and cloud cover. The Moon is the wrong place for such observations, with many disadvantages compared to an observation system combining platforms in low polar, intermediate-inclination and geostationary orbits. From the Moon, active observations are infeasible; thermal infrared observations require very large instruments to reach spatial resolutions obtained at much lower cost from geostationary or lower orbits. The Earth's polar zones are never well observed from the Moon; other zones are invisible more than half the time. The monthly illumination cycle leads to further bias in radiation budget determinations. The Earth will be a pretty sight from the Earth-side of the Moon, but serious Earth observations will be made elsewhere.

  15. The Mars Observer differential one-way range demonstration

    NASA Technical Reports Server (NTRS)

    Kroger, P. M.; Border, J. S.; Nandi, S.

    1994-01-01

    Current methods of angular spacecraft positioning using station differenced range data require an additional observation of an extragalactic radio source (quasar) to estimate the timing offset between the reference clocks at the two Deep Space Stations. The quasar observation is also used to reduce the effects of instrumental and media delays on the radio metric observable by forming a difference with the spacecraft observation (delta differential one-way range, delta DOR). An experiment has been completed using data from the Global Positioning System satellites to estimate the station clock offset, eliminating the need for the quasar observation. The requirements for direct measurement of the instrumental delays that must be made in the absence of a quasar observation are assessed. Finally, the results of the 'quasar-free' differential one-way range, or DOR, measurements of the Mars Observer spacecraft are compared with those of simultaneous conventional delta DOR measurements.

  16. As a Matter of Force—Systematic Biases in Idealized Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Grete, Philipp; O’Shea, Brian W.; Beckwith, Kris

    2018-05-01

    Many astrophysical systems encompass very large dynamical ranges in space and time, which are not accessible by direct numerical simulations. Thus, idealized subvolumes are often used to study small-scale effects including the dynamics of turbulence. These turbulent boxes require an artificial driving in order to mimic energy injection from large-scale processes. In this Letter, we show and quantify how the autocorrelation time of the driving and its normalization systematically change the properties of an isothermal compressible magnetohydrodynamic flow in the sub- and supersonic regime and affect astrophysical observations such as Faraday rotation. For example, we find that δ-in-time forcing with a constant energy injection leads to a steeper slope in kinetic energy spectrum and less-efficient small-scale dynamo action. In general, we show that shorter autocorrelation times require more power in the acceleration field, which results in more power in compressive modes that weaken the anticorrelation between density and magnetic field strength. Thus, derived observables, such as the line-of-sight (LOS) magnetic field from rotation measures, are systematically biased by the driving mechanism. We argue that δ-in-time forcing is unrealistic and numerically unresolved, and conclude that special care needs to be taken in interpreting observational results based on the use of idealized simulations.

  17. Reliability of Two Smartphone Applications for Radiographic Measurements of Hallux Valgus Angles.

    PubMed

    Mattos E Dinato, Mauro Cesar; Freitas, Marcio de Faria; Milano, Cristiano; Valloto, Elcio; Ninomiya, André Felipe; Pagnano, Rodrigo Gonçalves

    The objective of the present study was to assess the reliability of 2 smartphone applications compared with the traditional goniometer technique for measurement of radiographic angles in hallux valgus and the time required for analysis with the different methods. The radiographs of 31 patients (52 feet) with a diagnosis of hallux valgus were analyzed. Four observers, 2 with >10 years' experience in foot and ankle surgery and 2 in-training surgeons, measured the hallux valgus angle and intermetatarsal angle using a manual goniometer technique and 2 smartphone applications (Hallux Angles and iPinPoint). The interobserver and intermethod reliability were estimated using intraclass correlation coefficients (ICCs), and the time required for measurement of the angles among the 3 methods was compared using the Friedman test. A very good or good interobserver reliability was found among the 4 observers measuring the hallux valgus angle and intermetatarsal angle using the goniometer (ICC 0.913 and 0.821, respectively) and iPinPoint (ICC 0.866 and 0.638, respectively). Using the Hallux Angles application, a very good interobserver reliability was found for measurements of the hallux valgus angle (ICC 0.962) and intermetatarsal angle (ICC 0.935) only among the more experienced observers. The time required for the measurements was significantly shorter for the measurements using both smartphone applications compared with the goniometer method. One smartphone application (iPinPoint) was reliable for measurements of the hallux valgus angles by either experienced or nonexperienced observers. The use of these tools might save time in the evaluation of radiographic angles in the hallux valgus. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Measuring and managing radiologist workload: a method for quantifying radiologist activities and calculating the full-time equivalents required to operate a service.

    PubMed

    MacDonald, Sharyn L S; Cowan, Ian A; Floyd, Richard A; Graham, Rob

    2013-10-01

    Accurate and transparent measurement and monitoring of radiologist workload is highly desirable for management of daily workflow in a radiology department, and for informing decisions on department staffing needs. It offers the potential for benchmarking between departments and assessing future national workforce and training requirements. We describe a technique for quantifying, with minimum subjectivity, all the work carried out by radiologists in a tertiary department. Six broad categories of clinical activities contributing to radiologist workload were identified: reporting, procedures, trainee supervision, clinical conferences and teaching, informal case discussions, and administration related to referral forms. Time required for reporting was measured using data from the radiology information system. Other activities were measured by observation and timing by observers, and based on these results and extensive consultation, the time requirements and frequency of each activity was agreed on. An activity list was created to record this information and to calculate the total clinical hours required to meet the demand for radiologist services. Diagnostic reporting accounted for approximately 35% of radiologist clinical time; procedures, 23%; trainee supervision, 15%; conferences and tutorials, 14%; informal case discussions, 10%; and referral-related administration, 3%. The derived data have been proven reliable for workload planning over the past 3 years. A transparent and robust method of measuring radiologists' workload has been developed, with subjective assessments kept to a minimum. The technique has value for daily workload and longer term planning. It could be adapted for widespread use. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  19. Exploring the time course of face matching: temporal constraints impair unfamiliar face identification under temporally unconstrained viewing.

    PubMed

    Ozbek, Müge; Bindemann, Markus

    2011-10-01

    The identification of unfamiliar faces has been studied extensively with matching tasks, in which observers decide if pairs of photographs depict the same person (identity matches) or different people (mismatches). In experimental studies in this field, performance is usually self-paced under the assumption that this will encourage best-possible accuracy. Here, we examined the temporal characteristics of this task by limiting display times and tracking observers' eye movements. Observers were required to make match/mismatch decisions to pairs of faces shown for 200, 500, 1000, or 2000ms, or for an unlimited duration. Peak accuracy was reached within 2000ms and two fixations to each face. However, intermixing exposure conditions produced a context effect that generally reduced accuracy on identity mismatch trials, even when unlimited viewing of faces was possible. These findings indicate that less than 2s are required for face matching when exposure times are variable, but temporal constraints should be avoided altogether if accuracy is truly paramount. The implications of these findings are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. 50 CFR 218.105 - Requirements for monitoring and reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...., FFG, DDG, or CG); (G) Length of time observers maintained visual contact with marine mammal(s); (H...., participating in exercise; (H) Wave height in feet (high, low and average during exercise); and (I) Narrative... observers maintained visual contact with marine mammal; (G) Wave height; (H) Visibility; (I) Whether...

  1. 50 CFR 218.105 - Requirements for monitoring and reporting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...., FFG, DDG, or CG); (G) Length of time observers maintained visual contact with marine mammal(s); (H...., participating in exercise; (H) Wave height in feet (high, low and average during exercise); and (I) Narrative... observers maintained visual contact with marine mammal; (G) Wave height; (H) Visibility; (I) Whether...

  2. Metrology and ionospheric observation standards

    NASA Astrophysics Data System (ADS)

    Panshin, Evgeniy; Minligareev, Vladimir; Pronin, Anton

    Accuracy and ionospheric observation validity are urgent trends nowadays. WMO, URSI and national metrological and standardisation services bring forward requirements and descriptions of the ionospheric observation means. Researches in the sphere of metrological and standardisation observation moved to the next level in the Russian Federation. Fedorov Institute of Applied Geophysics (IAG) is in charge of ionospheric observation in the Russian Federation and the National Technical Committee, TC-101 , which was set up on the base of IAG- of the standardisation in the sphere. TC-101 can be the platform for initiation of the core international committee in the network of ISO The new type of the ionosounde “Parus-A” is engineered, which is up to the national requirements. “Parus-A” calibration and test were conducted by National metrological Institute (NMI) -D.I. Mendeleyev Institute for Metrology (VNIIM), signed CIMP MRA in 1991. VNIIM is a basic NMI in the sphere of Space weather (including ionospheric observations), the founder of which was celebrated chemist and metrologist Dmitriy I. Mendeleyev. Tests and calibration were carried out for the 1st time throughout 50-year-history of ionosonde exploitation in Russia. The following metrological characteristics were tested: -measurement range of radiofrequency time delay 0.5-10 ms; -time measurement inaccuracy of radio- frequency pulse ±12mcs; -frequency range of radio impulse 1-20 MHz ; -measurement inaccuracy of radio impulse carrier frequency± 5KHz. For example, the sound impulse simulator that was built-in in the ionosounde was used for measurement range of radiofrequency time delay testing. The number of standards on different levels is developed. - “Ionospheric observation guidance”; - “The Earth ionosphere. Terms and definitions”.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bishop, James K.B.

    Prediction of the substantial biologically mediated carbon flows in a rapidly changing and acidifying ocean requires model simulations informed by observations of key carbon cycle processes on the appropriate space and time scales. From 2000 to 2004, the National Oceanographic Partnership Program (NOPP) supported the development of the first low-cost fully-autonomous ocean profiling Carbon Explorers that demonstrated that year-round real-time observations of particulate organic carbon (POC) concentration and sedimentation could be achieved in the world's ocean. NOPP also initiated the development of a sensor for particulate inorganic carbon (PIC) suitable for operational deployment across all oceanographic platforms. As a result,more » PIC profile characterization that once required shipboard sample collection and shipboard or shore based laboratory analysis, is now possible to full ocean depth in real time using a 0.2W sensor operating at 24 Hz. NOPP developments further spawned US DOE support to develop the Carbon Flux Explorer, a free-vehicle capable of following hourly variations of particulate inorganic and organic carbon sedimentation from near surface to kilometer depths for seasons to years and capable of relaying contemporaneous observations via satellite. We have demonstrated the feasibility of real time - low cost carbon observations which are of fundamental value to carbon prediction and when further developed, will lead to a fully enhanced global carbon observatory capable of real time assessment of the ocean carbon sink, a needed constraint for assessment of carbon management policies on a global scale.« less

  4. Observing Strategies for the Detection of Jupiter Analogs

    NASA Astrophysics Data System (ADS)

    Wittenmyer, Robert A.; Tinney, C. G.; Horner, J.; Butler, R. P.; Jones, H. R. A.; O'Toole, S. J.; Bailey, J.; Carter, B. D.; Salter, G. S.; Wright, D.

    2013-04-01

    To understand the frequency, and thus the formation and evolution, of planetary systems like our own solar system, it is critical to detect Jupiter-like planets in Jupiter-like orbits. For long-term radial-velocity monitoring, it is useful to estimate the observational effort required to reliably detect such objects, particularly in light of severe competition for limited telescope time. We perform detailed simulations of observational campaigns, maximizing the realism of the sampling of a set of simulated observations. We then compute the detection limits for each campaign to quantify the effect of increasing the number of observational epochs and varying their time coverage. We show that once there is sufficient time baseline to detect a given orbital period, it becomes less effective to add further time coverage—rather, the detectability of a planet scales roughly as the square root of the number of observations, independently of the number of orbital cycles included in the data string. We also show that no noise floor is reached, with a continuing improvement in detectability at the maximum number of observations N = 500 tested here.

  5. [Registration of observational studies: it is time to comply with the Declaration of Helsinki requirement].

    PubMed

    Dal-Ré, Rafael; Delgado, Miguel; Bolumar, Francisco

    2015-01-01

    Publication bias is a serious deficiency in the current system of disseminating the results of human research studies. Clinical investigators know that, from an ethical standpoint, they should prospectively register clinical trials in a public registry before starting them. In addition, it is believed that this approach will help to reduce publication bias. However, most studies conducted in humans are observational rather than experimental. It is estimated that less than 2% out of 2 million concluded or ongoing observational studies have been registered. The 2013 revision of the Declaration of Helsinki requires registration of any type of research study involving humans or identifiable samples or data. It is proposed that funding agencies, such as the Fondo de Investigaciones Sanitarias, as well as private companies, require preregistration of observational studies before providing funding. It is also proposed that Research Ethics Committees which, following Spanish regulation, have been using the Declaration as the framework for assessing the ethics of clinical trials with medicines since 1990, should follow the same provisions for the assessment of health-related observational studies: therefore, they should require prospective registration of studies before granting their final approval. This would allow observational study investigators to be educated in complying with an ethical requirement recently introduced in the most important ethical code for research involving humans. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  6. Sports and Recreational Injuries in Relation to Lost Duty Time Among Deployed U.S. Marine Corps Personnel

    DTIC Science & Technology

    2011-06-07

    Lost Duty Time 6 standardized residuals of cells were examined to determine which cells had observed counts sizably different from expected counts...exploratory analysis) was used as the criterion to indicate that a cell had more (positive residual) or less (negative residual) observed events than...and supplies. These activities require lower body strength, stamina , and core strength that would be impaired by injuries to the lower extremities

  7. Just-in-Time Training for High-Risk Low-Volume Therapies: An Approach to Ensure Patient Safety.

    PubMed

    Helman, Stephanie; Lisanti, Amy Jo; Adams, Ann; Field, Cynthia; Davis, Katherine Finn

    2016-01-01

    High-risk low-volume therapies are those therapies that are practiced infrequently and yet carry an increased risk to patients because of their complexity. Staff nurses are required to competently manage these therapies to treat patients' unique needs and optimize outcomes; however, maintaining competence is challenging. This article describes implementation of Just-in-Time Training, which requires validation of minimum competency of bedside nurses managing high-risk low-volume therapies through direct observation of a return-demonstration competency checklist.

  8. Cross-correlation least-squares reverse time migration in the pseudo-time domain

    NASA Astrophysics Data System (ADS)

    Li, Qingyang; Huang, Jianping; Li, Zhenchun

    2017-08-01

    The least-squares reverse time migration (LSRTM) method with higher image resolution and amplitude is becoming increasingly popular. However, the LSRTM is not widely used in field land data processing because of its sensitivity to the initial migration velocity model, large computational cost and mismatch of amplitudes between the synthetic and observed data. To overcome the shortcomings of the conventional LSRTM, we propose a cross-correlation least-squares reverse time migration algorithm in pseudo-time domain (PTCLSRTM). Our algorithm not only reduces the depth/velocity ambiguities, but also reduces the effect of velocity error on the imaging results. It relieves the accuracy requirements on the migration velocity model of least-squares migration (LSM). The pseudo-time domain algorithm eliminates the irregular wavelength sampling in the vertical direction, thus it can reduce the vertical grid points and memory requirements used during computation, which makes our method more computationally efficient than the standard implementation. Besides, for field data applications, matching the recorded amplitudes is a very difficult task because of the viscoelastic nature of the Earth and inaccuracies in the estimation of the source wavelet. To relax the requirement for strong amplitude matching of LSM, we extend the normalized cross-correlation objective function to the pseudo-time domain. Our method is only sensitive to the similarity between the predicted and the observed data. Numerical tests on synthetic and land field data confirm the effectiveness of our method and its adaptability for complex models.

  9. Recent Timing Results for PSR B1259 - 63

    NASA Astrophysics Data System (ADS)

    Wex, N.; Johnston, S.

    The binary pulsar PSR B1259 - 63 is in a highly eccentric 3.4 yr orbit around the Be star SS 2883. Timing observations of this pulsar, made over a 7 yr period using the Parkes 64 m radio-telescope, cover two periastron passages, in 1990 August and 1994 January. The timing observations of PSR B1259 - 63 clearly show evidence for timing noise which is domina ted by a cubic term. Unfortunately, the large amplitude timing noise and data over only two complete orbits make it difficult to produce a unique timing solution for this pulsar. However, if the long term behavior of timing noise is completely modeled by a cubic term, both dot ω and dot x terms are required in the timing model which could be a result of a precessing orbit caused by the quadrupole moment of the tilted companion star. In this paper we summarise the timing observations for the PSR B1259 - 63 system; full details are given in Wex et al. (1997).

  10. M ≥ 7.0 earthquake recurrence on the San Andreas fault from a stress renewal model

    USGS Publications Warehouse

    Parsons, Thomas E.

    2006-01-01

     Forecasting M ≥ 7.0 San Andreas fault earthquakes requires an assessment of their expected frequency. I used a three-dimensional finite element model of California to calculate volumetric static stress drops from scenario M ≥ 7.0 earthquakes on three San Andreas fault sections. The ratio of stress drop to tectonic stressing rate derived from geodetic displacements yielded recovery times at points throughout the model volume. Under a renewal model, stress recovery times on ruptured fault planes can be a proxy for earthquake recurrence. I show curves of magnitude versus stress recovery time for three San Andreas fault sections. When stress recovery times were converted to expected M ≥ 7.0 earthquake frequencies, they fit Gutenberg-Richter relationships well matched to observed regional rates of M ≤ 6.0 earthquakes. Thus a stress-balanced model permits large earthquake Gutenberg-Richter behavior on an individual fault segment, though it does not require it. Modeled slip magnitudes and their expected frequencies were consistent with those observed at the Wrightwood paleoseismic site if strict time predictability does not apply to the San Andreas fault.

  11. Image Stability Requirements For a Geostationary Imaging Fourier Transform Spectrometer (GIFTS)

    NASA Technical Reports Server (NTRS)

    Bingham, G. E.; Cantwell, G.; Robinson, R. C.; Revercomb, H. E.; Smith, W. L.

    2001-01-01

    A Geostationary Imaging Fourier Transform Spectrometer (GIFTS) has been selected for the NASA New Millennium Program (NMP) Earth Observing-3 (EO-3) mission. Our paper will discuss one of the key GIFTS measurement requirements, Field of View (FOV) stability, and its impact on required system performance. The GIFTS NMP mission is designed to demonstrate new and emerging sensor and data processing technologies with the goal of making revolutionary improvements in meteorological observational capability and forecasting accuracy. The GIFTS payload is a versatile imaging FTS with programmable spectral resolution and spatial scene selection that allows radiometric accuracy and atmospheric sounding precision to be traded in near real time for area coverage. The GIFTS sensor combines high sensitivity with a massively parallel spatial data collection scheme to allow high spatial resolution measurement of the Earth's atmosphere and rapid broad area coverage. An objective of the GIFTS mission is to demonstrate the advantages of high spatial resolution (4 km ground sample distance - gsd) on temperature and water vapor retrieval by allowing sampling in broken cloud regions. This small gsd, combined with the relatively long scan time required (approximately 10 s) to collect high resolution spectra from geostationary (GEO) orbit, may require extremely good pointing control. This paper discusses the analysis of this requirement.

  12. Detection of individual atoms in helium buffer gas and observation of their real-time motion

    NASA Technical Reports Server (NTRS)

    Pan, C. L.; Prodan, J. V.; Fairbank, W. M., Jr.; She, C. Y.

    1980-01-01

    Single atoms are detected and their motion measured for the first time to our knowledge by the fluorescence photon-burst method in the presence of large quantities of buffer gas. A single-clipped digital correlator records the photon burst in real time and displays the atom's transit time across the laser beam. A comparison is made of the special requirements for single-atom detection in vacuum and in a buffer gas. Finally, the probability distribution of the bursts from many atoms is measured. It further proves that the bursts observed on resonance are due to single atoms and not simply to noise fluctuations.

  13. Quantum information erasure inside black holes

    DOE PAGES

    Lowe, David A.; Thorlacius, Larus

    2015-12-15

    An effective field theory for infalling observers in the vicinity of a quasi-static black hole is given in terms of a freely falling lattice discretization. The lattice model successfully reproduces the thermal spectrum of outgoing Hawking radiation, as was shown by Corley and Jacobson, but can also be used to model observations made by a typical low-energy observer who enters the black hole in free fall at a prescribed time. The explicit short distance cutoff ensures that, from the viewpoint of the infalling observer, any quantum information that entered the black hole more than a scrambling time earlier has beenmore » erased by the black hole singularity. Furthermore, this property, combined with the requirement that outside observers need at least of order the scrambling time to extract quantum information from the black hole, ensures that a typical infalling observer does not encounter drama upon crossing the black hole horizon in a theory where black hole information is preserved for asymptotic observers.« less

  14. Time frequency requirements for radio interferometric earth physics

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.; Fliegel, H. F.

    1973-01-01

    Two systems of VLBI (Very Long Baseline Interferometry) are now applicable to earth physics: an intercontinental baseline system using antennas of the NASA Deep Space Network, now observing at one-month intervals to determine UTI for spacecraft navigation; and a shorter baseline system called ARIES (Astronomical Radio Interferometric Earth Surveying), to be used to measure crustal movement in California for earthquake hazards estimation. On the basis of experience with the existing DSN system, a careful study has been made to estimate the time and frequency requirements of both the improved intercontinental system and of ARIES. Requirements for the two systems are compared and contrasted.

  15. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  16. Climate science and famine early warning

    USGS Publications Warehouse

    Verdin, James P.; Funk, Chris; Senay, Gabriel B.; Choularton, R.

    2005-01-01

    Food security assessment in sub-Saharan Africa requires simultaneous consideration of multiple socio-economic and environmental variables. Early identification of populations at risk enables timely and appropriate action. Since large and widely dispersed populations depend on rainfed agriculture and pastoralism, climate monitoring and forecasting are important inputs to food security analysis. Satellite rainfall estimates (RFE) fill in gaps in station observations, and serve as input to drought index maps and crop water balance models. Gridded rainfall time-series give historical context, and provide a basis for quantitative interpretation of seasonal precipitation forecasts. RFE are also used to characterize flood hazards, in both simple indices and stream flow models. In the future, many African countries are likely to see negative impacts on subsistence agriculture due to the effects of global warming. Increased climate variability is forecast, with more frequent extreme events. Ethiopia requires special attention. Already facing a food security emergency, troubling persistent dryness has been observed in some areas, associated with a positive trend in Indian Ocean sea surface temperatures. Increased African capacity for rainfall observation, forecasting, data management and modelling applications is urgently needed. Managing climate change and increased climate variability require these fundamental technical capacities if creative coping strategies are to be devised.

  17. Observational Definition of Future AGN Echo-Mapping Experiments

    NASA Technical Reports Server (NTRS)

    Collier, Stefan; Peterson, Bradley M.; Horne, Keith

    2001-01-01

    We describe numerical simulations we have begun in order to determine the observational requirements for future echo-apping experiments. We focus on two particular problems: (1) determination of the structure and kinematics of the broad-line region through emission- line reverberation mapping, and (2) detection of interband continuum lags that may be used as a probe of the continuum source, presumably a temperature-stratified accretion disk. Our preliminary results suggest the broad-line region can be reverberation-mapped to good precision with spectra of signal-to-noise ratio per pixel S/N approx. = 30, time resolution (Delta)t approx. = 0.1 day, and duration of about 60 days (which is a factor of three larger than the longest time scale in the input models); data that meet these requirements do not yet exist. We also find that interband continuum lags of approx. greater than 0.5 days can be detected at approx. greater than 95% confidence with at least daily observations for about 6 weeks, or rather more easily and definitively with shorter programs undertaken with satellite-based observatories. The results of these simulations show that significant steps forward in multiwavelength monitoring will almost certainly require dedicated facilities.

  18. Climate science and famine early warning.

    PubMed

    Verdin, James; Funk, Chris; Senay, Gabriel; Choularton, Richard

    2005-11-29

    Food security assessment in sub-Saharan Africa requires simultaneous consideration of multiple socio-economic and environmental variables. Early identification of populations at risk enables timely and appropriate action. Since large and widely dispersed populations depend on rainfed agriculture and pastoralism, climate monitoring and forecasting are important inputs to food security analysis. Satellite rainfall estimates (RFE) fill in gaps in station observations, and serve as input to drought index maps and crop water balance models. Gridded rainfall time-series give historical context, and provide a basis for quantitative interpretation of seasonal precipitation forecasts. RFE are also used to characterize flood hazards, in both simple indices and stream flow models. In the future, many African countries are likely to see negative impacts on subsistence agriculture due to the effects of global warming. Increased climate variability is forecast, with more frequent extreme events. Ethiopia requires special attention. Already facing a food security emergency, troubling persistent dryness has been observed in some areas, associated with a positive trend in Indian Ocean sea surface temperatures. Increased African capacity for rainfall observation, forecasting, data management and modelling applications is urgently needed. Managing climate change and increased climate variability require these fundamental technical capacities if creative coping strategies are to be devised.

  19. Climate science and famine early warning

    PubMed Central

    Verdin, James; Funk, Chris; Senay, Gabriel; Choularton, Richard

    2005-01-01

    Food security assessment in sub-Saharan Africa requires simultaneous consideration of multiple socio-economic and environmental variables. Early identification of populations at risk enables timely and appropriate action. Since large and widely dispersed populations depend on rainfed agriculture and pastoralism, climate monitoring and forecasting are important inputs to food security analysis. Satellite rainfall estimates (RFE) fill in gaps in station observations, and serve as input to drought index maps and crop water balance models. Gridded rainfall time-series give historical context, and provide a basis for quantitative interpretation of seasonal precipitation forecasts. RFE are also used to characterize flood hazards, in both simple indices and stream flow models. In the future, many African countries are likely to see negative impacts on subsistence agriculture due to the effects of global warming. Increased climate variability is forecast, with more frequent extreme events. Ethiopia requires special attention. Already facing a food security emergency, troubling persistent dryness has been observed in some areas, associated with a positive trend in Indian Ocean sea surface temperatures. Increased African capacity for rainfall observation, forecasting, data management and modelling applications is urgently needed. Managing climate change and increased climate variability require these fundamental technical capacities if creative coping strategies are to be devised. PMID:16433101

  20. Methodology for the passive control of orbital inclination and mean local time to meet sun-synchronous orbit requirements

    NASA Technical Reports Server (NTRS)

    Folta, David; Kraft, Lauri

    1992-01-01

    The mean local time (MLT) of equatorial crossing of a sun-synchronous Earth-observing spacecraft orbit drifts with inclination; therefore, in order to maintain the MLT, the inclination must be controlled. Inclination may be maintained actively by costly out-of-plane maneuvers or passively by using the perturbing forces due to the sun and moon. This paper examines the passive control approach using the Earth Observing System (EOS) as a basis for the discussion. Applications to Landsat and National Oceanic and Atmospheric Administration (NOAA) spacecraft are presented for comparison. This technique is especially beneficial to spacecraft lacking propulsion systems. The results indicate that passive inclination control appears to be the preferable maintenance method when spacecraft weight restrictions, operational considerations, and scientific requirements apply.

  1. Discrete Film Cooling in a Rocket with Curved Walls

    DTIC Science & Technology

    2009-12-01

    insight to be gained by observing the process of effusion cooling in its most basic elements. In rocket applications, the first desired condition is...ηspan. Convergence was determined by doubling the number of cells, mostly in the region near the hole, until less than a 1 % change was observed in the...method was required to determine the absolute start time for the transient process . To find the time error, start again with TS − Ti Taw − Ti = 1 − exp

  2. Planning and Scheduling for Fleets of Earth Observing Satellites

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Jonsson, Ari; Morris, Robert; Smith, David E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    We address the problem of scheduling observations for a collection of earth observing satellites. This scheduling task is a difficult optimization problem, potentially involving many satellites, hundreds of requests, constraints on when and how to service each request, and resources such as instruments, recording devices, transmitters, and ground stations. High-fidelity models are required to ensure the validity of schedules; at the same time, the size and complexity of the problem makes it unlikely that systematic optimization search methods will be able to solve them in a reasonable time. This paper presents a constraint-based approach to solving the Earth Observing Satellites (EOS) scheduling problem, and proposes a stochastic heuristic search method for solving it.

  3. Observers' measurements in premetric electrodynamics: Time and radar length

    NASA Astrophysics Data System (ADS)

    Gürlebeck, Norman; Pfeifer, Christian

    2018-04-01

    The description of an observer's measurement in general relativity and the standard model of particle physics is closely related to the spacetime metric. In order to understand and interpret measurements, which test the metric structure of the spacetime, like the classical Michelson-Morley, Ives-Stilwell, Kennedy-Thorndike experiments or frequency comparison experiments in general, it is necessary to describe them in theories, which go beyond the Lorentzian metric structure. However, this requires a description of an observer's measurement without relying on a metric. We provide such a description of an observer's measurement of the fundamental quantities time and length derived from a premetric perturbation of Maxwell's electrodynamics and a discussion on how these measurements influence classical relativistic observables like time dilation and length contraction. Most importantly, we find that the modification of electrodynamics influences the measurements at two instances: the propagation of light is altered as well as the observer's proper time normalization. When interpreting the results of a specific experiment, both effects cannot be disentangled, in general, and have to be taken into account.

  4. Solar EUV irradiance for space weather applications

    NASA Astrophysics Data System (ADS)

    Viereck, R. A.

    2015-12-01

    Solar EUV irradiance is an important driver of space weather models. Large changes in EUV and x-ray irradiances create large variability in the ionosphere and thermosphere. Proxies such as the F10.7 cm radio flux, have provided reasonable estimates of the EUV flux but as the space weather models become more accurate and the demands of the customers become more stringent, proxies are no longer adequate. Furthermore, proxies are often provided only on a daily basis and shorter time scales are becoming important. Also, there is a growing need for multi-day forecasts of solar EUV irradiance to drive space weather forecast models. In this presentation we will describe the needs and requirements for solar EUV irradiance information from the space weather modeler's perspective. We will then translate these requirements into solar observational requirements such as spectral resolution and irradiance accuracy. We will also describe the activities at NOAA to provide long-term solar EUV irradiance observations and derived products that are needed for real-time space weather modeling.

  5. Deficits in Coordinative Bimanual Timing Precision in Children With Specific Language Impairment

    PubMed Central

    Goffman, Lisa; Zelaznik, Howard N.

    2017-01-01

    Purpose Our objective was to delineate components of motor performance in specific language impairment (SLI); specifically, whether deficits in timing precision in one effector (unimanual tapping) and in two effectors (bimanual clapping) are observed in young children with SLI. Method Twenty-seven 4- to 5-year-old children with SLI and 21 age-matched peers with typical language development participated. All children engaged in a unimanual tapping and a bimanual clapping timing task. Standard measures of language and motor performance were also obtained. Results No group differences in timing variability were observed in the unimanual tapping task. However, compared with typically developing peers, children with SLI were more variable in their timing precision in the bimanual clapping task. Nine of the children with SLI performed greater than 1 SD below the mean on a standardized motor assessment. The children with low motor performance showed the same profile as observed across all children with SLI, with unaffected unimanual and impaired bimanual timing precision. Conclusions Although unimanual timing is unaffected, children with SLI show a deficit in timing that requires bimanual coordination. We propose that the timing deficits observed in children with SLI are associated with the increased demands inherent in bimanual performance. PMID:28174821

  6. Dexmedetomidine and propofol sedation requirements in an autistic rat model.

    PubMed

    Elmorsy, Soha A; Soliman, Ghada F; Rashed, Laila A; Elgendy, Hamed

    2018-05-30

    Autism is a challenging neurodevelopmental disorder. Previous clinical observations suggest altered sedation requirements for autistic children. Our study aimed to test this observation experimentally with an animal model and, to explore its possible mechanisms. Eight adult pregnant female Sprague Dawley rats were randomly selected into two groups. Four were injected with intraperitoneal sodium valproate on the gestational day 12 and four were injected with normal saline. On post-natal day 28 the newborn male rats were subjected to an open field test to confirm autistic features. Each rat was injected intraperitoneally with a single dose of propofol (50 mg/kg) or dexmedetomidine (0.2 mg/kg). Times to Loss of Righting Reflex (LORR) and to Return of Righting Reflex (RORR) were recorded. On the next day, all rats were re-sedated and their EEGs were recorded. The rats were sacrificed and hippocampal GABAA and glutamate NMDA receptor gene expression were assessed. Autistic rats showed significantly longer time to LORR and a shorter time to RORR compared to controls (Median time to LORR: 12.0 versus 5.0 for dexmedetomidine and 22.0 and 8.0 for propofol; p < 0.05). EEG showed a low frequency, high amplitude wave pattern two minutes after LORR in control rats. Autistic rats showed a high frequency, low amplitude awake pattern. Hippocampal GABAA receptor gene expression was significantly less in autistic rats and NMDA gene expression was greater. This study in rat supports the clinical observations of increased anesthetic sedative requirements in autistic children and proposes a mechanism for it.

  7. The Main Pillar: Assessment of Space Weather Observational Asset Performance Supporting Nowcasting, Forecasting and Research to Operations

    NASA Technical Reports Server (NTRS)

    Posner, Arik; Hesse, Michael; SaintCyr, Chris

    2014-01-01

    Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations.

  8. 50 CFR 218.115 - Requirements for monitoring and reporting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Narrative description of sensors and platforms utilized for marine mammal detection and timeline... sensor; (vi) Length of time observers maintained visual contact with marine mammal; (vii) Wave height...

  9. Bearings Only Air-to-Air Ranging

    DTIC Science & Technology

    1988-07-25

    directly in fiut of the observer whem first detected, more time will be needed for a good estimate. A sound uinp them is for the observer, having...altitude angle to provide an estimate of the z component. Moving targets commonly require some 60 seconds for good estimates of target location and...fixed target case, where a good strategy for the observer can be determined a priori, highly effective maneuvers for the observer in the case of a moving

  10. Estimating Morning Change in Land Surface Temperature from MODIS Day/Night Observations: Applications for Surface Energy Balance Modeling.

    PubMed

    Hain, Christopher R; Anderson, Martha C

    2017-10-16

    Observations of land surface temperature (LST) are crucial for the monitoring of surface energy fluxes from satellite. Methods that require high temporal resolution LST observations (e.g., from geostationary orbit) can be difficult to apply globally because several geostationary sensors are required to attain near-global coverage (60°N to 60°S). While these LST observations are available from polar-orbiting sensors, providing global coverage at higher spatial resolutions, the temporal sampling (twice daily observations) can pose significant limitations. For example, the Atmosphere Land Exchange Inverse (ALEXI) surface energy balance model, used for monitoring evapotranspiration and drought, requires an observation of the morning change in LST - a quantity not directly observable from polar-orbiting sensors. Therefore, we have developed and evaluated a data-mining approach to estimate the mid-morning rise in LST from a single sensor (2 observations per day) of LST from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on the Aqua platform. In general, the data-mining approach produced estimates with low relative error (5 to 10%) and statistically significant correlations when compared against geostationary observations. This approach will facilitate global, near real-time applications of ALEXI at higher spatial and temporal coverage from a single sensor than currently achievable with current geostationary datasets.

  11. Time Asymmetric Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Bohm, Arno R.; Gadella, Manuel; Kielanowski, Piotr

    2011-09-01

    The meaning of time asymmetry in quantum physics is discussed. On the basis of a mathematical theorem, the Stone-von Neumann theorem, the solutions of the dynamical equations, the Schrödinger equation (1) for states or the Heisenberg equation (6a) for observables are given by a unitary group. Dirac kets require the concept of a RHS (rigged Hilbert space) of Schwartz functions; for this kind of RHS a mathematical theorem also leads to time symmetric group evolution. Scattering theory suggests to distinguish mathematically between states (defined by a preparation apparatus) and observables (defined by a registration apparatus (detector)). If one requires that scattering resonances of width Γ and exponentially decaying states of lifetime τ=h/Γ should be the same physical entities (for which there is sufficient evidence) one is led to a pair of RHS's of Hardy functions and connected with it, to a semigroup time evolution t0≤t<∞, with the puzzling result that there is a quantum mechanical beginning of time, just like the big bang time for the universe, when it was a quantum system. The decay of quasi-stable particles is used to illustrate this quantum mechanical time asymmetry. From the analysis of these processes, we show that the properties of rigged Hilbert spaces of Hardy functions are suitable for a formulation of time asymmetry in quantum mechanics.

  12. Accretion of low-metallicity gas by the Milky Way.

    PubMed

    Wakker, B P; Howk, J C; Savage, B D; van Woerden, H; Tufte, S L; Schwarz, U J; Benjamin, R; Reynolds, R J; Peletier, R F; Kalberla, P M

    1999-11-25

    Models of the chemical evolution of the Milky Way suggest that the observed abundances of elements heavier than helium ('metals') require a continuous infall of gas with metallicity (metal abundance) about 0.1 times the solar value. An infall rate integrated over the entire disk of the Milky Way of approximately 1 solar mass per year can solve the 'G-dwarf problem'--the observational fact that the metallicities of most long-lived stars near the Sun lie in a relatively narrow range. This infall dilutes the enrichment arising from the production of heavy elements in stars, and thereby prevents the metallicity of the interstellar medium from increasing steadily with time. However, in other spiral galaxies, the low-metallicity gas needed to provide this infall has been observed only in associated dwarf galaxies and in the extreme outer disk of the Milky Way. In the distant Universe, low-metallicity hydrogen clouds (known as 'damped Ly alpha absorbers') are sometimes seen near galaxies. Here we report a metallicity of 0.09 times solar for a massive cloud that is falling into the disk of the Milky Way. The mass flow associated with this cloud represents an infall per unit area of about the theoretically expected rate, and approximately 0.1-0.2 times the amount required for the whole Galaxy.

  13. Real-time capability of GEONET system and its application to crust monitoring

    NASA Astrophysics Data System (ADS)

    Yamagiwa, Atsushi; Hatanaka, Yuki; Yutsudo, Toru; Miyahara, Basara

    2006-03-01

    The GPS Earth Observation Network system (GEONET) has been playing an important role in monitoring the crustal deformation of Japan. Since its start of operation, the requirements for accuracy and timeliness have become higher and higher. On the other hand, recent broadband communication infrastructure has had capability to realize real-time crust monitoring and to aid the development of a location-based service. In early 2003, the Geographical Survey Institute (GSI) upgraded the GEONET system to meet new requirements. The number of stations became 1200 in total by March, 2003. The antennas were unified to the choke ring antennas of Dorne Margolin T-type and the receivers were replaced with new ones that are capable of real-time observation and data transfer. The new system uses IP-connection through IP-VPN (Internet Protocol Virtual Private Network) for data transfer, which is provided by communication companies. The Data Processing System, which manages the observation data and analyses in GEONET, has 7 units. GEONET carries out three kinds of routine analyses and an analysis of RTK-type for emergencies. The new system has shown its capability for real-time crust monitoring, for example, the precise and rapid detection of coseismic (and post-seismic) motion caused by 2003 Tokachi-Oki earthquake.

  14. Time assignment system and its performance aboard the Hitomi satellite

    NASA Astrophysics Data System (ADS)

    Terada, Yukikatsu; Yamaguchi, Sunao; Sugimoto, Shigenobu; Inoue, Taku; Nakaya, Souhei; Murakami, Maika; Yabe, Seiya; Oshimizu, Kenya; Ogawa, Mina; Dotani, Tadayasu; Ishisaki, Yoshitaka; Mizushima, Kazuyo; Kominato, Takashi; Mine, Hiroaki; Hihara, Hiroki; Iwase, Kaori; Kouzu, Tomomi; Tashiro, Makoto S.; Natsukari, Chikara; Ozaki, Masanobu; Kokubun, Motohide; Takahashi, Tadayuki; Kawakami, Satoko; Kasahara, Masaru; Kumagai, Susumu; Angelini, Lorella; Witthoeft, Michael

    2018-01-01

    Fast timing capability in x-ray observation of astrophysical objects is one of the key properties for the ASTRO-H (Hitomi) mission. Absolute timing accuracies of 350 or 35 μs are required to achieve nominal scientific goals or to study fast variabilities of specific sources. The satellite carries a GPS receiver to obtain accurate time information, which is distributed from the central onboard computer through the large and complex SpaceWire network. The details of the time system on the hardware and software design are described. In the distribution of the time information, the propagation delays and jitters affect the timing accuracy. Six other items identified within the timing system will also contribute to absolute time error. These error items have been measured and checked on ground to ensure the time error budgets meet the mission requirements. The overall timing performance in combination with hardware performance, software algorithm, and the orbital determination accuracies, etc. under nominal conditions satisfies the mission requirements of 35 μs. This work demonstrates key points for space-use instruments in hardware and software designs and calibration measurements for fine timing accuracy on the order of microseconds for midsized satellites using the SpaceWire (IEEE1355) network.

  15. Prosthetic Complications and Maintenance Requirements in Locator-attached Implant-Supported Overdentures: A Retrospective Study.

    PubMed

    Engelhardt, Frank; Zeman, Florian; Behr, Michael; Hahmel, Sebastian

    2016-03-01

    Retrospective data of 32 patients supplied with implant-supported and Locator-attached overdentures were screened for prosthetic complications and maintenance requirements, which were recorded and statistically analyzed. Mean observation time was 4.78 ± 1.72) years. Loss of retention was the most frequently observed event (n = 22). Damage and exchange of the insert holders (n = 4) and loosening of locator attachments (n = 2) and fracture of the insert holder (n = 2) were uncommon events; no loss of locator attachments was observed. Loss of retention in Locator-attached overdentures is frequent; correlating patient-specific parameters with prosthetic complications is necessary to define recommendations for the use of Locator attachments.

  16. Application of a simple recording system to the analysis of free-play behavior in autistic children1

    PubMed Central

    Boer, Arend P.

    1968-01-01

    An observational system, which has been developed to facilitate recording of the total behavioral repertoire of autistic children, involves time-sampling recording of behavior with the help of a common Stenograph machine. Categories which exhausted all behavior were defined. Each category corresponded with a designated key on the Stenograph machine. The observer depressed one key at each 1-sec interval. The observer was paced by audible beats from a metronome. A naive observer can be used with this method. The observer is not mechanically limited and a minimum of observer training is required to obtain reliable measures. The data sampled during a five-week observation period indicated the stability of a taxonomic instrument of behavior based upon direct, time-sampling observations and the stability of spontaneous autistic behavior. Results showed that the behavior of the subjects was largely nonrandom and unsocialized in character. PMID:16795193

  17. Application of a simple recording system to the analysis of free-play behavior in autistic children.

    PubMed

    Boer, A P

    1968-01-01

    An observational system, which has been developed to facilitate recording of the total behavioral repertoire of autistic children, involves time-sampling recording of behavior with the help of a common Stenograph machine. Categories which exhausted all behavior were defined. Each category corresponded with a designated key on the Stenograph machine. The observer depressed one key at each 1-sec interval. The observer was paced by audible beats from a metronome. A naive observer can be used with this method. The observer is not mechanically limited and a minimum of observer training is required to obtain reliable measures. The data sampled during a five-week observation period indicated the stability of a taxonomic instrument of behavior based upon direct, time-sampling observations and the stability of spontaneous autistic behavior. Results showed that the behavior of the subjects was largely nonrandom and unsocialized in character.

  18. The MST radar technique: Requirements for operational weather forecasting

    NASA Technical Reports Server (NTRS)

    Larsen, M. F.

    1983-01-01

    There is a feeling that the accuracy of mesoscale forecasts for spatial scales of less than 1000 km and time scales of less than 12 hours can be improved significantly if resources are applied to the problem in an intensive effort over the next decade. Since the most dangerous and damaging types of weather occur at these scales, there are major advantages to be gained if such a program is successful. The interest in improving short term forecasting is evident. The technology at the present time is sufficiently developed, both in terms of new observing systems and the computing power to handle the observations, to warrant an intensive effort to improve stormscale forecasting. An assessment of the extent to which the so-called MST radar technique fulfills the requirements for an operational mesoscale observing network is reviewed and the extent to which improvements in various types of forecasting could be expected if such a network is put into operation are delineated.

  19. Three-Dimensional Structure and Energy Balance of a Coronal Mass Ejection

    NASA Technical Reports Server (NTRS)

    Lee, J.-Y.; Raymond, J. C.; Ko, Y.-K.; Kim, K.-S.

    2009-01-01

    UVCS observed Doppler-shifted material of a partial halo coronal mass ejection (CME) on 2001 December 13. The observed ratio of [O VJ/O V] is a reliable density diagnostic important for assessing the state of the plasma. Earlier UVCS observations of CMEs found evidence that the ejected plasma is heated long after the eruption. This paper investigated the heating rates, which represent a significant fraction of the CME energy budget. The parameterized heating and radiative and adiabatic cooling have been used to evaluate the temperature evolution of the CME material with a time-dependent ionization state model. Continuous heating is required to match the UVCS observations. To match the O VI bright knots, a higher heating rate is required such that the heating energy is greater than the kinetic energy.

  20. Mechanical analysis of statolith action in roots and rhizoids

    NASA Astrophysics Data System (ADS)

    Todd, Paul

    1994-08-01

    Published observations on the response times following gravistimulation (horizontal positioning) of Chara rhizoids and developing roots of vascular plants with normal and ``starchless'' amyloplasts were reviewed and compared. Statolith motion was found to be consistent with gravitational sedimentation opposed by elastic deformation of an intracellular material. The time required for a statolith to sediment to equilibrium was calculated on the basis of its buoyant density and compared with observed sedimentation times. In the examples chosen, the response time following gravistimulation (from horizontal positioning to the return of downward growth) could be related to the statolith sedimentation time. Such a relationship implies that the transduction step is rapid in comparison with the perception steo following gravistimulation of rhizoids and developing roots.

  1. Mechanical analysis of statolith action in roots and rhizoids.

    PubMed

    Todd, P

    1994-01-01

    Published observations on the response times following gravistimulation (horizontal positioning) of Chara rhizoids and developing roots of vascular plants with normal and "starchless" amyloplasts were reviewed and compared. Statolith motion was found to be consistent with gravitational sedimentation opposed by elastic deformation of an intracellular material. The time required for a statolith to sediment to equilibrium was calculated on the basis of its buoyant density and compared with observed sedimentation times. In the examples chosen, the response time following gravistimulation (from horizontal positioning to the return of downward growth) could be related to the statolith sedimentation time. Such a relationship implies that the transduction step is rapid in comparison with the perception step following gravistimulation of rhizoids and developing roots.

  2. Transit timing at Toruń Center for Astronomy

    NASA Astrophysics Data System (ADS)

    Bykowski, W.; Maciejewski, G.

    2011-01-01

    The transit monitoring is one of well-known methods for discovering and observing new extrasolar planets. Among various advantages, this way of searching other worlds does not require complex and expensive equipment -- it can be performed with a relatively small telescope and high-quality CCD camera. At the Center for Astronomy of Nicolaus Copernicus University in Toruń, Poland, we collect observational data using the 60-cm Cassegrain telescope hoping that it would be possible to discover new objects in already known planetary systems using the transit timing variation method. Our observations are a part of a bigger cooperation between observatories from many countries.

  3. Listing of 502 Times When the Ulysses Magnetic Fields Instrument Observed Waves Due to Newborn Interstellar Pickup Protons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, Bradford E.; Smith, Charles W.; Isenberg, Philip A.

    In two earlier publications we analyzed 502 intervals of magnetic waves excited by newborn interstellar pickup protons that were observed by the Ulysses spacecraft. Due to the considerable effort required in identifying these events, we provide a list of the times for the 502 wave event intervals previously identified. In the process, we provide a brief description of how the waves were found and what their properties are. We also remind the reader of the conditions that permit the waves to reach observable levels and explain why the waves are not seen more often.

  4. Adjustment of relative humidity and temperature for differences in elevation.

    Treesearch

    Owen P. Cramer

    1961-01-01

    The variation of fire-weather elements in mountainous terrain is complex at any one time, and the patterns vary considerably with time. During periods of serious fire weather, this variation becomes important. Much information is obtainable by local interpretation of available forecasts and observations. Optimum use of available information requires some understanding...

  5. Cashew Nut Positioning during Stone Tool Use by Wild Bearded Capuchin Monkeys (Sapajus libidinosus).

    PubMed

    Falótico, Tiago; Luncz, Lydia V; Svensson, Magdalena S; Haslam, Michael

    2016-01-01

    Wild capuchin monkeys (Sapajus libidinosus) at Serra da Capivara National Park, Brazil, regularly use stone tools to break open cashew nuts (Anacardium spp.). Here we examine 2 approaches used by the capuchins to position the kidney-shaped cashew nuts on an anvil before striking with a stone tool. Lateral positioning involves placing the nut on its flatter, more stable side, therefore requiring less attention from the monkey during placement. However, the less stable and never previously described arched position, in which the nut is balanced with its curved side uppermost, requires less force to crack the outer shell. We observed cashew nut cracking in a field experimental setting. Only 6 of 20 adults, of both sexes, were observed to deliberately place cashew nuts in an arched position, which may indicate that the technique requires time and experience to learn. We also found that use of the arched position with dry nuts, but not fresh, required, in 63% of the time, an initial processing to remove one of the cashew nut lobes, creating a more stable base for the arch. This relatively rare behaviour appears to have a complex ontogeny, but further studies are required to establish the extent to which social learning is involved. © 2017 S. Karger AG, Basel.

  6. Rates of projected climate change dramatically exceed past rates of climatic niche evolution among vertebrate species.

    PubMed

    Quintero, Ignacio; Wiens, John J

    2013-08-01

    A key question in predicting responses to anthropogenic climate change is: how quickly can species adapt to different climatic conditions? Here, we take a phylogenetic approach to this question. We use 17 time-calibrated phylogenies representing the major tetrapod clades (amphibians, birds, crocodilians, mammals, squamates, turtles) and climatic data from distributions of > 500 extant species. We estimate rates of change based on differences in climatic variables between sister species and estimated times of their splitting. We compare these rates to predicted rates of climate change from 2000 to 2100. Our results are striking: matching projected changes for 2100 would require rates of niche evolution that are > 10,000 times faster than rates typically observed among species, for most variables and clades. Despite many caveats, our results suggest that adaptation to projected changes in the next 100 years would require rates that are largely unprecedented based on observed rates among vertebrate species. © 2013 John Wiley & Sons Ltd/CNRS.

  7. Repeatability of Bolus Kinetics Ultrasound Perfusion Imaging for the Quantification of Cerebral Blood Flow.

    PubMed

    Vinke, Elisabeth J; Eyding, Jens; de Korte, Chris L; Slump, Cornelis H; van der Hoeven, Johannes G; Hoedemaekers, Cornelia W E

    2017-12-01

    Ultrasound perfusion imaging (UPI) can be used for the quantification of cerebral perfusion. In a neuro-intensive care setting, repeated measurements are required to evaluate changes in cerebral perfusion and monitor therapy. The aim of this study was to determine the repeatability of UPI in quantification of cerebral perfusion. UPI measurement of cerebral perfusion was performed three times in healthy patients. The coefficients of variation of the three bolus injections were calculated for both time- and volume-derived perfusion parameters in the macro- and microcirculation. The UPI time-dependent parameters had overall the lowest CVs in both the macro- and microcirculation. The volume-related parameters had poorer repeatability, especially in the microcirculation. Both intra-observer variability and inter-observer variability were low. Although UPI is a promising tool for the bedside measurement of cerebral perfusion, improvement of the technique is required before implementation in routine clinical practice. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  8. Problematics of Time and Timing in the Longitudinal Study of Human Development: Theoretical and Methodological Issues

    PubMed Central

    Lerner, Richard M.; Schwartz, Seth J; Phelps, Erin

    2009-01-01

    Studying human development involves describing, explaining, and optimizing intraindividual change and interindividual differences in such change and, as such, requires longitudinal research. The selection of the appropriate type of longitudinal design requires selecting the option that best addresses the theoretical questions asked about developmental process and the use of appropriate statistical procedures to best exploit data derived from theory-predicated longitudinal research. This paper focuses on several interrelated problematics involving the treatment of time and the timing of observations that developmental scientists face in creating theory-design fit and in charting in change-sensitive ways developmental processes across life. We discuss ways in which these problematics may be addressed to advance theory-predicated understanding of the role of time in processes of individual development. PMID:19554215

  9. Towards real-time verification of CO2 emissions

    NASA Astrophysics Data System (ADS)

    Peters, Glen P.; Le Quéré, Corinne; Andrew, Robbie M.; Canadell, Josep G.; Friedlingstein, Pierre; Ilyina, Tatiana; Jackson, Robert B.; Joos, Fortunat; Korsbakken, Jan Ivar; McKinley, Galen A.; Sitch, Stephen; Tans, Pieter

    2017-12-01

    The Paris Agreement has increased the incentive to verify reported anthropogenic carbon dioxide emissions with independent Earth system observations. Reliable verification requires a step change in our understanding of carbon cycle variability.

  10. Aging and Attentional Control

    PubMed Central

    Tsang, Pamela S.

    2013-01-01

    The research examines the structural bottleneck account and the resource account of the substantial dual-task deficits among older adults. Procedures from two common dual-task methodologies--the psychological refractory period and the relative-priority manipulation--were used to encourage maximization of the joint performance. Performance and time-sharing strategies from subjects between the ages of 20 and 70 were examined. Age-related declines in time-sharing efficiency and in the precision of the executive control process were observed. The age-related effect was larger when two manual responses were required than when one manual and one vocal response were required but no evidence for obligatory sequential processing was found. Except for the most demanding conditions, comparable practice effects were observed between the younger and older subjects, suggesting considerable cognitive plasticity in the older subjects. Implications for the two attentional accounts were discussed. PMID:23281799

  11. Cold Atomic Hydrogen, Narrow Self-Absorption, and the Age of Molecular Clouds

    NASA Technical Reports Server (NTRS)

    Goldsmith, Paul F.

    2006-01-01

    This viewgraph presentation reviews the history, and current work on HI and its importance in star formation. Through many observations of HI Narrow Self Absorption (HINSA) the conclusions are drawn and presented. Local molecular clouds have HI well-mixed with molecular constituents This HI is cold, quiescent, and must be well-shielded from the UV radiation field The density and fractional abundance (wrt H2) of the cold HI are close to steady state values The time required to convert these starless clouds from purely HI initial state to observed present composition is a few to ten million years This timescale is a lower limit - if dense clouds being swept up from lower density regions by shocks, the time to accumulate material to get A(sub v) is approximately 1 and provide required shielding may be comparable or longer

  12. Method for Direct Measurement of Cosmic Acceleration by 21-cm Absorption Systems

    NASA Astrophysics Data System (ADS)

    Yu, Hao-Ran; Zhang, Tong-Jie; Pen, Ue-Li

    2014-07-01

    So far there is only indirect evidence that the Universe is undergoing an accelerated expansion. The evidence for cosmic acceleration is based on the observation of different objects at different distances and requires invoking the Copernican cosmological principle and Einstein's equations of motion. We examine the direct observability using recession velocity drifts (Sandage-Loeb effect) of 21-cm hydrogen absorption systems in upcoming radio surveys. This measures the change in velocity of the same objects separated by a time interval and is a model-independent measure of acceleration. We forecast that for a CHIME-like survey with a decade time span, we can detect the acceleration of a ΛCDM universe with 5σ confidence. This acceleration test requires modest data analysis and storage changes from the normal processing and cannot be recovered retroactively.

  13. Effective Thermodynamics for a Marginal Observer

    NASA Astrophysics Data System (ADS)

    Polettini, Matteo; Esposito, Massimiliano

    2017-12-01

    Thermodynamics is usually formulated on the presumption that the observer has complete information about the system he or she deals with: no parasitic current, exact evaluation of the forces that drive the system. For example, the acclaimed fluctuation relation (FR), relating the probability of time-forward and time-reversed trajectories, assumes that the measurable transitions suffice to characterize the process as Markovian (in our case, a continuous-time jump process). However, most often the observer only measures a marginal current. We show that he or she will nonetheless produce an effective description that does not dispense with the fundamentals of thermodynamics, including the FR and the 2nd law. Our results stand on the mathematical construction of a hidden time reversal of the dynamics, and on the physical requirement that the observed current only accounts for a single transition in the configuration space of the system. We employ a simple abstract example to illustrate our results and to discuss the feasibility of generalizations.

  14. Restaurant Policies and Practices for Serving Raw Fish in Minnesota.

    PubMed

    Hedeen, Nicole

    2016-10-01

    The number of restaurants serving sushi within Minnesota is continuously increasing. The practices and protocols of serving raw fish are complex and require detailed planning to ensure that food served to patrons will not cause illness. Although the popularity of sushi is increasing, there is a lack of research on food safety issues pertaining to preparation of raw fish and sushi rice. To address this gap, the Minnesota Department of Health Environmental Health Specialists Network Food program collected descriptive data on restaurant practices and policies concerning the service of raw fish and sushi rice in 40 Minnesota restaurants. At each restaurant, a specialist interviewed a restaurant manager, conducted an observation of the sushi prep areas in the restaurant kitchen, and reviewed parasite destruction letters and invoices from fish supplier(s). Over half of the restaurants (59%) were missing one or more of the parasite destruction letters from their fish supplier(s) guaranteeing that fish had been properly frozen to the time and temperature requirements in the Minnesota Food Code. A total of 42 parasite destruction letters from suppliers were observed; 10% were considered "adequate" letters. The majority of the letters were missing details pertaining to the types of fish frozen, the length of time fish were frozen, or details on what temperatures fish were held frozen or a combination of all three. Most restaurants were using time as a public health control for their sushi rice. For those restaurants using time as a public health control, 26% had a written procedure on-site, and approximately 53% were keeping track of time. Bare hand contact during sushi prep was observed in 17% of restaurants, and in more than 40% of the restaurants, at least one fish was mislabeled on the menu. Findings from this study indicate that many Minnesota restaurants are not complying with the Food Code requirements pertaining to parasite destruction for the service of raw fish or the use of time as a public health control for sushi rice.

  15. A two-dimensional time domain near zone to far zone transformation

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Ryan, Deirdre; Beggs, John H.; Kunz, Karl S.

    1991-01-01

    A time domain transformation useful for extrapolating three dimensional near zone finite difference time domain (FDTD) results to the far zone was presented. Here, the corresponding two dimensional transform is outlined. While the three dimensional transformation produced a physically observable far zone time domain field, this is not convenient to do directly in two dimensions, since a convolution would be required. However, a representative two dimensional far zone time domain result can be obtained directly. This result can then be transformed to the frequency domain using a Fast Fourier Transform, corrected with a simple multiplicative factor, and used, for example, to calculate the complex wideband scattering width of a target. If an actual time domain far zone result is required, it can be obtained by inverse Fourier transform of the final frequency domain result.

  16. A two-dimensional time domain near zone to far zone transformation

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Ryan, Deirdre; Beggs, John H.; Kunz, Karl S.

    1991-01-01

    In a previous paper, a time domain transformation useful for extrapolating 3-D near zone finite difference time domain (FDTD) results to the far zone was presented. In this paper, the corresponding 2-D transform is outlined. While the 3-D transformation produced a physically observable far zone time domain field, this is not convenient to do directly in 2-D, since a convolution would be required. However, a representative 2-D far zone time domain result can be obtained directly. This result can then be transformed to the frequency domain using a Fast Fourier Transform, corrected with a simple multiplicative factor, and used, for example, to calculate the complex wideband scattering width of a target. If an actual time domain far zone result is required it can be obtained by inverse Fourier transform of the final frequency domain result.

  17. Sequential Modular Position and Momentum Measurements of a Trapped Ion Mechanical Oscillator

    NASA Astrophysics Data System (ADS)

    Flühmann, C.; Negnevitsky, V.; Marinelli, M.; Home, J. P.

    2018-04-01

    The noncommutativity of position and momentum observables is a hallmark feature of quantum physics. However, this incompatibility does not extend to observables that are periodic in these base variables. Such modular-variable observables have been suggested as tools for fault-tolerant quantum computing and enhanced quantum sensing. Here, we implement sequential measurements of modular variables in the oscillatory motion of a single trapped ion, using state-dependent displacements and a heralded nondestructive readout. We investigate the commutative nature of modular variable observables by demonstrating no-signaling in time between successive measurements, using a variety of input states. Employing a different periodicity, we observe signaling in time. This also requires wave-packet overlap, resulting in quantum interference that we enhance using squeezed input states. The sequential measurements allow us to extract two-time correlators for modular variables, which we use to violate a Leggett-Garg inequality. Signaling in time and Leggett-Garg inequalities serve as efficient quantum witnesses, which we probe here with a mechanical oscillator, a system that has a natural crossover from the quantum to the classical regime.

  18. Observing spatio-temporal dynamics of excitable media using reservoir computing

    NASA Astrophysics Data System (ADS)

    Zimmermann, Roland S.; Parlitz, Ulrich

    2018-04-01

    We present a dynamical observer for two dimensional partial differential equation models describing excitable media, where the required cross prediction from observed time series to not measured state variables is provided by Echo State Networks receiving input from local regions in space, only. The efficacy of this approach is demonstrated for (noisy) data from a (cubic) Barkley model and the Bueno-Orovio-Cherry-Fenton model describing chaotic electrical wave propagation in cardiac tissue.

  19. Effects of Above Real Time Training (ARTT) On Individual Skills and Contributions to Crew/Team Performance

    NASA Technical Reports Server (NTRS)

    Ali, Syed Firasat; Khan, M. Javed; Rossi, Marcia J.; Crane, Peter; Guckenberger, Dutch; Bageon, Kellye

    2001-01-01

    Above Real Time Training (ARTT) is the training acquired on a real time simulator when it is modified to present events at a faster pace than normal. The experiments on training of pilots performed by NASA engineers and others have indicated that real time training (RTT) reinforced with ARTT would offer an effective training strategy for such tasks which require significant effort at time and workload management. A study was conducted to find how ARTT and RTT complement each other for training of novice pilot-navigator teams to fly on a required route. In the experiment, each of the participating pilot-navigator teams was required to conduct simulator flights on a prescribed two-legged ground track while maintaining required air speed and altitude. At any instant in a flight, the distance between the actual spatial point location of the airplane and the required spatial point was used as a measure of deviation from the required route. A smaller deviation represented better performance. Over a segment of flight or over complete flight, an average value of the deviation represented consolidated performance. The deviations were computed from the information on latitude, longitude, and altitude. In the combined ARTT and RTT program, ARTT at intermediate training intervals was beneficial in improving the real time performance of the trainees. It was observed that the team interaction between pilot and navigator resulted in maintaining high motivation and active participation throughout the training program.

  20. Starshade orbital maneuver study for WFIRST

    NASA Astrophysics Data System (ADS)

    Soto, Gabriel; Sinha, Amlan; Savransky, Dmitry; Delacroix, Christian; Garrett, Daniel

    2017-09-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission, scheduled for launch in the mid-2020s will perform exoplanet science via both direct imaging and a microlensing survey. An internal coronagraph is planned to perform starlight suppression for exoplanet imaging, but an external starshade could be used to achieve the required high contrasts with potentially higher throughput. This approach would require a separately-launched occulter spacecraft to be positioned at exact distances from the telescope along the line of sight to a target star system. We present a detailed study to quantify the Δv requirements and feasibility of deploying this additional spacecraft as a means of exoplanet imaging. The primary focus of this study is the fuel use of the occulter while repositioning between targets. Based on its design, the occulter is given an offset distance from the nominal WFIRST halo orbit. Target star systems and look vectors are generated using Exoplanet Open-Source Imaging Simulator (EXOSIMS); a boundary value problem is then solved between successive targets. On average, 50 observations are achievable with randomly selected targets given a 30-day transfer time. Individual trajectories can be optimized for transfer time as well as fuel usage to be used in mission scheduling. Minimizing transfer time reduces the total mission time by up to 4.5 times in some simulations before expending the entire fuel budget. Minimizing Δv can generate starshade missions that achieve over 100 unique observations within the designated mission lifetime of WFIRST.

  1. Global trends

    NASA Technical Reports Server (NTRS)

    Megie, G.; Chanin, M.-L.; Ehhalt, D.; Fraser, P.; Frederick, J. F.; Gille, J. C.; Mccormick, M. P.; Schoebert, M.; Bishop, L.; Bojkov, R. D.

    1990-01-01

    Measuring trends in ozone, and most other geophysical variables, requires that a small systematic change with time be determined from signals that have large periodic and aperiodic variations. Their time scales range from the day-to-day changes due to atmospheric motions through seasonal and annual variations to 11 year cycles resulting from changes in the sun UV output. Because of the magnitude of all of these variations is not well known and highly variable, it is necessary to measure over more than one period of the variations to remove their effects. This means that at least 2 or more times the 11 year sunspot cycle. Thus, the first requirement is for a long term data record. The second related requirement is that the record be consistent. A third requirement is for reasonable global sampling, to ensure that the effects are representative of the entire Earth. The various observational methods relevant to trend detection are reviewed to characterize their quality and time and space coverage. Available data are then examined for long term trends or recent changes in ozone total content and vertical distribution, as well as related parameters such as stratospheric temperature, source gases and aerosols.

  2. Time management in radiation oncology: evaluation of time, attendance of medical staff, and resources during radiotherapy for prostate cancer: the DEGRO-QUIRO trial.

    PubMed

    Keilholz, L; Willner, J; Thiel, H-J; Zamboglou, N; Sack, H; Popp, W

    2014-01-01

    In order to evaluate resource requirements, the German Society of Radiation Oncology (DEGRO) recorded the times needed for core procedures in the radio-oncological treatment of various cancer types within the scope of its QUIRO trial. The present study investigated the personnel and infrastructural resources required in radiotherapy of prostate cancer. The investigation was carried out in the setting of definitive radiotherapy of prostate cancer patients between July and October 2008 at two radiotherapy centers, both with well-trained staff and modern technical facilities at their disposal. Personnel attendance times and room occupancy times required for core procedures (modules) were each measured prospectively by two independently trained observers using time measurements differentiated on the basis of professional group (physician, physicist, and technician), 3D conformal (3D-cRT), and intensity-modulated radiotherapy (IMRT). Total time requirements of 983 min for 3D-cRT and 1485 min for step-and-shoot IMRT were measured for the technician (in terms of professional group) in all modules recorded and over the entire course of radiotherapy for prostate cancer (72-76 Gy). Times needed for the medical specialist/physician were 255 min (3D-cRT) and 271 min (IMRT), times of the physicist were 181 min (3D-cRT) and 213 min (IMRT). The difference in time was significant, although variations in time spans occurred primarily as a result of various problems during patient treatment. This investigation has permitted, for the first time, a realistic estimation of average personnel and infrastructural requirements for core procedures in quality-assured definitive radiotherapy of prostate cancer. The increased time needed for IMRT applies to the step-and-shoot procedure with verification measurements for each irradiation planning.

  3. Data-driven fault mechanics: Inferring fault hydro-mechanical properties from in situ observations of injection-induced aseismic slip

    NASA Astrophysics Data System (ADS)

    Bhattacharya, P.; Viesca, R. C.

    2017-12-01

    In the absence of in situ field-scale observations of quantities such as fault slip, shear stress and pore pressure, observational constraints on models of fault slip have mostly been limited to laboratory and/or remote observations. Recent controlled fluid-injection experiments on well-instrumented faults fill this gap by simultaneously monitoring fault slip and pore pressure evolution in situ [Gugleilmi et al., 2015]. Such experiments can reveal interesting fault behavior, e.g., Gugleilmi et al. report fluid-activated aseismic slip followed only subsequently by the onset of micro-seismicity. We show that the Gugleilmi et al. dataset can be used to constrain the hydro-mechanical model parameters of a fluid-activated expanding shear rupture within a Bayesian framework. We assume that (1) pore-pressure diffuses radially outward (from the injection well) within a permeable pathway along the fault bounded by a narrow damage zone about the principal slip surface; (2) pore-pressure increase ativates slip on a pre-stressed planar fault due to reduction in frictional strength (expressed as a constant friction coefficient times the effective normal stress). Owing to efficient, parallel, numerical solutions to the axisymmetric fluid-diffusion and crack problems (under the imposed history of injection), we are able to jointly fit the observed history of pore-pressure and slip using an adaptive Monte Carlo technique. Our hydrological model provides an excellent fit to the pore-pressure data without requiring any statistically significant permeability enhancement due to the onset of slip. Further, for realistic elastic properties of the fault, the crack model fits both the onset of slip and its early time evolution reasonably well. However, our model requires unrealistic fault properties to fit the marked acceleration of slip observed later in the experiment (coinciding with the triggering of microseismicity). Therefore, besides producing meaningful and internally consistent bounds on in-situ fault properties like permeability, storage coefficient, resolved stresses, friction and the shear modulus, our results also show that fitting the complete observed time history of slip requires alternative model considerations, such as variations in fault mechanical properties or friction coefficient with slip.

  4. Representativeness of direct observations selected using a work-sampling equation.

    PubMed

    Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas

    2015-01-01

    Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.

  5. The main pillar: Assessment of space weather observational asset performance supporting nowcasting, forecasting, and research to operations.

    PubMed

    Posner, A; Hesse, M; St Cyr, O C

    2014-04-01

    Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations. Manuscript assesses current and near-future space weather assetsCurrent assets unreliable for forecasting of severe geomagnetic stormsNear-future assets will not improve the situation.

  6. The main pillar: Assessment of space weather observational asset performance supporting nowcasting, forecasting, and research to operations

    PubMed Central

    Posner, A; Hesse, M; St Cyr, O C

    2014-01-01

    Space weather forecasting critically depends upon availability of timely and reliable observational data. It is therefore particularly important to understand how existing and newly planned observational assets perform during periods of severe space weather. Extreme space weather creates challenging conditions under which instrumentation and spacecraft may be impeded or in which parameters reach values that are outside the nominal observational range. This paper analyzes existing and upcoming observational capabilities for forecasting, and discusses how the findings may impact space weather research and its transition to operations. A single limitation to the assessment is lack of information provided to us on radiation monitor performance, which caused us not to fully assess (i.e., not assess short term) radiation storm forecasting. The assessment finds that at least two widely spaced coronagraphs including L4 would provide reliability for Earth-bound CMEs. Furthermore, all magnetic field measurements assessed fully meet requirements. However, with current or even with near term new assets in place, in the worst-case scenario there could be a near-complete lack of key near-real-time solar wind plasma data of severe disturbances heading toward and impacting Earth's magnetosphere. Models that attempt to simulate the effects of these disturbances in near real time or with archival data require solar wind plasma observations as input. Moreover, the study finds that near-future observational assets will be less capable of advancing the understanding of extreme geomagnetic disturbances at Earth, which might make the resulting space weather models unsuitable for transition to operations. Key Points Manuscript assesses current and near-future space weather assets Current assets unreliable for forecasting of severe geomagnetic storms Near-future assets will not improve the situation PMID:26213516

  7. A real-world, multi-site, observational study of infusion time and treatment satisfaction with rheumatoid arthritis patients treated with intravenous golimumab or infliximab.

    PubMed

    Daniel, Shoshana R; McDermott, John D; Le, Cathy; Pierce, Christine A; Ziskind, Michael A; Ellis, Lorie A

    2018-05-25

    To assess real-world infusion times for golimumab (GLM-IV) and infliximab (IFX) for rheumatoid arthritis (RA) patients and factors associated with treatment satisfaction. An observational study assessed infusion time including: clinic visit duration, RA medication preparation and infusion time, and infusion process time. Satisfaction was assessed by a modified Treatment Satisfaction Questionnaire for Medication (patient) and study-specific questionnaires (patient and clinic personnel). Comparative statistical testing for patient data utilized analysis of variance for continuous measures, and Fisher's exact or Chi-square test for categorical measures. Multivariate analysis was performed for the primary time endpoints and patient satisfaction. One hundred and fifty patients were enrolled from six US sites (72 GLM-IV, 78 IFX). The majority of patients were female (80.0%) and Caucasian (88.7%). GLM-IV required fewer vials per infusion (3.7) compared to IFX (4.9; p = .0001). Clinic visit duration (minutes) was shorter for GLM-IV (65.1) compared to IFX (153.1; p < .0001), as was total infusion time for RA medication (32.8 GLM-IV, 119.5 IFX; p < .0001) and infusion process times (45.8 GLM-IV, 134.1 IFX; p < .0001). Patients treated with GLM-IV reported higher satisfaction ratings with infusion time (p < .0001) and total visit time (p = .0003). Clinic personnel reported higher satisfaction with GLM-IV than IFX specific to medication preparation time, ease of mixing RA medication, frequency of patients requiring pre-medication, and infusion time. Findings may not be representative of care delivery for all RA infusion practices or RA patients. Shorter overall clinic visit duration, infusion process, and RA medication infusion times were observed for GLM-IV compared to IFX. A shorter duration in infusion time was associated with higher patient and clinic personnel satisfaction ratings.

  8. Low-cost high performance distributed data storage for multi-channel observations

    NASA Astrophysics Data System (ADS)

    Liu, Ying-bo; Wang, Feng; Deng, Hui; Ji, Kai-fan; Dai, Wei; Wei, Shou-lin; Liang, Bo; Zhang, Xiao-li

    2015-10-01

    The New Vacuum Solar Telescope (NVST) is a 1-m solar telescope that aims to observe the fine structures in both the photosphere and the chromosphere of the Sun. The observational data acquired simultaneously from one channel for the chromosphere and two channels for the photosphere bring great challenges to the data storage of NVST. The multi-channel instruments of NVST, including scientific cameras and multi-band spectrometers, generate at least 3 terabytes data per day and require high access performance while storing massive short-exposure images. It is worth studying and implementing a storage system for NVST which would balance the data availability, access performance and the cost of development. In this paper, we build a distributed data storage system (DDSS) for NVST and then deeply evaluate the availability of real-time data storage on a distributed computing environment. The experimental results show that two factors, i.e., the number of concurrent read/write and the file size, are critically important for improving the performance of data access on a distributed environment. Referring to these two factors, three strategies for storing FITS files are presented and implemented to ensure the access performance of the DDSS under conditions of multi-host write and read simultaneously. The real applications of the DDSS proves that the system is capable of meeting the requirements of NVST real-time high performance observational data storage. Our study on the DDSS is the first attempt for modern astronomical telescope systems to store real-time observational data on a low-cost distributed system. The research results and corresponding techniques of the DDSS provide a new option for designing real-time massive astronomical data storage system and will be a reference for future astronomical data storage.

  9. Improved workflow for quantification of left ventricular volumes and mass using free-breathing motion corrected cine imaging.

    PubMed

    Cross, Russell; Olivieri, Laura; O'Brien, Kendall; Kellman, Peter; Xue, Hui; Hansen, Michael

    2016-02-25

    Traditional cine imaging for cardiac functional assessment requires breath-holding, which can be problematic in some situations. Free-breathing techniques have relied on multiple averages or real-time imaging, producing images that can be spatially and/or temporally blurred. To overcome this, methods have been developed to acquire real-time images over multiple cardiac cycles, which are subsequently motion corrected and reformatted to yield a single image series displaying one cardiac cycle with high temporal and spatial resolution. Application of these algorithms has required significant additional reconstruction time. The use of distributed computing was recently proposed as a way to improve clinical workflow with such algorithms. In this study, we have deployed a distributed computing version of motion corrected re-binning reconstruction for free-breathing evaluation of cardiac function. Twenty five patients and 25 volunteers underwent cardiovascular magnetic resonance (CMR) for evaluation of left ventricular end-systolic volume (ESV), end-diastolic volume (EDV), and end-diastolic mass. Measurements using motion corrected re-binning were compared to those using breath-held SSFP and to free-breathing SSFP with multiple averages, and were performed by two independent observers. Pearson correlation coefficients and Bland-Altman plots tested agreement across techniques. Concordance correlation coefficient and Bland-Altman analysis tested inter-observer variability. Total scan plus reconstruction times were tested for significant differences using paired t-test. Measured volumes and mass obtained by motion corrected re-binning and by averaged free-breathing SSFP compared favorably to those obtained by breath-held SSFP (r = 0.9863/0.9813 for EDV, 0.9550/0.9685 for ESV, 0.9952/0.9771 for mass). Inter-observer variability was good with concordance correlation coefficients between observers across all acquisition types suggesting substantial agreement. Both motion corrected re-binning and averaged free-breathing SSFP acquisition and reconstruction times were shorter than breath-held SSFP techniques (p < 0.0001). On average, motion corrected re-binning required 3 min less than breath-held SSFP imaging, a 37% reduction in acquisition and reconstruction time. The motion corrected re-binning image reconstruction technique provides robust cardiac imaging that can be used for quantification that compares favorably to breath-held SSFP as well as multiple average free-breathing SSFP, but can be obtained in a fraction of the time when using cloud-based distributed computing reconstruction.

  10. Mission requirements for a manned earth observatory. Task 2: Reference mission definition and analyiss, volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The mission requirements and conceptual design of manned earth observatory payloads for the 1980 time period are discussed. Projections of 1980 sensor technology and user data requirements were used to formulate typical basic criteria pertaining to experiments, sensor complements, and reference missions. The subjects discussed are: (1) mission selection and prioritization, (2) baseline mission analysis, (3) earth observation data handling and contingency plans, and (4) analysis of low cost mission definition and rationale.

  11. Sleep Disturbance and the Change from White to Red Lighting at Night on Old Age Psychiatry Wards: A Quality Improvement Project.

    PubMed

    Martin, David; Hurlbert, Anya; Cousins, David Andrew

    2018-06-01

    Psychiatric inpatient units often maintain a degree of lighting at night to facilitate the observation of patients, but this has the potential to disrupt sleep. Certain wavelengths of light may be less likely to disturb sleep and if such lighting permitted adequate observations, patient wellbeing may be improved. This study explored the effects of changing night-lights from broad-band white to narrow-band red on the amount of sleep observed, 'as required' medication administered and number of falls, in an old age psychiatry inpatient setting. Qualitative data was also gathered with a staff questionnaire. We hypothesised that compared to the use of white lights, red lights would be associated with a greater amount of recorded sleep, lesser use of 'as required' medication and no increase in the number of falls (reflecting comparable safety). Whilst there were no significant differences in quantitative measures recorded, there were more observations of sleep during the red light period than the white light period (14.1 versus 13.9 times per night) (U=627.5, z=-0.69, p=0.49) and fewer 'as required' medication administrations during the red light period compared to the white light period (3.3 versus 4.8 times per night) (U=640.0, z=0.56, p=0.57). Qualitatively, the staff of the organic assessment unit reported that patients were sleeping better and less agitated at night. Larger and more in-depth studies are required to examine the full effectiveness of using safe, sleep-enhancing lighting on wards at night. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Filtering observations without the initial guess

    NASA Astrophysics Data System (ADS)

    Chin, T. M.; Abbondanza, C.; Gross, R. S.; Heflin, M. B.; Parker, J. W.; Soja, B.; Wu, X.

    2017-12-01

    Noisy geophysical observations sampled irregularly over space and time are often numerically "analyzed" or "filtered" before scientific usage. The standard analysis and filtering techniques based on the Bayesian principle requires "a priori" joint distribution of all the geophysical parameters of interest. However, such prior distributions are seldom known fully in practice, and best-guess mean values (e.g., "climatology" or "background" data if available) accompanied by some arbitrarily set covariance values are often used in lieu. It is therefore desirable to be able to exploit efficient (time sequential) Bayesian algorithms like the Kalman filter while not forced to provide a prior distribution (i.e., initial mean and covariance). An example of this is the estimation of the terrestrial reference frame (TRF) where requirement for numerical precision is such that any use of a priori constraints on the observation data needs to be minimized. We will present the Information Filter algorithm, a variant of the Kalman filter that does not require an initial distribution, and apply the algorithm (and an accompanying smoothing algorithm) to the TRF estimation problem. We show that the information filter allows temporal propagation of partial information on the distribution (marginal distribution of a transformed version of the state vector), instead of the full distribution (mean and covariance) required by the standard Kalman filter. The information filter appears to be a natural choice for the task of filtering observational data in general cases where prior assumption on the initial estimate is not available and/or desirable. For application to data assimilation problems, reduced-order approximations of both the information filter and square-root information filter (SRIF) have been published, and the former has previously been applied to a regional configuration of the HYCOM ocean general circulation model. Such approximation approaches are also briefed in the presentation.

  13. Use of ebRIM-based CSW with sensor observation services for registry and discovery of remote-sensing observations

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Gong, Jianya; Wei, Yaxing

    2009-02-01

    Recent advances in Sensor Web geospatial data capture, such as high-resolution in satellite imagery and Web-ready data processing and modeling technologies, have led to the generation of large numbers of datasets from real-time or near real-time observations and measurements. Finding which sensor or data complies with criteria such as specific times, locations, and scales has become a bottleneck for Sensor Web-based applications, especially remote-sensing observations. In this paper, an architecture for use of the integration Sensor Observation Service (SOS) with the Open Geospatial Consortium (OGC) Catalogue Service-Web profile (CSW) is put forward. The architecture consists of a distributed geospatial sensor observation service, a geospatial catalogue service based on the ebXML Registry Information Model (ebRIM), SOS search and registry middleware, and a geospatial sensor portal. The SOS search and registry middleware finds the potential SOS, generating data granule information and inserting the records into CSW. The contents and sequence of the services, the available observations, and the metadata of the observations registry are described. A prototype system is designed and implemented using the service middleware technology and a standard interface and protocol. The feasibility and the response time of registry and retrieval of observations are evaluated using a realistic Earth Observing-1 (EO-1) SOS scenario. Extracting information from SOS requires the same execution time as record generation for CSW. The average data retrieval response time in SOS+CSW mode is 17.6% of that of the SOS-alone mode. The proposed architecture has the more advantages of SOS search and observation data retrieval than the existing sensor Web enabled systems.

  14. Degree Day Requirements for Kudzu Bug (Hemiptera: Plataspidae), a Pest of Soybeans.

    PubMed

    Grant, Jessica I; Lamp, William O

    2018-04-02

    Understanding the phenology of a new potential pest is fundamental for the development of a management program. Megacopta cribraria Fabricius (Hemiptera: Plataspidae), kudzu bug, is a pest of soybeans first detected in the United States in 2009 and in Maryland in 2013. We observed the phenology of kudzu bug life stages in Maryland, created a Celsius degree-day (CDD) model for development, and characterized the difference between microhabitat and ambient temperatures of both kudzu, Pueraria montana (Lour.) Merr. (Fabales: Fabaceae) and soybeans, Glycine max (L.) Merrill (Fabales: Fabaceae). In 2014, low population numbers yielded limited resolution from field phenology observations. We observed kudzu bug populations persisting within Maryland; but between 2013 and 2016, populations were low compared to populations in the southeastern United States. Based on the degree-day model, kudzu bug eggs require 80 CDD at a minimum temperature of 14°C to hatch. Nymphs require 545 CDD with a minimum temperature of 16°C for development. The CDD model matches field observations when factoring a biofix date of April 1 and a minimum preoviposition period of 17 d. The model suggests two full generations per year in Maryland. Standard air temperature monitors do not affect model predictions for pest management, as microhabitat temperature differences did not show a clear trend between kudzu and soybeans. Ultimately, producers can predict the timing of kudzu bug life stages with the CDD model for the use of timing management plans in soybean fields.

  15. Matrix tablets for sustained release of repaglinide: Preparation, pharmacokinetics and hypoglycemic activity in beagle dogs.

    PubMed

    He, Wei; Wu, Mengmeng; Huang, Shiqing; Yin, Lifang

    2015-01-15

    Repaglinide (RG) is an efficient antihyperglycemic drug; however, due to its short half-life, patients are required to take the marketed products several times a day, which compromises the therapeutic effects. The present study was conducted to develop a hydrophilic sustained release matrix tablet for RG with the aims of prolonging its action time, reducing the required administration times and side effects and improving patient adherence. The matrix tablets were fabricated by a direct compression method, the optimized formulation for which was obtained by screening the factors that affected the drug release. Moreover, studies of the pharmacokinetics and hypoglycemic activity as measured by glucose assay kits were performed in dogs. Sustained drug releases profiles over 10h and a reduced influence of medium pHs on release were achieved with the optimized formulation; moreover, the in vivo performance of extended release formulation was also examined, and better absorption, a one-fold decrease in Cmax, a two-fold increase of Tmax and a prolonged hypoglycemic effect compared to the marketed product were observed. In conclusion, sustained RG release and prolonged action were observed with present matrix tablets, which therefore provide a promising formulation for T2D patients who require long-term treatment. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Space-time-modulated stochastic processes

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.

  17. The Future of Operational Space Weather Observations

    NASA Astrophysics Data System (ADS)

    Berger, T. E.

    2015-12-01

    We review the current state of operational space weather observations, the requirements for new or evolved space weather forecasting capablities, and the relevant sections of the new National strategy for space weather developed by the Space Weather Operations, Research, and Mitigation (SWORM) Task Force chartered by the Office of Science and Technology Policy of the White House. Based on this foundation, we discuss future space missions such as the NOAA space weather mission to the L1 Lagrangian point planned for the 2021 time frame and its synergy with an L5 mission planned for the same period; the space weather capabilities of the upcoming GOES-R mission, as well as GOES-Next possiblities; and the upcoming COSMIC-2 mission for ionospheric observations. We also discuss the needs for ground-based operational networks to supply mission critical and/or backup space weather observations including the NSF GONG solar optical observing network, the USAF SEON solar radio observing network, the USGS real-time magnetometer network, the USCG CORS network of GPS receivers, and the possibility of operationalizing the world-wide network of neutron monitors for real-time alerts of ground-level radiation events.

  18. Sentinel-3 coverage-driven mission design: Coupling of orbit selection and instrument design

    NASA Astrophysics Data System (ADS)

    Cornara, S.; Pirondini, F.; Palmade, J. L.

    2017-11-01

    The first satellite of the Sentinel-3 series was launched in February 2016. Sentinel-3 payload suite encompasses the Ocean and Land Colour Instrument (OLCI) with a swath of 1270 km, the Sea and Land Surface Temperature Radiometer (SLSTR) yielding a dual-view scan with swaths of 1420 km (nadir) and 750 km (oblique view), the Synthetic Aperture Radar Altimeter (SRAL) working in Ku-band and C-band, and the dual-frequency Microwave Radiometer (MWR). In the early stages of mission and system design, the main driver for the Sentinel-3 reference orbit selection was the requirement to achieve a revisit time of two days or less globally over ocean areas with two satellites (i.e. 4-day global coverage with one satellite). The orbit selection was seamlessly coupled with the OLCI instrument design in terms of field of view (FoV) definition driven by the observation zenith angle (OZA) and sunglint constraints applied to ocean observations. The criticality of the global coverage requirement for ocean monitoring derives from the sunglint phenomenon, i.e. the impact on visible channels of the solar ray reflection on the water surface. This constraint was finally overcome thanks to the concurrent optimisation of the orbit parameters, notably the Local Time at Descending Node (LTDN), and the OLCI instrument FoV definition. The orbit selection process started with the identification of orbits with short repeat cycle (2-4 days), firstly to minimise the time required to achieve global coverage with existing constraints, and then to minimise the swath required to obtain global coverage and the maximum required OZA. This step yielded the selection of a 4-day repeat cycle orbit, thus allowing 2-day coverage with two adequately spaced satellites. Then suitable candidate orbits with higher repeat cycles were identified in the proximity of the selected altitudes and the reference orbit was ultimately chosen. Rationale was to keep the swath for global coverage as close as possible to the previous optimum value, but to tailor the repeat cycle length (i.e. the ground-track grid) to optimise the topography mission performances. The final choice converged on the sun-synchronous orbit 14 + 7/27, reference altitude ∼800 km, LTDN = 10h00. Extensive coverage analyses were carried out to characterise the mission performance and the fulfilment of the requirements, encompassing revisit time, number of acquisitions, observation viewing geometry and swath properties. This paper presents a comprehensive overview of the Sentinel-3 orbit selection, starting from coverage requirements and highlighting the close interaction with the instrument design activity.

  19. An Object-Oriented Graphical User Interface for a Reusable Rocket Engine Intelligent Control System

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Musgrave, Jeffrey L.; Guo, Ten-Huei; Paxson, Daniel E.; Wong, Edmond; Saus, Joseph R.; Merrill, Walter C.

    1994-01-01

    An intelligent control system for reusable rocket engines under development at NASA Lewis Research Center requires a graphical user interface to allow observation of the closed-loop system in operation. The simulation testbed consists of a real-time engine simulation computer, a controls computer, and several auxiliary computers for diagnostics and coordination. The system is set up so that the simulation computer could be replaced by the real engine and the change would be transparent to the control system. Because of the hard real-time requirement of the control computer, putting a graphical user interface on it was not an option. Thus, a separate computer used strictly for the graphical user interface was warranted. An object-oriented LISP-based graphical user interface has been developed on a Texas Instruments Explorer 2+ to indicate the condition of the engine to the observer through plots, animation, interactive graphics, and text.

  20. Time-response shaping using output to input saturation transformation

    NASA Astrophysics Data System (ADS)

    Chambon, E.; Burlion, L.; Apkarian, P.

    2018-03-01

    For linear systems, the control law design is often performed so that the resulting closed loop meets specific frequency-domain requirements. However, in many cases, it may be observed that the obtained controller does not enforce time-domain requirements amongst which the objective of keeping a scalar output variable in a given interval. In this article, a transformation is proposed to convert prescribed bounds on an output variable into time-varying saturations on the synthesised linear scalar control law. This transformation uses some well-chosen time-varying coefficients so that the resulting time-varying saturation bounds do not overlap in the presence of disturbances. Using an anti-windup approach, it is obtained that the origin of the resulting closed loop is globally asymptotically stable and that the constrained output variable satisfies the time-domain constraints in the presence of an unknown finite-energy-bounded disturbance. An application to a linear ball and beam model is presented.

  1. Satellite orbit and data sampling requirements

    NASA Technical Reports Server (NTRS)

    Rossow, William

    1993-01-01

    Climate forcings and feedbacks vary over a wide range of time and space scales. The operation of non-linear feedbacks can couple variations at widely separated time and space scales and cause climatological phenomena to be intermittent. Consequently, monitoring of global, decadal changes in climate requires global observations that cover the whole range of space-time scales and are continuous over several decades. The sampling of smaller space-time scales must have sufficient statistical accuracy to measure the small changes in the forcings and feedbacks anticipated in the next few decades, while continuity of measurements is crucial for unambiguous interpretation of climate change. Shorter records of monthly and regional (500-1000 km) measurements with similar accuracies can also provide valuable information about climate processes, when 'natural experiments' such as large volcanic eruptions or El Ninos occur. In this section existing satellite datasets and climate model simulations are used to test the satellite orbits and sampling required to achieve accurate measurements of changes in forcings and feedbacks at monthly frequency and 1000 km (regional) scale.

  2. Optimization of Advanced ACTPol Transition Edge Sensor Bolometer Operation Using R(T,I) Transition Measurements

    NASA Astrophysics Data System (ADS)

    Salatino, Maria

    2017-06-01

    In the current submm and mm cosmology experiments the focal planes are populated by kilopixel transition edge sensors (TESes). Varying incoming power load requires frequent rebiasing of the TESes through standard current-voltage (IV) acquisition. The time required to perform IVs on such large arrays and the resulting transient heating of the bath reduces the sky observation time. We explore a bias step method that significantly reduces the time required for the rebiasing process. This exploits the detectors' responses to the injection of a small square wave signal on top of the dc bias current and knowledge of the shape of the detector transition R(T,I). This method has been tested on two detector arrays of the Atacama Cosmology Telescope (ACT). In this paper, we focus on the first step of the method, the estimate of the TES %Rn.

  3. Integrated approach to estimate the ocean's time variable dynamic topography including its covariance matrix

    NASA Astrophysics Data System (ADS)

    Müller, Silvia; Brockmann, Jan Martin; Schuh, Wolf-Dieter

    2015-04-01

    The ocean's dynamic topography as the difference between the sea surface and the geoid reflects many characteristics of the general ocean circulation. Consequently, it provides valuable information for evaluating or tuning ocean circulation models. The sea surface is directly observed by satellite radar altimetry while the geoid cannot be observed directly. The satellite-based gravity field determination requires different measurement principles (satellite-to-satellite tracking (e.g. GRACE), satellite-gravity-gradiometry (GOCE)). In addition, hydrographic measurements (salinity, temperature and pressure; near-surface velocities) provide information on the dynamic topography. The observation types have different representations and spatial as well as temporal resolutions. Therefore, the determination of the dynamic topography is not straightforward. Furthermore, the integration of the dynamic topography into ocean circulation models requires not only the dynamic topography itself but also its inverse covariance matrix on the ocean model grid. We developed a rigorous combination method in which the dynamic topography is parameterized in space as well as in time. The altimetric sea surface heights are expressed as a sum of geoid heights represented in terms of spherical harmonics and the dynamic topography parameterized by a finite element method which can be directly related to the particular ocean model grid. Besides the difficult task of combining altimetry data with a gravity field model, a major aspect is the consistent combination of satellite data and in-situ observations. The particular characteristics and the signal content of the different observations must be adequately considered requiring the introduction of auxiliary parameters. Within our model the individual observation groups are combined in terms of normal equations considering their full covariance information; i.e. a rigorous variance/covariance propagation from the original measurements to the final product is accomplished. In conclusion, the developed integrated approach allows for estimating the dynamic topography and its inverse covariance matrix on arbitrary grids in space and time. The inverse covariance matrix contains the appropriate weights for model-data misfits in least-squares ocean model inversions. The focus of this study is on the North Atlantic Ocean. We will present the conceptual design and dynamic topography estimates based on time variable data from seven satellite altimeter missions (Jason-1, Jason-2, Topex/Poseidon, Envisat, ERS-2, GFO, Cryosat2) in combination with the latest GOCE gravity field model and in-situ data from the Argo floats and near-surface drifting buoys.

  4. The influence of asymmetric force requirements on a multi-frequency bimanual coordination task.

    PubMed

    Kennedy, Deanna M; Rhee, Joohyun; Jimenez, Judith; Shea, Charles H

    2017-01-01

    An experiment was designed to determine the impact of the force requirements on the production of bimanual 1:2 coordination patterns requiring the same (symmetric) or different (asymmetric) forces when Lissajous displays and goal templates are provided. The Lissajous displays have been shown to minimize the influence of attentional and perceptual constraints allowing constraints related to neural crosstalk to be more clearly observed. Participants (N=20) were randomly assigned to a force condition in which the left or right limb was required to produce more force than the contralateral limb. In each condition participants were required to rhythmically coordinate the pattern of isometric forces in a 1:2 coordination pattern. Participant performed 13 practice trials and 1 test trial per force level. The results indicated that participants were able to effectively coordinate the 1:2 multi-frequency goal patterns under both symmetric and asymmetric force requirements. However, consistent distortions in the force and force velocity time series were observed for one limb that appeared to be associated with the production of force in the contralateral limb. Distortions in the force produced by the left limb occurred regardless of the force requirements of the task (symmetric, asymmetric) or whether the left or right limb had to produce more force than the contralateral limb. However, distinct distortions in the right limb occurred only when the left limb was required to produce 5 times more force than the right limb. These results are consistent with the notion that neural crosstalk can influence both limbs, but may manifest differently for each limb depending on the force requirements of the task. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Impact of Institutional Review Board Practice Variation on Observational Health Services Research

    PubMed Central

    Green, Lee A; Lowery, Julie C; Kowalski, Christine P; Wyszewianski, Leon

    2006-01-01

    Objective To describe, qualitatively and quantitatively, the impact of a review by multiple institutional review boards (IRBs) on the conduct of a multisite observational health services research study. Data Source and Setting Primary data collection during 2002, 2003, and 2004 at 43 United States Department of Veterans Affairs (VA) primary care clinics. Design Explanatory sequential mixed methods design incorporating qualitative and quantitative elements in sequence. Data Collection and Abstraction Methods Field notes and documents collected by research staff during a multisite observational health services research study were used in thematic analysis. Themes were quantified descriptively and merged with timeline data. Principal Findings Approximately 4,680 hours of staff time over a 19-month period were devoted solely to the IRB process. Four categories of phenomena impacting research were observed: Recruitment, retention, and communication issues with local site principal investigators (PIs). Local PIs had no real role but were required by IRBs. Twenty-one percent of sites experienced turnover in local PIs, and local PI issues added significant delay to most sites.Wide variation in standards applied to review and approval of IRB applications. The study was designed to be qualified under U.S. government regulations for expedited review. One site exempted it from review (although it did not qualify for exemption), 10 granted expedited review, 31 required full review, and one rejected it as being too risky to be permitted. Twenty-three required inapplicable sections in the consent form and five required HIPAA (Health Insurance Portability and Accountability Act of 1996) consent from physicians although no health information was asked of them. Twelve sites requested, and two insisted upon, provisions that directly increased the risk to participants.Multiple returns for revision of IRB applications, consent documents, and ancillary forms. Seventy-six percent of sites required at least one resubmission, and 15 percent of sites required three or more (up to six) resubmissions. Only 12 percent of sites required any procedural or substantive revision; most resubmissions were editorial changes to the wording of the consent document.Process failures (long turnaround times, lost paperwork, difficulty in obtaining necessary forms, unavailability of key personnel at IRBs). The process required from 52 to 798 (median 286) days to obtain approval at each site. Conclusions Several features of the IRB system as currently configured impose costly burdens of administrative activity and delay on observational health services research studies, and paradoxically decrease protection of human subjects. Central review with local opt-out, cooperative review, or a system of peer review could reduce costs and improve protection of human subjects. PMID:16430608

  6. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter

    PubMed Central

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  7. Scan path entropy and arrow plots: capturing scanning behavior of multiple observers

    PubMed Central

    Hooge, Ignace; Camps, Guido

    2013-01-01

    Designers of visual communication material want their material to attract and retain attention. In marketing research, heat maps, dwell time, and time to AOI first hit are often used as evaluation parameters. Here we present two additional measures (1) “scan path entropy” to quantify gaze guidance and (2) the “arrow plot” to visualize the average scan path. Both are based on string representations of scan paths. The latter also incorporates transition matrices and time required for 50% of the observers to first hit AOIs (T50). The new measures were tested in an eye tracking study (48 observers, 39 advertisements). Scan path entropy is a sensible measure for gaze guidance and the new visualization method reveals aspects of the average scan path and gives a better indication in what order global scanning takes place. PMID:24399993

  8. Application of modern control theory to the design of optimum aircraft controllers

    NASA Technical Reports Server (NTRS)

    Power, L. J.

    1973-01-01

    The synthesis procedure presented is based on the solution of the output regulator problem of linear optimal control theory for time-invariant systems. By this technique, solution of the matrix Riccati equation leads to a constant linear feedback control law for an output regulator which will maintain a plant in a particular equilibrium condition in the presence of impulse disturbances. Two simple algorithms are presented that can be used in an automatic synthesis procedure for the design of maneuverable output regulators requiring only selected state variables for feedback. The first algorithm is for the construction of optimal feedforward control laws that can be superimposed upon a Kalman output regulator and that will drive the output of a plant to a desired constant value on command. The second algorithm is for the construction of optimal Luenberger observers that can be used to obtain feedback control laws for the output regulator requiring measurement of only part of the state vector. This algorithm constructs observers which have minimum response time under the constraint that the magnitude of the gains in the observer filter be less than some arbitrary limit.

  9. Red-light running violation prediction using observational and simulator data.

    PubMed

    Jahangiri, Arash; Rakha, Hesham; Dingus, Thomas A

    2016-11-01

    In the United States, 683 people were killed and an estimated 133,000 were injured in crashes due to running red lights in 2012. To help prevent/mitigate crashes caused by running red lights, these violations need to be identified before they occur, so both the road users (i.e., drivers, pedestrians, etc.) in potential danger and the infrastructure can be notified and actions can be taken accordingly. Two different data sets were used to assess the feasibility of developing red-light running (RLR) violation prediction models: (1) observational data and (2) driver simulator data. Both data sets included common factors, such as time to intersection (TTI), distance to intersection (DTI), and velocity at the onset of the yellow indication. However, the observational data set provided additional factors that the simulator data set did not, and vice versa. The observational data included vehicle information (e.g., speed, acceleration, etc.) for several different time frames. For each vehicle approaching an intersection in the observational data set, required data were extracted from several time frames as the vehicle drew closer to the intersection. However, since the observational data were inherently anonymous, driver factors such as age and gender were unavailable in the observational data set. Conversely, the simulator data set contained age and gender. In addition, the simulator data included a secondary (non-driving) task factor and a treatment factor (i.e., incoming/outgoing calls while driving). The simulator data only included vehicle information for certain time frames (e.g., yellow onset); the data did not provide vehicle information for several different time frames while vehicles were approaching an intersection. In this study, the random forest (RF) machine-learning technique was adopted to develop RLR violation prediction models. Factor importance was obtained for different models and different data sets to show how differently the factors influence the performance of each model. A sensitivity analysis showed that the factor importance to identify RLR violations changed when data from different time frames were used to develop the prediction models. TTI, DTI, the required deceleration parameter (RDP), and velocity at the onset of a yellow indication were among the most important factors identified by both models constructed using observational data and simulator data. Furthermore, in addition to the factors obtained from a point in time (i.e., yellow onset), valuable information suitable for RLR violation prediction was obtained from defined monitoring periods. It was found that period lengths of 2-6m contributed to the best model performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Direction dependence of displacement time for two-fluid electroosmotic flow.

    PubMed

    Lim, Chun Yee; Lam, Yee Cheong

    2012-03-01

    Electroosmotic flow that involves one fluid displacing another fluid is commonly encountered in various microfludic applications and experiments, for example, current monitoring technique to determine zeta potential of microchannel. There is experimentally observed anomaly in such flow, namely, the displacement time is flow direction dependent, i.e., it depends if it is a high concentration fluid displacing a low concentration fluid, or vice versa. Thus, this investigation focuses on the displacement flow of two fluids with various concentration differences. The displacement time was determined experimentally with current monitoring method. It is concluded that the time required for a high concentration solution to displace a low concentration solution is smaller than the time required for a low concentration solution to displace a high concentration solution. The percentage displacement time difference increases with increasing concentration difference and independent of the length or width of the channel and the voltage applied. Hitherto, no theoretical analysis or numerical simulation has been conducted to explain this phenomenon. A numerical model based on finite element method was developed to explain the experimental observations. Simulations showed that the velocity profile and ion distribution deviate significantly from a single fluid electroosmotic flow. The distortion of ion distribution near the electrical double layer is responsible for the displacement time difference for the two different flow directions. The trends obtained from simulations agree with the experimental findings.

  11. Direction dependence of displacement time for two-fluid electroosmotic flow

    PubMed Central

    Lim, Chun Yee; Lam, Yee Cheong

    2012-01-01

    Electroosmotic flow that involves one fluid displacing another fluid is commonly encountered in various microfludic applications and experiments, for example, current monitoring technique to determine zeta potential of microchannel. There is experimentally observed anomaly in such flow, namely, the displacement time is flow direction dependent, i.e., it depends if it is a high concentration fluid displacing a low concentration fluid, or vice versa. Thus, this investigation focuses on the displacement flow of two fluids with various concentration differences. The displacement time was determined experimentally with current monitoring method. It is concluded that the time required for a high concentration solution to displace a low concentration solution is smaller than the time required for a low concentration solution to displace a high concentration solution. The percentage displacement time difference increases with increasing concentration difference and independent of the length or width of the channel and the voltage applied. Hitherto, no theoretical analysis or numerical simulation has been conducted to explain this phenomenon. A numerical model based on finite element method was developed to explain the experimental observations. Simulations showed that the velocity profile and ion distribution deviate significantly from a single fluid electroosmotic flow. The distortion of ion distribution near the electrical double layer is responsible for the displacement time difference for the two different flow directions. The trends obtained from simulations agree with the experimental findings. PMID:22662083

  12. Equations of motion for the gravitational two-body problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, C.K.

    1988-01-01

    This paper reinvestigates the well-known gravitational two-body problem, in light of new information concerning the electrodynamic version of the problem. The well-known Lienard-Wiechert potentials, and the fields derived from them, are suspected to be time-shifted, anticipating the true potentials and fields by the time required for signal propagation from the source to the observer. This time shift is significant because it implies field directions different to first order in v/c. In the gravitational problem, the resulting observer accelerations become correlated with retarded source positions, rather than with present, unretarded source positions as was previously believed. This means there exist previouslymore » unrecognized first-order effects in gravitational systems.« less

  13. Million Degree Plasmas in Extreme Ultraviolet (EUV) Astrophysics. White Paper in Response to Astro2010 Science Call

    DTIC Science & Technology

    2010-01-01

    photometry , timing measurements of suitable cadence, and advanced theory are the keys to understanding the physics of million degree plasmas in...Disentangling these components requires time - and phase- resolved spectroscopic observations of a sample that spans a range of mass accretion rates...many narrow lines, or a continuum with strong, broad absorption features. Key Objective: Obtaining time - and phase- resolved high-resolution EUV

  14. Compensation for Blur Requires Increase in Field of View and Viewing Time

    PubMed Central

    Kwon, MiYoung; Liu, Rong; Chien, Lillian

    2016-01-01

    Spatial resolution is an important factor for human pattern recognition. In particular, low resolution (blur) is a defining characteristic of low vision. Here, we examined spatial (field of view) and temporal (stimulus duration) requirements for blurry object recognition. The spatial resolution of an image such as letter or face, was manipulated with a low-pass filter. In experiment 1, studying spatial requirement, observers viewed a fixed-size object through a window of varying sizes, which was repositioned until object identification (moving window paradigm). Field of view requirement, quantified as the number of “views” (window repositions) for correct recognition, was obtained for three blur levels, including no blur. In experiment 2, studying temporal requirement, we determined threshold viewing time, the stimulus duration yielding criterion recognition accuracy, at six blur levels, including no blur. For letter and face recognition, we found blur significantly increased the number of views, suggesting a larger field of view is required to recognize blurry objects. We also found blur significantly increased threshold viewing time, suggesting longer temporal integration is necessary to recognize blurry objects. The temporal integration reflects the tradeoff between stimulus intensity and time. While humans excel at recognizing blurry objects, our findings suggest compensating for blur requires increased field of view and viewing time. The need for larger spatial and longer temporal integration for recognizing blurry objects may further challenge object recognition in low vision. Thus, interactions between blur and field of view should be considered for developing low vision rehabilitation or assistive aids. PMID:27622710

  15. Magnetic Field Diagnostics and Spatio-Temporal Variability of the Solar Transition Region

    NASA Astrophysics Data System (ADS)

    Peter, H.

    2013-12-01

    Magnetic field diagnostics of the transition region from the chromosphere to the corona faces us with the problem that one has to apply extreme-ultraviolet (EUV) spectro-polarimetry. While for the coronal diagnostics techniques already exist in the form of infrared coronagraphy above the limb and radio observations on the disk, one has to investigate EUV observations for the transition region. However, so far the success of such observations has been limited, but various current projects aim to obtain spectro-polarimetric data in the extreme UV in the near future. Therefore it is timely to study the polarimetric signals we can expect from these observations through realistic forward modeling. We employ a 3D magneto-hydrodynamic (MHD) forward model of the solar corona and synthesize the Stokes I and Stokes V profiles of C iv (1548 Å). A signal well above 0.001 in Stokes V can be expected even if one integrates for several minutes to reach the required signal-to-noise ratio, and despite the rapidly changing intensity in the model (just as in observations). This variability of the intensity is often used as an argument against transition region magnetic diagnostics, which requires exposure times of minutes. However, the magnetic field is evolving much slower than the intensity, and therefore the degree of (circular) polarization remains rather constant when one integrates in time. Our study shows that it is possible to measure the transition region magnetic field if a polarimetric accuracy on the order of 0.001 can be reached, which we can expect from planned instrumentation.

  16. Constant supervision of bathing in French public swimming pools: an unrealistic regulatory requirement?

    PubMed

    Vignac, Élie; Lebihain, Pascal; Soulé, Bastien

    2017-09-01

    In France, to prevent drowning accidents in public swimming pools (PSPs), bathing must be constantly supervised by qualified staff. However, fatal drowning regularly occurs in supervised aquatic facilities. A review of the literature shows that human supervision is a complex task. The aim of this research is to fully assess the periods during which supervision is not carried out, or carried out in an inadequate manner. The observations made in 108 French PSPs show that supervision is not carried out 18% of the time and that it is carried out inadequately 33% of the time. The medical literature shows that, in order to expect to survive without after-effects, an immersed victim requires intervention within a time limit of not more than three minutes; however, we noted, over a total observation time of 54 hours, 147 periods (29.8%) during which the supervision system was degraded for three minutes or more. This quantification research on the periods of degraded supervision is complemented by an identification of the causes leading to these degradations, from which we can draw interesting areas for improvement, particularly from an organizational point of view, in order to improve safety management in French PSPs.

  17. A real-time recursive filter for the attitude determination of the Spacelab instrument pointing subsystem

    NASA Technical Reports Server (NTRS)

    West, M. E.

    1992-01-01

    A real-time estimation filter which reduces sensitivity to system variations and reduces the amount of preflight computation is developed for the instrument pointing subsystem (IPS). The IPS is a three-axis stabilized platform developed to point various astronomical observation instruments aboard the shuttle. Currently, the IPS utilizes a linearized Kalman filter (LKF), with premission defined gains, to compensate for system drifts and accumulated attitude errors. Since the a priori gains are generated for an expected system, variations result in a suboptimal estimation process. This report compares the performance of three real-time estimation filters with the current LKF implementation. An extended Kalman filter and a second-order Kalman filter are developed to account for the system nonlinearities, while a linear Kalman filter implementation assumes that the nonlinearities are negligible. The performance of each of the four estimation filters are compared with respect to accuracy, stability, settling time, robustness, and computational requirements. It is shown, that for the current IPS pointing requirements, the linear Kalman filter provides improved robustness over the LKF with less computational requirements than the two real-time nonlinear estimation filters.

  18. [Generation of appraisal standards for functional measurements in the frail elderly and persons aged 40 and older requiring light assistance in daily living].

    PubMed

    Obuchi, Shuichi; Kojima, Motonaga; Miki, Akiko; Ito, Kazuhiko; Arai, Takeshi; Tsuji, Ichiro; Okubo, Ichiro; Ohara, Satoko; Sugiyama, Michiko; Suzuki, Takao; Sone, Toshimasa; Yasumura, Seiji

    2010-11-01

    The purpose of this study was to generate appraisal standards for functional measures in independent elderly people with physical frailty, "Tokutei", or persons aged 40 and older who require light assistance, "Youshien". A total of 3,852 subjects for whom functional measures were available, including grasp strength, one-leg standing time, timed up & go (TUG) , and 5-m walking time, were analyzed from a database obtained from the Ministry of Health, Labour and Welfare. The upper limit and lower limit of each quintilededuced from the functional measurements were adopted to construct the appraisal standard. The functional measures were higher in Tokutei than in Youshien. Comparing Tokutei and Yoshien, one or more level difference in the five divided groups was observed for the one-leg standing time. There were differences of three or more levels between Tokutei and Yoshien in the TUG and the 5-m walking time. The present study allowed development of appraisal standards for elderly having physical frailty and for persons aged 40 and older requiring light assistance in daily living.

  19. Patterns of physiological activity accompanying performance on a perceptual-motor task.

    DOT National Transportation Integrated Search

    1969-04-01

    Air traffic controllers are required to spend considerable periods of time observing radar displays. Yet, information regarding physiological measures which best reflect the attentional process in complex vigilance tasks is generally lacking. As an i...

  20. Observed and Self-Reported Pesticide Protective Behaviors of Latino Migrant and Seasonal Farmworkers

    PubMed Central

    Walton, AnnMarie Lee; LePrevost, Catherine; Wong, Bob; Linnan, Laura; Sanchez-Birkhead, Ana; Mooney, Kathi

    2016-01-01

    Agricultural pesticide exposure has potential adverse health effects for farmworkers that may be reduced by pesticide protective behaviors (PPBs). The Environmental Protection Agency’s (EPA) Worker Protection Standard (WPS) requires PPBs be taught to farmworkers prior to field work. Studies to date have not utilized observational methods to evaluate the degree to which PPBs are practiced by Latino migrant and seasonal farmworkers in the United States. The purpose of this study was to describe, compare, and contrast observed and self-reported PPBs used by Latino farmworkers; both PPBs that the WPS requires be taught and other PPBs were included. Observed and self-reported data were collected from 71 Latino farmworkers during the 2014 tobacco growing season in North Carolina. Participants were consistent in reporting and using long pants and closed shoes in the field most of the time. In addition, gloves, hats/bandanas, and water-resistant outerwear were frequently observed, although they are not required to be taught by the WPS. Farmworkers reported more long-sleeve (p = .028) and glove use (p = .000) than what was observed. It was uncommon to observe washing behavior before eating or drinking, even when washing supplies were available. Washing behaviors were significantly overreported for hand (p = .000; (p = .000) and face (p = .000; (p = .058) washing before eating and drinking in the field. This study documents that protective clothing behaviors that the WPS requires be taught, plus a few others are commonly practiced by Latino migrant and seasonal farmworkers, but washing behaviors in the field are not. Targeted strategies to improve washing behaviors in the field are needed. PMID:26918841

  1. Ultrafast detection in particle physics and positron emission tomography using SiPMs

    NASA Astrophysics Data System (ADS)

    Dolenec, R.; Korpar, S.; Križan, P.; Pestotnik, R.

    2017-12-01

    Silicon photomultiplier (SiPM) photodetectors perform well in many particle and medical physics applications, especially where good efficiency, insensitivity to magnetic field and precise timing are required. In Cherenkov time-of-flight positron emission tomography the requirements for photodetector performance are especially high. On average only a couple of photons are available for detection and the best possible timing resolution is needed. Using SiPMs as photodetectors enables good detection efficiency, but the large sensitive area devices needed have somewhat limited time resolution for single photons. We have observed an additional degradation of the timing at very low light intensities due to delayed events in distribution of signals resulting from multiple fired micro cells. In this work we present the timing properties of AdvanSiD ASD-NUV3S-P-40 SiPM at single photon level picosecond laser illumination and a simple modification of the time-walk correction algorithm, that resulted in reduced degradation of timing resolution due to the delayed events.

  2. Calibration of hydrological models using flow-duration curves

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; Guerrero, J.-L.; Younger, P. M.; Beven, K. J.; Seibert, J.; Halldin, S.; Freer, J. E.; Xu, C.-Y.

    2011-07-01

    The degree of belief we have in predictions from hydrologic models will normally depend on how well they can reproduce observations. Calibrations with traditional performance measures, such as the Nash-Sutcliffe model efficiency, are challenged by problems including: (1) uncertain discharge data, (2) variable sensitivity of different performance measures to different flow magnitudes, (3) influence of unknown input/output errors and (4) inability to evaluate model performance when observation time periods for discharge and model input data do not overlap. This paper explores a calibration method using flow-duration curves (FDCs) to address these problems. The method focuses on reproducing the observed discharge frequency distribution rather than the exact hydrograph. It consists of applying limits of acceptability for selected evaluation points (EPs) on the observed uncertain FDC in the extended GLUE approach. Two ways of selecting the EPs were tested - based on equal intervals of discharge and of volume of water. The method was tested and compared to a calibration using the traditional model efficiency for the daily four-parameter WASMOD model in the Paso La Ceiba catchment in Honduras and for Dynamic TOPMODEL evaluated at an hourly time scale for the Brue catchment in Great Britain. The volume method of selecting EPs gave the best results in both catchments with better calibrated slow flow, recession and evaporation than the other criteria. Observed and simulated time series of uncertain discharges agreed better for this method both in calibration and prediction in both catchments. An advantage with the method is that the rejection criterion is based on an estimation of the uncertainty in discharge data and that the EPs of the FDC can be chosen to reflect the aims of the modelling application, e.g. using more/less EPs at high/low flows. While the method appears less sensitive to epistemic input/output errors than previous use of limits of acceptability applied directly to the time series of discharge, it still requires a reasonable representation of the distribution of inputs. Additional constraints might therefore be required in catchments subject to snow and where peak-flow timing at sub-daily time scales is of high importance. The results suggest that the calibration method can be useful when observation time periods for discharge and model input data do not overlap. The method could also be suitable for calibration to regional FDCs while taking uncertainties in the hydrological model and data into account.

  3. Hot Science with a "Warm" Telescope: Observations of Extrasolar Planets During the Spitzer Warm Mission

    NASA Astrophysics Data System (ADS)

    Grillmair, Carl J.; Carey, S.; Helou, G.; Hurt, R.; Rebull, L.; Soifer, T.; Squires, G. K.; Storrie-Lombardi, L.

    2007-12-01

    The Spitzer Space Telescope will exhaust its cryogen supply sometime around March of 2009. However, the observatory is expected to remain operational until early 2014 with undiminished 3.6 and 4.5 micron imaging capabilities over two 5'x5’ fields-of-view. During this "warm” mission, Spitzer will operate with extremely high efficiency and provide up to 35,000 hours of science observing time. This will enable unprecedented opportunities to address key scientific questions requiring large allocations of observing time, while maintaining opportunities for broad community use with more "traditional” time allocations. Spitzer will remain a particularly valuable resource for studies of extrasolar planets, with applications including: 1) transit timing observations and precise radius measurements of Earth-sized planets transiting nearby M-dwarfs, 2) measuring thermal emission and distinguishing between broad band emission and absorption in the atmospheres of transiting hot Jupiters, 3) measuring orbital phase variations of thermal emission for both transiting and non-transiting, close-in planets, and 4) sensitive imaging searches for young planets at large angular separations from their parent stars.

  4. Bayesian Techniques for Comparing Time-dependent GRMHD Simulations to Variable Event Horizon Telescope Observations

    NASA Astrophysics Data System (ADS)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  5. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore themore » robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.« less

  6. A relativistic analysis of clock synchronization

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1974-01-01

    The relativistic conversion between coordinate time and atomic time is reformulated to allow simpler time calculations relating analysis in solar-system barycentric coordinates (using coordinate time) with earth-fixed observations (measuring earth-bound proper time or atomic time.) After an interpretation of terms, this simplified formulation, which has a rate accuracy of about 10 to the minus 15th power, is used to explain the conventions required in the synchronization of a world wide clock network and to analyze two synchronization techniques-portable clocks and radio interferometry. Finally, pertinent experiment tests of relativity are briefly discussed in terms of the reformulated time conversion.

  7. The NRAO Observing for University Classes Program

    NASA Astrophysics Data System (ADS)

    Cannon, John M.; Van Moorsel, Gustaaf A.

    2017-01-01

    The NRAO "Observing for University Classes" program is a tremendous resource for instructors of courses in observational astronomy. As a service to the astronomical and educational communities, the NRAO offers small amounts of observing time on the Very Large Array (VLA) and the Very Long Baseline Array to such instructors. The data can be used by students and faculty to demonstrate radio astronomy theory with modern data products. Further, the results may lead to publication; this is a unique opportunity for faculty members to integrate research into the classroom. Previous experience with NRAO facilities is required for instructors; individuals without radio astronomy experience can take advantage of other NRAO educational opportunities (e.g., the Synthesis Imaging Workshop) prior to using the program. No previous experience with radio astronomy data is required for students; this is the primary target audience of the program. To demonstrate concept, this poster describes three different VLA observing programs that have been completed using the "Observing for University Classes" resource at Macalester College; undergraduate students have published the results of all three of these programs. Other recent "Observing for University Classes" programs are also described.

  8. A Possible Origin of Linear Depolarization Observed at Vertical Incidence in Rain

    NASA Technical Reports Server (NTRS)

    Jameson, A. R.; Durden, S. L.

    1996-01-01

    Recent observations by two different nadir-pointing airborne radars with some polarization capabilities have detected surprisingly large linear depolarization ratios at times in convective tropical rain. This depolarization can be explained if the rain is considered to be a mixture of a group of apparent spheres and another group of drops that are distorted in the horizontal plane perpendicular to the direction of propagation of the incident wave. If confirmed in future observations, this suggests that at times the larger raindrops are oscillating, in part, because of collisions with smaller drops. Since many of the interpretations of radar polarization measurements in rain by ground-based radars presume that the raindrop shapes correspond to those of the well-known "equilibrium" drops, the present observations may require adjustments to some radar polarization algorithms for estimating rainfall rate, for example, if the shape perturbations observed at nadir also apply to measurements along other axes as well.

  9. Management, Analysis, and Visualization of Experimental and Observational Data – The Convergence of Data and Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Greenwald, Martin; Kleese van Dam, Kerstin

    Scientific user facilities—particle accelerators, telescopes, colliders, supercomputers, light sources, sequencing facilities, and more—operated by the U.S. Department of Energy (DOE) Office of Science (SC) generate ever increasing volumes of data at unprecedented rates from experiments, observations, and simulations. At the same time there is a growing community of experimentalists that require real-time data analysis feedback, to enable them to steer their complex experimental instruments to optimized scientific outcomes and new discoveries. Recent efforts in DOE-SC have focused on articulating the data-centric challenges and opportunities facing these science communities. Key challenges include difficulties coping with data size, rate, and complexity inmore » the context of both real-time and post-experiment data analysis and interpretation. Solutions will require algorithmic and mathematical advances, as well as hardware and software infrastructures that adequately support data-intensive scientific workloads. This paper presents the summary findings of a workshop held by DOE-SC in September 2015, convened to identify the major challenges and the research that is needed to meet those challenges.« less

  10. Management, Analysis, and Visualization of Experimental and Observational Data -- The Convergence of Data and Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Greenwald, Martin; Kleese van Dam, Kersten

    Scientific user facilities---particle accelerators, telescopes, colliders, supercomputers, light sources, sequencing facilities, and more---operated by the U.S. Department of Energy (DOE) Office of Science (SC) generate ever increasing volumes of data at unprecedented rates from experiments, observations, and simulations. At the same time there is a growing community of experimentalists that require real-time data analysis feedback, to enable them to steer their complex experimental instruments to optimized scientific outcomes and new discoveries. Recent efforts in DOE-SC have focused on articulating the data-centric challenges and opportunities facing these science communities. Key challenges include difficulties coping with data size, rate, and complexity inmore » the context of both real-time and post-experiment data analysis and interpretation. Solutions will require algorithmic and mathematical advances, as well as hardware and software infrastructures that adequately support data-intensive scientific workloads. This paper presents the summary findings of a workshop held by DOE-SC in September 2015, convened to identify the major challenges and the research that is needed to meet those challenges.« less

  11. A look at motion in the frequency domain

    NASA Technical Reports Server (NTRS)

    Watson, A. B.; Ahumada, A. J., Jr.

    1983-01-01

    A moving image can be specified by a contrast distribution, c(x,y,t), over the dimensions of space x,y, and time t. Alternatively, it can be specified by the distribution C(u,v,w) over spatial frequency u,v and temporal frequency w. The frequency representation of a moving image is shown to have a characteristic form. This permits two useful observations. The first is that the apparent smoothness of time-sampled moving images (apparent motion) can be explained by the filtering action of the human visual system. This leads to the following formula for the required update rate for time-sampled displays. W(c)=W(l)+ru(l) where w(c) is the required update rate in Hz, W(l) is the limit of human temporal resolution in Hz, r is the velocity of the moving image in degrees/sec, and u(l) is the limit of human spatial resolution in cycles/deg. The second observation is that it is possible to construct a linear sensor that responds to images moving in a particular direction. The sensor is derived and its properties are discussed.

  12. Developing Conceptual Models of Biodegradation: Lessons Learned From a Long-Term Study of a Crude-Oil Contaminant Plume

    NASA Astrophysics Data System (ADS)

    Cozzarelli, I. M.; Esaid, H. I.; Bekins, B. A.; Eganhouse, R. P.; Baedecker, M.

    2002-05-01

    Assessment of natural attenuation as a remedial option requires understanding the long-term fate of contaminant compounds. The development of correct conceptual models of biodegradation requires observations at spatial and temporal scales appropriate for the reactions being measured. For example, the availability of electron acceptors such as solid-phase iron oxides may vary at the cm scale due to aquifer heterogeneities. Characterizing the distribution of these oxides may require small-scale measurements over time scales of tens of years in order to assess their impact on the fate of contaminants. The long-term study of natural attenuation of hydrocarbons in a contaminant plume near Bemidji, MN provides insight into how natural attenuation of hydrocarbons evolves over time. The sandy glacial-outwash aquifer at this USGS Toxic Substances Hydrology research site was contaminated by crude oil in 1979. During the 16 years that data have been collected the shape and extent of the contaminant plume changed as redox reactions, most notably iron reduction, progressed over time. Investigation of the controlling microbial reactions in this system required a systematic and multi-scaled approach. Early indications of plume shrinkage were observed over a time scale of a few years, based on observation well data. These changes were associated with iron reduction near the crude-oil source. The depletion of Fe (III) oxides near the contaminant source caused the dissolved iron concentrations to increase and spread downgradient at a rate of approximately 3 m/year. The zone of maximum benzene, toluene, ethylbenzene, and xylene (BTEX) concentrations has also spread within the anoxic plume. Subsequent analyses of sediment and water, collected at small-scale cm intervals from cores in the contaminant plume, provided insight into the evolution of redox zones at smaller scales. Contaminants, such as ortho-xylene, that appeared to be contained near the oil source based on the larger-scale observation well data, were observed to be migrating in thin layers as the aquifer evolved to methanogenic conditions in narrow zones. The impact of adequately identifying the microbially mediated redox reactions was illustrated with a novel inverse modeling effort (using both the USGS solute transport and biodegradation code BIOMOC and the USGS universal inverse modeling code UCODE) to quantify field-scale hydrocarbon dissolution and biodegradation at the Bemidji site. Extensive historical data compiled at the Bemidji site were used, including 1352 concentration observations from 30 wells and 66 core sections. The simulations reproduced the general large-scale evolution of the plume, but the percent BTEX mass removed from the oil body after 18 years varied significantly, depending on which biodegradation conceptual model was used. The best fit was obtained for the iron-reduction conceptual model, which incorporated the finite availability of Fe (III) in the aquifer and reproduced the field observation that depletion of solid-phase iron resulted in increased downgradient transport of BTEX compounds. The predicted benzene plume 50 years after the spill showed significantly higher concentrations of benzene for the iron-reduction model compared to other conceptual models tested. This study demonstrates that the long-term sustainability of the electron acceptors is key to predicting the ultimate fate of the hydrocarbons. Assessing this evolution of redox processes and developing an adequate conceptual model required observations on multiple spatial scales over the course of many years.

  13. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA

    PubMed Central

    2012-01-01

    Background Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Methods Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. Results The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. Conclusions The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations. PMID:22394458

  14. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA.

    PubMed

    Foster, Amanda; Laurin, Nancy

    2012-03-06

    Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations.

  15. Time-resolved observation of thermally activated rupture of a capillary-condensed water nanobridge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bak, Wan; Sung, Baekman; Kim, Jongwoo

    2015-01-05

    The capillary-condensed liquid bridge is one of the most ubiquitous forms of liquid in nature and contributes significantly to adhesion and friction of biological molecules as well as microscopic objects. Despite its important role in nanoscience and technology, the rupture process of the bridge is not well understood and needs more experimental works. Here, we report real-time observation of rupture of a capillary-condensed water nanobridge in ambient condition. During slow and stepwise stretch of the nanobridge, we measured the activation time for rupture, or the latency time required for the bridge breakup. By statistical analysis of the time-resolved distribution ofmore » activation time, we show that rupture is a thermally activated stochastic process and follows the Poisson statistics. In particular, from the Arrhenius law that the rupture rate satisfies, we estimate the position-dependent activation energies for the capillary-bridge rupture.« less

  16. Evaluation of Bayesian estimation of a hidden continuous-time Markov chain model with application to threshold violation in water-quality indicators

    USGS Publications Warehouse

    Deviney, Frank A.; Rice, Karen; Brown, Donald E.

    2012-01-01

    Natural resource managers require information concerning  the frequency, duration, and long-term probability of occurrence of water-quality indicator (WQI) violations of defined thresholds. The timing of these threshold crossings often is hidden from the observer, who is restricted to relatively infrequent observations. Here, a model for the hidden process is linked with a model for the observations, and the parameters describing duration, return period, and long-term probability of occurrence are estimated using Bayesian methods. A simulation experiment is performed to evaluate the approach under scenarios based on the equivalent of a total monitoring period of 5-30 years and an observation frequency of 1-50 observations per year. Given constant threshold crossing rate, accuracy and precision of parameter estimates increased with longer total monitoring period and more-frequent observations. Given fixed monitoring period and observation frequency, accuracy and precision of parameter estimates increased with longer times between threshold crossings. For most cases where the long-term probability of being in violation is greater than 0.10, it was determined that at least 600 observations are needed to achieve precise estimates.  An application of the approach is presented using 22 years of quasi-weekly observations of acid-neutralizing capacity from Deep Run, a stream in Shenandoah National Park, Virginia. The time series also was sub-sampled to simulate monthly and semi-monthly sampling protocols. Estimates of the long-term probability of violation were unbiased despite sampling frequency; however, the expected duration and return period were over-estimated using the sub-sampled time series with respect to the full quasi-weekly time series.

  17. Competency-based education in anesthesiology: history and challenges.

    PubMed

    Ebert, Thomas J; Fox, Chris A

    2014-01-01

    The Accreditation Council for Graduate Medical Education is transitioning to a competency-based system with milestones to measure progress and define success of residents. The confines of the time-based residency will be relaxed. Curriculum must be redesigned and assessments will need to be precise and in-depth. Core anesthesiology faculty will be identified and will be the "trained observers" of the residents' progress. There will be logistic challenges requiring creative management by program directors. There may be residents who achieve "expert" status earlier than the required 36 months of clinical anesthesia education, whereas others may struggle to achieve acceptable status and will require additional education time. Faculty must accept both extremes without judgment. Innovative new educational opportunities will need to be created for fast learners. Finally, it will be important that residents embrace this change. This will require programs to clearly define the specific aims and measurement endpoints for advancement and success.

  18. Towards Measurement of the Time-resolved Heat Release of Protein Conformation Dynamics

    NASA Technical Reports Server (NTRS)

    Puchalla, Jason; Adamek, Daniel; Austin, Robert

    2004-01-01

    We present a way to observe time-resolved heat release using a laminar flow diffusional mixer coupled with a highly sensitive infrared camera which measures the temperature change of the solvent. There are significant benefits to the use of laminar flow mixers for time-resolved calorimetry: (1) The thermal signal can be made position and time- stationary to allow for signal integration; (2) Extremely small volumes (nl/s) of sample are required for a measurement; (3) The same mixing environment can be observed spectroscopically to obtain state occupation information; (4) The mixer allows one to do out of equilibrium dynamic studies. The hope is that these measurements will allow us probe the non-equilibrium thermodynamics as a protein moves along a free energy trajectory from one state to another.

  19. Future Missions for Space Weather Specifications and Forecasts

    NASA Astrophysics Data System (ADS)

    Onsager, T. G.; Biesecker, D. A.; Anthes, R. A.; Maier, M. W.; Gallagher, F. W., III; St Germain, K.

    2017-12-01

    The progress of technology and the global integration of our economic and security infrastructures have introduced vulnerabilities to space weather that demand a more comprehensive ability to specify and to predict the dynamics of the space environment. This requires a comprehensive network of real-time space-based and ground-based observations with long-term continuity. In order to determine the most cost effective space architectures for NOAA's weather, space weather, and environmental missions, NOAA conducted the NOAA Satellite Observing System Architecture (NSOSA) study. This presentation will summarize the process used to document the future needs and the relative priorities for NOAA's operational space-based observations. This involves specifying the most important observations, defining the performance attributes at different levels of capability, and assigning priorities for achieving the higher capability levels. The highest priority observations recommended by the Space Platform Requirements Working Group (SPRWG) for improvement above a minimal capability level will be described. Finally, numerous possible satellite architectures have been explored to assess the costs and benefits of various architecture configurations.

  20. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.

  1. Cost-efficient scheduling of FAST observations

    NASA Astrophysics Data System (ADS)

    Luo, Qi; Zhao, Laiping; Yu, Ce; Xiao, Jian; Sun, Jizhou; Zhu, Ming; Zhong, Yi

    2018-03-01

    A cost-efficient schedule for the Five-hundred-meter Aperture Spherical radio Telescope (FAST) requires to maximize the number of observable proposals and the overall scientific priority, and minimize the overall slew-cost generated by telescope shifting, while taking into account the constraints including the astronomical objects visibility, user-defined observable times, avoiding Radio Frequency Interference (RFI). In this contribution, first we solve the problem of maximizing the number of observable proposals and scientific priority by modeling it as a Minimum Cost Maximum Flow (MCMF) problem. The optimal schedule can be found by any MCMF solution algorithm. Then, for minimizing the slew-cost of the generated schedule, we devise a maximally-matchable edges detection-based method to reduce the problem size, and propose a backtracking algorithm to find the perfect matching with minimum slew-cost. Experiments on a real dataset from NASA/IPAC Extragalactic Database (NED) show that, the proposed scheduler can increase the usage of available times with high scientific priority and reduce the slew-cost significantly in a very short time.

  2. Reformulation of the relativistic conversion between coordinate time and atomic time

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1975-01-01

    The relativistic conversion between coordinate time and atomic time is reformulated to allow simpler time calculations relating analysis in solar system barycentric coordinates (using coordinate time) with earth-fixed observations (measuring 'earth-bound' proper time or atomic time). After an interpretation in terms of relatively well-known concepts, this simplified formulation, which has a rate accuracy of about 10 to the minus 15th, is used to explain the conventions required in the synchronization of a worldwide clock network and to analyze two synchronization techniques - portable clocks and radio interferometry. Finally, pertinent experimental tests of relativity are briefly discussed in terms of the reformulated time conversion.

  3. A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth

    PubMed Central

    Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai

    2017-01-01

    State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production. PMID:28848565

  4. A New Strategy in Observer Modeling for Greenhouse Cucumber Seedling Growth.

    PubMed

    Qiu, Quan; Zheng, Chenfei; Wang, Wenping; Qiao, Xiaojun; Bai, He; Yu, Jingquan; Shi, Kai

    2017-01-01

    State observer is an essential component in computerized control loops for greenhouse-crop systems. However, the current accomplishments of observer modeling for greenhouse-crop systems mainly focus on mass/energy balance, ignoring physiological responses of crops. As a result, state observers for crop physiological responses are rarely developed, and control operations are typically made based on experience rather than actual crop requirements. In addition, existing observer models require a large number of parameters, leading to heavy computational load and poor application feasibility. To address these problems, we present a new state observer modeling strategy that takes both environmental information and crop physiological responses into consideration during the observer modeling process. Using greenhouse cucumber seedlings as an instance, we sample 10 physiological parameters of cucumber seedlings at different time point during the exponential growth stage, and employ them to build growth state observers together with 8 environmental parameters. Support vector machine (SVM) acts as the mathematical tool for observer modeling. Canonical correlation analysis (CCA) is used to select the dominant environmental and physiological parameters in the modeling process. With the dominant parameters, simplified observer models are built and tested. We conduct contrast experiments with different input parameter combinations on simplified and un-simplified observers. Experimental results indicate that physiological information can improve the prediction accuracies of the growth state observers. Furthermore, the simplified observer models can give equivalent or even better performance than the un-simplified ones, which verifies the feasibility of CCA. The current study can enable state observers to reflect crop requirements and make them feasible for applications with simplified shapes, which is significant for developing intelligent greenhouse control systems for modern greenhouse production.

  5. A real-time device for converting Doppler ultrasound audio signals into fluid flow velocity

    PubMed Central

    Hogeman, Cynthia S.; Koch, Dennis W.; Krishnan, Anandi; Momen, Afsana; Leuenberger, Urs A.

    2010-01-01

    A Doppler signal converter has been developed to facilitate cardiovascular and exercise physiology research. This device directly converts audio signals from a clinical Doppler ultrasound imaging system into a real-time analog signal that accurately represents blood flow velocity and is easily recorded by any standard data acquisition system. This real-time flow velocity signal, when simultaneously recorded with other physiological signals of interest, permits the observation of transient flow response to experimental interventions in a manner not possible when using standard Doppler imaging devices. This converted flow velocity signal also permits a more robust and less subjective analysis of data in a fraction of the time required by previous analytic methods. This signal converter provides this capability inexpensively and requires no modification of either the imaging or data acquisition system. PMID:20173048

  6. Engaging science communication that are time-saving for scientists using new online technology

    NASA Astrophysics Data System (ADS)

    Lilja Bye, Bente

    2016-04-01

    Science communication is a time consuming and challenging task. Communicating scientific results comes on top of doing science itself and the administrative work the modern day scientists have to cope with. The competition on peoples time and attention is also fierce. In order to get peoples attention and interest, it is today often required that there is a two-way communication. The audience needs and wants to be engaged, even in real-time. The skills and times required to do that is normally not included in the university curricula. In this presentation we will look at new technologies that can help scientists overcome some of those skills and time challenges. The new online technologies that has been tested and developed in other societal areas, can be of great use for research and the important science communication. We will illustrate this through an example from biodiversity, wetlands and these fields use of Earth observations. Both the scientists themselves representing different fields of research and the general public are being engaged effectively and efficiently through specifically designed online events/seminars/workshops. The scientists are able to learn from each other while also engaging in live dialogues with the audience. A cooperation between the Group of Earth Observations and the Ramsar Convention of Wetlands will be used to illustrate the method. Within the global Earth observation community, where this example comes from, there is a great potential for efficient capacity building, targeting both experts, decision-makers and the general public. The method presented is demonstrating one way of tapping into that potential using new online technologies and it can easily be transferred to other fields of geoscience and science in general.

  7. Fast frequency acquisition via adaptive least squares algorithm

    NASA Technical Reports Server (NTRS)

    Kumar, R.

    1986-01-01

    A new least squares algorithm is proposed and investigated for fast frequency and phase acquisition of sinusoids in the presence of noise. This algorithm is a special case of more general, adaptive parameter-estimation techniques. The advantages of the algorithms are their conceptual simplicity, flexibility and applicability to general situations. For example, the frequency to be acquired can be time varying, and the noise can be nonGaussian, nonstationary and colored. As the proposed algorithm can be made recursive in the number of observations, it is not necessary to have a priori knowledge of the received signal-to-noise ratio or to specify the measurement time. This would be required for batch processing techniques, such as the fast Fourier transform (FFT). The proposed algorithm improves the frequency estimate on a recursive basis as more and more observations are obtained. When the algorithm is applied in real time, it has the extra advantage that the observations need not be stored. The algorithm also yields a real time confidence measure as to the accuracy of the estimator.

  8. Locating multiple diffusion sources in time varying networks from sparse observations.

    PubMed

    Hu, Zhao-Long; Shen, Zhesi; Cao, Shinan; Podobnik, Boris; Yang, Huijie; Wang, Wen-Xu; Lai, Ying-Cheng

    2018-02-08

    Data based source localization in complex networks has a broad range of applications. Despite recent progress, locating multiple diffusion sources in time varying networks remains to be an outstanding problem. Bridging structural observability and sparse signal reconstruction theories, we develop a general framework to locate diffusion sources in time varying networks based solely on sparse data from a small set of messenger nodes. A general finding is that large degree nodes produce more valuable information than small degree nodes, a result that contrasts that for static networks. Choosing large degree nodes as the messengers, we find that sparse observations from a few such nodes are often sufficient for any number of diffusion sources to be located for a variety of model and empirical networks. Counterintuitively, sources in more rapidly varying networks can be identified more readily with fewer required messenger nodes.

  9. Widefield Two-Photon Excitation without Scanning: Live Cell Microscopy with High Time Resolution and Low Photo-Bleaching

    PubMed Central

    Amor, Rumelo; McDonald, Alison; Trägårdh, Johanna; Robb, Gillian; Wilson, Louise; Abdul Rahman, Nor Zaihana; Dempster, John; Amos, William Bradshaw; Bushell, Trevor J.; McConnell, Gail

    2016-01-01

    We demonstrate fluorescence imaging by two-photon excitation without scanning in biological specimens as previously described by Hwang and co-workers, but with an increased field size and with framing rates of up to 100 Hz. During recordings of synaptically-driven Ca2+ events in primary rat hippocampal neurone cultures loaded with the fluorescent Ca2+ indicator Fluo-4 AM, we have observed greatly reduced photo-bleaching in comparison with single-photon excitation. This method, which requires no costly additions to the microscope, promises to be useful for work where high time-resolution is required. PMID:26824845

  10. The French contribution to the voluntary observing ships network of sea surface salinity

    NASA Astrophysics Data System (ADS)

    Alory, G.; Delcroix, T.; Téchiné, P.; Diverrès, D.; Varillon, D.; Cravatte, S.; Gouriou, Y.; Grelet, J.; Jacquin, S.; Kestenare, E.; Maes, C.; Morrow, R.; Perrier, J.; Reverdin, G.; Roubaud, F.

    2015-11-01

    Sea Surface Salinity (SSS) is an essential climate variable that requires long term in situ observation. The French SSS Observation Service (SSS-OS) manages a network of Voluntary Observing Ships equipped with thermosalinographs (TSG). The network is global though more concentrated in the tropical Pacific and North Atlantic oceanic basins. The acquisition system is autonomous with real time transmission and is regularly serviced at harbor calls. There are distinct real time and delayed time processing chains. Real time processing includes automatic alerts to detect potential instrument problems, in case raw data are outside of climatic limits, and graphical monitoring tools. Delayed time processing relies on a dedicated software for attribution of data quality flags by visual inspection, and correction of TSG time series by comparison with daily water samples and collocated Argo data. A method for optimizing the automatic attribution of quality flags in real time, based on testing different thresholds for data deviation from climatology and retroactively comparing the resulting flags to delayed time flags, is presented. The SSS-OS real time data feed the Coriolis operational oceanography database, while the research-quality delayed time data can be extracted for selected time and geographical ranges through a graphical web interface. Delayed time data have been also combined with other SSS data sources to produce gridded files for the Pacific and Atlantic oceans. A short review of the research activities conducted with such data is given. It includes observation-based process-oriented and climate studies from regional to global scale as well as studies where in situ SSS is used for calibration/validation of models, coral proxies or satellite data.

  11. The French Contribution to the Voluntary Observing Ships Network of Sea Surface Salinity

    NASA Astrophysics Data System (ADS)

    Delcroix, T. C.; Alory, G.; Téchiné, P.; Diverrès, D.; Varillon, D.; Cravatte, S. E.; Gouriou, Y.; Grelet, J.; Jacquin, S.; Kestenare, E.; Maes, C.; Morrow, R.; Perrier, J.; Reverdin, G. P.; Roubaud, F.

    2016-02-01

    Sea Surface Salinity (SSS) is an essential climate variable that requires long term in situ observation. The French SSS Observation Service (SSS-OS) manages a network of Voluntary Observing Ships equipped with thermosalinographs (TSG). The network is global though more concentrated in the tropical Pacific and North Atlantic oceanic basins. The acquisition system is autonomous with real time transmission and is regularly serviced at harbor calls. There are distinct real time and delayed time processing chains. Real time processing includes automatic alerts to detect potential instrument problems, in case raw data are outside of climatic limits, and graphical monitoring tools. Delayed time processing relies on a dedicated software for attribution of data quality flags by visual inspection, and correction of TSG time series by comparison with daily water samples and collocated Argo data. A method for optimizing the automatic attribution of quality flags in real time, based on testing different thresholds for data deviation from climatology and retroactively comparing the resulting flags to delayed time flags, is presented. The SSS-OS real time data feed the Coriolis operational oceanography database, while the research-quality delayed time data can be extracted for selected time and geographical ranges through a graphical web interface. Delayed time data have been also combined with other SSS data sources to produce gridded files for the Pacific and Atlantic oceans. A short review of the research activities conducted with such data is given. It includes observation-based process-oriented and climate studies from regional to global scale as well as studies where in situ SSS is used for calibration/validation of models, coral proxies or satellite data.

  12. How to Obtain the Lorentz Space Contraction Formula for a Moving Rod from Knowledge of the Positions of its Ends at Different Times

    ERIC Educational Resources Information Center

    Guasti, M. Fernandez; Zagoya, C.

    2009-01-01

    The Lorentz length contraction for a rod in uniform motion is derived performing two measurements at arbitrary times. Provided that the velocity of the rod is known, this derivation does not require the simultaneous measurement of two events. It thus avoids uncomfortable superluminal relationships. Furthermore, since the observer's simultaneous…

  13. JAMES WEBB SPACE TELESCOPE CAN DETECT KILONOVAE IN GRAVITATIONAL WAVE FOLLOW-UP SEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartos, I.; Márka, S.; Huard, T. L., E-mail: ibartos@phys.columbia.edu

    Kilonovae represent an important electromagnetic counterpart for compact binary mergers, which could become the most commonly detected gravitational-wave (GW) source. Follow-up observations of kilonovae, triggered by GW events, are nevertheless difficult due to poor localization by GW detectors and due to their faint near-infrared peak emission, which has limited observational capability. We show that the Near-Infrared Camera (NIRCam) on the James Webb Space Telescope will be able to detect kilonovae within the relevant GW-detection range of ∼200 Mpc in short (≲12-s) exposure times for a week following the merger. Despite this sensitivity, a kilonova search fully covering a fiducial localizedmore » area of 10 deg{sup 2} will not be viable with NIRCam due to its limited field of view. However, targeted surveys may be developed to optimize the likelihood of discovering kilonovae efficiently within limited observing time. We estimate that a survey of 10 deg{sup 2} focused on galaxies within 200 Mpc would require about 13 hr, dominated by overhead times; a survey further focused on galaxies exhibiting high star formation rates would require ∼5 hr. The characteristic time may be reduced to as little as ∼4 hr, without compromising the likelihood of detecting kilonovae, by surveying sky areas associated with 50%, rather than 90%, confidence regions of 3 GW events, rather than a single event. Upon the detection and identification of a kilonova, a limited number of NIRCam follow-up observations could constrain the properties of matter ejected by the binary and the equation of state of dense nuclear matter.« less

  14. [Daily living activities and oral condition among care facility residents with severe intellectual disabilities. Comparative analyses between residents receiving tooth-brushing assistance and those not receiving tooth-brushing assistance].

    PubMed

    Chiwata, Kaoru; Takeda, Fumi

    2007-06-01

    To clarify 1) differences in daily living activities and oral condition among care facility residents with severe intellectual disabilities and 2) chronological changes in daily living activities and oral condition for residents receiving tooth-brushing assistance and those never receiving tooth-brushing assistance. Subjects were 44 residents at a care facility for individuals with severe intellectual disabilities, who underwent dental screening in July 1994 and October 2003. At each time point, daily living activities, behavior during oral health guidance, behavior during dental health screening and oral condition were compared between residents receiving tooth-brushing assistance (assistance group) and those not receiving tooth-brushing assistance (independent group). Furthermore, chronological changes were analyzed for residents requiring assistance at both screenings, those requiring assistance only at the second screening, and those not requiring assistance at either screening. 1) In the assistance group, 100% and 36.4% of residents were unable to brush their teeth independently in 1994 and 2003, respectively. Significant differences between the assistance and independent groups were observed in all items of behavior during dental health screening in 1994, but not in 2003. No significant intergroup differences in oral condition were observed in 1994, but differences were seen in 2003; when compared to the assistance group, the number of lost teeth was significantly higher in the independent group, while the number of remaining teeth was lower. 2) Regarding changes over the nine-year period, a significantly greater proportion of residents not requiring assistance at either screening and those requiring assistance only at the second screening finally required assistance in bathing. As for oral condition, no significant changes in healthy teeth were observed in residents requiring assistance at both screening time points, while significant increases in dental caries and filled teeth and a significant decrease in the number of healthy teeth were observed in residents requiring assistance only at the second screening and those not requiring assistance at either screening. Over the nine-year period, the subjects of tooth-brushing assistance changed, and assistance was given to those able to brush their teeth independently in addition to those unable to brush their teeth independently. The number of healthy teeth did not change in residents receiving tooth-brushing assistance during this period, but in residents never receiving tooth-brushing assistance, decrease was noted. Therefore, even for individuals able to brush their teeth independently, some form of tooth-brushing assistance is needed to sufficiently prevent oral diseases.

  15. Multi-instrument observation on co-seismic ionospheric effects after great Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Hao, Y. Q.; Xiao, Z.; Zhang, D. H.

    2012-02-01

    In this paper, evidence of quake-excited infrasonic waves is provided first by a multi-instrument observation of Japan's Tohoku earthquake. The observations of co-seismic infrasonic waves are as follows: 1, effects of surface oscillations are observed by local infrasonic detector, and it seems these effects are due to surface oscillation-excited infrasonic waves instead of direct influence of seismic vibration on the detector; 2, these local excited infrasonic waves propagate upwards and correspond to ionospheric disturbances observed by Doppler shift measurements and GPS/TEC; 3, interactions between electron density variation and currents in the ionosphere caused by infrasonic waves manifest as disturbances in the geomagnetic field observed via surface magnetogram; 4, within 4 hours after this strong earthquake, disturbances in the ionosphere related to arrivals of Rayleigh waves were observed by Doppler shift sounding three times over. Two of the arrivals were from epicenter along the minor arc of the great circle (with the second arrival due to a Rayleigh wave propagating completely around the planet) and the other one from the opposite direction. All of these seismo-ionospheric effects observed by HF Doppler shift appear after local arrivals of surface Rayleigh waves, with a time delay of 8-10 min. This is the time required for infrasonic wave to propagate upwards to the ionosphere.

  16. Detecting climate variations and change: New challenges for observing and data management systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karl, T.R.; Quayle, R.G.; Groisman, P.Ya.

    1993-08-01

    Several essential aspects of weather observing and the management of weather data related to improving knowledge of climate variations and change in the surface boundary layer and the consequences for socioeconomic and biogeophysical systems, are discussed. The issues include long-term homogeneous time series of routine weather observations; time- and space-scale resolution of datasets derived from the observations; information about observing systems, data collection systems, and data reduction algorithms; and the enhancement of weather observing systems to serve as climate observing systems. Although much has been learned from existing weather networks and methods of data management, the system is far frommore » perfect. Several vital areas have not received adequate attention. Particular improvements are needed in the interaction between network designers and climatologists; operational analyses that focus on detecting and documenting outliers and time-dependent biases within datasets; developing the means to cope with and minimize potential inhomogeneities in weather observing systems; and authoritative documentation of how various aspects of climate have or have not changed. In this last area, close attention must be given to time and space resolution of the data. In many instances the time and space resolution requirements for understanding why the climate changes are not synonymous with understanding how it has changed or varied. This is particularly true within the surface boundary layer. A standard global daily/monthly climate message should also be introduced to supplement current Global Telecommunication System's CLIMAT data. Overall, a call is made for improvements in routine weather observing, data management, and analysis systems. Routine observations have provided (and will continue to provide) most of the information regarding how the climate has changed during the last 100 years affecting where we live, work, and grow our food. 58 refs., 8 figs., 1 tab.« less

  17. Mapping atomic motions with ultrabright electrons: towards fundamental limits in space-time resolution.

    PubMed

    Manz, Stephanie; Casandruc, Albert; Zhang, Dongfang; Zhong, Yinpeng; Loch, Rolf A; Marx, Alexander; Hasegawa, Taisuke; Liu, Lai Chung; Bayesteh, Shima; Delsim-Hashemi, Hossein; Hoffmann, Matthias; Felber, Matthias; Hachmann, Max; Mayet, Frank; Hirscht, Julian; Keskin, Sercan; Hada, Masaki; Epp, Sascha W; Flöttmann, Klaus; Miller, R J Dwayne

    2015-01-01

    The long held objective of directly observing atomic motions during the defining moments of chemistry has been achieved based on ultrabright electron sources that have given rise to a new field of atomically resolved structural dynamics. This class of experiments requires not only simultaneous sub-atomic spatial resolution with temporal resolution on the 100 femtosecond time scale but also has brightness requirements approaching single shot atomic resolution conditions. The brightness condition is in recognition that chemistry leads generally to irreversible changes in structure during the experimental conditions and that the nanoscale thin samples needed for electron structural probes pose upper limits to the available sample or "film" for atomic movies. Even in the case of reversible systems, the degree of excitation and thermal effects require the brightest sources possible for a given space-time resolution to observe the structural changes above background. Further progress in the field, particularly to the study of biological systems and solution reaction chemistry, requires increased brightness and spatial coherence, as well as an ability to tune the electron scattering cross-section to meet sample constraints. The electron bunch density or intensity depends directly on the magnitude of the extraction field for photoemitted electron sources and electron energy distribution in the transverse and longitudinal planes of electron propagation. This work examines the fundamental limits to optimizing these parameters based on relativistic electron sources using re-bunching cavity concepts that are now capable of achieving 10 femtosecond time scale resolution to capture the fastest nuclear motions. This analysis is given for both diffraction and real space imaging of structural dynamics in which there are several orders of magnitude higher space-time resolution with diffraction methods. The first experimental results from the Relativistic Electron Gun for Atomic Exploration (REGAE) are given that show the significantly reduced multiple electron scattering problem in this regime, which opens up micron scale systems, notably solution phase chemistry, to atomically resolved structural dynamics.

  18. Near-Earth-Object identification over apparitions using n-body ranging

    NASA Astrophysics Data System (ADS)

    Granvik, Mikael; Muinonen, Karri

    2007-05-01

    Earth-based telescopes can observe Near-Earth objects (NEOs) continuously for a few weeks or months during each apparition. Due to the usually complicated dynamics of the Sun-Earth-NEO triplet, the time interval between consecutive apparitions typically ranges from months to several years. On these timescales single-apparition sets of observations (SASs) having reasonably small observational time-intervals lead to substantial orbital uncertainties. The linking of SASs over apparitions thus becomes a nontrivial task. Of a total of roughly 4,100 NEO observation sets, or orbits, currently known, some 500 are SASs for which the observational time interval is less than 7 days. Either these SASs have not been observed at an apparition following the discovery apparition (some 40% of the above NEO SASs have been obtained in 2005 or later), or the linkage of SASs has failed, an option which should preferably be eliminated. As a continuation to our work on the short-arc linking problem at the discovery moment (Granvik and Muinonen, 2005, Icarus 179, p. 109), we have investigated the possibility of using a similar method for the linking of SASs over apparitions. Assuming that the observational time-interval for SASs of NEOs is typically at least one day (minimum requirement set by the Minor Planet Center), the orbital-element probability density function is constrained as compared to the typical short-arc case with an observational time interval of only a few tens of minutes. Because of the smaller orbital-element uncertainty, we can use the short-arc method (comparison in ephemeris space) for longer time spans, or even do the comparison directly in orbital-element space (Keplerian, equinoctial, etc.), thus allowing us to assess the problem of linking SASs of NEOs. We will present linking results by using both simulated and real NEO SASs.

  19. The Research on Lucalibration of GF-4 Satellite

    NASA Astrophysics Data System (ADS)

    Qi, W.; Tan, W.

    2018-04-01

    Starting from the lunar observation requirements of the GF-4 satellite, the main index such as the resolution, the imaging field, the reflect radiance and the imaging integration time are analyzed combined with the imaging features and parameters of this camera. The analysis results show that the lunar observation of GF-4 satellite has high resolution, wide field which can image the whole moon, the radiance of the pupil which is reflected by the moon is within the dynamic range of the camera, and the lunar image quality can be guaranteed better by setting up a reasonable integration time. At the same time, the radiation transmission model of the lunar radiation calibration is trace and the radiation degree is evaluated.

  20. Does the Sun Have a Full-Time Chromosphere?

    NASA Astrophysics Data System (ADS)

    Kalkofen, Wolfgang; Ulmschneider, Peter; Avrett, Eugene H.

    1999-08-01

    The successful modeling of the dynamics of H2v bright points in the nonmagnetic chromosphere by Carlsson & Stein gave as a by-product a part-time chromosphere lacking the persistent outward temperature increase of time-average empirical models, which is needed to explain observations of UV emission lines and continua. We discuss the failure of the dynamical model to account for most of the observed chromospheric emission, arguing that their model uses only about 1% of the acoustic energy supplied to the medium. Chromospheric heating requires an additional source of energy in the form of acoustic waves of short period (P<2 minutes), which form shocks and produce the persistent outward temperature increase that can account for the UV emission lines and continua.

  1. Kinetic Features in the Ion Flux Spectrum

    NASA Astrophysics Data System (ADS)

    Vafin, S.; Riazantseva, M.; Yoon, P. H.

    2017-11-01

    An interesting feature of solar wind fluctuations is the occasional presence of a well-pronounced peak near the spectral knee. These peaks are well investigated in the context of magnetic field fluctuations in the magnetosheath and they are typically related to kinetic plasma instabilities. Recently, similar peaks were observed in the spectrum of ion flux fluctuations of the solar wind and magnetosheath. In this paper, we propose a simple analytical model to describe such peaks in the ion flux spectrum based on the linear theory of plasma fluctuations. We compare our predictions with a sample observation in the solar wind. For the given observation, the peak requires ˜10 minutes to grow up to the observed level that agrees with the quasi-linear relaxation time. Moreover, our model well reproduces the form of the measured peak in the ion flux spectrum. The observed lifetime of the peak is about 50 minutes, which is relatively close to the nonlinear Landau damping time of 30-40 minutes. Overall, our model proposes a plausible scenario explaining the observation.

  2. Correlations of circulating peptide YY and ghrelin with body weight, rate of weight gain, and time required to achieve the recommended daily intake in preterm infants.

    PubMed

    Chen, XiaFang; Du, XueLiang; Zhu, JianXing; Xie, LiJuan; Zhang, YongJun; He, ZhenJuan

    2012-07-01

    The objective was to elucidate the relationships between serum concentrations of the gut hormone peptide YY (PYY) and ghrelin and growth development in infants for potential application to the clinical observation index. Serum concentrations of PYY and ghrelin were measured using radioimmunoassay from samples collected at the clinic. For each patient, gestational age, birth weight, time required to return to birth weight, rate of weight gain, time required to achieve recommended daily intake (RDI) standards, time required for full-gastric feeding, duration of hospitalization, and time of administration of total parenteral nutrition were recorded. Serum PYY and ghrelin concentrations were significantly higher in the preterm group (N = 20) than in the full-term group (N = 20; P < 0.01). Within the preterm infant group, the serum concentrations of PYY and ghrelin on postnatal day (PND) 7 (ghrelin = 1485.38 ± 409.24; PYY = 812.37 ± 153.77 ng/L) were significantly higher than on PND 1 (ghrelin = 956.85 ± 223.09; PYY = 545.27 ± 204.51 ng/L) or PND 3 (ghrelin = 1108.44 ± 351.36; PYY = 628.96 ± 235.63 ng/L; P < 0.01). Both serum PYY and ghrelin concentrations were negatively correlated with body weight, and the degree of correlation varied with age. Serum ghrelin concentration correlated negatively with birth weight and positively with the time required to achieve RDI (P < 0.05). In conclusion, serum PYY and ghrelin concentrations reflect a negative energy balance, predict postnatal growth, and enable compensation. Further studies are required to elucidate the precise concentration and roles of PYY and ghrelin in newborns and to determine the usefulness of measuring these hormones in clinical practice.

  3. Correlations of circulating peptide YY and ghrelin with body weight, rate of weight gain, and time required to achieve the recommended daily intake in preterm infants

    PubMed Central

    Chen, XiaFang; Du, Xueliang; Zhu, JianXing; Xie, LiJuan; Zhang, YongJun; He, ZhenJuan

    2012-01-01

    The objective was to elucidate the relationships between serum concentrations of the gut hormone peptide YY (PYY) and ghrelin and growth development in infants for potential application to the clinical observation index. Serum concentrations of PYY and ghrelin were measured using radioimmunoassay from samples collected at the clinic. For each patient, gestational age, birth weight, time required to return to birth weight, rate of weight gain, time required to achieve recommended daily intake (RDI) standards, time required for full-gastric feeding, duration of hospitalization, and time of administration of total parenteral nutrition were recorded. Serum PYY and ghrelin concentrations were significantly higher in the preterm group (N = 20) than in the full-term group (N = 20; P < 0.01). Within the preterm infant group, the serum concentrations of PYY and ghrelin on postnatal day (PND) 7 (ghrelin = 1485.38 ± 409.24; PYY = 812.37 ± 153.77 ng/L) were significantly higher than on PND 1 (ghrelin = 956.85 ± 223.09; PYY = 545.27 ± 204.51 ng/L) or PND 3 (ghrelin = 1108.44 ± 351.36; PYY = 628.96 ± 235.63 ng/L; P < 0.01). Both serum PYY and ghrelin concentrations were negatively correlated with body weight, and the degree of correlation varied with age. Serum ghrelin concentration correlated negatively with birth weight and positively with the time required to achieve RDI (P < 0.05). In conclusion, serum PYY and ghrelin concentrations reflect a negative energy balance, predict postnatal growth, and enable compensation. Further studies are required to elucidate the precise concentration and roles of PYY and ghrelin in newborns and to determine the usefulness of measuring these hormones in clinical practice. PMID:22527125

  4. A switched systems approach to image-based estimation

    NASA Astrophysics Data System (ADS)

    Parikh, Anup

    With the advent of technological improvements in imaging systems and computational resources, as well as the development of image-based reconstruction techniques, it is necessary to understand algorithm performance when subject to real world conditions. Specifically, this dissertation focuses on the stability and performance of a class of image-based observers in the presence of intermittent measurements, caused by e.g., occlusions, limited FOV, feature tracking losses, communication losses, or finite frame rates. Observers or filters that are exponentially stable under persistent observability may have unbounded error growth during intermittent sensing, even while providing seemingly accurate state estimates. In Chapter 3, dwell time conditions are developed to guarantee state estimation error convergence to an ultimate bound for a class of observers while undergoing measurement loss. Bounds are developed on the unstable growth of the estimation errors during the periods when the object being tracked is not visible. A Lyapunov-based analysis for the switched system is performed to develop an inequality in terms of the duration of time the observer can view the moving object and the duration of time the object is out of the field of view. In Chapter 4, a motion model is used to predict the evolution of the states of the system while the object is not visible. This reduces the growth rate of the bounding function to an exponential and enables the use of traditional switched systems Lyapunov analysis techniques. The stability analysis results in an average dwell time condition to guarantee state error convergence with a known decay rate. In comparison with the results in Chapter 3, the estimation errors converge to zero rather than a ball, with relaxed switching conditions, at the cost of requiring additional information about the motion of the feature. In some applications, a motion model of the object may not be available. Numerous adaptive techniques have been developed to compensate for unknown parameters or functions in system dynamics; however, persistent excitation (PE) conditions are typically required to ensure parameter convergence, i.e., learning. Since the motion model is needed in the predictor, model learning is desired; however, PE is difficult to insure a priori and infeasible to check online for nonlinear systems. Concurrent learning (CL) techniques have been developed to use recorded data and a relaxed excitation condition to ensure convergence. In CL, excitation is only required for a finite period of time, and the recorded data can be checked to determine if it is sufficiently rich. However, traditional CL requires knowledge of state derivatives, which are typically not measured and require extensive filter design and tuning to develop satisfactory estimates. In Chapter 5 of this dissertation, a novel formulation of CL is developed in terms of an integral (ICL), removing the need to estimate state derivatives while preserving parameter convergence properties. Using ICL, an estimator is developed in Chapter 6 for simultaneously estimating the pose of an object as well as learning a model of its motion for use in a predictor when the object is not visible. A switched systems analysis is provided to demonstrate the stability of the estimation and prediction with learning scheme. Dwell time conditions as well as excitation conditions are developed to ensure estimation errors converge to an arbitrarily small bound. Experimental results are provided to illustrate the performance of each of the developed estimation schemes. The dissertation concludes with a discussion of the contributions and limitations of the developed techniques, as well as avenues for future extensions.

  5. Sunflower Seeds at Syracuse.

    ERIC Educational Resources Information Center

    Shaw, Kenneth A.

    1993-01-01

    Syracuse University (New York) has found that the focused, collaborative approach to institutional administration embodied in Total Quality Management is beneficial. It requires allocation of adequate time and resources, observation of others, communication, addressing high-need areas first, and willingness to adapt, but it may bring resistance…

  6. Critical Requirements for Army War College Faculty Instructors

    DTIC Science & Technology

    1989-04-26

    may be traced to studies of Sir Francis Galton and to developments such as time sampling studies of recreational activities, controlled observation...administratos show low correlations with objectives measures such as student test scores. Secondly, predictors of teacher effectiveness such as intelligence

  7. The implementation of a mobile problem-specific electronic CEX for assessing directly observed student-patient encounters.

    PubMed

    Ferenchick, Gary S; Foreback, Jami; Towfiq, Basim; Kavanaugh, Kevin; Solomon, David; Mohmand, Asad

    2010-01-29

    Facilitating direct observation of medical students' clinical competencies is a pressing need. We developed an electronic problem-specific Clinical Evaluation Exercise (eCEX) based on a national curriculum. We assessed its feasibility in monitoring and recording students' competencies and the impact of a grading incentive on the frequency of direct observations in an internal medicine clerkship. Students (n = 56) at three clinical sites used the eCEX and comparison students (n = 56) at three other clinical sites did not. Students in the eCEX group were required to arrange 10 evaluations with faculty preceptors. Students in the second group were required to document a single, faculty observed 'Full History and Physical' encounter with a patient. Students and preceptors were surveyed at the end of each rotation. eCEX increased students' and evaluators' understanding of direct-observation objectives and had a positive impact on the evaluators' ability to provide feedback and assessments. The grading incentive increased the number of times a student reported direct observation by a resident preceptor. eCEX appears to be an effective means of enhancing student evaluation.

  8. Unobtrusive Software and System Health Management with R2U2 on a Parallel MIMD Coprocessor

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Moosbrugger, Patrick

    2017-01-01

    Dynamic monitoring of software and system health of a complex cyber-physical system requires observers that continuously monitor variables of the embedded software in order to detect anomalies and reason about root causes. There exists a variety of techniques for code instrumentation, but instrumentation might change runtime behavior and could require costly software re-certification. In this paper, we present R2U2E, a novel realization of our real-time, Realizable, Responsive, and Unobtrusive Unit (R2U2). The R2U2E observers are executed in parallel on a dedicated 16-core EPIPHANY co-processor, thereby avoiding additional computational overhead to the system under observation. A DMA-based shared memory access architecture allows R2U2E to operate without any code instrumentation or program interference.

  9. A Nonparametric Approach For Representing Interannual Dependence In Monthly Streamflow Sequences

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Oneill, R.

    The estimation of risks associated with water management plans requires generation of synthetic streamflow sequences. The mathematical algorithms used to generate these sequences at monthly time scales are found lacking in two main respects: inability in preserving dependence attributes particularly at large (seasonal to interannual) time lags; and, a poor representation of observed distributional characteristics, in partic- ular, representation of strong assymetry or multimodality in the probability density function. Proposed here is an alternative that naturally incorporates both observed de- pendence and distributional attributes in the generated sequences. Use of a nonpara- metric framework provides an effective means for representing the observed proba- bility distribution, while the use of a Svariable kernelT ensures accurate modeling of & cedil;streamflow data sets that contain a substantial number of zero flow values. A careful selection of prior flows imparts the appropriate short-term memory, while use of an SaggregateT flow variable allows representation of interannual dependence. The non- & cedil;parametric simulation model is applied to monthly flows from the Beaver River near Beaver, Utah, USA, and the Burrendong dam inflows, New South Wales, Australia. Results indicate that while the use of traditional simulation approaches leads to an inaccurate representation of dependence at long (annual and interannual) time scales, the proposed model can simulate both short and long-term dependence. As a result, the proposed model ensures a significantly improved representation of reservoir storage statistics, particularly for systems influenced by long droughts. It is important to note that the proposed method offers a simpler and better alternative to conventional dis- aggregation models as: (a) a separate annual flow series is not required, (b) stringent assumptions relating annual and monthly flows are not needed, and (c) the method does not require the specification of a "water year", instead ensuring that the sum of any sequence of flows lasting twelve months will result in the type of dependence that is observed in the historical annual flow series.

  10. The role of adsorbed water on the friction of a layer of submicron particles

    USGS Publications Warehouse

    Sammis, Charles G.; Lockner, David A.; Reches, Ze’ev

    2011-01-01

    Anomalously low values of friction observed in layers of submicron particles deformed in simple shear at high slip velocities are explained as the consequence of a one nanometer thick layer of water adsorbed on the particles. The observed transition from normal friction with an apparent coefficient near μ = 0.6 at low slip speeds to a coefficient near μ = 0.3 at higher slip speeds is attributed to competition between the time required to extrude the water layer from between neighboring particles in a force chain and the average lifetime of the chain. At low slip speeds the time required for extrusion is less than the average lifetime of a chain so the particles make contact and lock. As slip speed increases, the average lifetime of a chain decreases until it is less than the extrusion time and the particles in a force chain never come into direct contact. If the adsorbed water layer enables the otherwise rough particles to rotate, the coefficient of friction will drop to μ = 0.3, appropriate for rotating spheres. At the highest slip speeds particle temperatures rise above 100°C, the water layer vaporizes, the particles contact and lock, and the coefficient of friction rises to μ = 0.6. The observed onset of weakening at slip speeds near 0.001 m/s is consistent with the measured viscosity of a 1 nm thick layer of adsorbed water, with a minimum particle radius of approximately 20 nm, and with reasonable assumptions about the distribution of force chains guided by experimental observation. The reduction of friction and the range of velocities over which it occurs decrease with increasing normal stress, as predicted by the model. Moreover, the analysis predicts that this high-speed weakening mechanism should operate only for particles with radii smaller than approximately 1 μm. For larger particles the slip speed required for weakening is so large that frictional heating will evaporate the adsorbed water and weakening will not occur.

  11. Spatial Modelling of Soil-Transmitted Helminth Infections in Kenya: A Disease Control Planning Tool

    PubMed Central

    Pullan, Rachel L.; Gething, Peter W.; Smith, Jennifer L.; Mwandawiro, Charles S.; Sturrock, Hugh J. W.; Gitonga, Caroline W.; Hay, Simon I.; Brooker, Simon

    2011-01-01

    Background Implementation of control of parasitic diseases requires accurate, contemporary maps that provide intervention recommendations at policy-relevant spatial scales. To guide control of soil transmitted helminths (STHs), maps are required of the combined prevalence of infection, indicating where this prevalence exceeds an intervention threshold of 20%. Here we present a new approach for mapping the observed prevalence of STHs, using the example of Kenya in 2009. Methods and Findings Observed prevalence data for hookworm, Ascaris lumbricoides and Trichuris trichiura were assembled for 106,370 individuals from 945 cross-sectional surveys undertaken between 1974 and 2009. Ecological and climatic covariates were extracted from high-resolution satellite data and matched to survey locations. Bayesian space-time geostatistical models were developed for each species, and were used to interpolate the probability that infection prevalence exceeded the 20% threshold across the country for both 1989 and 2009. Maps for each species were integrated to estimate combined STH prevalence using the law of total probability and incorporating a correction factor to adjust for associations between species. Population census data were combined with risk models and projected to estimate the population at risk and requiring treatment in 2009. In most areas for 2009, there was high certainty that endemicity was below the 20% threshold, with areas of endemicity ≥20% located around the shores of Lake Victoria and on the coast. Comparison of the predicted distributions for 1989 and 2009 show how observed STH prevalence has gradually decreased over time. The model estimated that a total of 2.8 million school-age children live in districts which warrant mass treatment. Conclusions Bayesian space-time geostatistical models can be used to reliably estimate the combined observed prevalence of STH and suggest that a quarter of Kenya's school-aged children live in areas of high prevalence and warrant mass treatment. As control is successful in reducing infection levels, updated models can be used to refine decision making in helminth control. PMID:21347451

  12. TH-C-18A-11: Investigating the Minimum Scan Parameters Required to Generate Free-Breathing Fast-Helical CT Scans Without Motion-Artifacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, D; Neylon, J; Dou, T

    Purpose: A recently proposed 4D-CT protocol uses deformable registration of free-breathing fast-helical CT scans to generate a breathing motion model. In order to allow accurate registration, free-breathing images are required to be free of doubling-artifacts, which arise when tissue motion is greater than scan speed. This work identifies the minimum scanner parameters required to successfully generate free-breathing fast-helical scans without doubling-artifacts. Methods: 10 patients were imaged under free breathing conditions 25 times in alternating directions with a 64-slice CT scanner using a low dose fast helical protocol. A high temporal resolution (0.1s) 4D-CT was generated using a patient specific motionmore » model and patient breathing waveforms, and used as the input for a scanner simulation. Forward projections were calculated using helical cone-beam geometry (800 projections per rotation) and a GPU accelerated reconstruction algorithm was implemented. Various CT scanner detector widths and rotation times were simulated, and verified using a motion phantom. Doubling-artifacts were quantified in patient images using structural similarity maps to determine the similarity between axial slices. Results: Increasing amounts of doubling-artifacts were observed with increasing rotation times > 0.2s for 16×1mm slice scan geometry. No significant increase in doubling artifacts was observed for 64×1mm slice scan geometry up to 1.0s rotation time although blurring artifacts were observed >0.6s. Using a 16×1mm slice scan geometry, a rotation time of less than 0.3s (53mm/s scan speed) would be required to produce images of similar quality to a 64×1mm slice scan geometry. Conclusion: The current generation of 16 slice CT scanners, which are present in most Radiation Oncology departments, are not capable of generating free-breathing sorting-artifact-free images in the majority of patients. The next generation of CT scanners should be capable of at least 53mm/s scan speed in order to use a fast-helical 4D-CT protocol to generate a motion-artifact free 4D-CT. NIH R01CA096679.« less

  13. Remediation of bromate-contaminated groundwater in an ex situ fixed-film bioreactor.

    PubMed

    Butler, R; Ehrenberg, S; Godley, A R; Lake, R; Lytton, L; Cartmell, E

    2006-07-31

    Use of a pilot-scale fixed-film bioreactor was investigated for remediation of bromate contamination within groundwater. Bromate reduction with stoichiometric production of bromide was observed, providing supporting evidence for complete reduction of bromate with no production of stable intermediates. Reduction of 87-90% bromate from an influent concentration of 1.1 mg L(-1) was observed with retention times of 40-80 h. Lower retention times led to decreases in bromate reduction capability, with 11.5% removal at a 10 h retention time. Nitrate reduction of 76-99% from a 30.7 mg L(-1) as NO(3)(-) influent was observed at retention times of 10-80 h, although an increase in nitrite production to 2.7 mg L(-1) occurred with a 10 h retention time. Backwashing was not required, with the large plastic packing media able to accommodate biomass accumulation without decreases in operational efficiency. This study has provided proof of concept and demonstrated the potential of biological bromate reduction by fixed-film processes for remediation of a bromate contaminated groundwater source.

  14. A Satellite-Based Imaging Instrumentation Concept for Hyperspectral Thermal Remote Sensing.

    PubMed

    Udelhoven, Thomas; Schlerf, Martin; Segl, Karl; Mallick, Kaniska; Bossung, Christian; Retzlaff, Rebecca; Rock, Gilles; Fischer, Peter; Müller, Andreas; Storch, Tobias; Eisele, Andreas; Weise, Dennis; Hupfer, Werner; Knigge, Thiemo

    2017-07-01

    This paper describes the concept of the hyperspectral Earth-observing thermal infrared (TIR) satellite mission HiTeSEM (High-resolution Temperature and Spectral Emissivity Mapping). The scientific goal is to measure specific key variables from the biosphere, hydrosphere, pedosphere, and geosphere related to two global problems of significant societal relevance: food security and human health. The key variables comprise land and sea surface radiation temperature and emissivity, surface moisture, thermal inertia, evapotranspiration, soil minerals and grain size components, soil organic carbon, plant physiological variables, and heat fluxes. The retrieval of this information requires a TIR imaging system with adequate spatial and spectral resolutions and with day-night following observation capability. Another challenge is the monitoring of temporally high dynamic features like energy fluxes, which require adequate revisit time. The suggested solution is a sensor pointing concept to allow high revisit times for selected target regions (1-5 days at off-nadir). At the same time, global observations in the nadir direction are guaranteed with a lower temporal repeat cycle (>1 month). To account for the demand of a high spatial resolution for complex targets, it is suggested to combine in one optic (1) a hyperspectral TIR system with ~75 bands at 7.2-12.5 µm (instrument NEDT 0.05 K-0.1 K) and a ground sampling distance (GSD) of 60 m, and (2) a panchromatic high-resolution TIR-imager with two channels (8.0-10.25 µm and 10.25-12.5 µm) and a GSD of 20 m. The identified science case requires a good correlation of the instrument orbit with Sentinel-2 (maximum delay of 1-3 days) to combine data from the visible and near infrared (VNIR), the shortwave infrared (SWIR) and TIR spectral regions and to refine parameter retrieval.

  15. Ocean OSSEs: recent developments and future challenges

    NASA Astrophysics Data System (ADS)

    Kourafalou, V. H.

    2012-12-01

    Atmospheric OSSEs have had a much longer history of applications than OSSEs (and OSEs) in oceanography. Long standing challenges include the presence of coastlines and steep bathymetric changes, which require the superposition of a wide variety of space and time scales, leading to difficulties on ocean observation and prediction. For instance, remote sensing is critical for providing a quasi-synoptic oceanographic view, but the coverage is limited at the ocean surface. Conversely, in situ measurements are capable to monitor the entire water column, but at a single location and usually for a specific, short time. Despite these challenges, substantial progress has been made in recent years and international initiatives have provided successful OSSE/OSE examples and formed appropriate forums that helped define the future roadmap. These will be discussed, together with various challenges that require a community effort. Examples include: integrated (remote and in situ) observing system requirements for monitoring large scale and climatic changes, vs. short term variability that is particularly important on the regional and coastal spatial scales; satisfying the needs of both global and regional/coastal nature runs, from development to rigorous evaluation and under a clear definition of metrics; data assimilation in the presence of tides; estimation of real-time river discharges for Earth system modeling. An overview of oceanographic efforts that complement the standard OSSE methodology will also be given. These include ocean array design methods, such as representer-based analysis and adaptive sampling. Exciting new opportunities for both global and regional ocean OSSE/OSE studies have recently become possible with targeted periods of comprehensive data sets, such as the existing Gulf of Mexico observations from multiple sources in the aftermath of the DeepWater Horizon incident and the upcoming airborne AirSWOT, in preparation for the SWOT (Surface Water and Ocean Topography) mission.

  16. VizieR Online Data Catalog: NGC1068 interferometric mid-IR measurements (Lopez-Gonzaga+, 2017)

    NASA Astrophysics Data System (ADS)

    Lopez-Gonzaga, N.; Asmus, D.; Bauer, F. E.; Tristram, K. R. W.; Burtscher, L.; Marinucci, A.; Matt, G.; Harrison, F. A.

    2017-06-01

    Single-aperture mid-infrared images and spectra were taken with the VLT spectrometer and imager for the mid-infrared (VISIR). Interferometric measurements were obtained with the instrument MIDI at the ESO's VLTI facility. Observations with intermediate AT baselines were requested and observed during the nights of January, 10, 20, and 23, 2015 using Director's discretionary time (DDT). We additionally included published and unpublished interferometric data from our previous campaigns with the requirement that they were observed (nearly) contemporaneously to the period of X-ray variation or observed a few years before. These include observations taken on the nights of September, 21, 26, and 30, 2014, and November, 17, 2014, using Guaranteed Time Observations (GTO). For our observations we used the low-resolution NaCl prism with spectral resolution R=λ/Δλ~30 to disperse the light of the beams. A log of the observations and instrument setup can be found in Appendix A. The published data were taken from Lopez-Gonzaga et al. (2014A&A...565A..71L, Cat. J/A+A/565/A71). (2 data files).

  17. Four minutes for a patient, twenty seconds for a relative - an observational study at a university hospital

    PubMed Central

    2010-01-01

    Background In the modern hospital environment, increasing possibilities in medical examination techniques and increasing documentation tasks claim the physicians' energy and encroach on their time spent with patients. This study aimed to investigate how much time physicians at hospital wards spend on communication with patients and their families and how much time they spend on other specific work tasks. Methods A non-participatory, observational study was conducted in thirty-six wards at the University Medical Center Freiburg, a 1700-bed academic hospital in Germany. All wards belonging to the clinics of internal medicine, surgery, radiology, neurology, and to the clinic for gynaecology took part in the study. Thirty-four ward doctors from fifteen different medical departments were observed during a randomly chosen complete work day. The Physicians' time for communication with patients and relatives and time spent on different working tasks during one day of work were assessed. Results 374 working hours were analysed. On average, a physician's workday on a university hospital ward added up to 658.91 minutes (10 hrs 58 min; range 490 - 848 min). Looking at single items of time consumption on the evaluation sheet, discussions with colleagues ranked first with 150 minutes on average. Documentation and administrative requirements took an average time of 148 minutes per day and ranked second. Total time for communication with patients and their relatives was 85 minutes per physician and day. Consequently, the available time for communication was 4 minutes and 17 seconds for each patient on the ward and 20 seconds for his or her relatives. Physicians assessed themselves to communicate twice as long with patients and sevenfold with relatives than they did according to this study. Conclusions Workload and time pressure for physicians working on hospital wards are high. To offer excellent medical treatment combined with patient centred care and to meet the needs of patients and relatives on hospital wards, physicians should be given more time to focus on core clinical tasks. Time and health care management solutions to minimize time pressure are required. Further research is needed to assess quality of communication in hospital settings. PMID:20380725

  18. Living Color Frame System: PC graphics tool for data visualization

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1993-01-01

    Living Color Frame System (LCFS) is a personal computer software tool for generating real-time graphics applications. It is highly applicable for a wide range of data visualization in virtual environment applications. Engineers often use computer graphics to enhance the interpretation of data under observation. These graphics become more complicated when 'run time' animations are required, such as found in many typical modern artificial intelligence and expert systems. Living Color Frame System solves many of these real-time graphics problems.

  19. Measurement of Ligand–Target Residence Times by 1H Relaxation Dispersion NMR Spectroscopy

    PubMed Central

    2016-01-01

    A ligand-observed 1H NMR relaxation experiment is introduced for measuring the binding kinetics of low-molecular-weight compounds to their biomolecular targets. We show that this approach, which does not require any isotope labeling, is applicable to ligand–target systems involving proteins and nucleic acids of variable molecular size. The experiment is particularly useful for the systematic investigation of low affinity molecules with residence times in the micro- to millisecond time regime. PMID:27933946

  20. Remote sensing requirements as suggested by watershed model sensitivity analyses

    NASA Technical Reports Server (NTRS)

    Salomonson, V. V.; Rango, A.; Ormsby, J. P.; Ambaruch, R.

    1975-01-01

    A continuous simulation watershed model has been used to perform sensitivity analyses that provide guidance in defining remote sensing requirements for the monitoring of watershed features and processes. The results show that out of 26 input parameters having meaningful effects on simulated runoff, 6 appear to be obtainable with existing remote sensing techniques. Of these six parameters, 3 require the measurement of the areal extent of surface features (impervious areas, water bodies, and the extent of forested area), two require the descrimination of land use that can be related to overland flow roughness coefficient or the density of vegetation so as to estimate the magnitude of precipitation interception, and one parameter requires the measurement of distance to get the length over which overland flow typically occurs. Observational goals are also suggested for monitoring such fundamental watershed processes as precipitation, soil moisture, and evapotranspiration. A case study on the Patuxent River in Maryland shows that runoff simulation is improved if recent satellite land use observations are used as model inputs as opposed to less timely topographic map information.

  1. Apollo 15 time and motion study

    NASA Technical Reports Server (NTRS)

    Kubis, J. F.; Elrod, J. T.; Rusnak, R.; Barnes, J. E.

    1972-01-01

    A time and motion study of Apollo 15 lunar surface activity led to examination of four distinct areas of crewmen activity. These areas are: an analysis of lunar mobility, a comparative analysis of tasks performed in 1-g training and lunar EVA, an analysis of the metabolic cost of two activities that are performed in several EVAs, and a fall/near-fall analysis. An analysis of mobility showed that the crewmen used three basic mobility patterns (modified walk, hop, side step) while on the lunar surface. These mobility patterns were utilized as adaptive modes to compensate for the uneven terrain and varied soil conditions that the crewmen encountered. A comparison of the time required to perform tasks at the final 1-g lunar EVA training sessions and the time required to perform the same task on the lunar surface indicates that, in almost all cases, it took significantly more time (on the order of 40%) to perform tasks on the moon. This increased time was observed even after extraneous factors (e.g., hardware difficulties) were factored out.

  2. Data analysis of gravitational-wave signals from spinning neutron stars. III. Detection statistics and computational requirements

    NASA Astrophysics Data System (ADS)

    Jaranowski, Piotr; Królak, Andrzej

    2000-03-01

    We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.

  3. Hippocampal activation during the recall of remote spatial memories in radial maze tasks.

    PubMed

    Schlesiger, Magdalene I; Cressey, John C; Boublil, Brittney; Koenig, Julie; Melvin, Neal R; Leutgeb, Jill K; Leutgeb, Stefan

    2013-11-01

    Temporally graded retrograde amnesia is observed in human patients with medial temporal lobe lesions as well as in animal models of medial temporal lobe lesions. A time-limited role for these structures in memory recall has also been suggested by the observation that the rodent hippocampus and entorhinal cortex are activated during the retrieval of recent but not of remote memories. One notable exception is the recall of remote memories for platform locations in the water maze, which requires an intact hippocampus and results in hippocampal activation irrespective of the age of the memory. These findings raise the question whether the hippocampus is always involved in the recall of spatial memories or, alternatively, whether it might be required for procedural computations in the water maze task, such as for calculating a path to a hidden platform. We performed spatial memory testing in radial maze tasks to distinguish between these possibilities. Radial maze tasks require a choice between spatial locations on a center platform and thus have a lesser requirement for navigation than the water maze. However, we used a behavioral design in the radial maze that retained other aspects of the standard water maze task, such as the use of multiple start locations and retention testing in a single trial. Using the immediate early gene c-fos as a marker for neuronal activation, we found that all hippocampal subregions were more activated during the recall of remote compared to recent spatial memories. In areas CA3 and CA1, activation during remote memory testing was higher than in rats that were merely reexposed to the testing environment after the same time interval. Conversely, Fos levels in the dentate gyrus were increased after retention testing to the extent that was also observed in the corresponding exposure control group. This pattern of hippocampal activation was also obtained in a second version of the task that only used a single start arm instead of multiple start arms. The CA3 and CA1 activation during remote memory recall is consistent with the interpretation that an older memory might require increased pattern completion and/or relearning after longer time intervals. Irrespective of whether the hippocampus is required for remote memory recall, the hippocampus might engage in computations that either support recall of remote memories or that update remote memories. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. A model of Fe speciation and biogeochemistry at the Tropical Eastern North Atlantic Time-Series Observatory site

    NASA Astrophysics Data System (ADS)

    Ye, Y.; Völker, C.; Wolf-Gladrow, D. A.

    2009-10-01

    A one-dimensional model of Fe speciation and biogeochemistry, coupled with the General Ocean Turbulence Model (GOTM) and a NPZD-type ecosystem model, is applied for the Tropical Eastern North Atlantic Time-Series Observatory (TENATSO) site. Among diverse processes affecting Fe speciation, this study is focusing on investigating the role of dust particles in removing dissolved iron (DFe) by a more complex description of particle aggregation and sinking, and explaining the abundance of organic Fe-binding ligands by modelling their origin and fate. The vertical distribution of different particle classes in the model shows high sensitivity to changing aggregation rates. Using the aggregation rates from the sensitivity study in this work, modelled particle fluxes are close to observations, with dust particles dominating near the surface and aggregates deeper in the water column. POC export at 1000 m is a little higher than regional sediment trap measurements, suggesting further improvement of modelling particle aggregation, sinking or remineralisation. Modelled strong ligands have a high abundance near the surface and decline rapidly below the deep chlorophyll maximum, showing qualitative similarity to observations. Without production of strong ligands, phytoplankton concentration falls to 0 within the first 2 years in the model integration, caused by strong Fe-limitation. A nudging of total weak ligands towards a constant value is required for reproducing the observed nutrient-like profiles, assuming a decay time of 7 years for weak ligands. This indicates that weak ligands have a longer decay time and therefore cannot be modelled adequately in a one-dimensional model. The modelled DFe profile is strongly influenced by particle concentration and vertical distribution, because the most important removal of DFe in deeper waters is colloid formation and aggregation. Redissolution of particulate iron is required to reproduce an observed DFe profile at TENATSO site. Assuming colloidal iron is mainly composed of inorganic colloids, the modelled colloidal to soluble iron ratio is lower that observations, indicating the importance of organic colloids.

  5. Quantile Regression Models for Current Status Data

    PubMed Central

    Ou, Fang-Shu; Zeng, Donglin; Cai, Jianwen

    2016-01-01

    Current status data arise frequently in demography, epidemiology, and econometrics where the exact failure time cannot be determined but is only known to have occurred before or after a known observation time. We propose a quantile regression model to analyze current status data, because it does not require distributional assumptions and the coefficients can be interpreted as direct regression effects on the distribution of failure time in the original time scale. Our model assumes that the conditional quantile of failure time is a linear function of covariates. We assume conditional independence between the failure time and observation time. An M-estimator is developed for parameter estimation which is computed using the concave-convex procedure and its confidence intervals are constructed using a subsampling method. Asymptotic properties for the estimator are derived and proven using modern empirical process theory. The small sample performance of the proposed method is demonstrated via simulation studies. Finally, we apply the proposed method to analyze data from the Mayo Clinic Study of Aging. PMID:27994307

  6. Development of a time-resolved fluorometric method for observing hybridization in living cells using fluorescence resonance energy transfer.

    PubMed Central

    Tsuji, A; Sato, Y; Hirano, M; Suga, T; Koshimoto, H; Taguchi, T; Ohsuka, S

    2001-01-01

    We previously showed that a specific kind of mRNA (c-fos) was detected in a living cell under a microscope by introducing two fluorescently labeled oligodeoxynucleotides, each labeled with donor or acceptor, into the cytoplasm, making them hybridize to adjacent locations on c-fos mRNA, and taking images of fluorescence resonance energy transfer (FRET) (A. Tsuji, H. Koshimoto, Y. Sato, M. Hirano. Y. Sei-Iida, S. Kondo, and K. Ishibashi, 2000, Biophys. J. 78:3260-3274). On the formed hybrid, the distance between donor and acceptor becomes close and FRET occurs. To observe small numbers of mRNA in living cells using this method, it is required that FRET fluorescence of hybrid must be distinguished from fluorescence of excess amounts of non-hybridizing probes and from cell autofluorescence. To meet these requirements, we developed a time-resolved method using acceptor fluorescence decays. When a combination of a donor having longer fluorescence lifetime and an acceptor having shorter lifetime is used, the measured fluorescence decays of acceptors under FRET becomes slower than the acceptor fluorescence decay with direct excitation. A combination of Bodipy493/503 and Cy5 was selected as donor and acceptor. When the formed hybrid had a configuration where the target RNA has no single-strand part between the two fluorophores, the acceptor fluorescence of hybrid had a sufficiently longer delay to detect fluorescence of hybrid in the presence of excess amounts of non-hybridizing probes. Spatial separation of 10-12 bases between two fluorophores on the hybrid is also required. The decay is also much slower than cell autofluorescence, and smaller numbers of hybrid were detected with less interference of cell autofluorescence in the cytoplasm of living cells under a time-resolved fluorescence microscope with a time-gated function equipped camera. The present method will be useful when observing induced expressions of mRNA in living cells. PMID:11423432

  7. All-season flash flood forecasting system for real-time operations

    USDA-ARS?s Scientific Manuscript database

    Flash floods can cause extensive damage to both life and property, especially because they are difficult to predict. Flash flood prediction requires high-resolution meteorologic observations and predictions, as well as calibrated hydrologic models in addition to extensive data handling. We have de...

  8. Data acquisition system for operational earth observation missions

    NASA Technical Reports Server (NTRS)

    Deerwester, J. M.; Alexander, D.; Arno, R. D.; Edsinger, L. E.; Norman, S. M.; Sinclair, K. F.; Tindle, E. L.; Wood, R. D.

    1972-01-01

    The data acquisition system capabilities expected to be available in the 1980 time period as part of operational Earth observation missions are identified. By data acquisition system is meant the sensor platform (spacecraft or aircraft), the sensors themselves and the communication system. Future capabilities and support requirements are projected for the following sensors: film camera, return beam vidicon, multispectral scanner, infrared scanner, infrared radiometer, microwave scanner, microwave radiometer, coherent side-looking radar, and scatterometer.

  9. On spurious detection of linear response and misuse of the fluctuation-dissipation theorem in finite time series

    NASA Astrophysics Data System (ADS)

    Gottwald, Georg A.; Wormell, J. P.; Wouters, Jeroen

    2016-09-01

    Using a sensitive statistical test we determine whether or not one can detect the breakdown of linear response given observations of deterministic dynamical systems. A goodness-of-fit statistics is developed for a linear statistical model of the observations, based on results for central limit theorems for deterministic dynamical systems, and used to detect linear response breakdown. We apply the method to discrete maps which do not obey linear response and show that the successful detection of breakdown depends on the length of the time series, the magnitude of the perturbation and on the choice of the observable. We find that in order to reliably reject the assumption of linear response for typical observables sufficiently large data sets are needed. Even for simple systems such as the logistic map, one needs of the order of 106 observations to reliably detect the breakdown with a confidence level of 95 %; if less observations are available one may be falsely led to conclude that linear response theory is valid. The amount of data required is larger the smaller the applied perturbation. For judiciously chosen observables the necessary amount of data can be drastically reduced, but requires detailed a priori knowledge about the invariant measure which is typically not available for complex dynamical systems. Furthermore we explore the use of the fluctuation-dissipation theorem (FDT) in cases with limited data length or coarse-graining of observations. The FDT, if applied naively to a system without linear response, is shown to be very sensitive to the details of the sampling method, resulting in erroneous predictions of the response.

  10. Operational fitness of box truss antennas in response to dynamic slewing

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Schartel, W. A.; Karanian, L. A.

    1985-01-01

    A parametric study was performed to define slewing capability of large satellites along with associated system changes or subsystem weight and complexity impacts. The satellite configuration and structural arrangement from the Earth Observation Spacecraft (EOS) study was used as the baseline spacecraft. Varying slew rates, settling times, damping, maneuver frequencies, and attitude hold times provided the data required to establish applicability to a wide range of potential missions. The key elements of the study are: (1) determine the dynamic transient response of the antenna system; (2) calculate the system errors produced by the dynamic response; (3) determine if the antenna has exceeded operational requirements at completion of the slew, and if so; (4) determine when the antenna has settled to the operational requirements. The slew event is not considered complete until the antenna is within operational limits.

  11. Collection and processing of data from a phase-coherent meteor radar

    NASA Technical Reports Server (NTRS)

    Backof, C. A., Jr.; Bowhill, S. A.

    1974-01-01

    An analysis of the measurement accuracy requirement of a high resolution meteor radar for observing short period, atmospheric waves is presented, and a system which satisfies the requirements is described. A medium scale, real time computer is programmed to perform all echo recognition and coordinate measurement functions. The measurement algorithms are exercised on noisy data generated by a program which simulates the hardware system, in order to find the effects of noise on the measurement accuracies.

  12. Optical and Radio Frequency Refractivity Fluctuations from High Resolution Point Sensors: Sea Breezes and Other Observations

    DTIC Science & Technology

    2007-03-01

    velocity and direction along with vertical velocities are derived from the measured time of flight for the ultrasonic signals (manufacture’s...data set. To prevent aliasing a wave must be sample at least twice per period so the Nyquist frequency is sn ff 2 = . 3. Sampling Requirements...an order of magnitude or more. To refine models or conduct climatologically studies for Cn2 requires direct measurements to identify the underlying

  13. Using All-Sky Imaging to Improve Telescope Scheduling (Abstract)

    NASA Astrophysics Data System (ADS)

    Cole, G. M.

    2017-12-01

    (Abstract only) Automated scheduling makes it possible for a small telescope to observe a large number of targets in a single night. But when used in areas which have less-than-perfect sky conditions such automation can lead to large numbers of observations of clouds and haze. This paper describes the development of a "sky-aware" telescope automation system that integrates the data flow from an SBIG AllSky340c camera with an enhanced dispatch scheduler to make optimum use of the available observing conditions for two highly instrumented backyard telescopes. Using the minute-by-minute time series image stream and a self-maintained reference database, the software maintains a file of sky brightness, transparency, stability, and forecasted visibility at several hundred grid positions. The scheduling software uses this information in real time to exclude targets obscured by clouds and select the best observing task, taking into account the requirements and limits of each instrument.

  14. Hobby-Eberly Telescope: commissioning experience and observing plans

    NASA Astrophysics Data System (ADS)

    Glaspey, John W.; Adams, M. T.; Booth, John A.; Cornell, Mark E.; Fowler, James R.; Krabbendam, Victor L.; Ramsey, Lawrence W.; Ray, Frank B.; Ricklefs, Randall L.; Spiesman, W. J.

    1998-07-01

    Experience in bringing into operation the 91-segment primary mirror alignment and control system, the focal plane tracker system, and other critical subsystems of the HET will be described. Particular attention is given to the tracker, which utilizes three linear and three rotational degrees of freedom to follow sidereal targets. Coarse time-dependent functions for each axis are downloaded to autonomous PMAC controllers that provide the precise motion drives to the two linear stages and the hexapod system. Experience gained in aligning the sperate mirrors and then maintaining image quality in a variable thermal environments will also be described. Because of the fixed elevation of the primary optical axis, only a limited amount of time is available for observing objects in the 12 degrees wide observing band. With a small core HET team working with McDonald Observatory staff, efficient, reliable, uncomplicated methodologies are required in all aspects of the observing operations.

  15. Observational data needs for plasma phenomena

    NASA Technical Reports Server (NTRS)

    Niedner, M. B., Jr.

    1981-01-01

    Bright comets display a rich variety of interesting plasma phenomena which occur over an enormous range of spatial scales, and which require different observational techniques to be studied effectively. Wide-angle photography of high time resolution is probably the best method of studying the phenomenon of largest known scale: the plasma tail disconnection event (DE), which has been attributed to magnetic reconnection at interplanetary sector boundary crossings. These structures usually accelerate as they recede from the head region and observed velocities are typically in the range 50 V km/s. They are often visible for several days following the time of disconnection, and are sometimes seen out past 0.2 AU from the cometary head. The following areas pertaining to plasma phenomena in the ionoshere are addressed: the existence, size, and heliocentric distance variations of the contact surface, and the observational signatures of magnetic reconnection at sector boundary crossings.

  16. Understanding climate: A strategy for climate modeling and predictability research, 1985-1995

    NASA Technical Reports Server (NTRS)

    Thiele, O. (Editor); Schiffer, R. A. (Editor)

    1985-01-01

    The emphasis of the NASA strategy for climate modeling and predictability research is on the utilization of space technology to understand the processes which control the Earth's climate system and it's sensitivity to natural and man-induced changes and to assess the possibilities for climate prediction on time scales of from about two weeks to several decades. Because the climate is a complex multi-phenomena system, which interacts on a wide range of space and time scales, the diversity of scientific problems addressed requires a hierarchy of models along with the application of modern empirical and statistical techniques which exploit the extensive current and potential future global data sets afforded by space observations. Observing system simulation experiments, exploiting these models and data, will also provide the foundation for the future climate space observing system, e.g., Earth observing system (EOS), 1985; Tropical Rainfall Measuring Mission (TRMM) North, et al. NASA, 1984.

  17. A new approach to data management and its impact on frequency control requirements

    NASA Technical Reports Server (NTRS)

    Blanchard, D. L.; Fuchs, A. J.; Chi, A. R.

    1979-01-01

    A new approach to data management consisting of spacecraft and data/information autonomy and its impact on frequency control requirements is presented. An autonomous spacecraft is capable of functioning without external intervention for up to 72 hr by enabling the sensors to make observations, maintaining its health and safety, and by using logical safety modes when anomalies occur. Data/information are made autonomous by associating all relevant ancillary data such as time, position, attitude, and sensor identification with the data/information record of an event onboard the spacecraft. This record is so constructed that the record of the event can be physically identified in a complete and self-contained record that is independent of all other data. All data within a packet will be time tagged to the needed accuracy, and the time markings from packet to packet will be coherent to a UTC time scale.

  18. A measurement of time-averaged aerosol optical depth using air-showers observed in stereo by HiRes

    NASA Astrophysics Data System (ADS)

    High Resolution Fly'S Eye Collaboration; Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Atkins, R.; Belov, K.; Belz, J. W.; Benzvi, S.; Bergman, D. R.; Boyer, J. H.; Cannon, C. T.; Cao, Z.; Connolly, B. M.; Fedorova, Y.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Manago, N.; Mannel, E. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Reil, K.; Roberts, M. D.; Schnetzer, S. R.; Seman, M.; Sinnis, G.; Smith, J. D.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.

    2006-03-01

    Air fluorescence measurements of cosmic ray energy must be corrected for attenuation of the atmosphere. In this paper, we show that the air-showers themselves can yield a measurement of the aerosol attenuation in terms of optical depth, time-averaged over extended periods. Although the technique lacks statistical power to make the critical hourly measurements that only specialized active instruments can achieve, we note the technique does not depend on absolute calibration of the detector hardware, and requires no additional equipment beyond the fluorescence detectors that observe the air showers. This paper describes the technique, and presents results based on analysis of 1258 air-showers observed in stereo by the High Resolution Fly’s Eye over a four year span.

  19. A Star Image Extractor for Small Satellites

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Yamauchi, Masahiro; Gouda, Naoteru; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Yano, Taihei; Suganuma, Masahiro; Nakasuka, Shinichi; Sako, Nobutada; Inamori, Takaya

    We have developed a Star Image Extractor (SIE) which works as an on-board real-time image processor. It is a logic circuit written on an FPGA(Field Programmable Gate Array) device. It detects and extracts only an object data from raw image data. SIE will be required with the Nano-JASMINE 1) satellite. Nano-JASMINE is the small astrometry satellite that observes objects in our galaxy. It will be launched in 2010 and needs two years mission period. Nano-JASMINE observes an object with the TDI (Time Delayed Integration) observation mode. TDI is one of operation modes of CCD detector. Data is obtained, by rotating the imaging system including CCD at a rated synchronized with a vertical charge transfer of CCD. Obtained image data is sent through SIE to the Mission-controller.

  20. Prospects for detecting oxygen, water, and chlorophyll on an exo-Earth

    PubMed Central

    Brandt, Timothy D.; Spiegel, David S.

    2014-01-01

    The goal of finding and characterizing nearby Earth-like planets is driving many NASA high-contrast flagship mission concepts, the latest of which is known as the Advanced Technology Large-Aperture Space Telescope (ATLAST). In this article, we calculate the optimal spectral resolution R = λ/δλ and minimum signal-to-noise ratio per spectral bin (SNR), two central design requirements for a high-contrast space mission, to detect signatures of water, oxygen, and chlorophyll on an Earth twin. We first develop a minimally parametric model and demonstrate its ability to fit synthetic and observed Earth spectra; this allows us to measure the statistical evidence for each component’s presence. We find that water is the easiest to detect, requiring a resolution R ≳ 20, while the optimal resolution for oxygen is likely to be closer to R = 150, somewhat higher than the canonical value in the literature. At these resolutions, detecting oxygen will require approximately two times the SNR as water. Chlorophyll requires approximately six times the SNR as oxygen for an Earth twin, only falling to oxygen-like levels of detectability for a low cloud cover and/or a large vegetation covering fraction. This suggests designing a mission for sensitivity to oxygen and adopting a multitiered observing strategy, first targeting water, then oxygen on the more favorable planets, and finally chlorophyll on only the most promising worlds. PMID:25197095

  1. Prospects for detecting oxygen, water, and chlorophyll on an exo-Earth.

    PubMed

    Brandt, Timothy D; Spiegel, David S

    2014-09-16

    The goal of finding and characterizing nearby Earth-like planets is driving many NASA high-contrast flagship mission concepts, the latest of which is known as the Advanced Technology Large-Aperture Space Telescope (ATLAST). In this article, we calculate the optimal spectral resolution R = λ/δλ and minimum signal-to-noise ratio per spectral bin (SNR), two central design requirements for a high-contrast space mission, to detect signatures of water, oxygen, and chlorophyll on an Earth twin. We first develop a minimally parametric model and demonstrate its ability to fit synthetic and observed Earth spectra; this allows us to measure the statistical evidence for each component's presence. We find that water is the easiest to detect, requiring a resolution R ≳ 20, while the optimal resolution for oxygen is likely to be closer to R = 150, somewhat higher than the canonical value in the literature. At these resolutions, detecting oxygen will require approximately two times the SNR as water. Chlorophyll requires approximately six times the SNR as oxygen for an Earth twin, only falling to oxygen-like levels of detectability for a low cloud cover and/or a large vegetation covering fraction. This suggests designing a mission for sensitivity to oxygen and adopting a multitiered observing strategy, first targeting water, then oxygen on the more favorable planets, and finally chlorophyll on only the most promising worlds.

  2. Efficient Saccade Planning Requires Time and Clear Choices

    PubMed Central

    Ghahghaei, Saiedeh; Verghese, Preeti

    2015-01-01

    We use eye movements constantly to gather information. Saccades are efficient when they maximize the information required for the task, however there is controversy regarding the efficiency of eye movement planning. For example, saccades are efficient when searching for a single target (Nature, 434 (2005) 387–91), but are inefficient when searching for an unknown number of targets in noise, particularly under time pressure (Vision Research 74 (2012), 61–71). In this study, we used a multiple-target search paradigm and explored whether altering the noise level or increasing saccadic latency improved efficiency. Experiments used stimuli with two levels of discriminability such that saccades to the less discriminable stimuli provided more information. When these two noise levels corresponded to low and moderate visibility, most observers did not preferentially select informative locations, but looked at uncertain and probable target locations equally often. We then examined whether eye movements could be made more efficient by increasing the discriminability of the two stimulus levels and by delaying the first saccade so that there was more time for decision processes to influence the saccade choices. Some observers did indeed increase the proportion of their saccades to informative locations under these conditions. Others, however, made as many saccades as they could during the limited time and were unselective about the saccade goal. A clear trend that emerges across all experiments is that conditions with a greater proportion of efficient saccades are associated with a longer latency to initiate saccades, suggesting that the choice of informative locations requires deliberate planning. PMID:26037735

  3. CLARREO Cornerstone of the Earth Observing System: Measuring Decadal Change Through Accurate Emitted Infrared and Reflected Solar Spectra and Radio Occultation

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    2010-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is one of four Tier 1 missions recommended by the recent NRC Decadal Survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to provide accurate, broadly acknowledged climate records that are used to enable validated long-term climate projections that become the foundation for informed decisions on mitigation and adaptation policies that address the effects of climate change on society. The CLARREO mission accomplishes this critical objective through rigorous SI traceable decadal change observations that are sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. These same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. For the first time CLARREO will make highly accurate, global, SI-traceable decadal change observations sensitive to the most critical, but least understood, climate forcings, responses, and feedbacks. The CLARREO breakthrough is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. The required accuracy levels are determined so that climate trend signals can be detected against a background of naturally occurring variability. Climate system natural variability therefore determines what level of accuracy is overkill, and what level is critical to obtain. In this sense, the CLARREO mission requirements are considered optimal from a science value perspective. The accuracy for decadal change traceability to SI standards includes uncertainties associated with instrument calibration, satellite orbit sampling, and analysis methods. Unlike most space missions, the CLARREO requirements are driven not by the instantaneous accuracy of the measurements, but by accuracy in the large time/space scale averages that are key to understanding decadal changes.

  4. Requirements and concept design for large earth survey telescope for SEOS

    NASA Technical Reports Server (NTRS)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  5. On the energy budget in the current disruption region. [of geomagnetic tail

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Birn, Joachim

    1993-01-01

    This study investigates the energy budget in the current disruption region of the magnetotail, coincident with a pre-onset thin current sheet, around substorm onset time using published observational data and theoretical estimates. We find that the current disruption/dipolarization process typically requires energy inflow into the primary disruption region. The disruption dipolarization process is therefore endoenergetic, i.e., requires energy input to operate. Therefore we argue that some other simultaneously operating process, possibly a large scale magnetotail instability, is required to provide the necessary energy input into the current disruption region.

  6. Reduced order modelling in searches for continuous gravitational waves - I. Barycentring time delays

    NASA Astrophysics Data System (ADS)

    Pitkin, M.; Doolan, S.; McMenamin, L.; Wette, K.

    2018-06-01

    The frequencies and phases of emission from extra-solar sources measured by Earth-bound observers are modulated by the motions of the observer with respect to the source, and through relativistic effects. These modulations depend critically on the source's sky-location. Precise knowledge of the modulations are required to coherently track the source's phase over long observations, for example, in pulsar timing, or searches for continuous gravitational waves. The modulations can be modelled as sky-location and time-dependent time delays that convert arrival times at the observer to the inertial frame of the source, which can often be the Solar system barycentre. We study the use of reduced order modelling for speeding up the calculation of this time delay for any sky-location. We find that the time delay model can be decomposed into just four basis vectors, and with these the delay for any sky-location can be reconstructed to sub-nanosecond accuracy. When compared to standard routines for time delay calculation in gravitational wave searches, using the reduced basis can lead to speed-ups of 30 times. We have also studied components of time delays for sources in binary systems. Assuming eccentricities <0.25, we can reconstruct the delays to within 100 s of nanoseconds, with best case speed-ups of a factor of 10, or factors of two when interpolating the basis for different orbital periods or time stamps. In long-duration phase-coherent searches for sources with sky-position uncertainties, or binary parameter uncertainties, these speed-ups could allow enhancements in their scopes without large additional computational burdens.

  7. A Time-domain Analysis of Nitrogen-rich Quasars.

    NASA Astrophysics Data System (ADS)

    Dittmann, Alexander; Liu, Xin; Shen, Yue; Jiang, Linhua

    2018-01-01

    A small population of quasars exhibit anomalously high nitrogen-to-carbon ratios (N/C) in their emission lines. These “nitrogen-rich” (N-rich) quasars have been difficult to explain. Few of the possible mechanism are natural, since stellar populations with abnormally high metallicities are required to produce an N-rich interstellar medium. N-rich quasars are also more likely to be “radio-loud” than average quasars, which is difficult to explain by invoking higher metallicity alone. Recently, tidal disruption events (TDEs) have been proposed as a mechanism for N-rich quasars. Such a TDE would occur between a supersolar mass star and a supermassive black hole. The CNO cycle creates a surplus of N-rich and carbon-deficient material that could naturally explain the N/C observed in N-rich quasars. The TDE hypothesis explains N-rich quasars without requiring extremely exotic stellar populations. A testable difference differentiating the TDE explanation and exotic stellar population scenarios is that TDEs do not produce enough N-rich material to pollute the quasar environment for extended periods of time, in which case N-rich phenomena in quasars would be transient. By analyzing changes in nitrogen and carbon line widths in time-separated spectra of N-rich quasars, we have studied nitrogen abundance in quasars which had previously been identified as nitrogen rich. We have found that over time-frames of greater than one year in the quasar rest frame, nitrogen abundance tends to systematically decrease. The observed decrease is larger than our estimate of the effects of noise based on spectra separated by smaller time frames. Additionally, x-ray observations of one N-rich quasar have demonstrated that its x-ray emission is an outlier among the quasar population, but similar to confirmed TDEs.

  8. User-friendly tools on handheld devices for observer performance study

    NASA Astrophysics Data System (ADS)

    Matsumoto, Takuya; Hara, Takeshi; Shiraishi, Junji; Fukuoka, Daisuke; Abe, Hiroyuki; Matsusako, Masaki; Yamada, Akira; Zhou, Xiangrong; Fujita, Hiroshi

    2012-02-01

    ROC studies require complex procedures to select cases from many data samples, and to set confidence levels in each selected case to generate ROC curves. In some observer performance studies, researchers have to develop software with specific graphical user interface (GUI) to obtain confidence levels from readers. Because ROC studies could be designed for various clinical situations, it is difficult task for preparing software corresponding to every ROC studies. In this work, we have developed software for recording confidence levels during observer studies on tiny personal handheld devices such as iPhone, iPod touch, and iPad. To confirm the functions of our software, three radiologists performed observer studies to detect lung nodules by using public database of chest radiograms published by Japan Society of Radiological Technology. The output in text format conformed to the format for the famous ROC kit from the University of Chicago. Times required for the reading each case was recorded very precisely.

  9. Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix

    2005-01-01

    To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.

  10. The Effect of Lactobacillus casei 32G on the Mouse Cecum Microbiota and Innate Immune Response Is Dose and Time Dependent

    PubMed Central

    Aktas, Busra; De Wolfe, Travis J.; Tandee, Kanokwan; Safdar, Nasia; Darien, Benjamin J.; Steele, James L.

    2015-01-01

    Lactobacilli have been associated with a variety of immunomodulatory effects and some of these effects have been related to changes in gastrointestinal microbiota. However, the relationship between probiotic dose, time since probiotic consumption, changes in the microbiota, and immune system requires further investigation. The objective of this study was to determine if the effect of Lactobacillus casei 32G on the murine gastrointestinal microbiota and immune function are dose and time dependent. Mice were fed L. casei 32G at doses of 106, 107, or 108 CFU/day/mouse for seven days and were sacrificed 0.5h, 3.5h, 12h, or 24h after the last administration. The ileum tissue and the cecal content were collected for immune profiling by qPCR and microbiota analysis, respectively. The time required for L. casei 32G to reach the cecum was monitored by qPCR and the 32G bolus reaches the cecum 3.5h after the last administration. L. casei 32G altered the cecal microbiota with the predominance of Lachnospiraceae IS, and Oscillospira decreasing significantly (p < 0.05) in the mice receiving 108 CFU/mouse 32G relative to the control mice, while a significant (p < 0.05) increase was observed in the prevalence of lactobacilli. The lactobacilli that increased were determined to be a commensal lactobacilli. Interestingly, no significant difference in the overall microbiota composition, regardless of 32G doses, was observed at the 12h time point. A likely explanation for this observation is the level of feed derived-nutrients resulting from the 12h light/dark cycle. 32G results in consistent increases in Clec2h expression and reductions in TLR-2, alpha-defensins, and lysozyme. Changes in expression of these components of the innate immune system are one possible explanation for the observed changes in the cecal microbiota. Additionally, 32G administration was observed to alter the expression of cytokines (IL-10rb and TNF-α) in a manner consistent with an anti-inflammatory response. PMID:26714177

  11. Commercial applications of satellite oceanography

    NASA Technical Reports Server (NTRS)

    Montgomery, D. R.

    1981-01-01

    It is shown that in the next decade the oceans' commercial users will require an operational oceanographic satellite system or systems capable of maximizing real-time coverage over all ocean areas. Seasat studies suggest that three spacecraft are required to achieve this. Here, the sensor suite would measure surface winds, wave heights (and spectral energy distribution), ice characteristics, sea-surface temperature, ocean colorimetry, height of the geoid, salinity, and subsurface thermal structure. The importance of oceanographic data being distributed to commercial users within two hours of observation time is stressed. Also emphasized is the importance of creating a responsive oceanographic satellite data archive. An estimate of the potential dollar benefits of such an operational oceanographic satellite system is given.

  12. Spatial and temporal evolutions of ozone in a nanosecond pulse corona discharge at atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Duten, X.; Redolfi, M.; Aggadi, N.; Vega, A.; Hassouni, K.

    2011-10-01

    This paper deals with the experimental determination of the spatial and temporal evolutions of the ozone concentration in an atmospheric pressure pulsed plasma, working in the nanosecond regime. We observed that ozone was produced in the localized region of the streamer. The ozone transport requires a characteristic time well above the millisecond. The numerical modelling of the streamer expansion confirms that the hydrodynamic expansion of the filamentary discharge region during the streamer propagation does not lead to a significant transport of atomic oxygen and ozone. It appears therefore that only diffusional transport can take place, which requires a characteristic time of the order of 50 ms.

  13. Behavioral Risk Assessment of the Guarded Suicidal Patient

    ERIC Educational Resources Information Center

    Simon, Robert I.

    2008-01-01

    Psychiatrists and other mental health professionals are trained to assess patients by direct observation and examination. Short inpatient length of stay, brief outpatient visits, emergency room evaluations, and other time-limited clinical settings require rapid assessment of suicide risk. Recognition of behavioral suicide risk factors can assist…

  14. An Investigation on the Contribution of GLONASS to the Precise Point Positioning for Short Time Observations

    NASA Astrophysics Data System (ADS)

    Ulug, R.; Ozludemir, M. T.

    2016-12-01

    After 2011, through the modernization process of GLONASS, the number of satellites increased rapidly. This progress has made the GLONASS the only fully operational system alternative to GPS in point positioning. So far, many researches have been conducted to investigate the contribution of GLONASS to point positioning considering different methods such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP). The latter one, PPP, is a method that performs precise position determination using a single GNSS receiver. PPP method has become very attractive since the early 2000s and it provided great advantages for engineering and scientific applications. However, PPP method needs at least 2 hours observation time and the required observation length may be longer depending on several factors, such as the number of satellites, satellite configuration etc. The more satellites, the less observation time. Nevertheless the impact of the number of satellites included must be known very well. In this study, to determine the contribution of GLONASS on PPP, GLONASS satellite observations were added one by one from 1 to 5 satellite in 2, 4 and 6 hours of observations. For this purpose, the data collected at the IGS site ISTA was used. Data processing has been done for Day of Year (DOY) 197 in 2016. 24 hours GPS observations have been processed by Bernese 5.2 PPP module and the output was selected as the reference while 2, 4 and 6 hours GPS and GPS/GLONASS observations have been processed by magic GNSS PPP module. The results clearly showed that GPS/GLONASS observations improved positional accuracy, precision, dilution of precision and convergence to the reference coordinates. In this context, coordinate differences between 24 hours GPS observations and 6 hours GPS/GLONASS observations have been obtained as less than 2 cm.

  15. Observing System Simulations for the NASA ASCENDS Lidar CO2 Mission Concept: Substantiating Science Measurement Requirements

    NASA Technical Reports Server (NTRS)

    Kawa, Stephan R.; Baker, David Frank; Schuh, Andrew E.; Abshire, James Brice; Browell, Edward V.; Michalak, Anna M.

    2012-01-01

    The NASA ASCENDS mission (Active Sensing of Carbon Emissions, Nights, Days, and Seasons) is envisioned as the next generation of dedicated, space-based CO2 observing systems, currently planned for launch in about the year 2022. Recommended by the US National Academy of Sciences Decadal Survey, active (lidar) sensing of CO2 from space has several potentially significant advantages, in comparison to current and planned passive CO2 instruments, that promise to advance CO2 measurement capability and carbon cycle understanding into the next decade. Assessment and testing of possible lidar instrument technologies indicates that such sensors are more than feasible, however, the measurement precision and accuracy requirements remain at unprecedented levels of stringency. It is, therefore, important to quantitatively and consistently evaluate the measurement capabilities and requirements for the prospective active system in the context of advancing our knowledge of carbon flux distributions and their dependence on underlying physical processes. This amounts to establishing minimum requirements for precision, relative accuracy, spatial/temporal coverage and resolution, vertical information content, interferences, and possibly the tradeoffs among these parameters, while at the same time framing a mission that can be implemented within a constrained budget. Here, we present results of observing system simulation studies, commissioned by the ASCENDS Science Requirements Definition Team, for a range of possible mission implementation options that are intended to substantiate science measurement requirements for a laser-based CO2 space instrument.

  16. A soft X-ray flare in the Seyfert I galaxy Markarian 335

    NASA Technical Reports Server (NTRS)

    Lee, M. G.; Balick, Bruce; Halpern, J. P.; Heckman, T. M.

    1988-01-01

    Strong, erratic, and primarily soft X-ray flux variations observed in Mrk 335 with the Einstein high-resolution imager (HRI) and monitor proportional counter (MPC) are reported. The variability time scales lie from about 6000 s to the period of observation, 60,000 s. The variability consisted of a decrease followed by an increase at X-ray energies below 2-3 keV. The variability is most pronounced at the softest energies. The X-ray spectrum was harder before the flare than afterward, even after the flare had ended. Averaged over the time of the observations, the MPC data are well-fitted by a power-law spectrum with a spectral index of 1.25 + or - 0.19 with no evidence of absorption by foreground neutral hydrogen at energies above 1.2 keV. If the observed value of the Galactic H I column density is assumed, then the HRI observations require the existence of an additional soft and variable X-ray component.

  17. Computational time reduction for sequential batch solutions in GNSS precise point positioning technique

    NASA Astrophysics Data System (ADS)

    Martín Furones, Angel; Anquela Julián, Ana Belén; Dimas-Pages, Alejandro; Cos-Gayón, Fernando

    2017-08-01

    Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%.

  18. Gas Dynamics and Kinetics in the Cometary Coma: Theory and Observations

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.; Harris, Walter M.; Smyth, William H.

    2005-01-01

    Our ability to describe the physical state of the expanding coma affects fundamental areas of cometary study both directly and indirectly. In order to convert measured abundances of gas species in the coma to gas production rates, models for the distribution and kinematics of gas species in the coma are required. Conversely, many different types of observations, together with laboratory data and theory, are still required to determine coma model attributes and parameters. Accurate relative and absolute gas production rates and their variations with time and from comet to comet are crucial to our basic understanding of the composition and structure of cometary nuclei and their place in the solar system. We review the gas dynamics and kinetics of cometary comae from both theoretical and observational perspectives, which are important for understanding the wide variety of physical conditions that are encountered.

  19. Single-Molecule Three-Color FRET with Both Negligible Spectral Overlap and Long Observation Time

    PubMed Central

    Hohng, Sungchul

    2010-01-01

    Full understanding of complex biological interactions frequently requires multi-color detection capability in doing single-molecule fluorescence resonance energy transfer (FRET) experiments. Existing single-molecule three-color FRET techniques, however, suffer from severe photobleaching of Alexa 488, or its alternative dyes, and have been limitedly used for kinetics studies. In this work, we developed a single-molecule three-color FRET technique based on the Cy3-Cy5-Cy7 dye trio, thus providing enhanced observation time and improved data quality. Because the absorption spectra of three fluorophores are well separated, real-time monitoring of three FRET efficiencies was possible by incorporating the alternating laser excitation (ALEX) technique both in confocal microscopy and in total-internal-reflection fluorescence (TIRF) microscopy. PMID:20808851

  20. Effect of temperature on embryonic development of Melanotaenia boesemani (Allen and Cross, 1982).

    PubMed

    Radael, Marcella Costa; Cardoso, Leonardo Demier; de Andrade, Dalcio Ricardo; Ferreira, André Veloso; da Cruz Mattos, Douglas; Vidal, Manuel Vazquez

    2016-04-01

    The present study aimed to provide data on the time required for Melanotaenia boesemani to complete embryonic development, and to investigate the influence that incubation at different temperatures caused in this species. The effects of temperature on the time and hatching rate are presented, as well as information related to embryonic development stages. After fertilization, the eggs were kept in incubators at 23, 26, 29 or 32°C and observed at predetermined times until the moment of hatching. Stages of development were identified and classified according to morphological and physiological characteristics. Oil droplets were visualized inside the eggs as well as filament adhesion present at the chorion. Embryonic development was similar to that observed in other species of the genus Melanotaenia with hatching and faster development in higher temperatures.

  1. Light-Flash Wind-Direction Indicator

    NASA Technical Reports Server (NTRS)

    Zysko, Jan A.

    1993-01-01

    Proposed wind-direction indicator read easily by distant observers. Indicator emits bright flashes of light separated by interval of time proportional to angle between true north and direction from which wind blowing. Timing of flashes indicates direction of wind. Flashes, from high-intensity stroboscopic lights seen by viewers at distances up to 5 miles or more. Also seen more easily through rain and fog. Indicator self-contained, requiring no connections to other equipment. Power demand satisfied by battery or solar power or both. Set up quickly to provide local surface-wind data for aircraft pilots during landing or hovering, for safety officers establishing hazard zones and safety corridors during handling of toxic materials, for foresters and firefighters conducting controlled burns, and for real-time wind observations during any of variety of wind-sensitive operations.

  2. Drivers of leaf-out phenology and their implications for species invasions: insights from Thoreau's Concord.

    PubMed

    Polgar, Caroline; Gallinat, Amanda; Primack, Richard B

    2014-04-01

    To elucidate climate-driven changes in leaf-out phenology and their implications for species invasions, we observed and experimentally manipulated leaf out of invasive and native woody plants in Concord, MA, USA. Using observations collected by Henry David Thoreau (1852-1860) and our own observations (2009-2013), we analyzed changes in leaf-out timing and sensitivity to temperature for 43 woody plant species. We experimentally tested winter chilling requirements of 50 species by exposing cut branches to warm indoor temperatures (22°C) during the winter and spring of 2013. Woody species are now leafing out an average of 18 d earlier than they did in the 1850s, and are advancing at a rate of 5 ± 1 d °C(-1) . Functional groups differ significantly in the duration of chilling they require to leaf out: invasive shrubs generally have weaker chilling requirements than native shrubs and leaf out faster in the laboratory and earlier in the field; native trees have the strongest chilling requirements. Our results suggest that invasive shrub species will continue to have a competitive advantage as the climate warms, because native plants are slower to respond to warming spring temperatures and, in the future, may not meet their chilling requirements. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.

  3. Lighting Quality Affects Eyestrain of Operators at Sorting Station in Beverage Industry

    NASA Astrophysics Data System (ADS)

    Anizar; Erwin

    2017-03-01

    This study observes sorters’ performance in two beverage industries whose job is to separate defect products found. Sorters observe bottles quality and beverage quality continuously, therefore requiring more focused eyes which makes eyes’ load heavier. Sorters’ eyestrain causes more defect products pass the selection. In this study, measurement is conducted toward ilumintation, operators’ time response, and defect products that pass the selection. Measurement is hold in 2 beverage industries for four days with four measurements per day, twice in the morning and twice in the afternoon. Ilumination is measured with 4 in 1 environmental meter in grid 1m x 1m, while operators’ time response is measured with Flicker Fusion. Illuminance is generally higher in the morning than in the evening, but still under the standard of Indonesia. Overall, sorters’ time response is higher in the morning than in the afternoon. Higher time response shows that operators experiencing lower fatigue than lower time response. The sorting duration also affects operators’ time response and defect products which pass the selection.

  4. Priorities and developments of sensors, samplers and methods for key marine biological observations.

    NASA Astrophysics Data System (ADS)

    Simmons, Samantha; Chavez, Francisco; Pearlman, Jay

    2016-04-01

    Over the last two decades or more, physical oceanography has seen a significant growth in in-situ sensors and platforms including fixed point and cable observatories, Argo floats, gliders and AUVs to supplement satellites for creating a 3-D view of the time-varying global ocean temperature and salinity structures. There are important developments recently for biogeochemists for monitoring nitrate, chemical contaminants, oxygen and pH that can now be added to these autonomous systems. Biologists are still lagging. Given the importance of biology to ocean health and the future earth, and the present reliance on humans and ships for observing species and abundance, it is paramount that new biological sensor systems be developed. Some promising sensor systems based on, but not limited to acoustic, chemical, genomic or imaging techniques, can sense from microbes to whales, are on the horizon. These techniques can be applied in situ with either real time or recorded data and can be captured and returned to the laboratory using the autonomous systems. The number of samples is limiting, requiring adaptive and smart systems. Two steps are envisioned to meeting the challenges. The first is to identify the priority biological variables to focus observation requirements and planning. The second is to address new sensors that can fill the gaps in current capabilities for biological observations. This abstract will review recent efforts to identify core biological variables for the US Integrated Ocean Observing System and address new sensors and innovations for observing these variables, particularly focused on availability and maturity of sensors.

  5. Image-Enhancement Aid For The Partially Sighted

    NASA Technical Reports Server (NTRS)

    Lawton, T. A.; Gennery, D. B.

    1989-01-01

    Digital filtering enhances ability to read and to recognize objects. Possible to construct portable vision aid by combining miniature video equipment to observe scene and display images with very-large-scale integrated circuits to implement real-time digital image-data processing. Afflicted observer views scene through magnifier to shift spatial frequencies downward and thereby improves perceived image. However, less magnification needed, larger the scene observed. Thus, one measure of effectiveness of new system is amount of magnification required with and without it. In series of tests, found 27 to 70 percent more magnification needed for afflicted observers to recognize unfiltered words than to recognize filtered words.

  6. The central pixel of the MAGIC telescope for optical observations

    NASA Astrophysics Data System (ADS)

    Lucarelli, F.; Barrio, J. A.; Antoranz, P.; Asensio, M.; Camara, M.; Contreras, J. L.; Fonseca, M. V.; Lopez, M.; Miranda, J. M.; Oya, I.; Reyes, R. De Los; Firpo, R.; Sidro, N.; Goebel, F.; Lorenz, E.; Otte, N.

    2008-05-01

    The MAGIC telescope has been designed for the observation of Cherenkov light generated in Extensive Air Showers initiated by cosmic particles. However, its 17 m diameter mirror and optical design makes the telescope suitable for direct optical observations as well. In this paper, we report about the development of a system based on the use of a dedicated photo-multiplier (PMT) for optical observations. This PMT is installed in the centre of the MAGIC camera (the so-called central pixel). An electro-to-optical system has been developed in order to transmit the PMT output signal by an optical fibre to the counting room, where it is digitized and stored for off-line analysis. The performance of the system using the optical pulsation of the Crab nebula as calibration source is presented. The time required for a 5σ detection of the Crab pulsar in the optical band is less than 20 s. The central pixel will be mainly used to perform simultaneous observations of the Crab pulsar both in the optical and γ-ray regimes. It will also allow for periodic testing of the precision of the MAGIC timing system using the Crab rotational optical pulses as a very precise timing reference.

  7. The longevity of habitable planets and the development of intelligent life

    NASA Astrophysics Data System (ADS)

    Simpson, Fergus

    2017-07-01

    Why did the emergence of our species require a timescale similar to the entire habitable period of our planet? Our late appearance has previously been interpreted by Carter (2008) as evidence that observers typically require a very long development time, implying that intelligent life is a rare occurrence. Here we present an alternative explanation, which simply asserts that many planets possess brief periods of habitability. We also propose that the rate-limiting step for the formation of observers is the enlargement of species from an initially microbial state. In this scenario, the development of intelligent life is a slow but almost inevitable process, greatly enhancing the prospects of future search for extra-terrestrial intelligence (SETI) experiments such as the Breakthrough Listen project.

  8. Measuring weather for aviation safety in the 1980's

    NASA Technical Reports Server (NTRS)

    Wedan, R. W.

    1980-01-01

    Requirements for an improved aviation weather system are defined and specifically include the need for (1) weather observations at all airports with instrument approaches, (2) more accurate and timely radar detection of weather elements hazardous to aviation, and (3) better methods of timely distribution of both pilot reports and ground weather data. The development of the discrete address beacon system data link, Doppler weather radar network, and various information processing techniques are described.

  9. Program Aids Visualization Of Data

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1995-01-01

    Living Color Frame System (LCFS) computer program developed to solve some problems that arise in connection with generation of real-time graphical displays of numerical data and of statuses of systems. Need for program like LCFS arises because computer graphics often applied for better understanding and interpretation of data under observation and these graphics become more complicated when animation required during run time. Eliminates need for custom graphical-display software for application programs. Written in Turbo C++.

  10. Multimodal imaging of vascular grafts using time-resolved fluorescence and ultrasound

    NASA Astrophysics Data System (ADS)

    Fatakdawala, Hussain; Griffiths, Leigh G.; Wong, Maelene L.; Humphrey, Sterling; Marcu, Laura

    2015-02-01

    The translation of engineered tissues into clinic requires robust monitoring of tissue development, both in vitro and in vivo. Traditional methods for the same are destructive, inefficient in time and cost and do not allow time-lapse measurements from the same sample or animal. This study reports on the ability of time-resolved fluorescence and ultrasound measurements for non-destructive characterization of explanted tissue engineered vascular grafts. Results show that TRFS and FLIm are able to assess alterations in luminal composition namely elastin, collagen and cellular (hyperplasia) content via changes in fluorescence lifetime values between normal and grafted tissue. These observations are complemented by structural changes observed in UBM pertaining to graft integration and intimal thickness over the grafted region. These results encourage the future application of a catheter-based technique that combines these imaging modalities for non-destructive characterization of vascular grafts in vivo.

  11. A Catalog of Transit Timing Posterior Distributions for all Kepler Planet Candidate Events

    NASA Astrophysics Data System (ADS)

    Montet, Benjamin Tyler; Becker, Juliette C.; Johnson, John

    2015-08-01

    Kepler has ushered in a new era of planetary dynamics, enabling the detection of interactions between multiple planets in transiting systems for hundreds of systems. These interactions, observed as transit timing variations (TTVs), have been used to find non-transiting companions to transiting systems and to measure masses, eccentricities, and inclinations of transiting planets. Often, physical parameters are inferred by comparing the observed light curve to the result of a photodynamical model, a time-intensive process that often ignores the effects of correlated noise in the light curve. Catalogs of transit timing observations have previously neglected non-Gaussian uncertainties in the times of transit, uncertainties in the transit shape, and short cadence data. Here, we present a catalog of not only times of transit centers, but also posterior distributions on the time of transit for every planet candidate transit event in the Kepler data, developed through importance sampling of each transit. This catalog allows us to marginalize over uncertainties in the transit shape and incorporate short cadence data, the effects of correlated noise, and non-Gaussian posteriors. Our catalog will enable dynamical studies that reflect accurately the precision of Kepler and its limitations without requiring the computational power to model the light curve completely with every integration.

  12. Kanaka Maoli and Kamáāina Seascapes - Knowing Our Ocean Through Times of Change

    NASA Astrophysics Data System (ADS)

    Puniwai, N.

    2017-12-01

    In Hawaíi our oceans define us, we come from the ocean. Our oceans change, and we change with them, as we always have. By learning from people who are dependent on their environment, we learn how to observe and how to adapt. Through the lens of climate change, we interviewed respected ocean observers and surfers to learn about changes they have witnessed over time and the spatial scales and ocean conditions important to them. We looked at our ancient and historical texts to see what processes they recorded and the language they used to ascribe their observations, interactions and relationships to these places. Yet, we also integrate what our mechanical data sensors have recorded over recent time. By expanding our time scales of reference, knowledge sources, and collaborators, these methods teach us how our ancestors adapted and how climate change may impact our subsistence, recreation, and interactions with the environment. Managing complex seascapes requires the integration of multiple ways of knowing; strengthening our understanding of seascapes and their resiliency in this changing environment.

  13. Accurate acceleration of kinetic Monte Carlo simulations through the modification of rate constants.

    PubMed

    Chatterjee, Abhijit; Voter, Arthur F

    2010-05-21

    We present a novel computational algorithm called the accelerated superbasin kinetic Monte Carlo (AS-KMC) method that enables a more efficient study of rare-event dynamics than the standard KMC method while maintaining control over the error. In AS-KMC, the rate constants for processes that are observed many times are lowered during the course of a simulation. As a result, rare processes are observed more frequently than in KMC and the time progresses faster. We first derive error estimates for AS-KMC when the rate constants are modified. These error estimates are next employed to develop a procedure for lowering process rates with control over the maximum error. Finally, numerical calculations are performed to demonstrate that the AS-KMC method captures the correct dynamics, while providing significant CPU savings over KMC in most cases. We show that the AS-KMC method can be employed with any KMC model, even when no time scale separation is present (although in such cases no computational speed-up is observed), without requiring the knowledge of various time scales present in the system.

  14. Simulation trainer for practicing emergent open thoracotomy procedures.

    PubMed

    Hamilton, Allan J; Prescher, Hannes; Biffar, David E; Poston, Robert S

    2015-07-01

    An emergent open thoracotomy (OT) is a high-risk, low-frequency procedure uniquely suited for simulation training. We developed a cost-effective Cardiothoracic (CT) Surgery trainer and assessed its potential for improving technical and interprofessional skills during an emergent simulated OT. We modified a commercially available mannequin torso with artificial tissue models to create a custom CT Surgery trainer. The trainer's feasibility for simulating emergent OT was tested using a multidisciplinary CT team in three consecutive in situ simulations. Five discretely observable milestones were identified as requisite steps in carrying out an emergent OT; namely (1) diagnosis and declaration of a code situation, (2) arrival of the code cart, (3) arrival of the thoracotomy tray, (4) initiation of the thoracotomy incision, and (5) defibrillation of a simulated heart. The time required for a team to achieve each discrete step was measured by an independent observer over the course of each OT simulation trial and compared. Over the course of the three OT simulation trials conducted in the coronary care unit, there was an average reduction of 29.5% (P < 0.05) in the times required to achieve the five critical milestones. The time required to complete the whole OT procedure improved by 7 min and 31 s from the initial to the final trial-an overall improvement of 40%. In our preliminary evaluation, the CT Surgery trainer appears to be useful for improving team performance during a simulated emergent bedside OT in the coronary care unit. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Awareness-based game-theoretic space resource management

    NASA Astrophysics Data System (ADS)

    Chen, Genshe; Chen, Huimin; Pham, Khanh; Blasch, Erik; Cruz, Jose B., Jr.

    2009-05-01

    Over recent decades, the space environment becomes more complex with a significant increase in space debris and a greater density of spacecraft, which poses great difficulties to efficient and reliable space operations. In this paper we present a Hierarchical Sensor Management (HSM) method to space operations by (a) accommodating awareness modeling and updating and (b) collaborative search and tracking space objects. The basic approach is described as follows. Firstly, partition the relevant region of interest into district cells. Second, initialize and model the dynamics of each cell with awareness and object covariance according to prior information. Secondly, explicitly assign sensing resources to objects with user specified requirements. Note that when an object has intelligent response to the sensing event, the sensor assigned to observe an intelligent object may switch from time-to-time between a strong, active signal mode and a passive mode to maximize the total amount of information to be obtained over a multi-step time horizon and avoid risks. Thirdly, if all explicitly specified requirements are satisfied and there are still more sensing resources available, we assign the additional sensing resources to objects without explicitly specified requirements via an information based approach. Finally, sensor scheduling is applied to each sensor-object or sensor-cell pair according to the object type. We demonstrate our method with realistic space resources management scenario using NASA's General Mission Analysis Tool (GMAT) for space object search and track with multiple space borne observers.

  16. Temporal Change of Seismic Earth's Inner Core Phases: Inner Core Differential Rotation Or Temporal Change of Inner Core Surface?

    NASA Astrophysics Data System (ADS)

    Yao, J.; Tian, D.; Sun, L.; Wen, L.

    2017-12-01

    Since Song and Richards [1996] first reported seismic evidence for temporal change of PKIKP wave (a compressional wave refracted in the inner core) and proposed inner core differential rotation as its explanation, it has generated enormous interests in the scientific community and the public, and has motivated many studies on the implications of the inner core differential rotation. However, since Wen [2006] reported seismic evidence for temporal change of PKiKP wave (a compressional wave reflected from the inner core boundary) that requires temporal change of inner core surface, both interpretations for the temporal change of inner core phases have existed, i.e., inner core rotation and temporal change of inner core surface. In this study, we discuss the issue of the interpretation of the observed temporal changes of those inner core phases and conclude that inner core differential rotation is not only not required but also in contradiction with three lines of seismic evidence from global repeating earthquakes. Firstly, inner core differential rotation provides an implausible explanation for a disappearing inner core scatterer between a doublet in South Sandwich Islands (SSI), which is located to be beneath northern Brazil based on PKIKP and PKiKP coda waves of the earlier event of the doublet. Secondly, temporal change of PKIKP and its coda waves among a cluster in SSI is inconsistent with the interpretation of inner core differential rotation, with one set of the data requiring inner core rotation and the other requiring non-rotation. Thirdly, it's not reasonable to invoke inner core differential rotation to explain travel time change of PKiKP waves in a very small time scale (several months), which is observed for repeating earthquakes in Middle America subduction zone. On the other hand, temporal change of inner core surface could provide a consistent explanation for all the observed temporal changes of PKIKP and PKiKP and their coda waves. We conclude that the observed temporal changes of the inner core phases are caused by temporal changes of inner core surface. The temporal changes of inner core surface are found to occur in some localized regions within a short time scale (years to months), a phenomenon that should provide important clues to a potentially fundamental change of our understanding of core dynamics.

  17. 49 CFR 238.111 - Pre-revenue service acceptance testing plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the times and places of the pre-revenue service tests to permit FRA observation of such tests. For... 49 Transportation 4 2010-10-01 2010-10-01 false Pre-revenue service acceptance testing plan. 238... and General Requirements § 238.111 Pre-revenue service acceptance testing plan. (a) Passenger...

  18. 49 CFR 238.111 - Pre-revenue service acceptance testing plan.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the times and places of the pre-revenue service tests to permit FRA observation of such tests. For... 49 Transportation 4 2011-10-01 2011-10-01 false Pre-revenue service acceptance testing plan. 238... and General Requirements § 238.111 Pre-revenue service acceptance testing plan. (a) Passenger...

  19. 49 CFR 238.111 - Pre-revenue service acceptance testing plan.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... the times and places of the pre-revenue service tests to permit FRA observation of such tests. For... 49 Transportation 4 2012-10-01 2012-10-01 false Pre-revenue service acceptance testing plan. 238... and General Requirements § 238.111 Pre-revenue service acceptance testing plan. (a) Passenger...

  20. 49 CFR 238.111 - Pre-revenue service acceptance testing plan.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the times and places of the pre-revenue service tests to permit FRA observation of such tests. For... 49 Transportation 4 2014-10-01 2014-10-01 false Pre-revenue service acceptance testing plan. 238... and General Requirements § 238.111 Pre-revenue service acceptance testing plan. (a) Passenger...

  1. 49 CFR 238.111 - Pre-revenue service acceptance testing plan.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the times and places of the pre-revenue service tests to permit FRA observation of such tests. For... 49 Transportation 4 2013-10-01 2013-10-01 false Pre-revenue service acceptance testing plan. 238... and General Requirements § 238.111 Pre-revenue service acceptance testing plan. (a) Passenger...

  2. Extended cox regression model: The choice of timefunction

    NASA Astrophysics Data System (ADS)

    Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu

    2017-07-01

    Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.

  3. Calibration of hydrological models using flow-duration curves

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; Guerrero, J.-L.; Younger, P. M.; Beven, K. J.; Seibert, J.; Halldin, S.; Freer, J. E.; Xu, C.-Y.

    2010-12-01

    The degree of belief we have in predictions from hydrologic models depends on how well they can reproduce observations. Calibrations with traditional performance measures such as the Nash-Sutcliffe model efficiency are challenged by problems including: (1) uncertain discharge data, (2) variable importance of the performance with flow magnitudes, (3) influence of unknown input/output errors and (4) inability to evaluate model performance when observation time periods for discharge and model input data do not overlap. A new calibration method using flow-duration curves (FDCs) was developed which addresses these problems. The method focuses on reproducing the observed discharge frequency distribution rather than the exact hydrograph. It consists of applying limits of acceptability for selected evaluation points (EPs) of the observed uncertain FDC in the extended GLUE approach. Two ways of selecting the EPs were tested - based on equal intervals of discharge and of volume of water. The method was tested and compared to a calibration using the traditional model efficiency for the daily four-parameter WASMOD model in the Paso La Ceiba catchment in Honduras and for Dynamic TOPMODEL evaluated at an hourly time scale for the Brue catchment in Great Britain. The volume method of selecting EPs gave the best results in both catchments with better calibrated slow flow, recession and evaporation than the other criteria. Observed and simulated time series of uncertain discharges agreed better for this method both in calibration and prediction in both catchments without resulting in overpredicted simulated uncertainty. An advantage with the method is that the rejection criterion is based on an estimation of the uncertainty in discharge data and that the EPs of the FDC can be chosen to reflect the aims of the modelling application e.g. using more/less EPs at high/low flows. While the new method is less sensitive to epistemic input/output errors than the normal use of limits of acceptability applied directly to the time series of discharge, it still requires a reasonable representation of the distribution of inputs. Additional constraints might therefore be required in catchments subject to snow. The results suggest that the new calibration method can be useful when observation time periods for discharge and model input data do not overlap. The new method could also be suitable for calibration to regional FDCs while taking uncertainties in the hydrological model and data into account.

  4. Implementation of a timed, electronic, assessment-driven potassium-replacement protocol.

    PubMed

    Zielenski, Christopher; Crabtree, Adam; Le, Tien; Marlatt, Alyse; Ng, Dana; Tran, Alan

    2017-06-15

    The adherence to and effectiveness and safety of a timed, electronic, assessment-driven potassium-replacement protocol (TARP) were compared with an electronic nurse-driven replacement protocol (NRP) are reported. A retrospective observational study was conducted in a community hospital evaluating protocol adherence, effectiveness, and safety for 2 potassium-replacement protocols. All adults on medical units with an order for potassium replacement per protocol during the 3-month trial periods were reviewed. All patients requiring potassium replacement per protocol were included in the analysis. Adherence to the protocol was assessed by evaluating the dose of potassium administered and performance of reassessments. Effectiveness of the protocol was assessed by evaluating the time to achieve target potassium levels. Safety was assessed by evaluating the route of administration and occurrence of hyperkalemia. A total of 300 patients treated using potassium-replacement protocols required potassium replacement during the study period, with 148 patients in the NRP group requiring 491 instances of potassium replacement. In the TARP group a total of 564 instances requiring potassium replacement corresponded to 152 patients. Of the 491 instances requiring replacement in the NRP group, the correct dose was administered and reassessment performed 117 times (23.8%). Overall adherence ( p < 0.05), correct dose given ( p < 0.05), average time from blood draw to potassium replacement ( p < 0.0001), use of oral replacement ( p < 0.05), and time to achieve target potassium level within 12 hours ( p < 0.05) were significantly improved in the TARP group. The TARP improved the effectiveness and safety of potassium-replacement therapy over the traditional NRP without negatively affecting timeliness of care. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  5. A computationally efficient approach for isolating satellite phase fractional cycle biases based on Kalman filter

    NASA Astrophysics Data System (ADS)

    Xiao, Guorui; Mayer, Michael; Heck, Bernhard; Sui, Lifen; Cong, Mingri

    2017-04-01

    Integer ambiguity resolution (AR) can significantly shorten the convergence time and improve the accuracy of Precise Point Positioning (PPP). Phase fractional cycle biases (FCB) originating from satellites destroy the integer nature of carrier phase ambiguities. To isolate the satellite FCB, observations from a global reference network are required. Firstly, float ambiguities containing FCBs are obtained by PPP processing. Secondly, the least squares method (LSM) is adopted to recover FCBs from all the float ambiguities. Finally, the estimated FCB products can be applied by the user to achieve PPP-AR. During the estimation of FCB, the LSM step can be very time-consuming, considering the large number of observations from hundreds of stations and thousands of epochs. In addition, iterations are required to deal with the one-cycle inconsistency among observations. Since the integer ambiguities are derived by directly rounding float ambiguities, the one-cycle inconsistency arises whenever the fractional parts of float ambiguities exceed the rounding boundary (e.g., 0.5 and -0.5). The iterations of LSM and the large number of observations require a long time to finish the estimation. Consequently, only a sparse global network containing a limited number of stations was processed in former research. In this paper, we propose to isolate the FCB based on a Kalman filter. The large number of observations is handled epoch-by-epoch, which significantly reduces the dimension of the involved matrix and accelerates the computation. In addition, it is also suitable for real-time applications. As for the one-cycle inconsistency, a pre-elimination method is developed to avoid the iteration of the whole process. According to the analysis of the derived satellite FCB products, we find that both wide-lane (WL) and narrow-lane (NL) FCB are very stable over time (e.g., WL FCB over several days rsp. NL FCB over tens of minutes). The stability implies that the satellite FCB can be removed by previous estimation. After subtraction of the satellite FCB, the receiver FCB can be determined. Theoretically, the receiver FCBs derived from different satellite observations should be the same for a single station. Thereby, the one-cycle inconsistency among satellites can be detected and eliminated by adjusting the corresponding receiver FCB. Here, stations can be handled individually to obtain "clean" FCB observations. In an experiment, 24 h observations from 200 stations are processed to estimate GPS FCB. The process finishes in one hour using a personal computer. The estimated WL FCB has a good consistency with existing WL FCB products (e.g., CNES, WHU-SGG). All differences are within ± 0.1 cycles, which indicates the correctness of the proposed approach. For NL FCB, all differences are within ± 0.2 cycles. Concerning the NL wavelength (10.7 cm), the slightly worse NL FCB may be ascribed to different PPP processing strategies. The state-based approach of the Kalman filter also allows for a more realistic modeling of stochastic parameters, which will be investigated in future research.

  6. Routing of radioactive shipments in networks with time-varying costs and curfews

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowler, L.A.; Mahmassani, H.S.

    This research examines routing of radioactive shipments in highway networks with time-dependent travel times and population densities. A time-dependent least-cost path (TDLCP) algorithm that uses a label-correcting approach is adapted to include curfews and waiting at nodes. A method is developed to estimate time-dependent population densities, which are required to estimate risk associated with the use of a particular highway link at a particular time. The TDLCP algorithm is implemented for example networks and used to examine policy questions related to radioactive shipments. It is observed that when only Interstate highway facilities are used to transport these materials, a shipmentmore » must go through many cities and has difficulty avoiding all of them during their rush hour periods. Decreases in risk, increased departure time flexibility, and modest increases in travel times are observed when primary and/or secondary roads are included in the network. Based on the results of the example implementation, the suitability of the TDLCP algorithm for strategic nuclear material and general radioactive material shipments is demonstrated.« less

  7. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    NASA Astrophysics Data System (ADS)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.

  8. A containerless levitation setup for liquid processing in a superconducting magnet.

    PubMed

    Lu, Hui-Meng; Yin, Da-Chuan; Li, Hai-Sheng; Geng, Li-Qiang; Zhang, Chen-Yan; Lu, Qin-Qin; Guo, Yun-Zhu; Guo, Wei-Hong; Shang, Peng; Wakayama, Nobuko I

    2008-09-01

    Containerless processing of materials is considered beneficial for obtaining high quality products due to the elimination of the detrimental effects coming from the contact with container walls. Many containerless processing methods are realized by levitation techniques. This paper describes a containerless levitation setup that utilized the magnetization force generated in a gradient magnetic field. It comprises a levitation unit, a temperature control unit, and a real-time observation unit. Known volume of liquid diamagnetic samples can be levitated in the levitation chamber, the temperature of which is controlled using the temperature control unit. The evolution of the levitated sample is observed in real time using the observation unit. With this setup, containerless processing of liquid such as crystal growth from solution can be realized in a well-controlled manner. Since the levitation is achieved using a superconducting magnet, experiments requiring long duration time such as protein crystallization and simulation of space environment for living system can be easily succeeded.

  9. Equatorial disc and dawn-dusk currents in the frontside magnetosphere of Jupiter - Pioneer 10 and 11

    NASA Technical Reports Server (NTRS)

    Jones, D. E.; Thomas, B. T.; Melville, J. G., II

    1981-01-01

    Observations by Pioneer 10 and 11 show that the strongest azimuthal fields are observed near the dawn meridian (Pioneer 10) while the weakest occur near the noon meridian (Pioneer 11), suggesting a strong local time dependence for the corresponding radial current system. Modeling studies of the radial component of the field observed by both spacecraft suggest that the corresponding azimuthal current system must also be a strong function of local time. Both the azimuthal and the radial field component signatures exhibit sharp dips and reversals, requiring thin radial and azimuthal current systems. There is also a suggestion that these two current systems either are interacting or are due, at least in part, to the same current. It is suggested that a plausible current model consists of the superposition of a thin, local-time-independent azimuthal current system plus the equatorial portion of a tail-like current system that extends into the dayside magnetosphere.

  10. Constraints on Inner Core Anisotropy Using Array Observations of P'P'

    NASA Astrophysics Data System (ADS)

    Frost, Daniel A.; Romanowicz, Barbara

    2017-11-01

    Recent studies of PKPdf travel times suggest strong anisotropy (4% or more) in the quasi-western inner core hemisphere. However, the availability of paths sampling at low angles to the Earth's rotation axis (the fast axis) is limited. To augment this sampling, we collected a travel time data set for the phase P'P'df (PKPPKPdf), for which at least one inner core leg is quasi-polar, at two high latitude seismic arrays. We find that the inferred anisotropy is weak (on the order of 0.5 to 1.5%), confirming previous results based on a much smaller P'P' data set. While previous models of inner core anisotropy required very strong alignment of anisotropic iron grains, our results are more easily explained by current dynamic models of inner core growth. We observe large travel time anomalies when one leg of P'P'df is along the South Sandwich to Alaska path, consistent with PKPdf observations, and warranting further investigation.

  11. LINNAEUS: BOOSTING NEAR EARTH ASTEROID CHARACTERIZATION RATES

    NASA Astrophysics Data System (ADS)

    Elvis, Martin; Beeson, C.; Galache, J.; DeMeo, F.; Evans, I.; Evans, J.; Konidaris, N.; Najita, J.; Allen, L.; Christensen, E.; Spahr, T.

    2013-10-01

    Near Earth objects (NEOs) are being discovered at a rate of about 1000 per year, and this rate is set to double by 2015. However, the physical characterization of NEOs is only ~100 per year for each type of follow-up observation. We have proposed the LINNAEUS program to NASA to raise the characterization rate of NEOs to the rate of their discovery. This rate matching is necessary as any given NEO is only available for a relatively short time (days to weeks), and they are usually fainter on subsequent apparitions. Hence follow-up observations must be initiated rapidly, without time to cherry-pick the optimum objects. LINNAEUS concentrates on NEO composition. Optical spectra, preferably extending into the near-infrared, provide compositions that can distinguish major compositional classes of NEOs with reasonable confidence (Bus and Binzel 2002, DeMeo et al. 2009). Armed with a taxonomic type the albedo, pV, of an NEO is better constrained, leading to more accurate sizes and masses. Time-resolved spectroscopy can give indications of period, axial ratio and surface homogeneity. A reasonable program of spectroscopy could keep pace with the NEO discovery rate. A ground-based telescope can observe faint NEOs about 210 nights a year, due to time lost due to weather, bright time, and equipment downtime (e.g. Gemini), for a total of ~2000 hours/year. At 1 hour per NEO spectrum, a well-run, dedicated, telescope could obtain almost 2000 spectra per year, about the rate required. If near-IR spectra are required then a 4 m or larger telescope is necessary to reach 20. However, if the Bus-Binzel taxomonmy suffices then only optical spectra are needed and a 2 meter class telescope is sufficient. LINNAEUS would use 50% of the KPNO 2.1 m telescope with an IFU spectrometer, the SED-machine (Ben-Ami et al. 2013), to obtain time-resolved optical spectra of 1200-2000 NEOs/year, or 4200-7000 in 3.5 years observing in an NEOO program. Robust pipeline analysis will release taxonomic types via the Minor Planet Center within 24 hours and a full archive of spectra and products will be provided.

  12. Getting here: five steps forwards and four back

    NASA Astrophysics Data System (ADS)

    Griffin, R. Elizabeth

    The concept of libraries of stellar spectra is by no means new, though access to on-line ones is a relatively recent achievement. The road to the present state has been rocky, and we are still far short of what is needed and what can easily be attained. Spectra as by-products of individual research projects are inhomogeneous, biassed, and can be dangerously inadequate for modelling complex stellar systems. Archival products are eclectic, but unique in the time domain. Getting telescope time for the required level of homogeneity, inclusivity and completeness for new libraries requires strong scientific arguments that must be competitive. Using synthetic spectra builds misconceptions into the modelling. Attempts to set up the initial requirements (archives of observed spectra) encountered dogged resistance, much of which has never been resolved. Those struggles, and the indelible effects they have upon our science, will be reviewed, and the basics of a promotional programme outlined.

  13. Extracellular space preservation aids the connectomic analysis of neural circuits.

    PubMed

    Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L

    2015-12-09

    Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits.

  14. User data dissemination concepts for earth resources

    NASA Technical Reports Server (NTRS)

    Davies, R.; Scott, M.; Mitchell, C.; Torbett, A.

    1976-01-01

    Domestic data dissemination networks for earth-resources data in the 1985-1995 time frame were evaluated. The following topics were addressed: (1) earth-resources data sources and expected data volumes, (2) future user demand in terms of data volume and timeliness, (3) space-to-space and earth point-to-point transmission link requirements and implementation, (4) preprocessing requirements and implementation, (5) network costs, and (6) technological development to support this implementation. This study was parametric in that the data input (supply) was varied by a factor of about fifteen while the user request (demand) was varied by a factor of about nineteen. Correspondingly, the time from observation to delivery to the user was varied. This parametric evaluation was performed by a computer simulation that was based on network alternatives and resulted in preliminary transmission and preprocessing requirements. The earth-resource data sources considered were: shuttle sorties, synchronous satellites (e.g., SEOS), aircraft, and satellites in polar orbits.

  15. SpecDB: The AAVSO’s Public Repository for Spectra of Variable Stars

    NASA Astrophysics Data System (ADS)

    Kafka, Stella; Weaver, John; Silvis, George; Beck, Sara

    2018-01-01

    SpecDB is the American Association of Variable Star Observers (AAVSO) spectral database. Accessible to any astronomer with the capability to perform spectroscopy, SpecDB provides an unprecedented scientific opportunity for amateur and professional astronomers around the globe. Backed by the Variable Star Index, one of the most utilized variable star catalogs, SpecDB is expected to become one of the world leading databases of its kind. Once verified by a team of expert spectroscopists, an observer can upload spectra of variable stars target easily and efficiently. Uploaded spectra can then be searched for, previewed, and downloaded for inclusion in publications. Close community development and involvement will ensure a user-friendly and versatile database, compatible with the needs of 21st century astrophysics. Observations of 1D spectra are submitted as FITS files. All spectra are required to be preprocessed for wavelength calibration and dark subtraction; Bias and flat are strongly recommended. First time observers are required to submit a spectrum of a standard (non-variable) star to be checked for errors in technique or equipment. Regardless of user validation, FITS headers must include several value cards detailing the observation, as well as information regarding the observer, equipment, and observing site in accordance with existing AAVSO records. This enforces consistency and provides necessary details for follow up analysis. Requirements are provided to users in a comprehensive guidebook and accompanying technical manual. Upon submission, FITS headers are automatically checked for errors and any anomalies are immediately fed back to the user. Successful candidates can then submit at will, including multiple simultaneous submissions. All published observations can be searched and interactively previewed. Community involvement will be enhanced by an associated forum where users can discuss observation techniques and suggest improvements to the database.

  16. Agile Science Operations: A New Approach for Primitive Exploration Bodies

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Thompson, David R.; Castillo-Rogez, Julie C.; Doyle, Richard; Estlin, Tara; Mclaren, David

    2012-01-01

    Primitive body exploration missions such as potential Comet Surface Sample Return or Trojan Tour and Rendezvous would challenge traditional operations practices. Earth-based observations would provide only basic understanding before arrival and many science goals would be defined during the initial rendezvous. It could be necessary to revise trajectories and observation plans to quickly characterize the target for safe, effective observations. Detection of outgassing activity and monitoring of comet surface activity are even more time constrained, with events occurring faster than round-trip light time. "Agile science operations" address these challenges with contingency plans that recognize the intrinsic uncertainty in the operating environment and science objectives. Planning for multiple alternatives can significantly improve the time required to repair and validate spacecraft command sequences. When appropriate, time-critical decisions can be automated and shifted to the spacecraft for immediate access to instrument data. Mirrored planning systems on both sides of the light-time gap permit transfer of authority back and forth as needed. We survey relevant science objectives, identifying time bottlenecks and the techniques that could be used to speed missions' reaction to new science data. Finally, we discuss the results of a trade study simulating agile observations during flyby and comet rendezvous scenarios. These experiments quantify instrument coverage of key surface features as a function of planning turnaround time. Careful application of agile operations techniques can play a significant role in realizing the Decadal Survey plan for primitive body exploration

  17. An Extended Multi-Zone Model for the MCG-6-30-15 Warm Absorber

    NASA Technical Reports Server (NTRS)

    Morales, R.; Fabian, A. C.; Reynolds, C. S.

    2000-01-01

    The variable warm absorber seen with ASCA in the X-ray spectrum of MCG 6-30-15 shows complex time behaviour in which the optical depth of O VIII anticorrelates with the flux whereas that of O VII is unchanging. The explanation in terms of a two zone absorber has since been challenged by BeppoSAX observations. These present a more complicated behaviour for the O VII edge. The explanation we offer for both ASCA and BeppoSAX observations requires a very simple photoionization model together with the presence of a third, intermediate, zone and a period of very low luminosity. In practice warm absorbers are likely to be extended, multi-zone regions of which only part causes directly observable absorption edges at any given time depending on the value of the luminosity.

  18. A digital video tracking system

    NASA Astrophysics Data System (ADS)

    Giles, M. K.

    1980-01-01

    The Real-Time Videotheodolite (RTV) was developed in connection with the requirement to replace film as a recording medium to obtain the real-time location of an object in the field-of-view (FOV) of a long focal length theodolite. Design philosophy called for a system capable of discriminatory judgment in identifying the object to be tracked with 60 independent observations per second, capable of locating the center of mass of the object projection on the image plane within about 2% of the FOV in rapidly changing background/foreground situations, and able to generate a predicted observation angle for the next observation. A description is given of a number of subsystems of the RTV, taking into account the processor configuration, the video processor, the projection processor, the tracker processor, the control processor, and the optics interface and imaging subsystem.

  19. A reevaluation of the proposed spin-down of the white dwarf pulsar in AR Scorpii.

    NASA Astrophysics Data System (ADS)

    Potter, Stephen B.; Buckley, David A. H.

    2018-05-01

    We present high-speed optical photometric observations, spanning ˜2 years, of the recently-discovered white dwarf pulsar AR Scorpii. The amplitudes of the orbital, spin and beat modulations appear to be remarkably stable and repeatable over the time span of our observations. It has been suggested that the polarized and non-polarized emission from AR Scorpii is powered by the spin-down of the white dwarf. However, we find that our new data is inconsistent with the published spin-down ephemeris. Whilst our data is consistent with a constant spin period further observations over an extended time-base are required in order to ascertain the true spin-evolution of the white dwarf. This may have implications for the various models put forward to explain the energetics and evolution of AR Scorpii.

  20. Structure of xanthan gum and cell ultrastructure at different times of alkali stress

    PubMed Central

    de Mello Luvielmo, Márcia; Borges, Caroline Dellinghausen; de Oliveira Toyama, Daniela; Vendruscolo, Claire Tondo; Scamparini, Adilma Regina Pippa

    2016-01-01

    The effect of alkali stress on the yield, viscosity, gum structure, and cell ultrastructure of xanthan gum was evaluated at the end of fermentation process of xanthan production by Xanthomonas campestris pv. manihotis 280-95. Although greater xanthan production was observed after a 24 h-alkali stress process, a lower viscosity was observed when compared to the alkali stress-free gum, regardless of the alkali stress time. However, this outcome is not conclusive as further studies on gum purification are required to remove excess sodium, verify the efficiency loss and the consequent increase in the polymer viscosity. Alkali stress altered the structure of xanthan gum from a polygon-like shape to a star-like form. At the end of the fermentation, early structural changes in the bacterium were observed. After alkali stress, marked structural differences were observed in the cells. A more vacuolated cytoplasm and discontinuities in the membrane cells evidenced the cell lysis. Xanthan was observed in the form of concentric circles instead of agglomerates as observed prior to the alkali stress. PMID:26887232

  1. Visualizing and understanding vortex and tendex lines of colliding black holes

    NASA Astrophysics Data System (ADS)

    Khan, Haroon; Lovelace, Geoffery; Rodriguez, Samuel

    2017-01-01

    Gravitational waves (GWs) are ripples of spacetime. In order to detect and physically study the GW emitted by merging black holes with ground based detectors such as aLIGO, we must accurately predict how the waves look and behave. This requires numerical simulations of black hole (BH) mergers on supercomputers, because all analytical approximations fail near the time of merger. These simulations also reveal how BHs warp space and time. My project focuses on using these simulations to visualize the strongly curved space time in simulations of merging BHs. I have visualized the vortex and tendex lines for a binary BH system, using the Spectral Einstein Code. Vortex lines describe how an observer would be twisted by the curvature, and the tendex lines describe an observer would be stretched at squeezed by it. These lines are analogous to how electric and magnetic field lines describe the electromagnetic forces on an observer. Visualizing these will provide a more intuitive understanding of the nonlinear dynamics of the spacetime of merging BHs. I am exploring how these lines change with time during a simulation, to see whether they vary smoothly in time and how they depend on where they are seeded.

  2. Western scrub-jays allocate longer observation time to more valuable information.

    PubMed

    Watanabe, Arii; Grodzinski, Uri; Clayton, Nicola S

    2014-07-01

    When humans mentally reconstruct past events and imagine future scenarios, their subjective experience of mentally time travelling is accompanied by the awareness of doing so. Despite recent popularity of studying episodic memory in animals, such phenomenological consciousness has been extremely difficult to demonstrate without agreed behavioural markers of consciousness in non-linguistic subjects. We presented western scrub-jays (Aphelocoma californica) with a task requiring them to allocate observing time between two peepholes to see food being hidden in either of two compartments, one where observing the hiding location was necessary to later relocate the food, and another where food could easily be found without watching. Jays first separately experienced these consequences of possessing information in each compartment and subsequently, once given a choice, made more looks and spent more time looking into the compartment where information was necessary than into the compartment where it was unnecessary. Thus, the jays can collect information to solve a future problem. Moreover, they can differentiate sources of information according to their potential value and modify behaviour to efficiently collect important, usable information. This is the first evidence of metacognition in a species that passes the behavioural criteria for both retrospective and prospective mental time travel.

  3. A Pilot Study of Peritoneal Perfusion with a Novel Hemoglobin Based Oxygen Carrier in Swine (Sus scrofa)

    DTIC Science & Technology

    2016-10-12

    has a high rate of complications, obliges systemic anticoagulation, and requires a significant level of logistics support as well as expertise. In...the endotracheal tube was clamped, ceasing gas exchange in the lung. Arterial blood gases and time to death were then recorded. No differences were...observed between treatment and control animals in terms of C02, 02 and time to death . Peritoneal gas exchange did not improve oxygenation, ventilation or time to death in this severe model of lung injury.

  4. System-level view of geospace dynamics: Challenges for high-latitude ground-based observations

    NASA Astrophysics Data System (ADS)

    Donovan, E.

    2014-12-01

    Increasingly, research programs including GEM, CEDAR, GEMSIS, GO Canada, and others are focusing on how geospace works as a system. Coupling sits at the heart of system level dynamics. In all cases, coupling is accomplished via fundamental processes such as reconnection and plasma waves, and can be between regions, energy ranges, species, scales, and energy reservoirs. Three views of geospace are required to attack system level questions. First, we must observe the fundamental processes that accomplish the coupling. This "observatory view" requires in situ measurements by satellite-borne instruments or remote sensing from powerful well-instrumented ground-based observatories organized around, for example, Incoherent Scatter Radars. Second, we need to see how this coupling is controlled and what it accomplishes. This demands quantitative observations of the system elements that are being coupled. This "multi-scale view" is accomplished by networks of ground-based instruments, and by global imaging from space. Third, if we take geospace as a whole, the system is too complicated, so at the top level we need time series of simple quantities such as indices that capture important aspects of the system level dynamics. This requires a "key parameter view" that is typically provided through indices such as AE and DsT. With the launch of MMS, and ongoing missions such as THEMIS, Cluster, Swarm, RBSP, and ePOP, we are entering a-once-in-a-lifetime epoch with a remarkable fleet of satellites probing processes at key regions throughout geospace, so the observatory view is secure. With a few exceptions, our key parameter view provides what we need. The multi-scale view, however, is compromised by space/time scales that are important but under-sampled, combined extent of coverage and resolution that falls short of what we need, and inadequate conjugate observations. In this talk, I present an overview of what we need for taking system level research to its next level, and how high latitude ground based observations can address these challenges.

  5. DARE Mission Design: Low RFI Observations from a Low-Altitude Frozen Lunar Orbit

    NASA Technical Reports Server (NTRS)

    Plice, Laura; Galal, Ken; Burns, Jack O.

    2017-01-01

    The Dark Ages Radio Explorer (DARE) seeks to study the cosmic Dark Ages approximately 80 to 420 million years after the Big Bang. Observations require truly quiet radio conditions, shielded from Sun and Earth electromagnetic (EM) emissions, on the far side of the Moon. DAREs science orbit is a frozen orbit with respect to lunar gravitational perturbations. The altitude and orientation of the orbit remain nearly fixed indefinitely, maximizing science time without the need for maintenance. DAREs observation targets avoid the galactic center and enable investigation of the universes first stars and galaxies.

  6. Determining neutrino mass from the cosmic microwave background alone.

    PubMed

    Kaplinghat, Manoj; Knox, Lloyd; Song, Yong-Seon

    2003-12-12

    Distortions of cosmic microwave background temperature and polarization maps caused by gravitational lensing, observable with high angular resolution and high sensitivity, can be used to measure the neutrino mass. Assuming two massless species and one with mass m(nu), we forecast sigma(m(nu))=0.15 eV from the Planck satellite and sigma(m(nu))=0.04 eV from observations with twice the angular resolution and approximately 20 times the sensitivity. A detection is likely at this higher sensitivity since the observation of atmospheric neutrino oscillations requires Deltam(2)(nu) greater, similar (0.04 eV)(2).

  7. From blackbirds to black holes: Investigating capture-recapture methods for time domain astronomy

    NASA Astrophysics Data System (ADS)

    Laycock, Silas G. T.

    2017-07-01

    In time domain astronomy, recurrent transients present a special problem: how to infer total populations from limited observations. Monitoring observations may give a biassed view of the underlying population due to limitations on observing time, visibility and instrumental sensitivity. A similar problem exists in the life sciences, where animal populations (such as migratory birds) or disease prevalence, must be estimated from sparse and incomplete data. The class of methods termed Capture-Recapture is used to reconstruct population estimates from time-series records of encounters with the study population. This paper investigates the performance of Capture-Recapture methods in astronomy via a series of numerical simulations. The Blackbirds code simulates monitoring of populations of transients, in this case accreting binary stars (neutron star or black hole accreting from a stellar companion) under a range of observing strategies. We first generate realistic light-curves for populations of binaries with contrasting orbital period distributions. These models are then randomly sampled at observing cadences typical of existing and planned monitoring surveys. The classical capture-recapture methods, Lincoln-Peterson, Schnabel estimators, related techniques, and newer methods implemented in the Rcapture package are compared. A general exponential model based on the radioactive decay law is introduced which is demonstrated to recover (at 95% confidence) the underlying population abundance and duty cycle, in a fraction of the observing visits (10-50%) required to discover all the sources in the simulation. Capture-Recapture is a promising addition to the toolbox of time domain astronomy, and methods implemented in R by the biostats community can be readily called from within python.

  8. Optimizing the night time with dome vents and SNR-QSO at CFHT

    NASA Astrophysics Data System (ADS)

    Devost, Daniel; Mahoney, Billy; Moutou, Claire; CFHT QSO Team, CFHT software Group

    2017-06-01

    Night time is a precious and costly commodity and it is important to get everything we can out of every second of every night of observing. In 2012 the Canada-France-Hawaii Telescope started operating 12 new vent doors installed on the dome over the course of the previous two years. The project was highly successful and seeing measurements show that venting the dome greatly enhances image quality at the focal plane. In order to capitalize on the gains brought by the new vents, the observatory started exploring a new mode of observation called SNR-QSO. This mode consist of a new implementation inside our Queued Service Observation (QSO) system. Exposure times are adjusted for each frame depending on the weather conditions in order to reach a specific depth, Signal to Noise Ratio (SNR) at a certain magnitude. The goal of this new mode is to capitalize on the exquisite seeing provided by Maunakea, complemented by the minimized dome turbulence, to use the least amount of time to reach the depth required by the science programs. Specific implementations were successfully tested on two different instruments, our wide field camera MegaCam and our high resolution spectrograph ESPaDOnS. I will present the methods used for each instrument to achieve SNR observing and the gains produced by these new observing modes in order to reach the scientific goals of accepted programs in a shorter amount of time.

  9. Satellite scheduling considering maximum observation coverage time and minimum orbital transfer fuel cost

    NASA Astrophysics Data System (ADS)

    Zhu, Kai-Jian; Li, Jun-Feng; Baoyin, He-Xi

    2010-01-01

    In case of an emergency like the Wenchuan earthquake, it is impossible to observe a given target on earth by immediately launching new satellites. There is an urgent need for efficient satellite scheduling within a limited time period, so we must find a way to reasonably utilize the existing satellites to rapidly image the affected area during a short time period. Generally, the main consideration in orbit design is satellite coverage with the subsatellite nadir point as a standard of reference. Two factors must be taken into consideration simultaneously in orbit design, i.e., the maximum observation coverage time and the minimum orbital transfer fuel cost. The local time of visiting the given observation sites must satisfy the solar radiation requirement. When calculating the operational orbit elements as optimal parameters to be evaluated, we obtain the minimum objective function by comparing the results derived from the primer vector theory with those derived from the Hohmann transfer because the operational orbit for observing the disaster area with impulse maneuvers is considered in this paper. The primer vector theory is utilized to optimize the transfer trajectory with three impulses and the Hohmann transfer is utilized for coplanar and small inclination of non-coplanar cases. Finally, we applied this method in a simulation of the rescue mission at Wenchuan city. The results of optimizing orbit design with a hybrid PSO and DE algorithm show that the primer vector and Hohmann transfer theory proved to be effective methods for multi-object orbit optimization.

  10. Self-calibration of Cosmic Microwave Background Polarization Experiments

    NASA Astrophysics Data System (ADS)

    Keating, Brian G.; Shimon, Meir; Yadav, Amit P. S.

    2013-01-01

    Precision measurements of the polarization of the cosmic microwave background (CMB) radiation, especially experiments seeking to detect the odd-parity "B-modes," have far-reaching implications for cosmology. To detect the B-modes generated during inflation, the flux response and polarization angle of these experiments must be calibrated to exquisite precision. While suitable flux calibration sources abound, polarization angle calibrators are deficient in many respects. Man-made polarized sources are often not located in the antenna's far-field, have spectral properties that are radically different from the CMB's, are cumbersome to implement, and may be inherently unstable over the (long) duration these searches require to detect the faint signature of the inflationary epoch. Astrophysical sources suffer from time, frequency, and spatial variability, are not visible from all CMB observatories, and none are understood with sufficient accuracy to calibrate future CMB polarimeters seeking to probe inflationary energy scales of 1015 GeV. Both man-made and astrophysical sources require dedicated observations which detract from the amount of integration time usable for detection of the inflationary B-modes. CMB TB and EB modes, expected to identically vanish in the standard cosmological model, can be used to calibrate CMB polarimeters. By enforcing the observed EB and TB power spectra to be consistent with zero, CMB polarimeters can be calibrated to levels not possible with man-made or astrophysical sources. All of this can be accomplished for any polarimeter without any loss of observing time using a calibration source which is spectrally identical to the CMB B-modes.

  11. NSF Lower Atmospheric Observing Facilities (LAOF) in support of science and education

    NASA Astrophysics Data System (ADS)

    Baeuerle, B.; Rockwell, A.

    2012-12-01

    Researchers, students and teachers who want to understand and describe the Earth System require high quality observations of the atmosphere, ocean, and biosphere. Making these observations requires state-of-the-art instruments and systems, often carried on highly capable research platforms. To support this need of the geosciences community, the National Science Foundation's (NSF) Division of Atmospheric and Geospace Sciences (AGS) provides multi-user national facilities through its Lower Atmospheric Observing Facilities (LAOF) Program at no cost to the investigator. These facilities, which include research aircraft, radars, lidars, and surface and sounding systems, receive NSF financial support and are eligible for deployment funding. The facilities are managed and operated by five LAOF partner organizations: the National Center for Atmospheric Research (NCAR); Colorado State University (CSU); the University of Wyoming (UWY); the Center for Severe Weather Research (CSWR); and the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS). These observational facilities are available on a competitive basis to all qualified researchers from US universities, requiring the platforms and associated services to carry out various research objectives. The deployment of all facilities is driven by scientific merit, capabilities of a specific facility to carry out the proposed observations, and scheduling for the requested time. The process for considering requests and setting priorities is determined on the basis of the complexity of a field campaign. The poster will describe available observing facilities and associated services, and explain the request process researchers have to follow to secure access to these platforms for scientific as well as educational deployments. NSF/NCAR GV Aircraft

  12. On recovering distributed IP information from inductive source time domain electromagnetic data

    NASA Astrophysics Data System (ADS)

    Kang, Seogi; Oldenburg, Douglas W.

    2016-10-01

    We develop a procedure to invert time domain induced polarization (IP) data for inductive sources. Our approach is based upon the inversion methodology in conventional electrical IP (EIP), which uses a sensitivity function that is independent of time. However, significant modifications are required for inductive source IP (ISIP) because electric fields in the ground do not achieve a steady state. The time-history for these fields needs to be evaluated and then used to define approximate IP currents. The resultant data, either a magnetic field or its derivative, are evaluated through the Biot-Savart law. This forms the desired linear relationship between data and pseudo-chargeability. Our inversion procedure has three steps: (1) Obtain a 3-D background conductivity model. We advocate, where possible, that this be obtained by inverting early-time data that do not suffer significantly from IP effects. (2) Decouple IP responses embedded in the observations by forward modelling the TEM data due to a background conductivity and subtracting these from the observations. (3) Use the linearized sensitivity function to invert data at each time channel and recover pseudo-chargeability. Post-interpretation of the recovered pseudo-chargeabilities at multiple times allows recovery of intrinsic Cole-Cole parameters such as time constant and chargeability. The procedure is applicable to all inductive source survey geometries but we focus upon airborne time domain EM (ATEM) data with a coincident-loop configuration because of the distinctive negative IP signal that is observed over a chargeable body. Several assumptions are adopted to generate our linearized modelling but we systematically test the capability and accuracy of the linearization for ISIP responses arising from different conductivity structures. On test examples we show: (1) our decoupling procedure enhances the ability to extract information about existence and location of chargeable targets directly from the data maps; (2) the horizontal location of a target body can be well recovered through inversion; (3) the overall geometry of a target body might be recovered but for ATEM data a depth weighting is required in the inversion; (4) we can recover estimates of intrinsic τ and η that may be useful for distinguishing between two chargeable targets.

  13. Dynamical Constants and Time Universals: A First Step toward a Metrical Definition of Ordered and Abnormal Cognition.

    PubMed

    Elliott, Mark A; du Bois, Naomi

    2017-01-01

    From the point of view of the cognitive dynamicist the organization of brain circuitry into assemblies defined by their synchrony at particular (and precise) oscillation frequencies is important for the correct correlation of all independent cortical responses to the different aspects of a given complex thought or object. From the point of view of anyone operating complex mechanical systems, i.e., those comprising independent components that are required to interact precisely in time, it follows that the precise timing of such a system is essential - not only essential but measurable, and scalable. It must also be reliable over observations to bring about consistent behavior, whatever that behavior is. The catastrophic consequence of an absence of such precision, for instance that required to govern the interference engine in many automobiles, is indicative of how important timing is for the function of dynamical systems at all levels of operation. The dynamics and temporal considerations combined indicate that it is necessary to consider the operating characteristic of any dynamical, cognitive brain system in terms, superficially at least, of oscillation frequencies. These may, themselves, be forensic of an underlying time-related taxonomy. Currently there are only two sets of relevant and necessarily systematic observations in this field: one of these reports the precise dynamical structure of the perceptual systems engaged in dynamical binding across form and time; the second, derived both empirically from perceptual performance data, as well as obtained from theoretical models, demonstrates a timing taxonomy related to a fundamental operator referred to as the time quantum. In this contribution both sets of theory and observations are reviewed and compared for their predictive consistency. Conclusions about direct comparability are discussed for both theories of cognitive dynamics and time quantum models. Finally, a brief review of some experimental data measuring sensitivity to visual information presented to the visual blind field (blindsight), as well as from studies of temporal processing in autism and schizophrenia, indicates that an understanding of a precise and metrical dynamic structure may be very important for an operational understanding of perception as well as more general cognitive function in psychopathology.

  14. Can generic paediatric mortality scores calculated 4 hours after admission be used as inclusion criteria for clinical trials?

    PubMed Central

    Leteurtre, Stéphane; Leclerc, Francis; Wirth, Jessica; Noizet, Odile; Magnenant, Eric; Sadik, Ahmed; Fourier, Catherine; Cremer, Robin

    2004-01-01

    Introduction Two generic paediatric mortality scoring systems have been validated in the paediatric intensive care unit (PICU). Paediatric RISk of Mortality (PRISM) requires an observation period of 24 hours, and PRISM III measures severity at two time points (at 12 hours and 24 hours) after admission, which represents a limitation for clinical trials that require earlier inclusion. The Paediatric Index of Mortality (PIM) is calculated 1 hour after admission but does not take into account the stabilization period following admission. To avoid these limitations, we chose to conduct assessments 4 hours after PICU admission. The aim of the present study was to validate PRISM, PRISM III and PIM at the time points for which they were developed, and to compare their accuracy in predicting mortality at those times with their accuracy at 4 hours. Methods All children admitted from June 1998 to May 2000 in one tertiary PICU were prospectively included. Data were collected to generate scores and predictions using PRISM, PRISM III and PIM. Results There were 802 consecutive admissions with 80 deaths. For the time points for which the scores were developed, observed and predicted mortality rates were significantly different for the three scores (P < 0.01) whereas all exhibited good discrimination (area under the receiver operating characteristic curve ≥0.83). At 4 hours after admission only the PIM had good calibration (P = 0.44), but all three scores exhibited good discrimination (area under the receiver operating characteristic curve ≥0.82). Conclusions Among the three scores calculated at 4 hours after admission, all had good discriminatory capacity but only the PIM score was well calibrated. Further studies are required before the PIM score at 4 hours can be used as an inclusion criterion in clinical trials. PMID:15312217

  15. Contingency Management Requirements Document: Preliminary Version. Revision F

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This is the High Altitude, Long Endurance (HALE) Remotely Operated Aircraft (ROA) Contingency Management (CM) Functional Requirements document. This document applies to HALE ROA operating within the National Airspace System (NAS) limited at this time to enroute operations above 43,000 feet (defined as Step 1 of the Access 5 project, sponsored by the National Aeronautics and Space Administration). A contingency is an unforeseen event requiring a response. The unforeseen event may be an emergency, an incident, a deviation, or an observation. Contingency Management (CM) is the process of evaluating the event, deciding on the proper course of action (a plan), and successfully executing the plan.

  16. On the probability of violations of Fourier's law for heat flow in small systems observed for short times

    NASA Astrophysics Data System (ADS)

    Evans, Denis J.; Searles, Debra J.; Williams, Stephen R.

    2010-01-01

    We study the statistical mechanics of thermal conduction in a classical many-body system that is in contact with two thermal reservoirs maintained at different temperatures. The ratio of the probabilities, that when observed for a finite time, the time averaged heat flux flows in and against the direction required by Fourier's Law for heat flow, is derived from first principles. This result is obtained using the transient fluctuation theorem. We show that the argument of that theorem, namely, the dissipation function is, close to equilibrium, equal to a microscopic expression for the entropy production. We also prove that if transient time correlation functions of smooth zero mean variables decay to zero at long times, the system will relax to a unique nonequilibrium steady state, and for this state, the thermal conductivity must be positive. Our expressions are tested using nonequilibrium molecular dynamics simulations of heat flow between thermostated walls.

  17. A PCR primer bank for quantitative gene expression analysis.

    PubMed

    Wang, Xiaowei; Seed, Brian

    2003-12-15

    Although gene expression profiling by microarray analysis is a useful tool for assessing global levels of transcriptional activity, variability associated with the data sets usually requires that observed differences be validated by some other method, such as real-time quantitative polymerase chain reaction (real-time PCR). However, non-specific amplification of non-target genes is frequently observed in the latter, confounding the analysis in approximately 40% of real-time PCR attempts when primer-specific labels are not used. Here we present an experimentally validated algorithm for the identification of transcript-specific PCR primers on a genomic scale that can be applied to real-time PCR with sequence-independent detection methods. An online database, PrimerBank, has been created for researchers to retrieve primer information for their genes of interest. PrimerBank currently contains 147 404 primers encompassing most known human and mouse genes. The primer design algorithm has been tested by conventional and real-time PCR for a subset of 112 primer pairs with a success rate of 98.2%.

  18. Capabilities of GRO/OSSE for observing solar flares

    NASA Technical Reports Server (NTRS)

    Kurfess, J. D.; Johnson, W. N.; Share, G. H.; Hulburt, E. O.; Matz, S. M.; Murphy, R. J.

    1989-01-01

    The launch of the Gamma Ray Observatory (GRO) near solar maximum makes solar flare studies early in the mission particularly advantageous. The Oriented Scintillation Spectrometer Experiment (OSSE) on GRO, covering the energy range 0.05 to 150 MeV, has some significant advantages over the previous generation of satellite-borne gamma-ray detectors for solar observations. The OSSE detectors will have about 10 times the effective area of the Gamma-Ray Spectrometer (GRS) on Solar Maximum Mission (SMM) for both photons and high-energy neutrons. The OSSE also has the added capability of distinguishing between high-energy neutrons and photons directly. The OSSE spectral accumulation time (approx. 4s) is four times faster than that of the SMM/GRS; much better time resolution is available in selected energy ranges. These characteristics will allow the investigation of particle acceleration in flares based on the evolution of the continuum and nuclear line components of flare spectra, nuclear emission in small flares, the anisotropy of continuum emission in small flares, and the relative intensities of different nuclear lines. The OSSE observational program will be devoted primarily to non-solar sources. Therefore, solar observations require planning and special configurations. The instrumental and operational characteristics of OSSE are discussed in the context of undertaking solar observations. The opportunities for guest investigators to participate in solar flare studies with OSSE is also presented.

  19. Proceedings of the Atmospheric Neutral Density Specialist Conference, Held in Colorado Springs, Colorado on March 22-23, 1988

    DTIC Science & Technology

    1988-03-23

    observations more often. Using this updated satellite orbital element set , a more accurate space surveillance product is generated by ensuring the time span...position were more accurate, observations could be required less frequently by the spacetrack network, the satellite orbital element set would not need to...of the orbit , one that includes the best model of atmospheric drag, will give the best, or most accurate, element set for a satellite. By maintaining

  20. Time delay measurement in the frequency domain

    DOE PAGES

    Durbin, Stephen M.; Liu, Shih -Chieh; Dufresne, Eric M.; ...

    2015-08-06

    Pump–probe studies at synchrotrons using X-ray and laser pulses require accurate determination of the time delay between pulses. This becomes especially important when observing ultrafast responses with lifetimes approaching or even less than the X-ray pulse duration (~100 ps). The standard approach of inspecting the time response of a detector sensitive to both types of pulses can have limitations due to dissimilar pulse profiles and other experimental factors. Here, a simple alternative is presented, where the frequency response of the detector is monitored versus time delay. Measurements readily demonstrate a time resolution of ~1 ps. Improved precision is possible bymore » simply extending the data acquisition time.« less

  1. Primary care nursing role and care coordination: an observational study of nursing work in a community health center.

    PubMed

    Anderson, Daren R; St Hilaire, Daniel; Flinter, Margaret

    2012-05-31

    Care coordination is a core element of the Patient-Centered Medical Home and requires an effective, well educated nursing staff. A greater understanding of roles and tasks currently being carried out by nurses in primary care is needed to help practices determine how best to implement care coordination and transform into PCMHs. We conducted an observational study of primary care nursing in a Community Health Center by creating a classification schema for nursing responsibilities, directly observing and tracking nurses' work, and categorizing their activities. Ten nurses in eight different practice sites were observed for a total of 61 hours. The vast majority of nursing time was spent in vaccine and medication administration; telephone work; and charting and paper work, while only 15% of their time was spent in activity that was classified broadly as care coordination. Care coordination work appeared to be subsumed by other daily tasks, many of which could have been accomplished by other, lesser trained members of the health care team. Practices looking to implement care coordination need a detailed look at work flow, task assignments, and a critical assessment of staffing, adhering to the principal of each team member working to the highest level of his or her education and license. Care coordination represents a distinct responsibility that requires dedicated nursing time, separate from the day to day tasks in a busy practice. To fully support these new functions, reimbursement models are needed that support such non visit-based work and provide incentives to coordinate and manage complex cases, achieve improved clinical outcomes and enhance efficiency of the health system. This article describes our study methods, data collection, and analysis, results, and discussion about reorganizing nursing roles to promote care coordination.

  2. Field observations using an AOTF polarimetric imaging spectrometer

    NASA Technical Reports Server (NTRS)

    Cheng, Li-Jen; Hamilton, Mike; Mahoney, Colin; Reyes, George

    1993-01-01

    This paper reports preliminary results of recent field observations using a prototype acousto-optic tunable filter (AOTF) polarimetric imaging spectrometer. The data illustrate application potentials for geoscience. The operation principle of this instrument is different from that of current airborne multispectral imaging instruments, such as AVIRIS. The AOTF instrument takes two orthogonally polarized images at a desired wavelength at one time, whereas AVIRIS takes a spectrum over a predetermined wavelength range at one pixel at a time and the image is constructed later. AVIRIS does not have any polarization measuring capability. The AOTF instrument could be a complement tool to AVIRIS. Polarization measurement is a desired capability for many applications in remote sensing. It is well know that natural light is often polarized due to various scattering phenomena in the atmosphere. Also, scattered light from canopies is reported to have a polarized component. To characterize objects of interest correctly requires a remote sensing imaging spectrometer capable of measuring object signal and background radiation in both intensity and polarization so that the characteristics of the object can be determined. The AORF instrument has the capability to do so. The AOTF instrument has other unique properties. For example, it can provide spectral images immediately after the observation. The instrument can also allow observations to be tailored in real time to perform the desired experiments and to collect only required data. Consequently, the performance in each mission can be increased with minimal resources. The prototype instrument was completed in the beginning of this year. A number of outdoor field experiments were performed with the objective to evaluate the capability of this new technology for remote sensing applications and to determine issues for further improvements.

  3. Diurnal changes in ocean color in coastal waters

    NASA Astrophysics Data System (ADS)

    Arnone, Robert; Vandermeulen, Ryan; Ladner, Sherwin; Ondrusek, Michael; Kovach, Charles; Yang, Haoping; Salisbury, Joseph

    2016-05-01

    Coastal processes can change on hourly time scales in response to tides, winds and biological activity, which can influence the color of surface waters. These temporal and spatial ocean color changes require satellite validation for applications using bio-optical products to delineate diurnal processes. The diurnal color change and capability for satellite ocean color response were determined with in situ and satellite observations. Hourly variations in satellite ocean color are dependent on several properties which include: a) sensor characterization b) advection of water masses and c) diurnal response of biological and optical water properties. The in situ diurnal changes in ocean color in a dynamic turbid coastal region in the northern Gulf of Mexico were characterized using above water spectral radiometry from an AErosol RObotic NETwork (AERONET -WavCIS CSI-06) site that provides up to 8-10 observations per day (in 15-30 minute increments). These in situ diurnal changes were used to validate and quantify natural bio-optical fluctuations in satellite ocean color measurements. Satellite capability to detect changes in ocean color was characterized by using overlapping afternoon orbits of the VIIRS-NPP ocean color sensor within 100 minutes. Results show the capability of multiple satellite observations to monitor hourly color changes in dynamic coastal regions that are impacted by tides, re-suspension, and river plume dispersion. Hourly changes in satellite ocean color were validated with in situ observation on multiple occurrences during different times of the afternoon. Also, the spatial variability of VIIRS diurnal changes shows the occurrence and displacement of phytoplankton blooms and decay during the afternoon period. Results suggest that determining the temporal and spatial changes in a color / phytoplankton bloom from the morning to afternoon time period will require additional satellite coverage periods in the coastal zone.

  4. Remote Observing and Automatic FTP on Kitt Peak

    NASA Astrophysics Data System (ADS)

    Seaman, Rob; Bohannan, Bruce

    As part of KPNO's Internet-based observing services we experimented with the publically available audio, video and whiteboard MBONE clients (vat, nv, wb and others) in both point-to-point and multicast modes. While bandwidth is always a constraint on the Internet, it is less of a constraint to operations than many might think. These experiments were part of two new Internet-based observing services offered to KPNO observers beginning with the Fall 1995 semester: a remote observing station and an automatic FTP data queue. The remote observing station seeks to duplicate the KPNO IRAF/ICE observing environment on a workstation at the observer's home institution. The automatic FTP queue is intended to support those observing programs that require quick transport of data back to the home institution, for instance, for near real time reductions to aid in observing tactics. We also discuss the early operational results of these services.

  5. The Time-Dependent Chemistry of Cometary Debris in the Solar Corona

    NASA Technical Reports Server (NTRS)

    Pesnell, W. D.; Bryans, P.

    2015-01-01

    Recent improvements in solar observations have greatly progressed the study of sungrazing comets. They can now be imaged along the entirety of their perihelion passage through the solar atmosphere, revealing details of their composition and structure not measurable through previous observations in the less volatile region of the orbit further from the solar surface. Such comets are also unique probes of the solar atmosphere. The debris deposited by sungrazers is rapidly ionized and subsequently influenced by the ambient magnetic field. Measuring the spectral signature of the deposited material highlights the topology of the magnetic field and can reveal plasma parameters such as the electron temperature and density. Recovering these variables from the observable data requires a model of the interaction of the cometary species with the atmosphere through which they pass. The present paper offers such a model by considering the time-dependent chemistry of sublimated cometary species as they interact with the solar radiation field and coronal plasma. We expand on a previous simplified model by considering the fully time-dependent solutions of the emitting species' densities. To compare with observations, we consider a spherically symmetric expansion of the sublimated material into the corona and convert the time-dependent ion densities to radial profiles. Using emissivities from the CHIANTI database and plasma parameters derived from a magnetohydrodynamic simulation leads to a spatially dependent emission spectrum that can be directly compared with observations. We find our simulated spectra to be consistent with observation.

  6. Pharmacodynamics and effectiveness of topical nitroglycerin at lowering blood pressure during autonomic dysreflexia.

    PubMed

    Solinsky, R; Bunnell, A E; Linsenmeyer, T A; Svircev, J N; Engle, A; Burns, S P

    2017-10-01

    Secondary analysis of prospectively collected observational data assessing the safety of an autonomic dysreflexia (AD) management protocol. To estimate the time to onset of action, time to full clinical effect (sustained systolic blood pressure (SBP) <160 mm Hg) and effectiveness of nitroglycerin ointment at lowering blood pressure for patients with spinal cord injuries experiencing AD. US Veterans Affairs inpatient spinal cord injury (SCI) unit. Episodes of AD recalcitrant to nonpharmacologic interventions that were given one to two inches of 2% topical nitroglycerin ointment were recorded. Pharmacodynamics as above and predictive characteristics (through a mixed multivariate logistic regression model) were calculated. A total of 260 episodes of pharmacologically managed AD were recorded in 56 individuals. Time to onset of action for nitroglycerin ointment was 9-11 min. Time to full clinical effect was 14-20 min. Topical nitroglycerin controlled SBP <160 mm Hg in 77.3% of pharmacologically treated AD episodes with the remainder requiring additional antihypertensive medications. A multivariate logistic regression model was unable to identify statistically significant factors to predict which patients would respond to nitroglycerin ointment (odds ratios 95% confidence intervals 0.29-4.93). The adverse event rate, entirely attributed to hypotension, was 3.6% with seven of the eight events resolving with close observation alone and one episode requiring normal saline. Nitroglycerin ointment has a rapid onset of action and time to full clinical effect with high efficacy and relatively low adverse event rate for patients with SCI experiencing AD.

  7. Temporal characteristics of imagined and actual walking in frail older adults.

    PubMed

    Nakano, Hideki; Murata, Shin; Shiraiwa, Kayoko; Iwase, Hiroaki; Kodama, Takayuki

    2018-05-09

    Mental chronometry, commonly used to evaluate motor imagery ability, measures the imagined time required for movements. Previous studies investigating mental chronometry of walking have investigated healthy older adults. However, mental chronometry in frail older adults has not yet been clarified. To investigate temporal characteristics of imagined and actual walking in frail older adults. We investigated the time required for imagined and actual walking along three walkways of different widths [width(s): 50, 25, 15 cm × length: 5 m] in 29 frail older adults and 20 young adults. Imagined walking was measured with mental chronometry. We observed significantly longer imagined and actual walking times along walkways of 50, 25, and 15 cm width in frail older adults compared with young adults. Moreover, temporal differences (absolute error) between imagined and actual walking were significantly greater in frail older adults than in young adults along walkways with a width of 25 and 15 cm. Furthermore, we observed significant differences in temporal differences (constant error) between frail older adults and young adults for walkways with a width of 25 and 15 cm. Frail older adults tended to underestimate actual walking time in imagined walking trials. Our results suggest that walkways of different widths may be a useful tool to evaluate age-related changes in imagined and actual walking in frail older adults.

  8. Rapid visual grouping and figure-ground processing using temporally structured displays.

    PubMed

    Cheadle, Samuel; Usher, Marius; Müller, Hermann J

    2010-08-23

    We examine the time course of visual grouping and figure-ground processing. Figure (contour) and ground (random-texture) elements were flickered with different phases (i.e., contour and background are alternated), requiring the observer to group information within a pre-specified time window. It was found this grouping has a high temporal resolution: less than 20ms for smooth contours, and less than 50ms for line conjunctions with sharp angles. Furthermore, the grouping process takes place without an explicit knowledge of the phase of the elements, and it requires a cumulative build-up of information. The results are discussed in relation to the neural mechanism for visual grouping and figure-ground segregation. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Water tribology on graphene.

    PubMed

    N'guessan, Hartmann E; Leh, Aisha; Cox, Paris; Bahadur, Prashant; Tadmor, Rafael; Patra, Prabir; Vajtai, Robert; Ajayan, Pulickel M; Wasnik, Priyanka

    2012-01-01

    Classical experiments show that the force required to slide liquid drops on surfaces increases with the resting time of the drop, t(rest), and reaches a plateau typically after several minutes. Here we use the centrifugal adhesion balance to show that the lateral force required to slide a water drop on a graphene surface is practically invariant with t(rest). In addition, the drop's three-phase contact line adopts a peculiar micrometric serrated form. These observations agree well with current theories that relate the time effect to deformation and molecular re-orientation of the substrate surface. Such molecular re-orientation is non-existent on graphene, which is chemically homogenous. Hence, graphene appears to provide a unique tribological surface test bed for a variety of liquid drop-surface interactions.

  10. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    NASA Astrophysics Data System (ADS)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  11. Detector Development for the abBA Experiment.

    PubMed

    Seo, P-N; Bowman, J D; Mitchell, G S; Penttila, S I; Wilburn, W S

    2005-01-01

    We have developed a new type of field-expansion spectrometer to measure the neutron beta decay correlations (a, b, B, and A). A precision measurement of these correlations places stringent requirements on charged particle detectors. The design employs large area segmented silicon detectors to detect both protons and electrons in coincidence. Other requirements include good energy resolution (< 5 keV), a thin dead layer to allow observation of 30-keV protons, fast timing resolution (~1 ns) to reconstruct electron-backscattering events, and nearly unity efficiency. We report results of testing commercially available surface-barrier silicon detectors for energy resolution and timing performance, and measurement of the dead-layer thickness of ion-implanted silicon detectors with a 3.2 MeV alpha source.

  12. Rapid Monte Carlo Simulation of Gravitational Wave Galaxies

    NASA Astrophysics Data System (ADS)

    Breivik, Katelyn; Larson, Shane L.

    2015-01-01

    With the detection of gravitational waves on the horizon, astrophysical catalogs produced by gravitational wave observatories can be used to characterize the populations of sources and validate different galactic population models. Efforts to simulate gravitational wave catalogs and source populations generally focus on population synthesis models that require extensive time and computational power to produce a single simulated galaxy. Monte Carlo simulations of gravitational wave source populations can also be used to generate observation catalogs from the gravitational wave source population. Monte Carlo simulations have the advantes of flexibility and speed, enabling rapid galactic realizations as a function of galactic binary parameters with less time and compuational resources required. We present a Monte Carlo method for rapid galactic simulations of gravitational wave binary populations.

  13. Implementation of a standardized handoff protocol for post-operative admissions to the surgical intensive care unit.

    PubMed

    Mukhopadhyay, Dhriti; Wiggins-Dohlvik, Katie C; MrDutt, Mary M; Hamaker, Jeffrey S; Machen, Graham L; Davis, Matthew L; Regner, Justin L; Smith, Randall W; Ciceri, David P; Shake, Jay G

    2018-01-01

    The transfer of critically ill patients from the operating room (OR) to the surgical intensive care unit (SICU) involves handoffs between multiple providers. Incomplete handoffs lead to poor communication, a major contributor to sentinel events. Our aim was to determine whether handoff standardization led to improvements in caregiver involvement and communication. A prospective intervention study was designed to observe thirty one patient handoffs from OR to SICU for 49 critical parameters including caregiver presence, peri-operative details, and time required to complete key steps. Following a six month implementation period, thirty one handoffs were observed to determine improvement. A significant improvement in presence of physician providers including intensivists and surgeons was observed (p = 0.0004 and p < 0.0001, respectively). Critical details were communicated more consistently, including procedure performed (p = 0.0048), complications (p < 0.0001), difficult airways (p < 0.0001), ventilator settings (p < 0.0001) and pressor requirements (p = 0.0134). Conversely, handoff duration did not increase significantly (p = 0.22). Implementation of a standardized protocol for handoffs between OR and SICU significantly improved caregiver involvement and reduced information omission without affecting provider time commitment. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Observation of warm, higher energy electrons transiting a double layer in a helicon plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yung-Ta, E-mail: ysung2@wisc.edu; Li, Yan; Scharer, John E.

    2015-03-15

    Measurements of an inductive RF helicon argon plasma double layer with two temperature electron distributions including a fast (>80 eV) tail are observed at 0.17 mTorr Ar pressure. The fast, untrapped electrons observed downstream of the double layer have a higher temperature (13 eV) than the trapped (T{sub e} = 4 eV) electrons. The reduction of plasma potential and density observed in the double layer region would require an upstream temperature ten times the measured 4 eV if occurring via Boltzmann ambipolar expansion. The experimental observation in Madison helicon experiment indicates that fast electrons with substantial density fractions can be created at low helicon operating pressures.

  15. Efficient estimation of ideal-observer performance in classification tasks involving high-dimensional complex backgrounds

    PubMed Central

    Park, Subok; Clarkson, Eric

    2010-01-01

    The Bayesian ideal observer is optimal among all observers and sets an absolute upper bound for the performance of any observer in classification tasks [Van Trees, Detection, Estimation, and Modulation Theory, Part I (Academic, 1968).]. Therefore, the ideal observer should be used for objective image quality assessment whenever possible. However, computation of ideal-observer performance is difficult in practice because this observer requires the full description of unknown, statistical properties of high-dimensional, complex data arising in real life problems. Previously, Markov-chain Monte Carlo (MCMC) methods were developed by Kupinski et al. [J. Opt. Soc. Am. A 20, 430(2003) ] and by Park et al. [J. Opt. Soc. Am. A 24, B136 (2007) and IEEE Trans. Med. Imaging 28, 657 (2009) ] to estimate the performance of the ideal observer and the channelized ideal observer (CIO), respectively, in classification tasks involving non-Gaussian random backgrounds. However, both algorithms had the disadvantage of long computation times. We propose a fast MCMC for real-time estimation of the likelihood ratio for the CIO. Our simulation results show that our method has the potential to speed up ideal-observer performance in tasks involving complex data when efficient channels are used for the CIO. PMID:19884916

  16. Timing the Geminga Pulsar with High-Energy Gamma-Rays

    NASA Technical Reports Server (NTRS)

    Halpern, Jules P.

    1997-01-01

    This is a continuing program to extend and refine the ephemeris of the Geminga pulsar with annual observations for the remaining lifetime of EGRET. The data show that every revolution of Geminga is accounted for during the EGRET epoch, and that a coherent timing solution linking the phase between EGRET, COS-B, amd SAS-2, observations has now been achieved. The accuracy of the gamma-ray timing is such that the proper motion of the pulsar can now be detected, consistent with the optical determination. The measured braking index over the 24.2 yr baseline is 17 +/- 1. Further observation is required to ascertain whether this very large braking index truly represents the energy loss mechanism, perhaps related to the theory in which Geminga is near its gamma-ray death line, or whether it is a manifestation of timing noise. Statistically significant timing residuals are detected in the EGRET data; they depart from the cubic ephemeris at a level of 23 milliperiods. The residuals appear to have a sinusoidal modulation with a period of about 5.1 yr. This could simply be a manifestation of timing noise, or it could be consistent with a planet of mass 1.7/sin i solar mass orbiting Geminga at a radius of 3.3/sin i AU.

  17. Observation planning tools for the ESO VLT interferometer

    NASA Astrophysics Data System (ADS)

    McKay, Derek J.; Ballester, Pascal; Vinther, Jakob

    2004-09-01

    Now that the Very Large Telescope Interferometer (VLTI) is producing regular scientific observations, the field of optical interferometry has moved from being a specialist niche area into mainstream astronomy. Making such instruments available to the general community involves difficult challenges in modelling, presentation and automation. The planning of each interferometric observation requires calibrator source selection, visibility prediction, signal-to-noise estimation and exposure time calculation. These planning tools require detailed physical models simulating the complete telescope system - including the observed source, atmosphere, array configuration, optics, detector and data processing. Only then can these software utilities provide accurate predictions about instrument performance, robust noise estimation and reliable metrics indicating the anticipated success of an observation. The information must be presented in a clear, intelligible manner, sufficiently abstract to hide the details of telescope technicalities, but still giving the user a degree of control over the system. The Data Flow System group has addressed the needs of the VLTI and, in doing so, has gained some new insights into the planning of observations, and the modelling and simulation of interferometer performance. This paper reports these new techniques, as well as the successes of the Data Flow System group in this area and a summary of what is now offered as standard to VLTI observers.

  18. Supporting Greenhouse Gas Management Strategies with Observations and Analysis - Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Butler, J. H.; Tarasova, O. A.

    2014-12-01

    Climate-change challenges facing society in the 21st century require an improved understanding of the global carbon-cycle and of the impacts and feedbacks of past, present, and future emissions of carbon-cycle gases. Global society faces a major challenge of reducing greenhouse gas emissions to virtually zero, most notably those of CO2, while at the same time facing variable and potentially overwhelming Earth System feedbacks. How it goes about this will depend upon the nature of impending international agreements, national laws, regional strategies, and social and economic forces. The challenge to those making observations to support, inform, or verify these reduction efforts, or to address potential Earth System feedbacks, lies in harmonizing a diverse array of observations and observing systems. Doing so is not trivial. Providing coherent, regional-scale information from these observations also requires improved modelling and ensemble reanalysis, but in the end such information must be relevant and reasonably certain. The challenge to us is to ensure a globally coherent observing and analysis system to supply the information that society will need to succeed. Policy-makers, scientists, government agencies, and businesses will need the best information available for decision-making and any observing and analysis system ultimately must be able to provide a coherent story over decades.

  19. LVGEMS Time-of-Flight Mass Spectrometry on Satellites

    NASA Technical Reports Server (NTRS)

    Herrero, Federico

    2013-01-01

    NASA fs investigations of the upper atmosphere and ionosphere require measurements of composition of the neutral air and ions. NASA is able to undertake these observations, but the instruments currently in use have their limitations. NASA has extended the scope of its research in the atmosphere and now requires more measurements covering more of the atmosphere. Out of this need, NASA developed multipoint measurements using miniaturized satellites, also called nanosatellites (e.g., CubeSats), that require a new generation of spectrometers that can fit into a 4 4 in. (.10 10 cm) cross-section in the upgraded satellites. Overall, the new mass spectrometer required for the new depth of atmospheric research must fulfill a new level of low-voltage/low-power requirements, smaller size, and less risk of magnetic contamination. The Low-Voltage Gated Electrostatic Mass Spectrometer (LVGEMS) was developed to fulfill these requirements. The LVGEMS offers a new spectrometer that eliminates magnetic field issues associated with magnetic sector mass spectrometers, reduces power, and is about 1/10 the size of previous instruments. LVGEMS employs the time of flight (TOF) technique in the GEMS mass spectrometer previously developed. However, like any TOF mass spectrometer, GEMS requires a rectangular waveform of large voltage amplitude, exceeding 100 V -- that means that the voltage applied to one of the GEMS electrodes has to change from 0 to 100 V in a time of only a few nanoseconds. Such electronic speed requires more power than can be provided in a CubeSat. In the LVGEMS, the amplitude of the rectangular waveform is reduced to about 1 V, compatible with digital electronics supplies and requiring little power.

  20. A millisecond pulsar in an extremely wide binary system

    NASA Astrophysics Data System (ADS)

    Bassa, C. G.; Janssen, G. H.; Stappers, B. W.; Tauris, T. M.; Wevers, T.; Jonker, P. G.; Lentati, L.; Verbiest, J. P. W.; Desvignes, G.; Graikou, E.; Guillemot, L.; Freire, P. C. C.; Lazarus, P.; Caballero, R. N.; Champion, D. J.; Cognard, I.; Jessner, A.; Jordan, C.; Karuppusamy, R.; Kramer, M.; Lazaridis, K.; Lee, K. J.; Liu, K.; Lyne, A. G.; McKee, J.; Osłowski, S.; Perrodin, D.; Sanidas, S.; Shaifullah, G.; Smits, R.; Theureau, G.; Tiburzi, C.; Zhu, W. W.

    2016-08-01

    We report on 22 yr of radio timing observations of the millisecond pulsar J1024-0719 by the telescopes participating in the European Pulsar Timing Array (EPTA). These observations reveal a significant second derivative of the pulsar spin frequency and confirm the discrepancy between the parallax and Shklovskii distances that has been reported earlier. We also present optical astrometry, photometry and spectroscopy of 2MASS J10243869-0719190. We find that it is a low-metallicity main-sequence star (K7V spectral type, [M/H] = -1.0, Teff = 4050 ± 50 K) and that its position, proper motion and distance are consistent with those of PSR J1024-0719. We conclude that PSR J1024-0719 and 2MASS J10243869-0719190 form a common proper motion pair and are gravitationally bound. The gravitational interaction between the main-sequence star and the pulsar accounts for the spin frequency derivatives, which in turn resolves the distance discrepancy. Our observations suggest that the pulsar and main-sequence star are in an extremely wide (Pb > 200 yr) orbit. Combining the radial velocity of the companion and proper motion of the pulsar, we find that the binary system has a high spatial velocity of 384 ± 45 km s-1 with respect to the local standard of rest and has a Galactic orbit consistent with halo objects. Since the observed main-sequence companion star cannot have recycled the pulsar to millisecond spin periods, an exotic formation scenario is required. We demonstrate that this extremely wide-orbit binary could have evolved from a triple system that underwent an asymmetric supernova explosion, though find that significant fine-tuning during the explosion is required. Finally, we discuss the implications of the long period orbit on the timing stability of PSR J1024-0719 in light of its inclusion in pulsar timing arrays.

  1. Real-time characterization of partially observed epidemics using surrogate models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safta, Cosmin; Ray, Jaideep; Lefantzi, Sophia

    We present a statistical method, predicated on the use of surrogate models, for the 'real-time' characterization of partially observed epidemics. Observations consist of counts of symptomatic patients, diagnosed with the disease, that may be available in the early epoch of an ongoing outbreak. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information on the dynamics of the etiologic agent in the affected population e.g., the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and epidemiologicalmore » parameters are estimated as distributions using a Markov chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. In some cases, the inverse problem can be computationally expensive, primarily due to the epidemic simulator used inside the inversion algorithm. We present a method, based on replacing the epidemiological model with computationally inexpensive surrogates, that can reduce the computational time to minutes, without a significant loss of accuracy. The surrogates are created by projecting the output of an epidemiological model on a set of polynomial chaos bases; thereafter, computations involving the surrogate model reduce to evaluations of a polynomial. We find that the epidemic characterizations obtained with the surrogate models is very close to that obtained with the original model. We also find that the number of projections required to construct a surrogate model is O(10)-O(10{sup 2}) less than the number of samples required by the MCMC to construct a stationary posterior distribution; thus, depending upon the epidemiological models in question, it may be possible to omit the offline creation and caching of surrogate models, prior to their use in an inverse problem. The technique is demonstrated on synthetic data as well as observations from the 1918 influenza pandemic collected at Camp Custer, Michigan.« less

  2. Purely temporal figure-ground segregation.

    PubMed

    Kandil, F I; Fahle, M

    2001-05-01

    Visual figure-ground segregation is achieved by exploiting differences in features such as luminance, colour, motion or presentation time between a figure and its surround. Here we determine the shortest delay times required for figure-ground segregation based on purely temporal features. Previous studies usually employed stimulus onset asynchronies between figure- and ground-containing possible artefacts based on apparent motion cues or on luminance differences. Our stimuli systematically avoid these artefacts by constantly showing 20 x 20 'colons' that flip by 90 degrees around their midpoints at constant time intervals. Colons constituting the background flip in-phase whereas those constituting the target flip with a phase delay. We tested the impact of frequency modulation and phase reduction on target detection. Younger subjects performed well above chance even at temporal delays as short as 13 ms, whilst older subjects required up to three times longer delays in some conditions. Figure-ground segregation can rely on purely temporal delays down to around 10 ms even in the absence of luminance and motion artefacts, indicating a temporal precision of cortical information processing almost an order of magnitude lower than the one required for some models of feature binding in the visual cortex [e.g. Singer, W. (1999), Curr. Opin. Neurobiol., 9, 189-194]. Hence, in our experiment, observers are unable to use temporal stimulus features with the precision required for these models.

  3. Development of sustainable precision farming systems for swine: estimating real-time individual amino acid requirements in growing-finishing pigs.

    PubMed

    Hauschild, L; Lovatto, P A; Pomar, J; Pomar, C

    2012-07-01

    The objective of this study was to develop and evaluate a mathematical model used to estimate the daily amino acid requirements of individual growing-finishing pigs. The model includes empirical and mechanistic model components. The empirical component estimates daily feed intake (DFI), BW, and daily gain (DG) based on individual pig information collected in real time. Based on DFI, BW, and DG estimates, the mechanistic component uses classic factorial equations to estimate the optimal concentration of amino acids that must be offered to each pig to meet its requirements. The model was evaluated with data from a study that investigated the effect of feeding pigs with a 3-phase or daily multiphase system. The DFI and BW values measured in this study were compared with those estimated by the empirical component of the model. The coherence of the values estimated by the mechanistic component was evaluated by analyzing if it followed a normal pattern of requirements. Lastly, the proposed model was evaluated by comparing its estimates with those generated by the existing growth model (InraPorc). The precision of the proposed model and InraPorc in estimating DFI and BW was evaluated through the mean absolute error. The empirical component results indicated that the DFI and BW trajectories of individual pigs fed ad libitum could be predicted 1 d (DFI) or 7 d (BW) ahead with the average mean absolute error of 12.45 and 1.85%, respectively. The average mean absolute error obtained with the InraPorc for the average individual of the population was 14.72% for DFI and 5.38% for BW. Major differences were observed when estimates from InraPorc were compared with individual observations. The proposed model, however, was effective in tracking the change in DFI and BW for each individual pig. The mechanistic model component estimated the optimal standardized ileal digestible Lys to NE ratio with reasonable between animal (average CV = 7%) and overtime (average CV = 14%) variation. Thus, the amino acid requirements estimated by model are animal- and time-dependent and follow, in real time, the individual DFI and BW growth patterns. The proposed model can follow the average feed intake and feed weight trajectory of each individual pig in real time with good accuracy. Based on these trajectories and using classical factorial equations, the model makes it possible to estimate dynamically the AA requirements of each animal, taking into account the intake and growth changes of the animal.

  4. Timing and documentation of key events in neonatal resuscitation.

    PubMed

    Heathcote, Adam Charles; Jones, Jacqueline; Clarke, Paul

    2018-04-30

    Only a minority of babies require extended resuscitation at birth. Resuscitations concerning babies who die or who survive with adverse outcomes are increasingly subject to medicolegal scrutiny. Our aim was to describe real-life timings of key resuscitation events observed in a historical series of newborns who required full resuscitation at birth. Twenty-seven babies born in our centre over a 10-year period had an Apgar score of 0 at 1 min and required full resuscitation. The median (95% confidence interval) postnatal age at achieving key events were commencing cardiac compressions, 2.0 (1.5-4.0) min; endotracheal intubation, 3.8 (2.0-6.0) min; umbilical venous catheterisation 9.0 (7.5-12.0) min; and administration of first adrenaline dose 10.0 (8.0-14.0) min. The wide range of timings presented from real-life cases may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training. What is Known: • Only a minority of babies require extended resuscitation at birth; these cases are often subject to medicolegal interrogation • Timings of key resuscitation events are poorly described and documentation of resuscitation events is often lacking yet is open to medicolegal scrutiny What is New: • We present a wide range of real-life timings of key resuscitation events during the era of routine newborn life support training • These timings may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training.

  5. Novel Estimation of Pilot Performance Characteristics

    NASA Technical Reports Server (NTRS)

    Bachelder, Edward N.; Aponso, Bimal

    2017-01-01

    Two mechanisms internal to the pilot that affect performance during a tracking task are: 1) Pilot equalization (i.e. lead/lag); and 2) Pilot gain (i.e. sensitivity to the error signal). For some applications McRuer's Crossover Model can be used to anticipate what equalization will be employed to control a vehicle's dynamics. McRuer also established approximate time delays associated with different types of equalization - the more cognitive processing that is required due to equalization difficulty, the larger the time delay. However, the Crossover Model does not predict what the pilot gain will be. A nonlinear pilot control technique, observed and coined by the authors as 'amplitude clipping', is shown to improve stability, performance, and reduce workload when employed with vehicle dynamics that require high lead compensation by the pilot. Combining linear and nonlinear methods a novel approach is used to measure the pilot control parameters when amplitude clipping is present, allowing precise measurement in real time of key pilot control parameters. Based on the results of an experiment which was designed to probe workload primary drivers, a method is developed that estimates pilot spare capacity from readily observable measures and is tested for generality using multi-axis flight data. This paper documents the initial steps to developing a novel, simple objective metric for assessing pilot workload and its variation over time across a wide variety of tasks. Additionally, it offers a tangible, easily implementable methodology for anticipating a pilot's operating parameters and workload, and an effective design tool. The model shows promise in being able to precisely predict the actual pilot settings and workload, and observed tolerance of pilot parameter variation over the course of operation. Finally, an approach is proposed for generating Cooper-Harper ratings based on the workload and parameter estimation methodology.

  6. Holistic component of image perception in mammogram interpretation: gaze-tracking study.

    PubMed

    Kundel, Harold L; Nodine, Calvin F; Conant, Emily F; Weinstein, Susan P

    2007-02-01

    To test the hypothesis that rapid and accurate performance of the proficient observer in mammogram interpretation involves a shift in the mechanism of image perception from a relatively slow search-to-find mode to a relatively fast holistic mode. This HIPAA-compliant study had institutional review board approval, and participant informed consent was obtained; patient informed consent was not required. The eye positions of three full-time mammographers, one attending radiologist, two mammography fellows, and three radiology residents were recorded during the interpretation of 20 normal and 20 subtly abnormal mammograms. The search time required to first locate a cancer, as well as the initial eye scan path, was determined and compared with diagnostic performance as measured with receiver operating characteristic (ROC) analysis. The median time for all observers to fixate a cancer, regardless of the decision outcome, was 1.13 seconds, with a range of 0.68 second to 3.06 seconds. Even though most of the lesions were fixated, recognition of them as cancerous ranged from 85% (17 of 20) to 10% (two of 20), with corresponding areas under the ROC curve of 0.87-0.40. The ROC index of detectability, d(a), was linearly related to the time to first fixate a cancer with a correlation (r(2)) of 0.81. The rapid initial fixation of a true abnormality is evidence for a global perceptual process capable of analyzing the visual input of the entire retinal image and pinpointing the spatial location of an abnormality. It appears to be more highly developed in the most proficient observers, replacing the less efficient initial search-to-find strategies. (c) RSNA, 2007.

  7. Reciprocity, Responsiveness, and Timing in Interactions between Mothers and Deaf and Hearing Children.

    ERIC Educational Resources Information Center

    Waxman, Robyn P.; Spencer, Patricia E.; Poisson, Susan S.

    1996-01-01

    The Greenspan-Lieberman Observational System Revised was used to evaluate characteristics of dyadic interactions between 10 hearing mothers and hearing toddlers (HH), 10 deaf mothers and deaf toddlers (DD), and 10 hearing mothers and deaf toddlers (HD). Findings suggest that assessment instruments require some modifications and results must be…

  8. The New Economy, Technology, and Learning Outcomes Assessment

    ERIC Educational Resources Information Center

    Moore, Anne H.

    2007-01-01

    Many observers describe the 21st century as a complex age with new demands for education and new requirements for accountability in teaching and learning to meet society's needs in a new, global economy. At the same time, innovations in teaching and learning and proposals for measuring them often seem disconnected from public and political…

  9. Navy Maintenance: The P-3 Aircraft Overhaul Program Can Be Improved.

    DTIC Science & Technology

    1987-06-01

    Air Sys- tems Command’s Naval Aviation Logistics Center, we obtained data on aircraft turnaround times, mobilization requirements, and aircraft over...480 561 637 P-3 Workload as a Percent of 23 22 25 27 Total FINDING Z: Aplicability of Procedural Changes To Other Aircraft. The GAO observed that the

  10. Automation of Coordinated Planning Between Observatories: The Visual Observation Layout Tool (VOLT)

    NASA Technical Reports Server (NTRS)

    Maks, Lori; Koratkar, Anuradha; Kerbel, Uri; Pell, Vince

    2002-01-01

    Fulfilling the promise of the era of great observatories, NASA now has more than three space-based astronomical telescopes operating in different wavebands. This situation provides astronomers with the unique opportunity of simultaneously observing a target in multiple wavebands with these observatories. Currently scheduling multiple observatories simultaneously, for coordinated observations, is highly inefficient. Coordinated observations require painstaking manual collaboration among the observatory staff at each observatory. Because they are time-consuming and expensive to schedule, observatories often limit the number of coordinated observations that can be conducted. In order to exploit new paradigms for observatory operation, the Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center has developed a tool called the Visual Observation Layout Tool (VOLT). The main objective of VOLT is to provide a visual tool to automate the planning of coordinated observations by multiple astronomical observatories. Four of NASA's space-based astronomical observatories - the Hubble Space Telescope (HST), Far Ultraviolet Spectroscopic Explorer (FUSE), Rossi X-ray Timing Explorer (RXTE) and Chandra - are enthusiastically pursuing the use of VOLT. This paper will focus on the purpose for developing VOLT, as well as the lessons learned during the infusion of VOLT into the planning and scheduling operations of these observatories.

  11. E-VLBI-activities at the FS Wettzell

    NASA Astrophysics Data System (ADS)

    Kronschnabl, Gerhard; Dassing, Reiner

    The FS-Wettzell carries out the daily-INTENSIVE observations which were required for the rapid determination of DUT1. The data volume is roughly 40 GB. So fare the data were shipped via currier services to the correlator which requires 2-3 days transportation time. The INTENSIVE time series is a real candidate for E-VLBI. It will reduce the delay due to data transport strongly. Considering the remote location of Wettzell - apart from the fast INTERNET links, considering the current high cost for a fast connection, in the next future the installation of a 34 Gbps-internet connection will be realistic. It will strongly support the data transmission on start the delay time to only a few hours. This report give an overview about the activities on the realisation of such a fast link. First attempts are reported made from the next nodal point at the University Regensburg, making use of a 155Mbps connection.

  12. Identification of bearing faults using time domain zero-crossings

    NASA Astrophysics Data System (ADS)

    William, P. E.; Hoffman, M. W.

    2011-11-01

    In this paper, zero-crossing characteristic features are employed for early detection and identification of single point bearing defects in rotating machinery. As a result of bearing defects, characteristic defect frequencies appear in the machine vibration signal, normally requiring spectral analysis or envelope analysis to identify the defect type. Zero-crossing features are extracted directly from the time domain vibration signal using only the duration between successive zero-crossing intervals and do not require estimation of the rotational frequency. The features are a time domain representation of the composite vibration signature in the spectral domain. Features are normalized by the length of the observation window and classification is performed using a multilayer feedforward neural network. The model was evaluated on vibration data recorded using an accelerometer mounted on an induction motor housing subjected to a number of single point defects with different severity levels.

  13. Time dependent features in tremor spectra

    NASA Astrophysics Data System (ADS)

    Powell, T. W.; Neuberg, J.

    2003-11-01

    Harmonic spectral peaks are observed in the tremor spectra of many different volcanoes, and in some cases these spectral lines have been seen to change with time. This has also been observed for the tremor at the Soufrière Hills volcano on Montserrat, West Indies, where the spectral lines are sometimes seen to glide apart before an explosion. We propose a model of repeated triggering of low-frequency earthquakes to explain these gliding lines using the relationship δt=1/ δν, where δt and δν are time and frequency spacing, respectively, and investigate factors which can affect the observation of these spectral peaks. Noise and amplitude variation are shown to have little effect on the spectral peaks; however the time gap between events must be nearly constant over several events. An error with a standard deviation of 2% or less is required for the spectral lines to be observed in the frequency range 0.5-10 Hz. We can reproduce the gliding spectral lines from a specific tremor episode preceding an explosion by changing δt from 1 to 0.31 s over a time period of 12 min. Using this relationship and an Automated Event Classification Analysis Program (AECAP), we can monitor δt over a long time period. The AECAP also extracts other seismic parameters such as energy, duration and spectral characteristics. An initial comparison between low-frequency seismic energy and cyclic tilt shows a correlation between the two, but this does not hold for later cycles.

  14. The use of satellite observations of the ocean surface in commercial fishing operations

    NASA Technical Reports Server (NTRS)

    Montgomery, D. R.

    1983-01-01

    Commercial fishermen are interested in the safety of their crews, boats, and gear, and in making the best catch for their time and money. Rising fuel costs, increased competition from foreign fisheries, improved knowledge about fish habits and the new 200 mile economic zone have all had an impact on the U.S. fishing industry. As a consequence, the modern fisherman, more than ever, requires reliable and timely information about the marine environment. This paper describes an experimental program to utilize satellite observations of the ocean surface, in conjunction with conventional observations and products, to prepare special fisheries aids charts for daily radio facsimile broadcasts to commercial fishermen. These special fisheries products aggregate a broad set of ocean observations, including ocean color structure, to depict oceanographic conditions of importance to commercial fishing tactics. Results to date have shown that improved safety at sea and decreased fuel costs can be achieved through the applied use of these special fisheries charts.

  15. Time Varying Compensator Design for Reconfigurable Structures Using Non-Collocated Feedback

    NASA Technical Reports Server (NTRS)

    Scott, Michael A.

    1996-01-01

    Analysis and synthesis tools are developed to improved the dynamic performance of reconfigurable nonminimum phase, nonstrictly positive real-time variant systems. A novel Spline Varying Optimal (SVO) controller is developed for the kinematic nonlinear system. There are several advantages to using the SVO controller, in which the spline function approximates the system model, observer, and controller gain. They are: The spline function approximation is simply connected, thus the SVO controller is more continuous than traditional gain scheduled controllers when implemented on a time varying plant; ft is easier for real-time implementations in storage and computational effort; where system identification is required, the spline function requires fewer experiments, namely four experiments; and initial startup estimator transients are eliminated. The SVO compensator was evaluated on a high fidelity simulation of the Shuttle Remote Manipulator System. The SVO controller demonstrated significant improvement over the present arm performance: (1) Damping level was improved by a factor of 3; and (2) Peak joint torque was reduced by a factor of 2 following Shuttle thruster firings.

  16. Modelling chaotic vibrations using NASTRAN

    NASA Technical Reports Server (NTRS)

    Sheerer, T. J.

    1993-01-01

    Due to the unavailability and, later, prohibitive cost of the computational power required, many phenomena in nonlinear dynamic systems have in the past been addressed in terms of linear systems. Linear systems respond to periodic inputs with periodic outputs, and may be characterized in the time domain or in the frequency domain as convenient. Reduction to the frequency domain is frequently desireable to reduce the amount of computation required for solution. Nonlinear systems are only soluble in the time domain, and may exhibit a time history which is extremely sensitive to initial conditions. Such systems are termed chaotic. Dynamic buckling, aeroelasticity, fatigue analysis, control systems and electromechanical actuators are among the areas where chaotic vibrations have been observed. Direct transient analysis over a long time period presents a ready means of simulating the behavior of self-excited or externally excited nonlinear systems for a range of experimental parameters, either to characterize chaotic behavior for development of load spectra, or to define its envelope and preclude its occurrence.

  17. Trade-offs between driving nodes and time-to-control in complex networks

    PubMed Central

    Pequito, Sérgio; Preciado, Victor M.; Barabási, Albert-László; Pappas, George J.

    2017-01-01

    Recent advances in control theory provide us with efficient tools to determine the minimum number of driving (or driven) nodes to steer a complex network towards a desired state. Furthermore, we often need to do it within a given time window, so it is of practical importance to understand the trade-offs between the minimum number of driving/driven nodes and the minimum time required to reach a desired state. Therefore, we introduce the notion of actuation spectrum to capture such trade-offs, which we used to find that in many complex networks only a small fraction of driving (or driven) nodes is required to steer the network to a desired state within a relatively small time window. Furthermore, our empirical studies reveal that, even though synthetic network models are designed to present structural properties similar to those observed in real networks, their actuation spectra can be dramatically different. Thus, it supports the need to develop new synthetic network models able to replicate controllability properties of real-world networks. PMID:28054597

  18. Trade-offs between driving nodes and time-to-control in complex networks

    NASA Astrophysics Data System (ADS)

    Pequito, Sérgio; Preciado, Victor M.; Barabási, Albert-László; Pappas, George J.

    2017-01-01

    Recent advances in control theory provide us with efficient tools to determine the minimum number of driving (or driven) nodes to steer a complex network towards a desired state. Furthermore, we often need to do it within a given time window, so it is of practical importance to understand the trade-offs between the minimum number of driving/driven nodes and the minimum time required to reach a desired state. Therefore, we introduce the notion of actuation spectrum to capture such trade-offs, which we used to find that in many complex networks only a small fraction of driving (or driven) nodes is required to steer the network to a desired state within a relatively small time window. Furthermore, our empirical studies reveal that, even though synthetic network models are designed to present structural properties similar to those observed in real networks, their actuation spectra can be dramatically different. Thus, it supports the need to develop new synthetic network models able to replicate controllability properties of real-world networks.

  19. A time motion study in the immunization clinic of a tertiary care hospital of kolkata, west bengal.

    PubMed

    Chattopadhyay, Amitabha; Ghosh, Ritu; Maji, Sucharita; Ray, Tapobroto Guha; Lahiri, Saibendu Kumar

    2012-01-01

    A time and motion study is used to determine the amount of time required for a specific activity, work function, or mechanical process. Few such studies have been reported in the outpatient department of institutions, and such studies based exclusively on immunization clinic of an institute is a rarity. This was an observational cross sectional study done in the immunization clinic of R.G. Kar Medical College, Kolkata, over a period of 1 month (September 2010). The study population included mother/caregivers attending the immunization clinics with their children. The total sample was 482. Pre-synchronized stopwatches were used to record service delivery time at the different activity points. Median time was the same for both initial registration table and nutrition and health education table (120 seconds), but the vaccination and post vaccination advice table took the highest percentage of overall time (46.3%). Maximum time spent on the vaccination and post vaccination advice table was on Monday (538.1 s) and nutritional assessment and health assessment table took maximum time on Friday (217.1 s). Time taken in the first half of immunization session was more in most of the tables. The goal for achieving universal immunization against vaccine-preventable diseases requires multifaceted collated response from many stakeholders. Efficient functioning of immunization clinics is therefore required to achieve the prescribed goals. This study aims to initiate an effort to study the utilization of time at a certain health care unit with the invitation of much more in depth analysis in future.

  20. Benchmarking of vertically-integrated CO2 flow simulations at the Sleipner Field, North Sea

    NASA Astrophysics Data System (ADS)

    Cowton, L. R.; Neufeld, J. A.; White, N. J.; Bickle, M. J.; Williams, G. A.; White, J. C.; Chadwick, R. A.

    2018-06-01

    Numerical modeling plays an essential role in both identifying and assessing sub-surface reservoirs that might be suitable for future carbon capture and storage projects. Accuracy of flow simulations is tested by benchmarking against historic observations from on-going CO2 injection sites. At the Sleipner project located in the North Sea, a suite of time-lapse seismic reflection surveys enables the three-dimensional distribution of CO2 at the top of the reservoir to be determined as a function of time. Previous attempts have used Darcy flow simulators to model CO2 migration throughout this layer, given the volume of injection with time and the location of the injection point. Due primarily to computational limitations preventing adequate exploration of model parameter space, these simulations usually fail to match the observed distribution of CO2 as a function of space and time. To circumvent these limitations, we develop a vertically-integrated fluid flow simulator that is based upon the theory of topographically controlled, porous gravity currents. This computationally efficient scheme can be used to invert for the spatial distribution of reservoir permeability required to minimize differences between the observed and calculated CO2 distributions. When a uniform reservoir permeability is assumed, inverse modeling is unable to adequately match the migration of CO2 at the top of the reservoir. If, however, the width and permeability of a mapped channel deposit are allowed to independently vary, a satisfactory match between the observed and calculated CO2 distributions is obtained. Finally, the ability of this algorithm to forecast the flow of CO2 at the top of the reservoir is assessed. By dividing the complete set of seismic reflection surveys into training and validation subsets, we find that the spatial pattern of permeability required to match the training subset can successfully predict CO2 migration for the validation subset. This ability suggests that it might be feasible to forecast migration patterns into the future with a degree of confidence. Nevertheless, our analysis highlights the difficulty in estimating reservoir parameters away from the region swept by CO2 without additional observational constraints.

  1. An automated skin segmentation of Breasts in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.

    PubMed

    Lee, Chia-Yen; Chang, Tzu-Fang; Chang, Nai-Yun; Chang, Yeun-Chung

    2018-04-18

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is used to diagnose breast disease. Obtaining anatomical information from DCE-MRI requires the skin be manually removed so that blood vessels and tumors can be clearly observed by physicians and radiologists; this requires considerable manpower and time. We develop an automated skin segmentation algorithm where the surface skin is removed rapidly and correctly. The rough skin area is segmented by the active contour model, and analyzed in segments according to the continuity of the skin thickness for accuracy. Blood vessels and mammary glands are retained, which remedies the defect of removing some blood vessels in active contours. After three-dimensional imaging, the DCE-MRIs without the skin can be used to see internal anatomical information for clinical applications. The research showed the Dice's coefficients of the 3D reconstructed images using the proposed algorithm and the active contour model for removing skins are 93.2% and 61.4%, respectively. The time performance of segmenting skins automatically is about 165 times faster than manually. The texture information of the tumors position with/without the skin is compared by the paired t-test yielded all p < 0.05, which suggested the proposed algorithm may enhance observability of tumors at the significance level of 0.05.

  2. Green Thunderstorms Observed.

    NASA Astrophysics Data System (ADS)

    Gallagher, Frank W., III; Beasley, William H.; Bohren, Craig F.

    1996-12-01

    Green thunderstorms have been observed from time to time in association with deep convection or severe weather events. Often the green coloration has been attributed to hail or to reflections of light from green foliage on the ground. Some skeptics who have not personally observed a green thunderstorm do not believe that green thunderstorms exist. They suggest that the green storms may be fabrications by excited observers. The authors have demonstrated the existence of green thunderstorms objectively using a spectrophotometer. During the spring and summer of 1995 the authors observed numerous storms and recorded hundreds of spectra of the light emanating corn these storms. It was found that the subjective judgment of colors can vary somewhat between observers, but the variation is usually in the shade of green. The authors recorded spectra of green and nongreen thunderstorms and recorded spectral measurements as a storm changed its appearance from dark blue to a bluish green. The change in color is gradual when observed from a stationary position. Also, as the light from a storm becomes greener, the luminance decreases. The authors also observed and recorded the spectrum of a thunderstorm during a period of several hours as they flew in an aircraft close to a supercell that appeared somewhat green. The authors' observations refute the ground reflection hypothesis and raise questions about explanations that require the presence of hail.

  3. Optical Comb from a Whispering Gallery Mode Resonator for Spectroscopy and Astronomy Instruments Calibration

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.; Yu, Nam; Thompson, Robert J.

    2012-01-01

    The most accurate astronomical data is available from space-based observations that are not impeded by the Earth's atmosphere. Such measurements may require spectral samples taken as long as decades apart, with the 1 cm/s velocity precision integrated over a broad wavelength range. This raises the requirements specifically for instruments used in astrophysics research missions -- their stringent wavelength resolution and accuracy must be maintained over years and possibly decades. Therefore, a stable and broadband optical calibration technique compatible with spaceflights becomes essential. The space-based spectroscopic instruments need to be calibrated in situ, which puts forth specific requirements to the calibration sources, mainly concerned with their mass, power consumption, and reliability. A high-precision, high-resolution reference wavelength comb source for astronomical and astrophysics spectroscopic observations has been developed that is deployable in space. The optical comb will be used for wavelength calibrations of spectrographs and will enable Doppler measurements to better than 10 cm/s precision, one hundred times better than the current state-of-the- art.

  4. Time Series Analysis of Remote Sensing Observations for Citrus Crop Growth Stage and Evapotranspiration Estimation

    NASA Astrophysics Data System (ADS)

    Sawant, S. A.; Chakraborty, M.; Suradhaniwar, S.; Adinarayana, J.; Durbha, S. S.

    2016-06-01

    Satellite based earth observation (EO) platforms have proved capability to spatio-temporally monitor changes on the earth's surface. Long term satellite missions have provided huge repository of optical remote sensing datasets, and United States Geological Survey (USGS) Landsat program is one of the oldest sources of optical EO datasets. This historical and near real time EO archive is a rich source of information to understand the seasonal changes in the horticultural crops. Citrus (Mandarin / Nagpur Orange) is one of the major horticultural crops cultivated in central India. Erratic behaviour of rainfall and dependency on groundwater for irrigation has wide impact on the citrus crop yield. Also, wide variations are reported in temperature and relative humidity causing early fruit onset and increase in crop water requirement. Therefore, there is need to study the crop growth stages and crop evapotranspiration at spatio-temporal scale for managing the scarce resources. In this study, an attempt has been made to understand the citrus crop growth stages using Normalized Difference Time Series (NDVI) time series data obtained from Landsat archives (http://earthexplorer.usgs.gov/). Total 388 Landsat 4, 5, 7 and 8 scenes (from year 1990 to Aug. 2015) for Worldwide Reference System (WRS) 2, path 145 and row 45 were selected to understand seasonal variations in citrus crop growth. Considering Landsat 30 meter spatial resolution to obtain homogeneous pixels with crop cover orchards larger than 2 hectare area was selected. To consider change in wavelength bandwidth (radiometric resolution) with Landsat sensors (i.e. 4, 5, 7 and 8) NDVI has been selected to obtain continuous sensor independent time series. The obtained crop growth stage information has been used to estimate citrus basal crop coefficient information (Kcb). Satellite based Kcb estimates were used with proximal agrometeorological sensing system observed relevant weather parameters for crop ET estimation. The results show that time series EO based crop growth stage estimates provide better information about geographically separated citrus orchards. Attempts are being made to estimate regional variations in citrus crop water requirement for effective irrigation planning. In future high resolution Sentinel 2 observations from European Space Agency (ESA) will be used to fill the time gaps and to get better understanding about citrus crop canopy parameters.

  5. Digital timing: sampling frequency, anti-aliasing filter and signal interpolation filter dependence on timing resolution.

    PubMed

    Cho, Sanghee; Grazioso, Ron; Zhang, Nan; Aykac, Mehmet; Schmand, Matthias

    2011-12-07

    The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.

  6. Sound Clocks and Sonic Relativity

    NASA Astrophysics Data System (ADS)

    Todd, Scott L.; Menicucci, Nicolas C.

    2017-10-01

    Sound propagation within certain non-relativistic condensed matter models obeys a relativistic wave equation despite such systems admitting entirely non-relativistic descriptions. A natural question that arises upon consideration of this is, "do devices exist that will experience the relativity in these systems?" We describe a thought experiment in which `acoustic observers' possess devices called sound clocks that can be connected to form chains. Careful investigation shows that appropriately constructed chains of stationary and moving sound clocks are perceived by observers on the other chain as undergoing the relativistic phenomena of length contraction and time dilation by the Lorentz factor, γ , with c the speed of sound. Sound clocks within moving chains actually tick less frequently than stationary ones and must be separated by a shorter distance than when stationary to satisfy simultaneity conditions. Stationary sound clocks appear to be length contracted and time dilated to moving observers due to their misunderstanding of their own state of motion with respect to the laboratory. Observers restricted to using sound clocks describe a universe kinematically consistent with the theory of special relativity, despite the preferred frame of their universe in the laboratory. Such devices show promise in further probing analogue relativity models, for example in investigating phenomena that require careful consideration of the proper time elapsed for observers.

  7. Laser-combined scanning tunnelling microscopy for probing ultrafast transient dynamics.

    PubMed

    Terada, Yasuhiko; Yoshida, Shoji; Takeuchi, Osamu; Shigekawa, Hidemi

    2010-07-07

    The development of time-resolved scanning tunnelling microscopy (STM), in particular, attempts to combine STM with ultrafast laser technology, is reviewed with emphasis on observed physical quantities and spatiotemporal resolution. Ultrashort optical pulse technology has allowed us to observe transient phenomena in the femtosecond range, which, however, has the drawback of a relatively low spatial resolution due to the electromagnetic wavelength used. In contrast, STM and its related techniques, although the time resolution is limited by the circuit bandwidth (∼100 kHz), enable us to observe structures at the atomic level in real space. Our purpose has been to combine these two techniques to achieve a new technology that satisfies the requirements for exploring the ultrafast transient dynamics of the local quantum functions in organized small structures, which will advance the pursuit of future nanoscale scientific research in terms of the ultimate temporal and spatial resolutions. © 2010 IOP Publishing Ltd

  8. Time Projection Chamber Polarimeters for X-ray Astrophysics

    NASA Astrophysics Data System (ADS)

    Hill, Joanne; Black, Kevin; Jahoda, Keith

    2015-04-01

    Time Projection Chamber (TPC) based X-ray polarimeters achieve the sensitivity required for practical and scientifically significant astronomical observations, both galactic and extragalactic, with a combination of high analyzing power and good quantum efficiency. TPC polarimeters at the focus of an X-ray telescope have low background and large collecting areas providing the ability to measure the polarization properties of faint persistent sources. TPCs based on drifting negative ions rather than electrons permit large detector collecting areas with minimal readout electronics enabling wide field of view polarimeters for observing unpredictable, bright transient sources such as gamma-ray bursts. We described here the design and expected performance of two different TPC polarimeters proposed for small explorer missions: The PRAXyS (Polarimetry of Relativistic X-ray Sources) X-ray Polarimeter Instrument, optimized for observations of faint persistent sources and the POET (Polarimetry of Energetic Transients) Low Energy Polarimeter, designed to detect and measure bright transients. also NASA/GSFC.

  9. Discovery of powerful gamma-ray flares from the Crab Nebula.

    PubMed

    Tavani, M; Bulgarelli, A; Vittorini, V; Pellizzoni, A; Striani, E; Caraveo, P; Weisskopf, M C; Tennant, A; Pucella, G; Trois, A; Costa, E; Evangelista, Y; Pittori, C; Verrecchia, F; Del Monte, E; Campana, R; Pilia, M; De Luca, A; Donnarumma, I; Horns, D; Ferrigno, C; Heinke, C O; Trifoglio, M; Gianotti, F; Vercellone, S; Argan, A; Barbiellini, G; Cattaneo, P W; Chen, A W; Contessi, T; D'Ammando, F; DePris, G; Di Cocco, G; Di Persio, G; Feroci, M; Ferrari, A; Galli, M; Giuliani, A; Giusti, M; Labanti, C; Lapshov, I; Lazzarotto, F; Lipari, P; Longo, F; Fuschino, F; Marisaldi, M; Mereghetti, S; Morelli, E; Moretti, E; Morselli, A; Pacciani, L; Perotti, F; Piano, G; Picozza, P; Prest, M; Rapisarda, M; Rappoldi, A; Rubini, A; Sabatini, S; Soffitta, P; Vallazza, E; Zambra, A; Zanello, D; Lucarelli, F; Santolamazza, P; Giommi, P; Salotti, L; Bignami, G F

    2011-02-11

    The well-known Crab Nebula is at the center of the SN1054 supernova remnant. It consists of a rotationally powered pulsar interacting with a surrounding nebula through a relativistic particle wind. The emissions originating from the pulsar and nebula have been considered to be essentially stable. Here, we report the detection of strong gamma-ray (100 mega-electron volts to 10 giga-electron volts) flares observed by the AGILE satellite in September 2010 and October 2007. In both cases, the total gamma-ray flux increased by a factor of three compared with the non-flaring flux. The flare luminosity and short time scale favor an origin near the pulsar, and we discuss Chandra Observatory x-ray and Hubble Space Telescope optical follow-up observations of the nebula. Our observations challenge standard models of nebular emission and require power-law acceleration by shock-driven plasma wave turbulence within an approximately 1-day time scale.

  10. COST Action ES1206: Advanced GNSS Tropospheric Products forMonitoring Severe Weather Events and Climate (GNSS4SWEC)

    NASA Astrophysics Data System (ADS)

    Jones, J.; Guerova, G.; Dousa, J.; Dick, G.; Haan, de, S.; Pottiaux, E.; Bock, O.; Pacione, R.

    2016-12-01

    GNSS is now an established atmospheric observing system which can accurately sense water vapour, the mostabundant greenhouse gas, accounting for 60-70% of atmospheric warming. Water vapour observations arecurrently under-sampled and obtaining and exploiting additional high-quality humidity observations is essential tosevere weather forecasting and climate monitoring. COST Action ES1206 addresses new and improved capabilities from developments in both the GNSS andmeteorological communities to address these requirements. For the first time, the synergy of multi-GNSS(GPS, GLONASS and Galileo) will be used to develop new, advanced tropospheric products, exploiting the fullpotential of multi-GNSS water vapour estimates on a wide range of temporal and spatial scales, from real-timemonitoring and forecasting of severe weather, to climate research. In addition the Action will promote the useof meteorological data in GNSS positioning, navigation, and timing services and stimulate knowledge and datatransfer throughout Europe.

  11. The Euclid AOCS science mode design

    NASA Astrophysics Data System (ADS)

    Bacchetta, A.; Saponara, M.; Torasso, A.; Saavedra Criado, G.; Girouart, B.

    2015-06-01

    Euclid is a Medium-Class mission of the ESA Cosmic Vision 2015-2025 plan. Thales Alenia Space Italy has been selected as prime contractor for the Euclid design and implementation. The spacecraft will be launched in 2020 on a Soyuz launch vehicle from Kourou, to a large-amplitude orbit around the sun-earth libration point L2. The objective of Euclid is to understand the origin of the Universe's accelerating expansion, by mapping large-scale structure over a cosmic time covering the last 10 billion years. The mission requires the ability to survey a large fraction of the extragalactic sky (i.e. portion of sky with latitude higher than 30 deg with respect to galactic plane) over its lifetime, with very high system stability (telescope, focal plane, spacecraft pointing) to minimize systematic effects. The AOCS is a key element to meet the scientific requirements. The AOCS design drivers are pointing performance and image quality (Relative Pointing Error over 700 s less than 25 m as, 68 % confidence level), and minimization of slew time between observation fields to meet the goal of completing the Wide Extragalactic Survey in 6 years. The first driver demands a Fine Guidance Sensor in the telescope focal plane for accurate attitude measurement and actuators with low noise and fine command resolution. The second driver requires high-torque actuators and an extended attitude control bandwidth. In the design, reaction wheels (RWL) and cold-gas micro-propulsion (MPS) are used in a synergetic and complementary way during different operational phases of the science mode. The RWL are used for performing the field slews, whereas during scientific observation they are stopped not to perturb the pointing by additional mechanical noise. The MPS is used for maintaining the reference attitude with high pointing accuracy during the scientific observation. This unconventional concept achieves the pointing performance with the shortest maneuver times, with significant mass savings with respect to the MPS-only solution.

  12. A Satellite-Based Imaging Instrumentation Concept for Hyperspectral Thermal Remote Sensing

    PubMed Central

    Udelhoven, Thomas; Schlerf, Martin; Segl, Karl; Mallick, Kaniska; Bossung, Christian; Rock, Gilles; Fischer, Peter; Müller, Andreas; Storch, Tobias; Eisele, Andreas; Weise, Dennis; Hupfer, Werner; Knigge, Thiemo

    2017-01-01

    This paper describes the concept of the hyperspectral Earth-observing thermal infrared (TIR) satellite mission HiTeSEM (High-resolution Temperature and Spectral Emissivity Mapping). The scientific goal is to measure specific key variables from the biosphere, hydrosphere, pedosphere, and geosphere related to two global problems of significant societal relevance: food security and human health. The key variables comprise land and sea surface radiation temperature and emissivity, surface moisture, thermal inertia, evapotranspiration, soil minerals and grain size components, soil organic carbon, plant physiological variables, and heat fluxes. The retrieval of this information requires a TIR imaging system with adequate spatial and spectral resolutions and with day-night following observation capability. Another challenge is the monitoring of temporally high dynamic features like energy fluxes, which require adequate revisit time. The suggested solution is a sensor pointing concept to allow high revisit times for selected target regions (1–5 days at off-nadir). At the same time, global observations in the nadir direction are guaranteed with a lower temporal repeat cycle (>1 month). To account for the demand of a high spatial resolution for complex targets, it is suggested to combine in one optic (1) a hyperspectral TIR system with ~75 bands at 7.2–12.5 µm (instrument NEDT 0.05 K–0.1 K) and a ground sampling distance (GSD) of 60 m, and (2) a panchromatic high-resolution TIR-imager with two channels (8.0–10.25 µm and 10.25–12.5 µm) and a GSD of 20 m. The identified science case requires a good correlation of the instrument orbit with Sentinel-2 (maximum delay of 1–3 days) to combine data from the visible and near infrared (VNIR), the shortwave infrared (SWIR) and TIR spectral regions and to refine parameter retrieval. PMID:28671575

  13. TD/GC-MS analysis of volatile markers emitted from mono- and co-cultures of Enterobacter cloacae and Pseudomonas aeruginosa in artificial sputum.

    PubMed

    Lawal, Oluwasola; Knobel, Hugo; Weda, Hans; Nijsen, Tamara M E; Goodacre, Royston; Fowler, Stephen J

    2018-01-01

    Infections such as ventilator-associated pneumonia (VAP) can be caused by one or more pathogens. Current methods for identifying these pathogenic microbes often require invasive sampling, and can be time consuming, due to the requirement for prolonged cultural enrichment along with selective and differential plating steps. This results in delays in diagnosis which in such critically ill patients can have potentially life-threatening consequences. Therefore, a non-invasive and timely diagnostic method is required. Detection of microbial volatile organic compounds (VOCs) in exhaled breath is proposed as an alternative method for identifying these pathogens and may distinguish between mono- and poly-microbial infections. To investigate volatile metabolites that discriminate between bacterial mono- and co-cultures. VAP-associated pathogens Enterobacter cloacae and Pseudomonas aeruginosa were cultured individually and together in artificial sputum medium for 24 h and their headspace was analysed for potential discriminatory VOCs by thermal desorption gas chromatography-mass spectrometry. Of the 70 VOCs putatively identified, 23 were found to significantly increase during bacterial culture (i.e. likely to be released during metabolism) and 13 decreased (i.e. likely consumed during metabolism). The other VOCs showed no transformation (similar concentrations observed as in the medium). Bacteria-specific VOCs including 2-methyl-1-propanol, 2-phenylethanol, and 3-methyl-1-butanol were observed in the headspace of axenic cultures of E. cloacae , and methyl 2-ethylhexanoate in the headspace of P. aeruginosa cultures which is novel to this investigation. Previously reported VOCs 1-undecene and pyrrole were also detected. The metabolites 2-methylbutyl acetate and methyl 2-methylbutyrate, which are reported to exhibit antimicrobial activity, were elevated in co-culture only. The observed VOCs were able to differentiate axenic and co-cultures. Validation of these markers in exhaled breath specimens could prove useful for timely pathogen identification and infection type diagnosis.

  14. Local short-duration precipitation extremes in Sweden: observations, forecasts and projections

    NASA Astrophysics Data System (ADS)

    Olsson, Jonas; Berg, Peter; Simonsson, Lennart

    2015-04-01

    Local short-duration precipitation extremes (LSPEs) are a key driver of hydrological hazards, notably in steep catchments with thin soils and in urban environments. The triggered floodings, landslides, etc., have large consequences for society in terms of both economy and health. Accurate estimations of LSPEs on both climatological time-scales (past, present, future) and in real-time is thus of great importance for improved hydrological predictions as well as design of constructions and infrastructure affected by hydrological fluxes. Analysis of LSPEs is, however, associated with various limitations and uncertainties. These are to a large degree associated with the small-scale nature of the meteorological processes behind LSPEs and the associated requirements on observation sensors as well as model descriptions. Some examples of causes for the limitations involved are given in the following. - Observations: High-resolution data sets available for LSPE analyses are often limited to either relatively long series from one or a few stations or relatively short series from larger station networks. Radar data have excellent resolutions in both time and space but the estimated local precipitation intensity is still highly uncertain. New and promising techniques (e.g. microwave links) are still in their infancy. - Weather forecasts (short-range): Although forecasts with the required spatial resolution for potential generation of LSPEs (around 2-4 km) are becoming operationally available, the actual forecast precision of LSPEs is largely unknown. Forecasted LSPEs may be displaced in time or, more critically, in space which strongly affects the possibility to assess hydrological risk. - Climate projections: The spatial resolution of the current RCM generation (around 25 km) is not sufficient for proper description of LSPEs. Statistical post-processing (i.e. downscaling) is required which adds substantial uncertainty to the final result. Ensemble generation of sufficiently high-resolution RCM projections is not yet computationally feasible. In this presentation, examples of recent research in Sweden related to these aspects will be given with some main findings shown and discussed. Finally, some ongoing and future research directions will be outlined (the former hopefully accompanied by some brand-new results).

  15. The observation of sporadic meteors and meteor showers by means of radio technology measuring equipment

    NASA Astrophysics Data System (ADS)

    Schippke, W.

    1981-08-01

    Advantages regarding a tracking of meteors with the aid of the instruments of radio technology are related to the possibility for continuous observations without any dependence on meteorological conditions or on the time of day or night. Two methods exist for the registration of the traces of meteors, including a passive and an active method. The appropriate frequency range for both methods is the lower VHF range. For passive observations a very sensitive measurement receiver is required along with recording equipment, and a suitable antenna system. In Europe there are many television transmitters which are eminently suited for a detection of meteor traces. The active method for tracking meteors is more difficult and requires for its employment more expensive equipment than the passive method. It is based on the use of a VHF metric-wave radar. These devices operate normally also at a frequency of approximately 50 or 60 MHz. Attention is given to the theory of meteoric scattering, the various types of ionized trails, the geometry of meteor traces, results obtained in an observational station in Munich, and observations in the 144-MHz band.

  16. Event Horizon Telescope observations as probes for quantum structure of astrophysical black holes

    NASA Astrophysics Data System (ADS)

    Giddings, Steven B.; Psaltis, Dimitrios

    2018-04-01

    The need for a consistent quantum evolution for black holes has led to proposals that their semiclassical description is modified not just near the singularity, but at horizon or larger scales. If such modifications extend beyond the horizon, they influence regions accessible to distant observation. Natural candidates for these modifications behave like metric fluctuations, with characteristic length scales and timescales set by the horizon radius. We investigate the possibility of using the Event Horizon Telescope to observe these effects, if they have a strength sufficient to make quantum evolution consistent with unitarity, without introducing new scales. We find that such quantum fluctuations can introduce a strong time dependence for the shape and size of the shadow that a black hole casts on its surrounding emission. For the black hole in the center of the Milky Way, detecting the rapid time variability of its shadow will require nonimaging timing techniques. However, for the much larger black hole in the center of the M87 galaxy, a variable black-hole shadow, if present with these parameters, would be readily observable in the individual snapshots that will be obtained by the Event Horizon Telescope.

  17. Determination of eruption temperature of Io's lavas using lava tube skylights

    NASA Astrophysics Data System (ADS)

    Davies, Ashley Gerard; Keszthelyi, Laszlo P.; McEwen, Alfred S.

    2016-11-01

    Determining the eruption temperature of Io's dominant silicate lavas would constrain Io's present interior state and composition. We have examined how eruption temperature can be estimated at lava tube skylights through synthesis of thermal emission from the incandescent lava flowing within the lava tube. Lava tube skylights should be present along Io's long-lived lava flow fields, and are attractive targets because of their temporal stability and the narrow range of near-eruption temperatures revealed through them. We conclude that these skylights are suitable and desirable targets (perhaps the very best targets) for the purposes of constraining eruption temperature, with a 0.9:0.7-μm radiant flux ratio ≤6.3 being diagnostic of ultramafic lava temperatures. Because the target skylights may be small - perhaps only a few m or 10 s of m across - such observations will require a future Io-dedicated mission that will obtain high spatial resolution (< 100 m/pixel), unsaturated observations of Io's surface at multiple wavelengths in the visible and near-infrared, ideally at night. In contrast to observations of lava fountains or roiling lava lakes, where accurate determination of surface temperature distribution requires simultaneous or near-simultaneous (< 0.1 s) observations at different wavelengths, skylight thermal emission data are superior for the purposes of temperature derivation, as emission is stable on much longer time scales (minutes, or longer), so long as viewing geometry does not greatly change during that time.

  18. Future projects in asteroseismology: the unique role of Antarctica

    NASA Astrophysics Data System (ADS)

    Mosser, B.; Siamois Team

    Asteroseismology requires observables registered in stringent conditions: very high sensitivity, uninterrupted time series, long duration. These specifications then allow to study the details of the stellar interior structure. Space-borne and ground-based asteroseismic projects are presented and compared. With CoRoT as a precursor, then Kepler and maybe Plato, the roadmap in space appears to be precisely designed. In parallel, ground-based projects are necessary to provide different and unique information on bright stars with Doppler measurements. Dome C appears to be the ideal place for ground-based asteroseismic observations. The unequalled weather conditions yield a duty cycle comparable to space. Long time series (up to 3 months) will be possible, thanks to the long duration of the polar night.

  19. Resistance-training exercises with different stability requirements: time course of task specificity.

    PubMed

    Saeterbakken, Atle Hole; Andersen, Vidar; Behm, David G; Krohn-Hansen, Espen Krogseth; Smaamo, Mats; Fimland, Marius Steiro

    2016-12-01

    The aim of the study was to assess the task-specificity (greater improvements in trained compared to non-trained tasks), transferability and time-course adaptations of resistance-training programs with varying instability requirements. Thirty-six resistance-trained men were randomized to train chest press 2 days week -1 for 10 week (6 repetitions × 4 series) using a Swiss ball, Smith machine or dumbbells. A six-repetition maximum-strength test with the aforementioned exercises and traditional barbell chest press were performed by all participants at the first, 7th, 14th and final training session in addition to electromyographic activities of the prime movers measured during isometric bench press. The groups training with the unstable Swiss-ball and dumbbells, but not the stable Smith-machine, demonstrated task-specificity, which became apparent in the early phase and remained throughout the study. The improvements in the trained exercise tended to increase more with instability (dumbbells vs. Smith machine, p = 0.061). The group training with Smith machine had similar improvements in the non-trained exercises. Greater improvements were observed in the early phase of the strength-training program (first-7th session) for all groups in all three exercises, but most notably for the unstable exercises. No differences were observed between the groups or testing times for EMG activity. These findings suggest that among resistance-trained individuals, the concept of task-specificity could be most relevant in resistance training with greater stability requirements, particularly due to rapid strength improvements for unstable resistance exercises.

  20. The Aloha Telescope for K-12 STEM Education

    NASA Astrophysics Data System (ADS)

    Sowell, James R.

    2015-01-01

    How does one bring night-time astronomical observations into the classroom? How does a teacher - during the school day - show students the craters on the Moon, the rings of Saturn, or the four Galilean moons of Jupiter? One of the greatest drawbacks to teaching Astronomy is the lack of real-time telescopic observations during the school day, and yet this is a very exciting time for astronomical discoveries. The solution is to access a telescope in a substantially different time zone where it is still night. This facility - the Aloha Telescope - on Maui has already been established by a partnership between Georgia Tech and the Air Force Research Lab. This robotic telescope's sole purpose is for K-12 education, as it is equipped with a video-camera and is operated remotely via high-speed internet connections. This facility and its outreach program allow east-coast teachers and, in turn, students to have local daytime access to - and direct control of - the telescope. When observing the Moon, teachers and students will move the telescope wherever they wish across the highly-magnified lunar surface (~ 5 arcminute FOV). This telescope will enable night-time astronomical observations to come alive as day-time activities and will be an important tool for STEM education and activities. The use of the Aloha Telescope requires minimal training and is free after registering for a date and time.Dr. Sowell has written specific telescopic exercises and surface feature tours appropriate for K-12 and college-level users. These exercises, and other aspects of the Aloha Telescope and program, are posted on the website at http://aloha.gatech.edu

  1. The suppression of charged-particle-induced noise in infrared detectors

    NASA Technical Reports Server (NTRS)

    Houck, J. R.; Briotta, D. A., Jr.

    1982-01-01

    A d.c.-coupled transimpedance amplifier/pulse suppression circuit designed to remove charged-particle-induced noise from infrared detectors is described. Noise spikes produced by single particle events are large and have short rise times, and can degrade the performance of an infrared detector in moderate radiation environments. The use of the suppression circuit improves the signal-to-noise ratio by a factor of 1.6:1, which corresponds to a reduction in required observing time by a factor of about 2.6.

  2. Theoretical Methods in the Non-Equilibrium Quantum Mechanics of Many Bodies

    DTIC Science & Technology

    2011-01-01

    signature of this effect would be similar to what has been found in time-of- flight experiments [43]. When µ and J are tuned to a point within the enhanced...junctions [95] and 91 thin films [69]. At the same time, other nonequilibrium stimulation methods were developed [22] with more recent reports of...but our desired effect will be easier to observe experimentally with some other constraints. For instance, requiring that gFB < 0 will raise the BCS

  3. Effective precipitation duration for runoff peaks based on catchment modelling

    NASA Astrophysics Data System (ADS)

    Sikorska, A. E.; Viviroli, D.; Seibert, J.

    2018-01-01

    Despite precipitation intensities may greatly vary during one flood event, detailed information about these intensities may not be required to accurately simulate floods with a hydrological model which rather reacts to cumulative precipitation sums. This raises two questions: to which extent is it important to preserve sub-daily precipitation intensities and how long does it effectively rain from the hydrological point of view? Both questions might seem straightforward to answer with a direct analysis of past precipitation events but require some arbitrary choices regarding the length of a precipitation event. To avoid these arbitrary decisions, here we present an alternative approach to characterize the effective length of precipitation event which is based on runoff simulations with respect to large floods. More precisely, we quantify the fraction of a day over which the daily precipitation has to be distributed to faithfully reproduce the large annual and seasonal floods which were generated by the hourly precipitation rate time series. New precipitation time series were generated by first aggregating the hourly observed data into daily totals and then evenly distributing them over sub-daily periods (n hours). These simulated time series were used as input to a hydrological bucket-type model and the resulting runoff flood peaks were compared to those obtained when using the original precipitation time series. We define then the effective daily precipitation duration as the number of hours n, for which the largest peaks are simulated best. For nine mesoscale Swiss catchments this effective daily precipitation duration was about half a day, which indicates that detailed information on precipitation intensities is not necessarily required to accurately estimate peaks of the largest annual and seasonal floods. These findings support the use of simple disaggregation approaches to make usage of past daily precipitation observations or daily precipitation simulations (e.g. from climate models) for hydrological modeling at an hourly time step.

  4. Identification of the ideal clutter metric to predict time dependence of human visual search

    NASA Astrophysics Data System (ADS)

    Cartier, Joan F.; Hsu, David H.

    1995-05-01

    The Army Night Vision and Electronic Sensors Directorate (NVESD) has recently performed a human perception experiment in which eye tracker measurements were made on trained military observers searching for targets in infrared images. This data offered an important opportunity to evaluate a new technique for search modeling. Following the approach taken by Jeff Nicoll, this model treats search as a random walk in which the observers are in one of two states until they quit: they are either searching, or they are wandering around looking for a point of interest. When wandering they skip rapidly from point to point. When examining they move more slowly, reflecting the fact that target discrimination requires additional thought processes. In this paper we simulate the random walk, using a clutter metric to assign relative attractiveness to points of interest within the image which are competing for the observer's attention. The NVESD data indicates that a number of standard clutter metrics are good estimators of the apportionment of observer's time between wandering and examining. Conversely, the apportionment of observer time spent wandering and examining could be used to reverse engineer the ideal clutter metric which would most perfectly describe the behavior of the group of observers. It may be possible to use this technique to design the optimal clutter metric to predict performance of visual search.

  5. Tethered Satellites as Enabling Platforms for an Operational Space Weather Monitoring System

    NASA Technical Reports Server (NTRS)

    Krause, L. Habash; Gilchrist, B. E.; Bilen, S.; Owens, J.; Voronka, N.; Furhop, K.

    2013-01-01

    Space weather nowcasting and forecasting models require assimilation of near-real time (NRT) space environment data to improve the precision and accuracy of operational products. Typically, these models begin with a climatological model to provide "most probable distributions" of environmental parameters as a function of time and space. The process of NRT data assimilation gently pulls the climate model closer toward the observed state (e.g. via Kalman smoothing) for nowcasting, and forecasting is achieved through a set of iterative physics-based forward-prediction calculations. The issue of required space weather observatories to meet the spatial and temporal requirements of these models is a complex one, and we do not address that with this poster. Instead, we present some examples of how tethered satellites can be used to address the shortfalls in our ability to measure critical environmental parameters necessary to drive these space weather models. Examples include very long baseline electric field measurements, magnetized ionospheric conductivity measurements, and the ability to separate temporal from spatial irregularities in environmental parameters. Tethered satellite functional requirements will be presented for each space weather parameter considered in this study.

  6. The temporal spectrum of adult mosquito population fluctuations: conceptual and modeling implications.

    PubMed

    Jian, Yun; Silvestri, Sonia; Brown, Jeff; Hickman, Rick; Marani, Marco

    2014-01-01

    An improved understanding of mosquito population dynamics under natural environmental forcing requires adequate field observations spanning the full range of temporal scales over which mosquito abundance fluctuates in natural conditions. Here we analyze a 9-year daily time series of uninterrupted observations of adult mosquito abundance for multiple mosquito species in North Carolina to identify characteristic scales of temporal variability, the processes generating them, and the representativeness of observations at different sampling resolutions. We focus in particular on Aedes vexans and Culiseta melanura and, using a combination of spectral analysis and modeling, we find significant population fluctuations with characteristic periodicity between 2 days and several years. Population dynamical modelling suggests that the observed fast fluctuations scales (2 days-weeks) are importantly affected by a varying mosquito activity in response to rapid changes in meteorological conditions, a process neglected in most representations of mosquito population dynamics. We further suggest that the range of time scales over which adult mosquito population variability takes place can be divided into three main parts. At small time scales (indicatively 2 days-1 month) observed population fluctuations are mainly driven by behavioral responses to rapid changes in weather conditions. At intermediate scales (1 to several month) environmentally-forced fluctuations in generation times, mortality rates, and density dependence determine the population characteristic response times. At longer scales (annual to multi-annual) mosquito populations follow seasonal and inter-annual environmental changes. We conclude that observations of adult mosquito populations should be based on a sub-weekly sampling frequency and that predictive models of mosquito abundance must include behavioral dynamics to separate the effects of a varying mosquito activity from actual changes in the abundance of the underlying population.

  7. Addressing Thermal Model Run Time Concerns of the Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA)

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Guerrero, Sergio; Hawk, John; Rodriguez, Juan; McDonald, Carson; Jackson, Cliff

    2016-01-01

    The Wide Field Infrared Survey Telescope using Astrophysics Focused Telescope Assets (WFIRST-AFTA) utilizes an existing 2.4 m diameter Hubble sized telescope donated from elsewhere in the federal government for near-infrared sky surveys and Exoplanet searches to answer crucial questions about the universe and dark energy. The WFIRST design continues to increase in maturity, detail, and complexity with each design cycle leading to a Mission Concept Review and entrance to the Mission Formulation Phase. Each cycle has required a Structural-Thermal-Optical-Performance (STOP) analysis to ensure the design can meet the stringent pointing and stability requirements. As such, the models have also grown in size and complexity leading to increased model run time. This paper addresses efforts to reduce the run time while still maintaining sufficient accuracy for STOP analyses. A technique was developed to identify slews between observing orientations that were sufficiently different to warrant recalculation of the environmental fluxes to reduce the total number of radiation calculation points. The inclusion of a cryocooler fluid loop in the model also forced smaller time-steps than desired, which greatly increases the overall run time. The analysis of this fluid model required mitigation to drive the run time down by solving portions of the model at different time scales. Lastly, investigations were made into the impact of the removal of small radiation couplings on run time and accuracy. Use of these techniques allowed the models to produce meaningful results within reasonable run times to meet project schedule deadlines.

  8. Three-dimensional lung tumor segmentation from x-ray computed tomography using sparse field active models.

    PubMed

    Awad, Joseph; Owrangi, Amir; Villemaire, Lauren; O'Riordan, Elaine; Parraga, Grace; Fenster, Aaron

    2012-02-01

    Manual segmentation of lung tumors is observer dependent and time-consuming but an important component of radiology and radiation oncology workflow. The objective of this study was to generate an automated lung tumor measurement tool for segmentation of pulmonary metastatic tumors from x-ray computed tomography (CT) images to improve reproducibility and decrease the time required to segment tumor boundaries. The authors developed an automated lung tumor segmentation algorithm for volumetric image analysis of chest CT images using shape constrained Otsu multithresholding (SCOMT) and sparse field active surface (SFAS) algorithms. The observer was required to select the tumor center and the SCOMT algorithm subsequently created an initial surface that was deformed using level set SFAS to minimize the total energy consisting of mean separation, edge, partial volume, rolling, distribution, background, shape, volume, smoothness, and curvature energies. The proposed segmentation algorithm was compared to manual segmentation whereby 21 tumors were evaluated using one-dimensional (1D) response evaluation criteria in solid tumors (RECIST), two-dimensional (2D) World Health Organization (WHO), and 3D volume measurements. Linear regression goodness-of-fit measures (r(2) = 0.63, p < 0.0001; r(2) = 0.87, p < 0.0001; and r(2) = 0.96, p < 0.0001), and Pearson correlation coefficients (r = 0.79, p < 0.0001; r = 0.93, p < 0.0001; and r = 0.98, p < 0.0001) for 1D, 2D, and 3D measurements, respectively, showed significant correlations between manual and algorithm results. Intra-observer intraclass correlation coefficients (ICC) demonstrated high reproducibility for algorithm (0.989-0.995, 0.996-0.997, and 0.999-0.999) and manual measurements (0.975-0.993, 0.985-0.993, and 0.980-0.992) for 1D, 2D, and 3D measurements, respectively. The intra-observer coefficient of variation (CV%) was low for algorithm (3.09%-4.67%, 4.85%-5.84%, and 5.65%-5.88%) and manual observers (4.20%-6.61%, 8.14%-9.57%, and 14.57%-21.61%) for 1D, 2D, and 3D measurements, respectively. The authors developed an automated segmentation algorithm requiring only that the operator select the tumor to measure pulmonary metastatic tumors in 1D, 2D, and 3D. Algorithm and manual measurements were significantly correlated. Since the algorithm segmentation involves selection of a single seed point, it resulted in reduced intra-observer variability and decreased time, for making the measurements.

  9. Time-dependent analysis of dosage delivery information for patient-controlled analgesia services.

    PubMed

    Kuo, I-Ting; Chang, Kuang-Yi; Juan, De-Fong; Hsu, Steen J; Chan, Chia-Tai; Tsou, Mei-Yung

    2018-01-01

    Pain relief always plays the essential part of perioperative care and an important role of medical quality improvement. Patient-controlled analgesia (PCA) is a method that allows a patient to self-administer small boluses of analgesic to relieve the subjective pain. PCA logs from the infusion pump consisted of a lot of text messages which record all events during the therapies. The dosage information can be extracted from PCA logs to provide easily understanding features. The analysis of dosage information with time has great help to figure out the variance of a patient's pain relief condition. To explore the trend of pain relief requirement, we developed a PCA dosage information generator (PCA DIG) to extract meaningful messages from PCA logs during the first 48 hours of therapies. PCA dosage information including consumption, delivery, infusion rate, and the ratio between demand and delivery is presented with corresponding values in 4 successive time frames. Time-dependent statistical analysis demonstrated the trends of analgesia requirements decreased gradually along with time. These findings are compatible with clinical observations and further provide valuable information about the strategy to customize postoperative pain management.

  10. Space Shuttle booster thrust imbalance analysis

    NASA Technical Reports Server (NTRS)

    Bailey, W. R.; Blackwell, D. L.

    1985-01-01

    An analysis of the Shuttle SRM thrust imbalance during the steady-state and tailoff portions of the boost phase of flight are presented. Results from flights STS-1 through STS-13 are included. A statistical analysis of the observed thrust imbalance data is presented. A 3 sigma thrust imbalance history versus time was generated from the observed data and is compared to the vehicle design requirements. The effect on Shuttle thrust imbalance from the use of replacement SRM segments is predicted. Comparisons of observed thrust imbalances with respect to predicted imbalances are presented for the two space shuttle flights which used replacement aft segments (STS-9 and STS-13).

  11. Suprathermal protons in the interplanetary solar wind

    NASA Technical Reports Server (NTRS)

    Goodrich, C. C.; Lazarus, A. J.

    1976-01-01

    Using the Mariner 5 solar wind plasma and magnetic field data, we present observations of field-aligned suprathermal proton velocity distributions having pronounced high-energy shoulders. These observations, similar to the interpenetrating stream observations of Feldman et al. (1974), are clear evidence that such proton distributions are interplanetary rather than bow shock associated phenomena. Large Alfven speed is found to be a requirement for the occurrence of suprathermal proton distribution; further, we find the proportion of particles in the shoulder to be limited by the magnitude of the Alfven speed. It is suggested that this last result could indicate that the proton thermal anisotropy is limited at times by wave-particle interactions

  12. The mass of (1) Ceres from perturbations on (348) May

    NASA Technical Reports Server (NTRS)

    Williams, Gareth V.

    1992-01-01

    The most promising ground-based technique for determining the mass of a minor planet is the observation of the perturbations it induces in the motion of another minor planet. This method requires careful observation of both minor planets over extended periods of time. The mass of (1) Ceres has been determined from the perturbations on (348) May, which made three close approaches to Ceres at intervals of 46 years between 1891 and 1984. The motion of May is clearly influenced by Ceres, and by using different test masses for Ceres, a search was made to determine the mass of Ceres that minimizes the residuals in the observations of May.

  13. Observations of CO above Venus cloud top near 4.53 μm

    NASA Astrophysics Data System (ADS)

    Marcq, E.; Encrenaz, T.; Widemann, T.; Bertaux, J. L.

    2013-09-01

    Venus' cloud top region exhibits a higher level of variability both in space and time than previously thought. The interplay between photochemistry, dynamics and cloud microphysics requires more observational constraints in order to be fully grasped. Recent observations of sulfur dioxide (SO2) variability [2, 8, 7, 9] have evidenced both short-term, longterm and latitudinal variability whose origin remains mysterious (volcanogenic emissions? dynamic variability?). A better knowledge of the variability of other minor species would be highly welcome in this context. Carbon monoxide (CO), whose pattern of sinks and sources is opposite to SO2, is a prime candidate.

  14. Observations of dusty plasmas with magnetized dust grains

    NASA Astrophysics Data System (ADS)

    Luo, Q.-Z.; D'Angelo, N.

    2000-11-01

    We report a newly observed phenomenon in a dusty plasma device of the \\mbox{Q-machine} type. At low plasma densities the time required by the plasma to return to its no-dust conditions, after the dust dispenser is turned off, can be as long as many tens of seconds or longer. A tentative interpretation of this observation in terms of magnetized dust grains is advanced. It appears that an important loss mechanism of fine dust grains is by ion drag along the magnetic field lines. The effect of ion drag is somewhat counteracted by the -µ∇B force present when the magnetic field has a mirror geometry.

  15. Earth Science System of the Future: Observing, Processing, and Delivering Data Products Directly to Users

    NASA Technical Reports Server (NTRS)

    Crisp, David; Komar, George (Technical Monitor)

    2001-01-01

    Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.

  16. Fast Simulation of the Impact Parameter Calculation of Electrons through Pair Production

    NASA Astrophysics Data System (ADS)

    Bang, Hyesun; Kweon, MinJung; Huh, Kyoung Bum; Pachmayer, Yvonne

    2018-05-01

    A fast simulation method is introduced that reduces tremendously the time required for the impact parameter calculation, a key observable in physics analyses of high energy physics experiments and detector optimisation studies. The impact parameter of electrons produced through pair production was calculated considering key related processes using the Bethe-Heitler formula, the Tsai formula and a simple geometric model. The calculations were performed at various conditions and the results were compared with those from full GEANT4 simulations. The computation time using this fast simulation method is 104 times shorter than that of the full GEANT4 simulation.

  17. Time and Frequency Synchronization on the Virac Radio Telescope RT-32

    NASA Astrophysics Data System (ADS)

    Bezrukovs, V.

    2016-04-01

    One of the main research directions of Ventspils International Radio Astronomy Centre (VIRAC) is radio astronomy and astrophysics. The instrumental base for the centre comprised two fully steerable parabolic antennas, RT-16 and RT-32 (i.e. with the mirror diameter of 16 m and 32 m). After long reconstruction, radio telescope RT-32 is currently equipped with the receiving and data acquisition systems that allow observing in a wide frequency range from 327 MHz to 9 GHz. New Antenna Control Unit (ACU) allows stable, fast and precise pointing of antenna. Time and frequency distribution service provide 5, 10 and 100 MHz reference frequency, 1PPS signals and precise time stamps by NTP protocol and in the IRIG-B format by coaxial cable. For the radio astronomical observations, main requirement of spatially Very Long Base Line Interferometric (VLBI) observations for the observatory is precise synchronization of the received and sampled data and linking to the exact time stamps. During October 2015, radio telescope RT-32 performance was tested in several successful VLBI experiments. The obtained results confirm the efficiency of the chosen methods of synchronization and the ability to reproduce them on similar antennas.

  18. Deepwater Horizon - Estimating surface oil volume distribution in real time

    NASA Astrophysics Data System (ADS)

    Lehr, B.; Simecek-Beatty, D.; Leifer, I.

    2011-12-01

    Spill responders to the Deepwater Horizon (DWH) oil spill required both the relative spatial distribution and total oil volume of the surface oil. The former was needed on a daily basis to plan and direct local surface recovery and treatment operations. The latter was needed less frequently to provide information for strategic response planning. Unfortunately, the standard spill observation methods were inadequate for an oil spill this size, and new, experimental, methods, were not ready to meet the operational demands of near real-time results. Traditional surface oil estimation tools for large spills include satellite-based sensors to define the spatial extent (but not thickness) of the oil, complemented with trained observers in small aircraft, sometimes supplemented by active or passive remote sensing equipment, to determine surface percent coverage of the 'thick' part of the slick, where the vast majority of the surface oil exists. These tools were also applied to DWH in the early days of the spill but the shear size of the spill prevented synoptic information of the surface slick through the use small aircraft. Also, satellite images of the spill, while large in number, varied considerably in image quality, requiring skilled interpretation of them to identify oil and eliminate false positives. Qualified staff to perform this task were soon in short supply. However, large spills are often events that overcome organizational inertia to the use of new technology. Two prime examples in DWH were the application of hyper-spectral scans from a high-altitude aircraft and more traditional fixed-wing aircraft using multi-spectral scans processed by use of a neural network to determine, respectively, absolute or relative oil thickness. But, with new technology, come new challenges. The hyper-spectral instrument required special viewing conditions that were not present on a daily basis and analysis infrastructure to process the data that was not available at the command post. Very few days provided sufficient observation quality and spatial coverage. Future application of this method will require solving both the observational and analysis challenges demonstrated at DWH. Similarly, the multi-spectral scanner results could only be interpreted by a handful of individuals, causing some logistical problems incorporating the observational results with the incident command decisions. This roadblock may go away as the spill response community becomes more familiar with the technology.

  19. Timing Noise in PSR 1821-24 : a Micro-Glitch Observed in a Recycled Millisecond Pulsar

    NASA Astrophysics Data System (ADS)

    Cognard, I.; Backer, D. C.

    2005-07-01

    We report the observation of a very small glitch observed for the first time in a millisecond pulsar, PSR B1821-24, located in the globular cluster M28. Timing observations were mainly conducted with the Nançay radiotelescope and confirmation comes from the 140-ft Green Bank telescope data. This event is characterized by a rotation frequency step of 3 nHz or 10-11 in fractional frequency change. Timing residuals of 1821-24 at Nançay and Green Bank (crux) obtained with a set of pulsar parameters adjusted up to Feb 2001 are shown in the left figure below. After March 2001, both TOAs from original set of parameters and TOAs obtained with a {Δ P}/P change of 10-11 are also shown. The evolution of the PSR B1821-24 rotational frequency is shown on the right. This glitch follows the main characteristics of those in the slow period pulsars, but is two orders of magnitude smaller than the smallest ever recorded. Such an event must be very rare in millisecond pulsars since no other glitches have been detected when the cumulated number of years of millisecond pulsar timing observations up to 2001 is around 500 for all these objects. We should, however, keep in mind that PSR B1821-24 is one of the youngest among the old recycled pulsars. While this event happens on a much smaller scale, the required adjustment of the star to a new equilibrium figure as it spins down is a likely common cause for all glitches.

  20. Operating a wide-area remote observing system for the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    Wirth, Gregory D.; Kibrick, Robert I.; Goodrich, Robert W.; Lyke, James E.

    2008-07-01

    For over a decade, the W. M. Keck Observatory's two 10-meter telescopes have been operated remotely from its Waimea headquarters. Over the last 6 years, WMKO remote observing has expanded to allow teams at dedicated sites in California to observe either in collaboration with colleagues in Waimea or entirely from the U.S. mainland. Once an experimental effort, the Observatory's mainland observing capability is now fully operational, supported on all science instruments (except the interferometer) and regularly used by astronomers at eight mainland sites. Establishing a convenient and secure observing capability from those sites required careful planning to ensure that they are properly equipped and configured. It also entailed a significant investment in hardware and software, including both custom scripts to simplify launching the instrument interface at remote sites and automated routers employing ISDN backup lines to ensure continuation of observing during Internet outages. Observers often wait until shortly before their runs to request use of the mainland facilities. Scheduling these requests and ensuring proper system operation prior to observing requires close coordination between personnel at WMKO and the mainland sites. An established protocol for approving requests and carrying out pre-run checkout has proven useful in ensuring success. The Observatory anticipates enhancing and expanding its remote observing system. Future plans include deploying dedicated summit computers for running VNC server software, implementing a web-based tracking system for mainland-based observing requests, expanding the system to additional mainland sites, and converting to full-time VNC operation for all instruments.

  1. Modelling nanoflares in active regions and implications for coronal heating mechanisms

    PubMed Central

    Cargill, P. J.; Warren, H. P.; Bradshaw, S. J.

    2015-01-01

    Recent observations from the Hinode and Solar Dynamics Observatory spacecraft have provided major advances in understanding the heating of solar active regions (ARs). For ARs comprising many magnetic strands or sub-loops heated by small, impulsive events (nanoflares), it is suggested that (i) the time between individual nanoflares in a magnetic strand is 500–2000 s, (ii) a weak ‘hot’ component (more than 106.6 K) is present, and (iii) nanoflare energies may be as low as a few 1023 ergs. These imply small heating events in a stressed coronal magnetic field, where the time between individual nanoflares on a strand is of order the cooling time. Modelling suggests that the observed properties are incompatible with nanoflare models that require long energy build-up (over 10 s of thousands of seconds) and with steady heating. PMID:25897093

  2. Proper motion and secular variations of Keplerian orbital elements

    NASA Astrophysics Data System (ADS)

    Butkevich, Alexey G.

    2018-05-01

    High-precision observations require accurate modelling of secular changes in the orbital elements in order to extrapolate measurements over long time intervals, and to detect deviation from pure Keplerian motion caused, for example, by other bodies or relativistic effects. We consider the evolution of the Keplerian elements resulting from the gradual change of the apparent orbit orientation due to proper motion. We present rigorous formulae for the transformation of the orbit inclination, longitude of the ascending node and argument of the pericenter from one epoch to another, assuming uniform stellar motion and taking radial velocity into account. An approximate treatment, accurate to the second-order terms in time, is also given. The proper motion effects may be significant for long-period transiting planets. These theoretical results are applicable to the modelling of planetary transits and precise Doppler measurements as well as analysis of pulsar and eclipsing binary timing observations.

  3. Near real time observational data collection for SPRUCE experiment- PakBus protocol for slow satellite connections

    NASA Astrophysics Data System (ADS)

    Krassovski, Misha; Hanson, Paul; Riggs, Jeff

    2017-04-01

    Climate change studies are one of the most important aspects of modern science and related experiments are getting bigger and more complex. One such experiment is the Spruce and Peatland Responses Under Climatic and Environmental Change experiment (SPRUCE, http://mnspruce.ornl.gov) conducted in in northern Minnesota, 40 km north of Grand Rapids, in the USDA Forest Service Marcell Experimental Forest (MEF). The SPRUCE experimental mission is to assess ecosystem-level biological responses of vulnerable, high carbon terrestrial ecosystems to a range of climate warming manipulations and an elevated CO2 atmosphere. This manipulation experiment generates a lot of observational data and requires a reliable onsite data collection system, dependable methods to transfer data to a robust scientific facility, and real-time monitoring capabilities. This publication shares our experience of establishing near real time data collection and monitoring system via a satellite link using PakBus protocol.

  4. Tools and Data Services from the NASA Earth Satellite Observations for Remote Sensing Commercial Applications

    NASA Technical Reports Server (NTRS)

    Vicente, Gilberto

    2005-01-01

    Several commercial applications of remote sensing data, such as water resources management, environmental monitoring, climate prediction, agriculture, forestry, preparation for and migration of extreme weather events, require access to vast amounts of archived high quality data, software tools and services for data manipulation and information extraction. These on the other hand require gaining detailed understanding of the data's internal structure and physical implementation of data reduction, combination and data product production. The time-consuming task must be undertaken before the core investigation can begin and is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets of different formats, structures, and resolutions.

  5. Evaluation of the Horizontal and Vertical Accuracy of GNSS Survey Observations from a Real-Time Network

    NASA Astrophysics Data System (ADS)

    Allahyari, M.; Olsen, M. J.; Gillins, D. T.; Dennis, M. L.

    2016-12-01

    Many current surveying standards in the United States require several long-duration, static Global Navigation Satellite System (GNSS) observations to derive high-accuracy geodetic coordinates. However, over the past decade, many entities have established real-time GNSS networks (RTNs), which could reduce the field time for establishing geodetic control from hours to minutes. To evaluate the accuracy of RTN GNSS observations, data collected from two National Geodetic Survey (NGS) surveys in South Carolina and Oregon were studied. The objectives were to: 1) determine the accuracy of a real-time observation as a function of duration; 2) examine the influence of including GLONASS (Russia's version of GPS); 3) compare results using a single base to the full RTN network solution; and 4) assess the effect of baseline length on accuracy. In South Carolina, 360 observations ranging from 5 to 600 seconds were collected on 20 passive marks using RTN and single-base solutions, both with GPS+GLONASS and GPS-only. In Oregon, 18 passive marks were observed from 5 to 900 seconds using GPS-only with the RTN, and with GPS+GLONASS and GPS-only from a single-base. To develop "truth" coordinates, at least 30 hours of static GPS data were also collected on all marks. Each static survey session was post-processed in OPUS-Projects, and the resulting vectors were used to build survey networks that were least-squares adjusted using the NGS software ADJUST. The resulting coordinates provided the basis for evaluating the accuracy of the real-time observations. Results from this study indicate great potential in the use of RTNs for accurate derivation of geodetic coordinates. Both case studies showed an optimal observation duration of 180 seconds. RTN data tended to be more accurate and consistent than single-base data, and GLONASS slightly improved accuracy. A key benefit of GLONASS was the ability to obtain more fixed solutions at longer baseline lengths than single-base solutions.

  6. Energetic particle penetrations into the inner magnetosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ejiri, M.; Hoffman, R.A.; Smith, P.H.

    Data from Explorer 45 (S/sup 3/- A) instruments have revealed characteristics of magnetospheric storm or substorm time energetic particle enhancements in the inner magnetosphere (L< or approx. =5). The properties of the ion 'nose' structure in the dusk hemisphere are examined in detail. A statistical study of the local time dependence of noses places the highest probability of occurrence around 2000 MLT, but hey can be observed even near the noon meridian. It also appears that most noses are not isolated events but will appear on successive passes. A geoelectric field enhancement corresponding to a minimum value of AE ofmore » about 205 ..gamma.. seems to be required to convect the particles within the apogee of Explorer 45. The dynamical behavior of the nose characteristics observed along successive orbits is then explained quantitatively by the time-dependent convection theory in a Volland-Stern type geoelectric field (..gamma..=2). These calculations of adiabatic charged particle motions are also applied to expalin the energy spectra and dispersion in penetration distances for both electrons and ions observed in the postmidnight to morning hours. Finally, useful descriptions are given of the dispersion properties of particles penetrating the inter magnetosphere at all local times as a function of time after a sudden enhancement of the geoelectric field.« less

  7. Can CCTV identify people in public transit stations who are at risk of attempting suicide? An analysis of CCTV video recordings of attempters and a comparative investigation.

    PubMed

    Mishara, Brian L; Bardon, Cécile; Dupont, Serge

    2016-12-15

    Suicides incur in all public transit systems which do not completely impede access to tracks. We conducted two studies to determine if we can reliably identify in stations people at risk of suicide in order to intervene in a timely manner. The first study analysed all CCTV recordings of suicide attempters in Montreal underground stations over 2 years to identify behaviours indicating suicide risk. The second study verified the potential of using those behaviours to discriminate attempters from other passengers in real time. First study: Trained observers watched CCTV video recordings of 60 attempters, with 2-3 independent observers coding seven easily observable behaviours and five behaviours requiring interpretation (e.g. "strange behaviours," "anxious behaviour"). Second study: We randomly mixed 63 five-minute CCTV recordings before an attempt with 56 recordings from the same cameras at the same time of day, and day of week, but when no suicide attempt was to occur. Thirty-three undergraduate students after only 10 min of instructions watched the recordings and indicated if they observed each of 13 behaviours identified in the First Study. First study: Fifty (83%) of attempters had easily observable behaviours potentially indicative of an impending attempt, and 37 (61%) had two or more of these behaviours. Forty-five (75%) had at least one behaviours requiring interpretation. Twenty-two witnesses attempted to intervene to stop the attempt, and 75% of attempters had behaviours indicating possible ambivalence (e.g. waiting for several trains to pass; trying to get out of the path of the train). Second study: Two behaviours, leaving an object on the platform and pacing back and forth from the yellow line (just before the edge of the platform), could identify 24% of attempters with no false positives. The other target behaviours were also present in non-attempters. However, having two or more of these behaviours indicated a likelihood of being at risk of attempting suicide. We conclude that real time observations of CCTV monitors, automated computer monitoring of CCTV signals, and/or training of drivers and transit personnel on behavioural indications of suicide risk, may identify attempters with few false positives, and potentially save lives.

  8. First Space VLBI Observations and Images Using the VLBA and VSOP

    NASA Astrophysics Data System (ADS)

    Romney, J. D.; Benson, J. M.; Claussen, M. J.; Desai, K. M.; Flatters, C.; Mioduszewski, A. J.; Ulvestad, J. S.

    1997-12-01

    The National Radio Astronomy Observatory (NRAO) is a participant in the VSOP Space VLBI mission, an international collaboration led by Japan's Institute of Space and Astronautical Science. NRAO has committed up to 30% of scheduled observing time on the Very Long Baseline Array (VLBA), and corresponding correlation resources, to Space VLBI observations. The NRAO Space VLBI Project, funded by NASA, has been working for several years to complete the necessary enhancements to the VLBA correlator and the AIPS image processing system. These developments were completed by the time of the successful launch of the VSOP mission's Halca spacecraft on 1997 February 12. As part of the in-orbit checkout phase, the first Space VLBI fringes from a VLBA observation were detected on 1997 June 12, and the VSOP mission's first images, in both the 1.6- and 5-GHz bands, were obtained shortly thereafter. In-orbit test observations continued through early September, with the first General Observing Time (GOT) scientific observations beginning in July. Through mid-October, a total of 20 Space VLBI observations, comprising 190 hours, had been completed at the VLBA correlator. This paper reviews the unique features of correlation and imaging of Space VLBI observations. These include, for correlation, the ephemeris for an orbiting VLBI ``station'' which is not fixed on the surface of the earth, and the requirement to close the loop on the phase-transfer process from a frequency standard on the ground to the spacecraft. Images from a number of early tests and scientific observations are presented. NRAO's user-support program, providing expert assistance in data analysis to Space VLBI observers, is also described.

  9. SELF-CALIBRATION OF COSMIC MICROWAVE BACKGROUND POLARIZATION EXPERIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Brian G.; Yadav, Amit P. S.; Shimon, Meir

    2013-01-10

    Precision measurements of the polarization of the cosmic microwave background (CMB) radiation, especially experiments seeking to detect the odd-parity 'B-modes', have far-reaching implications for cosmology. To detect the B-modes generated during inflation, the flux response and polarization angle of these experiments must be calibrated to exquisite precision. While suitable flux calibration sources abound, polarization angle calibrators are deficient in many respects. Man-made polarized sources are often not located in the antenna's far-field, have spectral properties that are radically different from the CMB's, are cumbersome to implement, and may be inherently unstable over the (long) duration these searches require to detectmore » the faint signature of the inflationary epoch. Astrophysical sources suffer from time, frequency, and spatial variability, are not visible from all CMB observatories, and none are understood with sufficient accuracy to calibrate future CMB polarimeters seeking to probe inflationary energy scales of 10{sup 15} GeV. Both man-made and astrophysical sources require dedicated observations which detract from the amount of integration time usable for detection of the inflationary B-modes. CMB TB and EB modes, expected to identically vanish in the standard cosmological model, can be used to calibrate CMB polarimeters. By enforcing the observed EB and TB power spectra to be consistent with zero, CMB polarimeters can be calibrated to levels not possible with man-made or astrophysical sources. All of this can be accomplished for any polarimeter without any loss of observing time using a calibration source which is spectrally identical to the CMB B-modes.« less

  10. Administrative complexities for a European observational study despite directives harmonising requirements.

    PubMed

    Gülmez, Sinem Ezgi; Lignot-Maleyran, Séverine; de Vries, Corinne S; Sturkenboom, Miriam; Micon, Sophie; Hamoud, Fatima; Blin, Patrick; Moore, Nicholas

    2012-08-01

    For pharmacoepidemiological studies in Europe, accessing data should require only authorisation by the relevant data protections committees, as expected from the 1995 Data Protection Directive (95/46/EC). Our experience from a multinational observational study across seven European countries shows that this is certainly not the case. The study was a multicentre, multinational, case-population study in European liver transplant centres in seven countries, retrospectively evaluating a 3-year period. Before data collection started, the procedures to obtain the necessary authorisations for the participating countries were defined. In France, a single opinion from a single data protection committee was enough to start the study. In Italy, Portugal, Greece and the UK, there was a national authority, but the hospitals requested the approval by their local committees/bodies irrespective of whether the authorisation of the national committee came after or before that of local ones. In Ireland, only one hospital participated, and the opinion of its ethics committee was sufficient. In the Netherlands, the opinion of the institutional review board of the local coordinating centre was necessary to obtain the opinions from the institutional review boards of the other hospitals. The information requested by the different committees and the time to obtain the approvals varied, even within the same country. This degree of complexity and disharmony, and resulting cost, was observed in a simple retrospective study. Regulators will need to be aware that these time-consuming, expensive and useless complexities must be factored in when estimating the time and cost of a study. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Unsolved problems in observational astronomy. II. Focus on rapid response - mining the sky with ``thinking" telescopes

    NASA Astrophysics Data System (ADS)

    Vestrand, W. T.; Theiler, J.; Woznia, P. R.

    2004-10-01

    The existence of rapidly slewing robotic telescopes and fast alert distribution via the Internet is revolutionizing our capability to study the physics of fast astrophysical transients. But the salient challenge that optical time domain surveys must conquer is mining the torrent of data to recognize important transients in a scene full of normal variations. Humans simply do not have the attention span, memory, or reaction time required to recognize fast transients and rapidly respond. Autonomous robotic instrumentation with the ability to extract pertinent information from the data stream in real time will therefore be essential for recognizing transients and commanding rapid follow-up observations while the ephemeral behavior is still present. Here we discuss how the development and integration of three technologies: (1) robotic telescope networks; (2) machine learning; and (3) advanced database technology, can enable the construction of smart robotic telescopes, which we loosely call ``thinking'' telescopes, capable of mining the sky in real time.

  12. Merging climate and multi-sensor time-series data in real-time drought monitoring across the U.S.A.

    USGS Publications Warehouse

    Brown, Jesslyn F.; Miura, T.; Wardlow, B.; Gu, Yingxin

    2011-01-01

    Droughts occur repeatedly in the United States resulting in billions of dollars of damage. Monitoring and reporting on drought conditions is a necessary function of government agencies at multiple levels. A team of Federal and university partners developed a drought decision- support tool with higher spatial resolution relative to traditional climate-based drought maps. The Vegetation Drought Response Index (VegDRI) indicates general canopy vegetation condition assimilation of climate, satellite, and biophysical data via geospatial modeling. In VegDRI, complementary drought-related data are merged to provide a comprehensive, detailed representation of drought stress on vegetation. Time-series data from daily polar-orbiting earth observing systems [Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS)] providing global measurements of land surface conditions are ingested into VegDRI. Inter-sensor compatibility is required to extend multi-sensor data records; thus, translations were developed using overlapping observations to create consistent, long-term data time series. 

  13. The Serendip II design. [narrowband astronautical radio signal search for extraterrestrial intelligence

    NASA Technical Reports Server (NTRS)

    Werthimer, D.; Tarter, J.; Bowyer, S.

    1985-01-01

    Serendip II is an automated system designed to perform a real time search for narrow band radio signals in the spectra of sources in a regularly scheduled, non-Seti, astronomical observing program. Because Serendip II is expected to run continuously without requiring dedicated observing time, it is hoped that a large portion of the sky will be surveyed at high sensitivity and low cost. Serendip II will compute the power spectrum using a 65,536 channel fast Fourier transform processor with a real time bandwidth of 128 KHz and 2 Hz per channel resolution. After searching for peaks in a 100 KHz portion of the radio telescope's IF band, Serendip II will move to the next 100 KHz portion using a programmable frequency synthesizer; when the whole IF band has been scanned, the process will start again. Unidentified peaks in the power spectra are candidates for further study and their celestial coordinates will be recorded along with the time and power, IF and RF frequency, and bandwidth of the peak.

  14. Instrumental requirements for the detection of electron beam-induced object excitations at the single atom level in high-resolution transmission electron microscopy.

    PubMed

    Kisielowski, C; Specht, P; Gygax, S M; Barton, B; Calderon, H A; Kang, J H; Cieslinski, R

    2015-01-01

    This contribution touches on essential requirements for instrument stability and resolution that allows operating advanced electron microscopes at the edge to technological capabilities. They enable the detection of single atoms and their dynamic behavior on a length scale of picometers in real time. It is understood that the observed atom dynamic is intimately linked to the relaxation and thermalization of electron beam-induced sample excitation. Resulting contrast fluctuations are beam current dependent and largely contribute to a contrast mismatch between experiments and theory if not considered. If explored, they open the possibility to study functional behavior of nanocrystals and single molecules at the atomic level in real time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Inferring phase equations from multivariate time series.

    PubMed

    Tokuda, Isao T; Jain, Swati; Kiss, István Z; Hudson, John L

    2007-08-10

    An approach is presented for extracting phase equations from multivariate time series data recorded from a network of weakly coupled limit cycle oscillators. Our aim is to estimate important properties of the phase equations including natural frequencies and interaction functions between the oscillators. Our approach requires the measurement of an experimental observable of the oscillators; in contrast with previous methods it does not require measurements in isolated single or two-oscillator setups. This noninvasive technique can be advantageous in biological systems, where extraction of few oscillators may be a difficult task. The method is most efficient when data are taken from the nonsynchronized regime. Applicability to experimental systems is demonstrated by using a network of electrochemical oscillators; the obtained phase model is utilized to predict the synchronization diagram of the system.

  16. Rescuing complementarity with little drama

    NASA Astrophysics Data System (ADS)

    Bao, Ning; Bouland, Adam; Chatwin-Davies, Aidan; Pollack, Jason; Yuen, Henry

    2016-12-01

    The AMPS paradox challenges black hole complementarity by apparently constructing a way for an observer to bring information from the outside of the black hole into its interior if there is no drama at its horizon, making manifest a violation of monogamy of entanglement. We propose a new resolution to the paradox: this violation cannot be explicitly checked by an infalling observer in the finite proper time they have to live after crossing the horizon. Our resolution depends on a weak relaxation of the no-drama condition (we call it "little-drama") which is the "complementarity dual" of scrambling of information on the stretched horizon. When translated to the description of the black hole interior, this implies that the fine-grained quantum information of infalling matter is rapidly diffused across the entire interior while classical observables and coarse-grained geometry remain unaffected. Under the assumption that information has diffused throughout the interior, we consider the difficulty of the information-theoretic task that an observer must perform after crossing the event horizon of a Schwarzschild black hole in order to verify a violation of monogamy of entanglement. We find that the time required to complete a necessary subroutine of this task, namely the decoding of Bell pairs from the interior and the late radiation, takes longer than the maximum amount of time that an observer can spend inside the black hole before hitting the singularity. Therefore, an infalling observer cannot observe monogamy violation before encountering the singularity.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bao, Ning; Bouland, Adam; Chatwin-Davies, Aidan

    The AMPS paradox challenges black hole complementarity by apparently constructing a way for an observer to bring information from the outside of the black hole into its interior if there is no drama at its horizon, making manifest a violation of monogamy of entanglement. We propose a new resolution to the paradox: this violation cannot be explicitly checked by an infalling observer in the finite proper time they have to live after crossing the horizon. Our resolution depends on a weak relaxation of the no-drama condition (we call it “little-drama”) which is the “complementarity dual” of scrambling of information onmore » the stretched horizon. When translated to the description of the black hole interior, this implies that the fine-grained quantum information of infalling matter is rapidly diffused across the entire interior while classical observables and coarse-grained geometry remain unaffected. Under the assumption that information has diffused throughout the interior, we consider the difficulty of the information-theoretic task that an observer must perform after crossing the event horizon of a Schwarzschild black hole in order to verify a violation of monogamy of entanglement. We find that the time required to complete a necessary subroutine of this task, namely the decoding of Bell pairs from the interior and the late radiation, takes longer than the maximum amount of time that an observer can spend inside the black hole before hitting the singularity. Furthermore, an infalling observer cannot observe monogamy violation before encountering the singularity.« less

  18. Rescuing complementarity with little drama

    DOE PAGES

    Bao, Ning; Bouland, Adam; Chatwin-Davies, Aidan; ...

    2016-12-07

    The AMPS paradox challenges black hole complementarity by apparently constructing a way for an observer to bring information from the outside of the black hole into its interior if there is no drama at its horizon, making manifest a violation of monogamy of entanglement. We propose a new resolution to the paradox: this violation cannot be explicitly checked by an infalling observer in the finite proper time they have to live after crossing the horizon. Our resolution depends on a weak relaxation of the no-drama condition (we call it “little-drama”) which is the “complementarity dual” of scrambling of information onmore » the stretched horizon. When translated to the description of the black hole interior, this implies that the fine-grained quantum information of infalling matter is rapidly diffused across the entire interior while classical observables and coarse-grained geometry remain unaffected. Under the assumption that information has diffused throughout the interior, we consider the difficulty of the information-theoretic task that an observer must perform after crossing the event horizon of a Schwarzschild black hole in order to verify a violation of monogamy of entanglement. We find that the time required to complete a necessary subroutine of this task, namely the decoding of Bell pairs from the interior and the late radiation, takes longer than the maximum amount of time that an observer can spend inside the black hole before hitting the singularity. Furthermore, an infalling observer cannot observe monogamy violation before encountering the singularity.« less

  19. Short- and Long-Term Propagation of Spacecraft Orbits

    NASA Technical Reports Server (NTRS)

    Smith, John C., Jr.; Sweetser, Theodore; Chung, Min-Kun; Yen, Chen-Wan L.; Roncoli, Ralph B.; Kwok, Johnny H.; Vincent, Mark A.

    2008-01-01

    The Planetary Observer Planning Software (POPS) comprises four computer programs for use in designing orbits of spacecraft about planets. These programs are the Planetary Observer High Precision Orbit Propagator (POHOP), the Planetary Observer Long-Term Orbit Predictor (POLOP), the Planetary Observer Post Processor (POPP), and the Planetary Observer Plotting (POPLOT) program. POHOP and POLOP integrate the equations of motion to propagate an initial set of classical orbit elements to a future epoch. POHOP models shortterm (one revolution) orbital motion; POLOP averages out the short-term behavior but requires far less processing time than do older programs that perform long-term orbit propagations. POPP postprocesses the spacecraft ephemeris created by POHOP or POLOP (or optionally can use a less accurate internal ephemeris) to search for trajectory-related geometric events including, for example, rising or setting of a spacecraft as observed from a ground site. For each such event, POPP puts out such user-specified data as the time, elevation, and azimuth. POPLOT is a graphics program that plots data generated by POPP. POPLOT can plot orbit ground tracks on a world map and can produce a variety of summaries and generic ordinate-vs.-abscissa plots of any POPP data.

  20. Observations of and Influences on Low-Latitude Vertical Plasma Drifts

    NASA Astrophysics Data System (ADS)

    Miller, E. S.; Chartier, A.; Paxton, L. J.

    2016-12-01

    Many workers have suggested that the morphology (position and relative intensities) of the crests of the equatorial ionization anomaliesis related to the time history of the equatorial vertical drift. In this work, we compare observations of the vertical drift using an HF radiosignals of opportunity in the Central Pacific with UV 135.6-nm observations of the equatorial anomalies from the DMSP/SSUSI andTIMED/GUVI instruments. Furthermore, we explore the role of E region density in modulating the vertical plasma drift using a passive HFsounding experiment in the Caribbean. Coupling between nighttime medium-scale traveling ionospheric disturbances (MSTIDs) and sporadic-Elayers has been suggested as a growth-rate-increasing process. While we observe sporadic-E in the local hemisphere coincident to increases in thealtitude of the F-region altitude, we also observe uplifts without sporadic-E in the local hemisphere. Apart from the trivial explanation that sporadic-E is occurring in the conjugate hemisphere, another possible explanation is that the E region may enhance the vertical drift, but is not required to produce enhanced vertical drifts. These studies represent fruitful areas of future intersection between ground-based observations and ICON and GOLD science.

  1. Field test comparison of two dermal tolerance assessment methods of hand hygiene products.

    PubMed

    Girard, R; Carré, E; Pires-Cronenberger, S; Bertin-Mandy, M; Favier-Bulit, M C; Coyault, C; Coudrais, S; Billard, M; Regard, A; Kerhoas, A; Valdeyron, M L; Cracco, B; Misslin, P

    2008-06-01

    This study aimed to compare the sensitivity and workload requirement of two dermal tolerance assessment methods of hand hygiene products, in order to select a suitable pilot testing method for field tests. An observer-rating method and a self-assessment method were compared in 12 voluntary hospital departments (autumn/winter of 2005-2006). Three test-periods of three weeks were separated by two-week intervals during which the routine products were reintroduced. The observer rating method scored dryness and irritation on four-point scales. In the self-assessment method, the user rated appearance, intactness, moisture content, and sensation on a visual analogue scale which was converted into a 10-point numerical scale. Eleven products (soaps) were tested (223/250 complete reports for observer rating, 131/251 for self-assessment). Two products were significantly less well tolerated than the routine product according to the observers, four products according to the self-assessments. There was no significant difference between the two methods when products were classified according to tolerance (Fisher's test: P=0.491). For the symptom common to both assessment methods (dryness), there is a good correlation between the two methods (Spearman's Rho: P=0.032). The workload was higher for observer rating method (288 h of observer time plus 122 h of prevention team and pharmacist time compared with 15 h of prevention team and pharmacist time for self-assessment). In conclusion, the self-assessment method was considered more suitable for pilot testing, although further time should be allocated for educational measures as the return rate of complete self-assessment forms was poor.

  2. What Knowledge and Conceptions Do Irish Primary Schoolteachers Hold on Attention Deficit Hyperactivity Disorder?

    ERIC Educational Resources Information Center

    Ward, Victoria Ann

    2014-01-01

    Attention deficit hyperactivity disorder (ADHD) diagnosis rates have increased significantly in recent times. A teacher's role is crucial in determining if a child will be referred for an ADHD assessment. Teachers' opinions and observations are also required for and play a huge role in the actual assessment process. For this reason, their…

  3. How to make a wind-movement recorder from any spare drum-type recorder

    Treesearch

    Irvin C. Reigner

    1964-01-01

    The automatic recording of wind movement is sometimes essential to experiments in forestry and watershed-management research. Wind-movement data are often obtained by periodic reading of the anemometer dial, but occasionally data are required for variable intervals and at times when observations cannot be made easily. Instruments designed to record wind movement are...

  4. MaRIE: A facility for time-dependent materials science at the mesoscale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Cris William; Kippen, Karen Elizabeth

    To meet new and emerging national security issues the Laboratory is stepping up to meet another grand challenge—transitioning from observing to controlling a material’s performance. This challenge requires the best of experiment, modeling, simulation, and computational tools. MaRIE is the Laboratory’s proposed flagship experimental facility intended to meet the challenge.

  5. A Capstone Course on Agile Software Development Using Scrum

    ERIC Educational Resources Information Center

    Mahnic, V.

    2012-01-01

    In this paper, an undergraduate capstone course in software engineering is described that not only exposes students to agile software development, but also makes it possible to observe the behavior of developers using Scrum for the first time. The course requires students to work as Scrum Teams, responsible for the implementation of a set of user…

  6. College Student Perceptions and Ideals of Advising: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Christian, Tiffany Y.; Sprinkle, Julie E.

    2013-01-01

    Student advising has been a staple of the college experience for decades. However, the importance of advising differs greatly through the lens of the observer. Students may feel that advising is a "waste of time" or that they already know what they need to take to meet degree requirements. Conversely, other students may want the added…

  7. Genetic and Behavioral Influences on Received Aggression during Observed Play among Unfamiliar Preschool-Aged Peers

    ERIC Educational Resources Information Center

    DiLalla, Lisabeth Fisher; John, Sufna Gheyara

    2014-01-01

    Peer victimization appears heritable, but it is unclear whether the traits that confer genetic risk require time and familiarity with a perpetrator to manifest or whether novel and brief interactions can lead to received aggression that demonstrates similar genetic risk. We examined 20-minute, peer-play interactions between 5-year-olds, pairing…

  8. Optimized green fluorescent protein fused to FoF1-ATP synthase for single-molecule FRET using a fast anti-Brownian electrokinetic trap

    NASA Astrophysics Data System (ADS)

    Dienerowitz, Maria; Ilchenko, Mykhailo; Su, Bertram; Deckers-Hebestreit, Gabriele; Mayer, Günter; Henkel, Thomas; Heitkamp, Thomas; Börsch, Michael

    2016-02-01

    Observation times of freely diffusing single molecules in solution are limited by the photophysics of the attached fluorescence markers and by a small observation volume in the femtolitre range that is required for a sufficient signal-to-background ratio. To extend diffusion-limited observation times through a confocal detection volume, A. E. Cohen and W. E. Moerner have invented and built the ABELtrap -- a microfluidic device to actively counteract Brownian motion of single nanoparticles with an electrokinetic trap. Here we present a version of an ABELtrap with a laser focus pattern generated by electro-optical beam deflectors and controlled by a programmable FPGA chip. This ABELtrap holds single fluorescent nanoparticles for more than 100 seconds, increasing the observation time of fluorescent nanoparticles compared to free diffusion by a factor of 10000. To monitor conformational changes of individual membrane proteins in real time, we record sequential distance changes between two specifically attached dyes using Förster resonance energy transfer (smFRET). Fusing the a-subunit of the FoF1-ATP synthase with mNeonGreen results in an improved signal-to-background ratio at lower laser excitation powers. This increases our measured trap duration of proteoliposomes beyond 2 s. Additionally, we observe different smFRET levels attributed to varying distances between the FRET donor (mNeonGreen) and acceptor (Alexa568) fluorophore attached at the a- and c-subunit of the FoF1-ATP synthase respectively.

  9. Observations concerning the generation and propagation of Type III solar bursts

    NASA Technical Reports Server (NTRS)

    Kellogg, P. J.

    1986-01-01

    A number of Type III bursts were observed during the Helios missions in which the burst exciter passed over the spacecraft, as evidenced by strong electric field fluctuations near the plasma frequency. Six of these were suitable for detailed study. Of the six events, one was ambiguous, one showed what is interpreted as a switchover from harmonic to fundamental, and the rest all generated fundamental at onset. This would be expected if both fundamental and harmonic are generated, as, at a fixed frequency, the fundamental will be generated earlier. For the event which seems to show both fundamental and harmonic emission, the frequency ratio is not exactly 2. This is explained in terms of a time delay of the fundamental, due to scattering and diffusion in the source region. A time delay of the order of 600 seconds at 1 AU and 20 kHz, and inversely proportional to frequency, is required to explain the observations. Crude estimates show that delay times at least this long may be attributed to trapping and scattering.

  10. Long-term stability of radiotherapy dosimeters calibrated at the Polish Secondary Standard Dosimetry Laboratory.

    PubMed

    Ulkowski, Piotr; Bulski, Wojciech; Chełmiński, Krzysztof

    2015-10-01

    Unidos 10001, Unidos E (10008/10009) and Dose 1 electrometers from 14 radiotherapy centres were calibrated 3-4 times over a long period of time, together with Farmer type (PTW 30001, 30013, Nuclear Enterprises 2571 and Scanditronix-Wellhofer FC65G) cylindrical ionization chambers and plane-parallel type chambers (PTW Markus 23343 and Scanditronix-Wellhofer PPC05). On the basis of the long period of repetitive establishing of calibration coefficients for the same electrometers and ionization chambers, the accuracy of electrometers and the long-term stability of ionization chambers were examined. All measurements were carried out at the same laboratory, by the same staff, according to the same IAEA recommendations. A good accuracy and long-term stability of the dosimeters used in Polish radiotherapy centres was observed. These values were within 0.1% for electrometers and 0.2% for the chambers with electrometers. Furthermore, these values were not observed to vary over time. The observations confirm the opinion that the requirement of calibration of the dosimeters more often than every 2 years is not justified. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Time required for institutional review board review at one Veterans Affairs medical center.

    PubMed

    Hall, Daniel E; Hanusa, Barbara H; Stone, Roslyn A; Ling, Bruce S; Arnold, Robert M

    2015-02-01

    Despite growing concern that institutional review boards (IRBs) impose burdensome delays on research, little is known about the time required for IRB review across different types of research. To measure the overall and incremental process times for IRB review as a process of quality improvement. After developing a detailed process flowchart of the IRB review process, 2 analysts abstracted temporal data from the records pertaining to all 103 protocols newly submitted to the IRB at a large urban Veterans Affairs medical center from June 1, 2009, through May 31, 2011. Disagreements were reviewed with the principal investigator to reach consensus. We then compared the review times across review types using analysis of variance and post hoc Scheffé tests after achieving normally distributed data through logarithmic transformation. Calendar days from initial submission to final approval of research protocols. Initial IRB review took 2 to 4 months, with expedited and exempt reviews requiring less time (median [range], 85 [23-631] and 82 [16-437] days, respectively) than full board reviews (median [range], 131 [64-296] days; P = .008). The median time required for credentialing of investigators was 1 day (range, 0-74 days), and review by the research and development committee took a median of 15 days (range, 0-184 days). There were no significant differences in credentialing or research and development times across review types (exempt, expedited, or full board). Of the extreme delays in IRB review, 80.0% were due to investigators' slow responses to requested changes. There were no systematic delays attributable to the information security officer, privacy officer, or IRB chair. Measuring and analyzing review times is a critical first step in establishing a culture and process of continuous quality improvement among IRBs that govern research programs. The review times observed at this IRB are substantially longer than the 60-day target recommended by expert panels. The method described here could be applied to other IRBs to begin identifying and improving inefficiencies.

  12. The CACAO Method for Smoothing, Gap Filling, and Characterizing Seasonal Anomalies in Satellite Time Series

    NASA Technical Reports Server (NTRS)

    Verger, Aleixandre; Baret, F.; Weiss, M.; Kandasamy, S.; Vermote, E.

    2013-01-01

    Consistent, continuous, and long time series of global biophysical variables derived from satellite data are required for global change research. A novel climatology fitting approach called CACAO (Consistent Adjustment of the Climatology to Actual Observations) is proposed to reduce noise and fill gaps in time series by scaling and shifting the seasonal climatological patterns to the actual observations. The shift and scale CACAO parameters adjusted for each season allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. CACAO was assessed first over simulated daily Leaf Area Index (LAI) time series with varying fractions of missing data and noise. Then, performances were analyzed over actual satellite LAI products derived from AVHRR Long-Term Data Record for the 1981-2000 period over the BELMANIP2 globally representative sample of sites. Comparison with two widely used temporal filtering methods-the asymmetric Gaussian (AG) model and the Savitzky-Golay (SG) filter as implemented in TIMESAT-revealed that CACAO achieved better performances for smoothing AVHRR time series characterized by high level of noise and frequent missing observations. The resulting smoothed time series captures well the vegetation dynamics and shows no gaps as compared to the 50-60% of still missing data after AG or SG reconstructions. Results of simulation experiments as well as confrontation with actual AVHRR time series indicate that the proposed CACAO method is more robust to noise and missing data than AG and SG methods for phenology extraction.

  13. Automating the Processing of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Pang, Wan-Lin; Nemani, Ramakrishna; Votava, Petr

    2003-01-01

    NASA s vision for Earth science is to build a "sensor web": an adaptive array of heterogeneous satellites and other sensors that will track important events, such as storms, and provide real-time information about the state of the Earth to a wide variety of customers. Achieving this vision will require automation not only in the scheduling of the observations but also in the processing of the resulting data. To address this need, we are developing a planner-based agent to automatically generate and execute data-flow programs to produce the requested data products.

  14. Impact of space-based instruments on magnetic star research: past and future

    NASA Astrophysics Data System (ADS)

    Weiss, WW.; Neiner, C.; Wade, G. A.

    2018-01-01

    Magnetic stars are observed at a large variety of spectral ranges, frequently with photometric and spectroscopic techniques and on time scales ranging from a 'snap shot' to years, sometimes using data sets which are continuous over many months. The outcome of such observations has been discussed during this conference and many examples have been presented, demonstrating the high scientific significance and gains in our knowledge that result from these observations. A key question that should be addressed is, what are the advantages and requirements of space based research of magnetic stars, particularly in relation to ground based observations? And what are the drawbacks? What are the hopes for the future? In the following, we intend to present an overview that addresses these questions.

  15. Stokes Profile Compression Applied to VSM Data

    NASA Astrophysics Data System (ADS)

    Toussaint, W. A.; Henney, C. J.; Harvey, J. W.

    2012-02-01

    The practical details of applying the Expansion in Hermite Functions (EHF) method to compression of full-disk full-Stokes solar spectroscopic data from the SOLIS/VSM instrument are discussed in this paper. The algorithm developed and discussed here preserves the 630.15 and 630.25 nm Fe i lines, along with the local continuum and telluric lines. This compression greatly reduces the amount of space required to store these data sets while maintaining the quality of the data, allowing these observations to be archived and made publicly available with limited bandwidth. Applying EHF to the full-Stokes profiles and saving the coefficient files with Rice compression reduces the disk space required to store these observations by a factor of 20, while maintaining the quality of the data and with a total compression time only 35% slower than the standard gzip (GNU zip) compression.

  16. VizieR Online Data Catalog: ALLSMOG final data release. A new APEX CO survey (Cicone+, 2017)

    NASA Astrophysics Data System (ADS)

    Cicone, C.; Bothwell, M.; Wagg, J.; Moller, P.; De Breuck, C.; Zhang, Z.; Martin, S.; Maiolino, R.; Severgnini, P.; Aravena, M.; Belfiore, F.; Espada, D.; Flutsch, A.; Impellizzeri, V.; Peng, Y.; Raj, M. A.; Ramirez-Olivencia, N.; Riechers, D.; Schawinski, K.

    2017-10-01

    ALLSMOG is an ESO Large Programme for the Atacama Pathfinder EXperiment (APEX, project no.: E-192.A-0359, principal investigator (PI): J. Wagg) targeting the CO(2-1) emission line (rest frequency, νCO(2-1)=230.538GHz) in 88 local, low-M* star-forming galaxies. The project was initially allocated 300h of ESO observing time over the course of four semesters, corresponding to 75h per semester throughout periods P92-P95 (October 2013 - September 2015). However, during P94 and P95 there was a slowdown in ALLSMOG observations, mainly due the installation of the visiting instrument Supercam in combination with better-than-average weather conditions - causing other programmes requiring more stringent precipitable water vapour (PWV) constraints to be prioritised. Because of the resulting ~50% time loss for ALLSMOG during two semesters, the ESO observing programmes committee (OPC) granted a one-semester extension of the project, hence allowing us to complete the survey in P96 (March 2016). The final total APEX observing time dedicated to ALLSMOG amounts to 327h, including the overheads due to setup and calibration but not accounting for possible additional time lost because of technical issues. In 2014 a northern component of the ALLSMOG survey was approved at the IRAM 30m telescope (project code: 188-14, PI: S. Martin), aimed at observing the CO(1-0) (rest frequency, νCO(1-0)=115.271GHz) and CO(2-1) emission lines in a sample of nine additional galaxies characterised by stellar masses, M*<109Mȯ. A total of 22h of observations were obtained with the IRAM 30m during two observing runs in November 2014 and May 2015. (5 data files).

  17. LSST Operations Simulator

    NASA Astrophysics Data System (ADS)

    Cook, K. H.; Delgado, F.; Miller, M.; Saha, A.; Allsman, R.; Pinto, P.; Gee, P. A.

    2005-12-01

    We have developed an operations simulator for LSST and used it to explore design and operations parameter space for this large etendue telescope and its ten year survey mission. The design is modular, with separate science programs coded in separate modules. There is a sophisticated telescope module with all motions parametrized for ease of testing different telescope capabilities, e.g. effect of acceleration capabilities of various motors on science output. Sky brightness is calculated as a function of moon phase and separation. A sophisticated exposure time calculator has been developed for LSST which is being incorporated into the simulator to allow specification of S/N requirements. All important parameters for the telescope, the site and the science programs are easily accessible in configuration files. Seeing and cloud data from the three candidate LSST sites are used for our simulations. The simulator has two broad categories of science proposals: sky coverage and transient events. Sky coverage proposals base their observing priorities on a required number of observations for each field in a particular filter with specified conditions (maximum seeing, sky brightness, etc) and one is used for a weak lensing investigation. Transient proposals are highly configurable. A transient proposal can require sequential, multiple exposures in various filters with a specified sequence of filters, and require a particular cadence for multiple revisits to complete an observation sequence. Each science proposal ranks potential observations based upon the internal logic of that proposal. We present the results of a variety of mixed science program observing simulations, showing how varied programs can be carried out simultaneously, with many observations serving multiple science goals. The simulator has shown that LSST can carry out its multiple missions under a variety of conditions. KHC's work was performed under the auspices of the US DOE, NNSA by the Univ. of California, LLNL under contract No. W-7405-Eng-48.

  18. Towards a regional coastal ocean observing system: An initial design for the Southeast Coastal Ocean Observing Regional Association

    NASA Astrophysics Data System (ADS)

    Seim, H. E.; Fletcher, M.; Mooers, C. N. K.; Nelson, J. R.; Weisberg, R. H.

    2009-05-01

    A conceptual design for a southeast United States regional coastal ocean observing system (RCOOS) is built upon a partnership between institutions of the region and among elements of the academic, government and private sectors. This design envisions support of a broad range of applications (e.g., marine operations, natural hazards, and ecosystem-based management) through the routine operation of predictive models that utilize the system observations to ensure their validity. A distributed information management system enables information flow, and a centralized information hub serves to aggregate information regionally and distribute it as needed. A variety of observing assets are needed to satisfy model requirements. An initial distribution of assets is proposed that recognizes the physical structure and forcing in the southeast U.S. coastal ocean. In-situ data collection includes moorings, profilers and gliders to provide 3D, time-dependent sampling, HF radar and surface drifters for synoptic sampling of surface currents, and satellite remote sensing of surface ocean properties. Nested model systems are required to properly represent ocean conditions from the outer edge of the EEZ to the watersheds. An effective RCOOS will depend upon a vital "National Backbone" (federally supported) system of in situ and satellite observations, model products, and data management. This dependence highlights the needs for a clear definition of the National Backbone components and a Concept of Operations (CONOPS) that defines the roles, functions and interactions of regional and federal components of the integrated system. A preliminary CONOPS is offered for the Southeast (SE) RCOOS. Thorough system testing is advocated using a combination of application-specific and process-oriented experiments. Estimates of costs and personnel required as initial components of the SE RCOOS are included. Initial thoughts on the Research and Development program required to support the RCOOS are also outlined.

  19. Force-Induced Rupture of a DNA Duplex: From Fundamentals to Force Sensors.

    PubMed

    Mosayebi, Majid; Louis, Ard A; Doye, Jonathan P K; Ouldridge, Thomas E

    2015-12-22

    The rupture of double-stranded DNA under stress is a key process in biophysics and nanotechnology. In this article, we consider the shear-induced rupture of short DNA duplexes, a system that has been given new importance by recently designed force sensors and nanotechnological devices. We argue that rupture must be understood as an activated process, where the duplex state is metastable and the strands will separate in a finite time that depends on the duplex length and the force applied. Thus, the critical shearing force required to rupture a duplex depends strongly on the time scale of observation. We use simple models of DNA to show that this approach naturally captures the observed dependence of the force required to rupture a duplex within a given time on duplex length. In particular, this critical force is zero for the shortest duplexes, before rising sharply and then plateauing in the long length limit. The prevailing approach, based on identifying when the presence of each additional base pair within the duplex is thermodynamically unfavorable rather than allowing for metastability, does not predict a time-scale-dependent critical force and does not naturally incorporate a critical force of zero for the shortest duplexes. We demonstrate that our findings have important consequences for the behavior of a new force-sensing nanodevice, which operates in a mixed mode that interpolates between shearing and unzipping. At a fixed time scale and duplex length, the critical force exhibits a sigmoidal dependence on the fraction of the duplex that is subject to shearing.

  20. Observation of a Pharmacist-Conducted Group A Streptococcal Pharyngitis Point-of-Care Test: A Time and Motion Study.

    PubMed

    Corn, Carolyn E; Klepser, Donald G; Dering-Anderson, Allison M; Brown, Terrence G; Klepser, Michael E; Smith, Jaclyn K

    2018-06-01

    Acute pharyngitis is among the most common infectious diseases encountered in the United States, resulting in 13 million patient visits annually, with group A streptococcus (GAS) being a common causative pathogen. It is estimated that annual expenditures for the treatment of adult pharyngitis will exceed US$1.2 billion annually. This substantial projection reinforces the need to evaluate diagnosis and treatment of adult pharyngitis in nontraditional settings. The objective of this research is to quantify the amount of pharmacist time required to complete a point-of-care (POC) test for a patient presenting with pharyngitis symptoms. A standardized patient with pharyngitis symptoms visited 11 pharmacies for POC testing services for a total of 33 patient encounters. An observer was present at each encounter and recorded the total encounter time, divided into 9 categories. Pharmacists conducted POC testing in 1 of 2 ways: sequence 1-pharmacists performed all service-related tasks; sequence 2-both pharmacists and pharmacist interns performed service-related tasks. The average time for completion of a POC test for GAS pharyngitis was 25.3 ± 4.8 minutes. The average pharmacist participation time per encounter was 12.7 ± 3.0 minutes (sequence 1), which decreased to 2.6 ± 1.1 minutes when pharmacist interns were involved in the testing (sequence 2). Although additional studies are required to further assess service feasibility, this study indicates that a GAS POC testing service could be implemented in a community pharmacy with limited disruption or change to workflow and staff.

  1. Integration of multiple research disciplines on the International Space Station

    NASA Technical Reports Server (NTRS)

    Penley, N. J.; Uri, J.; Sivils, T.; Bartoe, J. D.

    2000-01-01

    The International Space Station will provide an extremely high-quality, long-duration microgravity environment for the conduct of research. In addition, the ISS offers a platform for performing observations of Earth and Space from a high-inclination orbit, outside of the Earth's atmosphere. This unique environment and observational capability offers the opportunity for advancement in a diverse set of research fields. Many of these disciplines do not relate to one another, and present widely differing approaches to study, as well as different resource and operational requirements. Significant challenges exist to ensure the highest quality research return for each investigation. Requirements from different investigations must be identified, clarified, integrated and communicated to ISS personnel in a consistent manner. Resources such as power, crew time, etc. must be apportioned to allow the conduct of each investigation. Decisions affecting research must be made at the strategic level as well as at a very detailed execution level. The timing of the decisions can range from years before an investigation to real-time operations. The international nature of the Space Station program adds to the complexity. Each participating country must be assured that their interests are represented during the entire planning and operations process. A process for making decisions regarding research planning, operations, and real-time replanning is discussed. This process ensures adequate representation of all research investigators. It provides a means for timely decisions, and it includes a means to ensure that all ISS International Partners have their programmatic interests represented. c 2000 Published by Elsevier Science Ltd. All rights reserved.

  2. Targeted social mobilization in a global manhunt.

    PubMed

    Rutherford, Alex; Cebrian, Manuel; Rahwan, Iyad; Dsouza, Sohan; McInerney, James; Naroditskiy, Victor; Venanzi, Matteo; Jennings, Nicholas R; deLara, J R; Wahlstedt, Eero; Miller, Steven U

    2013-01-01

    Social mobilization, the ability to mobilize large numbers of people via social networks to achieve highly distributed tasks, has received significant attention in recent times. This growing capability, facilitated by modern communication technology, is highly relevant to endeavors which require the search for individuals that possess rare information or skills, such as finding medical doctors during disasters, or searching for missing people. An open question remains, as to whether in time-critical situations, people are able to recruit in a targeted manner, or whether they resort to so-called blind search, recruiting as many acquaintances as possible via broadcast communication. To explore this question, we examine data from our recent success in the U.S. State Department's Tag Challenge, which required locating and photographing 5 target persons in 5 different cities in the United States and Europe - in under 12 hours - based only on a single mug-shot. We find that people are able to consistently route information in a targeted fashion even under increasing time pressure. We derive an analytical model for social-media fueled global mobilization and use it to quantify the extent to which people were targeting their peers during recruitment. Our model estimates that approximately 1 in 3 messages were of targeted fashion during the most time-sensitive period of the challenge. This is a novel observation at such short temporal scales, and calls for opportunities for devising viral incentive schemes that provide distance or time-sensitive rewards to approach the target geography more rapidly. This observation of '12 hours of separation' between individuals has applications in multiple areas from emergency preparedness, to political mobilization.

  3. Management of post-traumatic retained hemothorax: a prospective, observational, multicenter AAST study.

    PubMed

    DuBose, Joseph; Inaba, Kenji; Demetriades, Demetrios; Scalea, Thomas M; O'Connor, James; Menaker, Jay; Morales, Carlos; Konstantinidis, Agathoklis; Shiflett, Anthony; Copwood, Ben

    2012-01-01

    The natural history and optimal management of retained hemothorax (RH) after chest tube placement is unknown. The intent of our study was to determine practice patterns used and identify independent predictors of the need for thoracotomy. An American Association for the Surgery of Trauma multicenter prospective observational trial was conducted, enrolling patients with placement of chest tube within 24 hours of trauma admission and RH on subsequent computed tomography of the chest. Demographics, interventions, and outcomes were analyzed. Logistic regression analysis was used to identify the independent predictors of successful intervention for each of the management choices chosen and complications. RH was identified in 328 patients from 20 centers. Video-assisted thoracoscopy (VATS) was the most commonly used initial procedure in 33.5%, but 26.5% required two and 5.4% required three procedures to clear RH or subsequent empyema. Thoracotomy was ultimately required in 20.4%. The strongest independent predictor of successful observation was estimated volume of RH ≤300 cc (odds ratio [OR], 3.7 [2.0-7.0]; p < 0.001). Independent predictors of successful VATS as definitive treatment were absence of an associated diaphragm injury (OR, 4.7 [1.6-13.7]; p = 0.005), use of periprocedural antibiotics for thoracostomy placement (OR, 3.3 [1.2-9.0]; p = 0.023), and volume of RH ≤900 cc (OR, 3.9 [1.4-13.2]; p = 0.03). No relationship between timing of VATS and success rate was identified. Independent predictors of the need for thoracotomy included diaphragm injury (OR, 4.9 [2.4-9.9]; p < 0.001), RH >900 cc (OR, 3.2 [1.4-7.5]; p = 0.007), and failure to give periprocedural antibiotics for initial chest tube placement (OR 2.3 [1.2-4.6]; p = 0.015). The overall empyema and pneumonia rates for RH patients were 26.8% and 19.5%, respectively. RH in trauma is associated with high rates of empyema and pneumonia. VATS can be performed with high success rates, although optimal timing is unknown. Approximately, 25% of patients require at least two procedures to effectively clear RH or subsequent pleural space infections and 20.4% require thoracotomy.

  4. The wireless networking system of Earthquake precursor mobile field observation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Teng, Y.; Wang, X.; Fan, X.; Wang, X.

    2012-12-01

    The mobile field observation network could be real-time, reliably record and transmit large amounts of data, strengthen the physical signal observations in specific regions and specific period, it can improve the monitoring capacity and abnormal tracking capability. According to the features of scatter everywhere, a large number of current earthquake precursor observation measuring points, networking technology is based on wireless broadband accessing McWILL system, the communication system of earthquake precursor mobile field observation would real-time, reliably transmit large amounts of data to the monitoring center from measuring points through the connection about equipment and wireless accessing system, broadband wireless access system and precursor mobile observation management center system, thereby implementing remote instrument monitoring and data transmition. At present, the earthquake precursor field mobile observation network technology has been applied to fluxgate magnetometer array geomagnetic observations of Tianzhu, Xichang,and Xinjiang, it can be real-time monitoring the working status of the observational instruments of large area laid after the last two or three years, large scale field operation. Therefore, it can get geomagnetic field data of the local refinement regions and provide high-quality observational data for impending earthquake tracking forecast. Although, wireless networking technology is very suitable for mobile field observation with the features of simple, flexible networking etc, it also has the phenomenon of packet loss etc when transmitting a large number of observational data due to the wireless relatively weak signal and narrow bandwidth. In view of high sampling rate instruments, this project uses data compression and effectively solves the problem of data transmission packet loss; Control commands, status data and observational data transmission use different priorities and means, which control the packet loss rate within an acceptable range and do not affect real-time observation curve. After field running test and earthquake tracking project applications, the field mobile observation wireless networking system is operate normally, various function have good operability and show good performance, the quality of data transmission meet the system design requirements and play a significant role in practical applications.

  5. Extracellular space preservation aids the connectomic analysis of neural circuits

    PubMed Central

    Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L

    2015-01-01

    Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits. DOI: http://dx.doi.org/10.7554/eLife.08206.001 PMID:26650352

  6. In orbit adiabatic demagnetization refrigeration for bolometric and microcalorimetric detectors

    NASA Astrophysics Data System (ADS)

    Hepburn, I. D.; Ade, P. A. R.; Davenport, I.; Smith, A.; Sumner, T. J.

    1992-12-01

    The new generation of photon detectors for satellite based mm/submm and X-ray astronomical observations require cooling to temperatures in the range 60 to 300 mK. At present Adiabatic Demagnetization Refrigeration (ADR) is the best proposed technique for producing these temperatures in orbit due to its inherent simplicity and gravity independent operation. For the efficient utilization of an ADR it is important to realize long operational times at base temperature with short recycle times. These criteria are dependent on several parameters; the required operating temperature, the cryogen bath temperature, the amount of heat leakage to the paramagnetic salt, the volume and type of salt and the maximum obtainable magnetic field. For space application these parameters are restricted by the limitations imposed on the physical size, the mass, the available electrical power and the cooling power available. The design considerations required in order to match these parameters are described and test data from a working laboratory system is presented.

  7. Our contaminated atmosphere: The danger of climate change, phases 1 and 2. [effect of atmospheric particulate matter on surface temperature and earth's radiation budget

    NASA Technical Reports Server (NTRS)

    Cimorelli, A. J.; House, F. B.

    1974-01-01

    The effects of increased concentrations of atmospheric particulate matter on average surface temperature and on the components of the earth's radiation budget are studied. An atmospheric model which couples particulate loading to surface temperature and to changes in the earth's radiation budget was used. A determination of the feasibility of using satellites to monitor the effect of increased atmospheric particulate concentrations is performed. It was found that: (1) a change in man-made particulate loading of a factor of 4 is sufficient to initiate an ice age; (2) variations in the global and hemispheric weighted averages of surface temperature, reflected radiant fluz and emitted radiant flux are nonlinear functions of particulate loading; and (3) a black satellite sphere meets the requirement of night time measurement sensitivity, but not the required day time sensitivity. A nonblack, spherical radiometer whose external optical properties are sensitive to either the reflected radiant fluz or the emitted radiant flux meets the observational sensitivity requirements.

  8. Economy with the time delay of information flow—The stock market case

    NASA Astrophysics Data System (ADS)

    Miśkiewicz, Janusz

    2012-02-01

    Any decision process requires information about the past and present state of the system, but in an economy acquiring data and processing it is an expensive and time-consuming task. Therefore, the state of the system is often measured over some legal interval, analysed after the end of well defined time periods and the results announced much later before any strategic decision is envisaged. The various time delay roles have to be crucially examined. Here, a model of stock market coupled with an economy is investigated to emphasise the role of the time delay span on the information flow. It is shown that the larger the time delay the more important the collective behaviour of agents since one observes time oscillations in the absolute log-return autocorrelations.

  9. Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1996-01-01

    The NASA OPAD spectrometer system relies heavily on extensive software which repetitively extracts spectral information from the engine plume and reports the amounts of metals which are present in the plume. The development of this software is at a sufficiently advanced stage where it can be used in actual engine tests to provide valuable data on engine operation and health. This activity will continue and, in addition, the OPAD system is planned to be used in flight aboard space vehicles. The two implementations, test-stand and in-flight, may have some differing requirements. For example, the data stored during a test-stand experiment are much more extensive than in the in-flight case. In both cases though, the majority of the requirements are similar. New data from the spectrograph is generated at a rate of once every 0.5 sec or faster. All processing must be completed within this period of time to maintain real-time performance. Every 0.5 sec, the OPAD system must report the amounts of specific metals within the engine plume, given the spectral data. At present, the software in the OPAD system performs this function by solving the inverse problem. It uses powerful physics-based computational models (the SPECTRA code), which receive amounts of metals as inputs to produce the spectral data that would have been observed, had the same metal amounts been present in the engine plume. During the experiment, for every spectrum that is observed, an initial approximation is performed using neural networks to establish an initial metal composition which approximates as accurately as possible the real one. Then, using optimization techniques, the SPECTRA code is repetitively used to produce a fit to the data, by adjusting the metal input amounts until the produced spectrum matches the observed one to within a given level of tolerance. This iterative solution to the original problem of determining the metal composition in the plume requires a relatively long period of time to execute the software in a modern single-processor workstation, and therefore real-time operation is currently not possible. A different number of iterations may be required to perform spectral data fitting per spectral sample. Yet, the OPAD system must be designed to maintain real-time performance in all cases. Although faster single-processor workstations are available for execution of the fitting and SPECTRA software, this option is unattractive due to the excessive cost associated with very fast workstations and also due to the fact that such hardware is not easily expandable to accommodate future versions of the software which may require more processing power. Initial research has already demonstrated that the OPAD software can take advantage of a parallel computer architecture to achieve the necessary speedup. Current work has improved the software by converting it into a form which is easily parallelizable. Timing experiments have been performed to establish the computational complexity and execution speed of major components of the software. This work provides the foundation of future work which will create a fully parallel version of the software executing in a shared-memory multiprocessor system.

  10. Sprouting characteristics and associated changes in nutritional composition of cowpea (Vigna unguiculata).

    PubMed

    Devi, Chingakham Basanti; Kushwaha, Archana; Kumar, Anil

    2015-10-01

    Cowpea (Vigna unguiculata), is an important arid legume with a good source of energy, protein, vitamins, minerals and dietary fibre. Sprouting of legumes enhances the bioavailability and digestibility of nutrients and therefore plays an important role in human nutrition. Improved varieties of grain cowpea viz. Pant Lobia-1 (PL-1) and Pant Lobia-2 (PL-2) and Pant Lobia-3 (PL-3) were examined for sprouting characteristics and associated changes in nutritional quality. Soaking time, sprouting time and sprouting temperature combinations for desirable sprout length of ¼ to ½ inch for cowpea seed samples were standardized. All the observations were taken in triplicate except soaking time, where six observations were taken in a completely randomized design of three treatments. Results revealed that optimum soaking time of PL-1 and PL-2 seed was 3 h whereas PL-3 required 9 h. Sprouting period of 24 h at 25 °C was found to be desirable for obtaining good sprouts. Significant improvement in nutritional quality was observed after sprouting at 25 °C for 24 h; protein increased by 9-12 %, vitamin C increased by 4-38 times, phytic acid decreased by 4-16 times, trypsin inhibitor activity decreased by 28-55 % along with an increase of 8-20 % in in-vitro protein digestibility.

  11. A Prospective Comparison of Intraluminal and Extraluminal Placement of the 9-French Arndt Bronchial Blocker in Adult Thoracic Surgery Patients.

    PubMed

    Templeton, T Wesley; Morris, Benjamin N; Goenaga-Diaz, Eduardo J; Forest, Daniel J; Hadley, Rhett; Moore, Blake A; Bryan, Yvon F; Royster, Roger L

    2017-08-01

    To compare the standard intraluminal approach with the placement of the 9-French Arndt endobronchial blocker with an extraluminal approach by measuring the time to positioning and other relevant intraoperative and postoperative parameters. A prospective, randomized, controlled trial. University hospital. The study comprised 41 patients (20 intraluminal, 21 extraluminal) undergoing thoracic surgery. Placement of a 9-French Arndt bronchial blocker either intraluminally or extraluminally. Comparisons between the 2 groups included the following: (1) time for initial placement, (2) quality of isolation at 1-hour intervals during one-lung ventilation, (3) number of repositionings during one-lung ventilation, and (4) presence or absence of a sore throat on postoperative days 1 and 2 and, if present, its severity. Median time to placement (min:sec) in the extraluminal group was statistically faster at 2:42 compared with 6:24 in the intraluminal group (p < 0.05). Overall quality of isolation was similar between groups, even though a significant number of blockers in both groups required repositioning (extraluminal 47%, intraluminal 40%, p > 0.05), and 1 blocker ultimately had to be replaced intraoperatively. No differences in the incidence or severity of sore throat postoperatively were observed. A statistically significant reduction in time to placement using the extraluminal approach without any differences in the rate of postoperative sore throat was observed. Whether placed intraluminally or extraluminally, a significant percentage of Arndt endobronchial blockers required at least one intraoperative repositioning. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Observational Model for Precision Astrometry with the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Turyshev, Slava G.; Milman, Mark H.

    2000-01-01

    The Space Interferometry Mission (SIM) is a space-based 10-m baseline Michelson optical interferometer operating in the visible waveband that is designed to achieve astrometric accuracy in the single digits of the microarcsecond domain. Over a narrow field of view SIM is expected to achieve a mission accuracy of 1 microarcsecond. In this mode SIM will search for planetary companions to nearby stars by detecting the astrometric "wobble" relative to a nearby reference star. In its wide-angle mode, SIM will provide 4 microarcsecond precision absolute position measurements of stars, with parallaxes to comparable accuracy, at the end of its 5-year mission. The expected proper motion accuracy is around 3 microarcsecond/year, corresponding to a transverse velocity of 10 m/ s at a distance of 1 kpc. The basic astrometric observable of the SIM instrument is the pathlength delay. This measurement is made by a combination of internal metrology measurements that determine the distance the starlight travels through the two arms of the interferometer, and a measurement of the white light stellar fringe to find the point of equal pathlength. Because this operation requires a non-negligible integration time, the interferometer baseline vector is not stationary over this time period, as its absolute length and orientation are time varying. This paper addresses how the time varying baseline can be "regularized" so that it may act as a single baseline vector for multiple stars, as required for the solution of the astrometric equations.

  13. Development of a real time bistatic radar receiver using signals of opportunity

    NASA Astrophysics Data System (ADS)

    Rainville, Nicholas

    Passive bistatic radar remote sensing offers a novel method of monitoring the Earth's surface by observing reflected signals of opportunity. The Global Positioning System (GPS) has been used as a source of signals for these observations and the scattering properties of GPS signals from rough surfaces are well understood. Recent work has extended GPS signal reflection observations and scattering models to include communications signals such as XM radio signals. However the communication signal reflectometry experiments to date have relied on collecting raw, high data-rate signals which are then post-processed after the end of the experiment. This thesis describes the development of a communication signal bistatic radar receiver which computes a real time correlation waveform, which can be used to retrieve measurements of the Earth's surface. The real time bistatic receiver greatly reduces the quantity of data that must be stored to perform the remote sensing measurements, as well as offering immediate feedback. This expands the applications for the receiver to include space and bandwidth limited platforms such as aircraft and satellites. It also makes possible the adjustment of flight plans to the observed conditions. This real time receiver required the development of an FGPA based signal processor, along with the integration of commercial Satellite Digital Audio Radio System (SDARS) components. The resulting device was tested both in a lab environment as well on NOAA WP-3D and NASA WB-57 aircraft.

  14. A Microglitch in the Millisecond Pulsar PSR B1821-24 in M28

    NASA Astrophysics Data System (ADS)

    Cognard, Ismaël; Backer, Donald C.

    2004-09-01

    We report on the observation of a very small glitch observed for the first time in a millisecond pulsar, PSR B1821-24, located in the globular cluster M28. Timing observations were mainly conducted with the Nançay radio telescope (France), and confirmation comes from the 140 ft radio telescope at Green Bank and the new Green Bank Telescope data. This event is characterized by a rotation frequency step of 3 nHz, or 10-11 in fractional frequency change, along with a short duration limited to a few days or a week. A marginally significant frequency derivative step was also found. This glitch follows the main characteristics of those in the slow-period pulsars but is 2 orders of magnitude smaller than the smallest ever recorded. Such an event must be very rare for millisecond pulsars since no other glitches have been detected when the cumulated number of years of millisecond pulsar timing observations up to 2001 is around 500 for all these objects. However, pulsar PSR B1821-24 is one of the youngest among the old recycled ones, and there is likely a correlation between age, or a related parameter, and timing noise. While this event happens on a much smaller scale, the required adjustment of the star to a new equilibrium figure as it spins down is a likely common cause for all glitches.

  15. Modeling volatility using state space models.

    PubMed

    Timmer, J; Weigend, A S

    1997-08-01

    In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).

  16. A prospective audit of preprocedural fasting practices on a transplant ward: when fasting becomes starving.

    PubMed

    Vidot, Helen; Teevan, Kate; Carey, Sharon; Strasser, Simone; Shackel, Nicholas

    2016-03-01

    To investigate the prevalence and duration of preprocedural medically ordered fasting during a period of hospitalisation in an Australian population of patients with hepatic cirrhosis or following liver transplantation and to identify potential solutions to reduce fasting times. Protein-energy malnutrition is a common finding in patients with hepatic cirrhosis and can impact significantly on survival and quality of life. Protein and energy requirements in patients with cirrhosis are higher than those of healthy individuals. A significant feature of cirrhosis is the induction of starvation metabolism following seven to eight hours of food deprivation. Many investigative and interventional procedures for patients with cirrhosis necessitate a period of fasting to comply with anaesthesia guidelines. An observational study of the fasting episodes for 34 hospitalised patients with hepatic cirrhosis or following liver transplantation. Nutritional status was estimated using subjective global assessment and handgrip strength. The prevalence and duration of fasting practices for diagnostic or investigational procedures were estimated using electronic records and patient notes. Thirty-three patients (97%) were malnourished. Twenty-two patients (65%) were fasted during the observation period. There were 43 occasions of fasting with a median fasting time of 13·5 hours. On 40 occasions fasting times exceeded the maximum six-hour guideline recommended prior to the administration of anaesthesia by the majority of Anaesthesiology Societies. The majority of procedures (77%) requiring fasting occurred after midday. Eating breakfast on the day of the procedure reduced fasting time by 45%. Medically ordered preprocedural fasting times almost always exceed existing guidelines in this nutritionally compromised group. Adherence to fasting guidelines and eating breakfast before the procedure can reduce fasting times significantly and avoid the potential induction of starvation metabolism in this nutritionally at risk group. © 2016 John Wiley & Sons Ltd.

  17. Bud break responds more strongly to daytime than night-time temperature under asymmetric experimental warming.

    PubMed

    Rossi, Sergio; Isabel, Nathalie

    2017-01-01

    Global warming is diurnally asymmetric, leading to a less cold, rather than warmer, climate. We investigated the effects of asymmetric experimental warming on plant phenology by testing the hypothesis that daytime warming is more effective in advancing bud break than night-time warming. Bud break was monitored daily in Picea mariana seedlings belonging to 20 provenances from Eastern Canada and subjected to daytime and night-time warming in growth chambers at temperatures varying between 8 and 16 °C. The higher advancements of bud break and shorter times required to complete the phenological phases occurred with daytime warming. Seedlings responded to night-time warming, but still with less advancement of bud break than under daytime warming. No advancement was observed when night-time warming was associated with a daytime cooling. The effect of the treatments was uniform across provenances. Our observations realized under controlled conditions allowed to experimentally demonstrate that bud break can advance under night-time warming, but to a lesser extent than under daytime warming. Prediction models using daily timescales could neglect the diverging influence of asymmetric warming and should be recalibrated for higher temporal resolutions. © 2016 John Wiley & Sons Ltd.

  18. Mandatory Nap Times and Group Napping Patterns in Child Care: An Observational Study.

    PubMed

    Staton, Sally L; Smith, Simon S; Hurst, Cameron; Pattinson, Cassandra L; Thorpe, Karen J

    2017-01-01

    Policy provision for naps is typical in child care settings, but there is variability in the practices employed. One practice that might modify children's early sleep patterns is the allocation of a mandatory nap time in which all children are required to lie on their beds without alternate activity permitted. There is currently limited evidence of the effects of such practices on children's napping patterns. This study examined the association between duration of mandatory nap times and group-level napping patterns in child care settings. Observations were undertaken in a community sample of 113 preschool rooms with a scheduled nap time (N = 2,114 children). Results showed that 83.5% of child care settings implemented a mandatory nap time (range = 15-145 min) while 14.2% provided alternate activities for children throughout the nap time period. Overall, 31% of children napped during nap times. Compared to rooms with ≤ 30 min of mandatory nap time, rooms with 31-60 min and > 60 min of mandatory nap time had a two-and-a-half and fourfold increase, respectively, in the proportion of children napping. Nap onset latency did not significantly differ across groups. Among preschool children, exposure to longer mandatory nap times in child care may increase incidence of napping.

  19. The SIRTF Legacy Observing Program

    NASA Astrophysics Data System (ADS)

    Greenhouse, M. A.; Leisawitz, D.; Gehrz, R. D.; Clemens, D. P.; Force, Sirtf Community Task

    1997-12-01

    Legacy Observations and General Observations(GO)are separate categories in which SIRTF observing time will be allocated through peer reviewed community proposals. The Legacy Program will embrace several projects, each headed by a Legacy Principal Investigator. Legacy Observations are distinguished from General Observations by the following three criteria: [1] the project is a large, coherent investigation whose scientific goals can not be met by a number of smaller, uncoordinated projects; [2] the data will be of both general and lasting importance to the broad astronomical community and of immediate utility in motivating and planning follow-on GO investigations with SIRTF; and [3] the data (unprocessed, fully processed, and at intermediate steps in processing) will be placed in a public data base immediately and with no proprietary period. The goals of the SIRTF Legacy program are: [1] enable community use of SIRTF for large coherent survey observations, [2] provide prompt community access to SIRTF survey data, and [3] enable GO program observations based on Legacy program results. A likely attribute (but not a requirement) for Legacy projects is that they may involve hundreds, and perhaps thousands, of hours of observing time. It is anticipated that as much as 6000 hours of telescope time will be allocated through the Legacy program. To meet Legacy program goal [3], allocation of as much as 70% of SIRTF's first year on orbit to Legacy projects may be necessary, and the observing phase of the Legacy program will be completed during the following year. A Legacy call for proposals will be issued 1 year prior to launch or sooner, and will be open to all scientists and science topics. In this poster, we display Legacy program definition and schedule items that will be of interest to those intending to propose under this unique opportunity.

  20. Request to monitor 0103+59 HT Cas, 0809-76 Z Cha, 1004-69 OY Car AND Request to monitor 2147+13 LS Peg AND Request to monitor 1743-12 V378 Ser (Nova Ser 2005)

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2005-06-01

    AAVSO Alert Notice 317 has three topics. First: Drs. Christopher Mauche (Lawrence Livermore National Laboratory), Peter Wheatley (Univ. of Leicester), and Koji Mukai (NASA GSFC) have obtained time on XMM-Newton to observe HT Cas, Z Cha, or OY Car in outburst. AAVSO assistance is requested in monitoring these stars closely so we can inform them promptly when any of them go into outburst. Very prompt notification is essential, because the satellite requires 2-4 days to move to the target after the observations are triggered, and the superoutbursts of OY Car and Z Cha last only about 10 days, while the HT Cas outbursts last only a little more than 2 days. Second: Dr. Darren Baskill (Univ. of Leicester) has requested optical observations of LS Peg (currently suspected as being a DQ Her nova-like) to coincide with upcoming observations by XMM-Newton. Observations are requested from now until July 8, with time series 12 hours before and after, and also during the XMM observation. Use an Ic or V filter (Ic preferred), maximum time precision, S/N=100. Third: Dr. Alon Retter (Penn State Univ.) has requested AAVSO assistance in observing V378 Ser (Nova Serpentis 2005). Please monitor V378 Ser over the coming weeks as the nova fades and report your observations to the AAVSO. Both visual and CCD observations are encouraged. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

Top