Sample records for missing satellites problem

  1. Implications for the missing low-mass galaxies (satellites) problem from cosmic shear

    NASA Astrophysics Data System (ADS)

    Jimenez, Raul; Verde, Licia; Kitching, Thomas D.

    2018-06-01

    The number of observed dwarf galaxies, with dark matter mass ≲ 1011 M⊙ in the Milky Way or the Andromeda galaxy does not agree with predictions from the successful ΛCDM paradigm. To alleviate this problem a suppression of dark matter clustering power on very small scales has been conjectured. However, the abundance of dark matter halos outside our immediate neighbourhood (the Local Group) seem to agree with the ΛCDM-expected abundance. Here we connect these problems to observations of weak lensing cosmic shear, pointing out that cosmic shear can make significant statements about the missing satellites problem in a statistical way. As an example and pedagogical application we use recent constraints on small-scales power suppression from measurements of the CFHTLenS data. We find that, on average, in a region of ˜Gpc3 there is no significant small-scale power suppression. This implies that suppression of small-scale power is not a viable solution to the `missing satellites problem' or, alternatively, that on average in this volume there is no `missing satellites problem' for dark matter masses ≳ 5 × 109 M⊙. Further analysis of current and future weak lensing surveys will probe much smaller scales, k > 10h Mpc-1 corresponding roughly to masses M < 109M⊙.

  2. A Solution to ``Too Big to Fail''

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-10-01

    Its a tricky business to reconcile simulations of our galaxys formation with our current observations of the Milky Way and its satellites. In a recent study, scientists have addressed one discrepancy between simulations and observations: the so-called to big to fail problem.From Missing Satellites to Too Big to FailThe favored model of the universe is the lambda-cold-dark-matter (CDM) cosmological model. This model does a great job of correctly predicting the large-scale structure of the universe, but there are still a few problems with it on smaller scales.Hubble image of UGC 5497, a dwarf galaxy associated with Messier 81. In the missing satellite problem, simulations of galaxy formation predict that there should be more such satellite galaxies than we observe. [ESA/NASA]The first is the missing satellites problem: CDM cosmology predicts that galaxies like the Milky Way should have significantly more satellite galaxies than we observe. A proposed solution to this problem is the argument that there may exist many more satellites than weve observed, but these dwarf galaxies have had their stars stripped from them during tidal interactions which prevents us from being able to see them.This solution creates a new problem, though: the too big to fail problem. This problem states that many of the satellites predicted by CDM cosmology are simply so massive that theres no way they couldnt have visible stars. Another way of looking at it: the observed satellites of the Milky Way are not massive enough to be consistent with predictions from CDM.Artists illustration of a supernova, a type of stellar feedback that can modify the dark-matter distribution of a satellite galaxy. [NASA/CXC/M. Weiss]Density Profiles and Tidal StirringLed by Mihai Tomozeiu (University of Zurich), a team of scientists has published a study in which they propose a solution to the too big to fail problem. By running detailed cosmological zoom simulations of our galaxys formation, Tomozeiu and collaborators modeled the dark matter and the stellar content of the galaxy, tracking the formation and evolution of dark-matter subhalos.Based on the results of their simulations, the team argues that the too big to fail problem can be resolved by combining two effects:Stellar feedback in a satellite galaxy can modify its dark-matter distribution, lowering the dark-matter density in the galaxys center and creating a shallower density profile. Satellites with such shallow density profiles evolve differently than those typically modeled, which have a high concentration of dark matter in their centers.After these satellites fall into the Milky Ways potential, tidal effects such as shocks and stripping modify the mass distribution of both the dark matter and the baryons even further.Each curve represents a simulated satellites circular velocity (which corresponds to its total mass) at z=0. Left: results using typical dark-matter density profiles. Right: results using the shallower profiles expected when stellar feedback is included. Results from the shallower profiles are consistent with observed Milky-Way satellites(black crosses). [Adapted from Tomozeiu et al. 2016]A Match to ObservationsTomozeiu and collaborators found that when they used traditional density profiles to model the satellites, the satellites at z=0 in the simulation were much larger than those we observe around the Milky Way consistent with the too big to fail problem.When the team used shallower density profiles and took into account tidal effects, however, the simulations produced a distribution of satellites at z=0 that is consistent with what we observe.This study provides a tidy potential solution to the too big to fail problem, further strengthening the support for CDM cosmology.CitationMihai Tomozeiu et al 2016 ApJ 827 L15. doi:10.3847/2041-8205/827/1/L15

  3. Ship detection in satellite imagery using rank-order greyscale hit-or-miss transforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Neal R; Porter, Reid B; Theiler, James

    2010-01-01

    Ship detection from satellite imagery is something that has great utility in various communities. Knowing where ships are and their types provides useful intelligence information. However, detecting and recognizing ships is a difficult problem. Existing techniques suffer from too many false-alarms. We describe approaches we have taken in trying to build ship detection algorithms that have reduced false alarms. Our approach uses a version of the grayscale morphological Hit-or-Miss transform. While this is well known and used in its standard form, we use a version in which we use a rank-order selection for the dilation and erosion parts of themore » transform, instead of the standard maximum and minimum operators. This provides some slack in the fitting that the algorithm employs and provides a method for tuning the algorithm's performance for particular detection problems. We describe our algorithms, show the effect of the rank-order parameter on the algorithm's performance and illustrate the use of this approach for real ship detection problems with panchromatic satellite imagery.« less

  4. A BARYONIC SOLUTION TO THE MISSING SATELLITES PROBLEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Alyson M.; Kuhlen, Michael; Zolotov, Adi

    2013-03-01

    It has been demonstrated that the inclusion of baryonic physics can alter the dark matter densities in the centers of low-mass galaxies, making the central dark matter slope more shallow than predicted in pure cold dark matter simulations. This flattening of the dark matter profile can occur in the most luminous subhalos around Milky Way mass galaxies. Zolotov et al. have suggested a correction to be applied to the central masses of dark matter-only satellites in order to mimic the affect of (1) the flattening of the dark matter cusp due to supernova feedback in luminous satellites and (2) enhancedmore » tidal stripping due to the presence of a baryonic disk. In this paper, we apply this correction to the z = 0 subhalo masses from the high resolution, dark matter-only Via Lactea II (VL2) simulation, and find that the number of massive subhalos is dramatically reduced. After adopting a stellar mass to halo mass relationship for the VL2 halos, and identifying subhalos that are (1) likely to be destroyed by stripping and (2) likely to have star formation suppressed by photo-heating, we find that the number of massive, luminous satellites around a Milky Way mass galaxy is in agreement with the number of observed satellites around the Milky Way or M31. We conclude that baryonic processes have the potential to solve the missing satellites problem.« less

  5. A Baryonic Solution to the Missing Satellites Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Alyson M.; Kuhlen, Michael; Zolotov, Adi

    2013-03-01

    It has been demonstrated that the inclusion of baryonic physics can alter the dark matter densities in the centers of low-mass galaxies, making the central dark matter slope more shallow than predicted in pure cold dark matter simulations. This flattening of the dark matter profile can occur in the most luminous subhalos around Milky Way mass galaxies. Zolotov et al. have suggested a correction to be applied to the central masses of dark matter-only satellites in order to mimic the affect of (1) the flattening of the dark matter cusp due to supernova feedback in luminous satellites and (2) enhancedmore » tidal stripping due to the presence of a baryonic disk. In this paper, we apply this correction to the z = 0 subhalo masses from the high resolution, dark matter-only Via Lactea II (VL2) simulation, and find that the number of massive subhalos is dramatically reduced. After adopting a stellar mass to halo mass relationship for the VL2 halos, and identifying subhalos that are (1) likely to be destroyed by stripping and (2) likely to have star formation suppressed by photo-heating, we find that the number of massive, luminous satellites around a Milky Way mass galaxy is in agreement with the number of observed satellites around the Milky Way or M31. We conclude that baryonic processes have the potential to solve the missing satellites problem« less

  6. 1.688 g/cm(3) satellite-related repeats: a missing link to dosage compensation and speciation.

    PubMed

    Gallach, Miguel

    2015-09-01

    Despite the important progress that has been made on dosage compensation (DC), a critical link in our understanding of the X chromosome recognition mechanisms is still missing. Recent studies in Drosophila indicate that the missing link could be a family of DNA repeats populating the euchromatin of the X chromosome. In this opinion article, I discuss how these findings add a new fresh twist on the DC problem. In the following sections, I first summarize our understanding of DC in Drosophila and integrate these recent discoveries into our knowledge of the X chromosome recognition problem. Next, I introduce a model according to which, 1.688 g/cm(3) satellite-related (SR) repeats would be the primary recognition elements for the dosage compensation complex. Contrary to the current belief, I suggest that the DC system in Drosophila is not conserved and static, but it is continuously co-evolving with the target SR repeats. The potential role of the SR repeats in hybrid incompatibilities and speciation is also discussed. © 2015 John Wiley & Sons Ltd.

  7. Demise of faint satellites around isolated early-type galaxies

    NASA Astrophysics Data System (ADS)

    Park, Changbom; Hwang, Ho Seong; Park, Hyunbae; Lee, Jong Chul

    2018-02-01

    The hierarchical galaxy formation scenario in the Cold Dark Matter cosmology with a non-vanishing cosmological constant Λ and geometrically flat space (ΛCDM) has been very successful in explaining the large-scale distribution of galaxies. However, there have been claims that ΛCDM over-predicts the number of satellite galaxies associated with massive galaxies compared with observations—the missing satellite galaxy problem1-3. Isolated groups of galaxies hosted by passively evolving massive early-type galaxies are ideal laboratories for identifying the missing physics in the current theory4-11. Here, we report—based on a deep spectroscopic survey—that isolated massive and passive early-type galaxies without any signs of recent wet mergers or accretion episodes have almost no satellite galaxies fainter than the r-band absolute magnitude of about Mr = -14. If only early-type satellites are used, the cutoff is at the somewhat brighter magnitude of about Mr = -15. Such a cutoff has not been found in other nearby satellite galaxy systems hosted by late-type galaxies or those with merger features. Various physical properties of satellites depend strongly on the host-centric distance. Our observations indicate that the satellite galaxy luminosity function is largely determined by the interaction of satellites with the environment provided by their host.

  8. Guidance and Control System for a Satellite Constellation

    NASA Technical Reports Server (NTRS)

    Bryson, Jonathan Lamar; Cox, James; Mays, Paul Richard; Neidhoefer, James Christian; Ephrain, Richard

    2010-01-01

    A distributed guidance and control algorithm was developed for a constellation of satellites. The system repositions satellites as required, regulates satellites to desired orbits, and prevents collisions. 1. Optimal methods are used to compute nominal transfers from orbit to orbit. 2. Satellites are regulated to maintain the desired orbits once the transfers are complete. 3. A simulator is used to predict potential collisions or near-misses. 4. Each satellite computes perturbations to its controls so as to increase any unacceptable distances of nearest approach to other objects. a. The avoidance problem is recast in a distributed and locally-linear form to arrive at a tractable solution. b. Plant matrix values are approximated via simulation at each time step. c. The Linear Quadratic Gaussian (LQG) method is used to compute perturbations to the controls that will result in increased miss distances. 5. Once all danger is passed, the satellites return to their original orbits, all the while avoiding each other as above. 6. The delta-Vs are reasonable. The controller begins maneuvers as soon as practical to minimize delta-V. 7. Despite the inclusion of trajectory simulations within the control loop, the algorithm is sufficiently fast for available satellite computer hardware. 8. The required measurement accuracies are within the capabilities of modern inertial measurement devices and modern positioning devices.

  9. Large Scale Crop Classification in Ukraine using Multi-temporal Landsat-8 Images with Missing Data

    NASA Astrophysics Data System (ADS)

    Kussul, N.; Skakun, S.; Shelestov, A.; Lavreniuk, M. S.

    2014-12-01

    At present, there are no globally available Earth observation (EO) derived products on crop maps. This issue is being addressed within the Sentinel-2 for Agriculture initiative where a number of test sites (including from JECAM) participate to provide coherent protocols and best practices for various global agriculture systems, and subsequently crop maps from Sentinel-2. One of the problems in dealing with optical images for large territories (more than 10,000 sq. km) is the presence of clouds and shadows that result in having missing values in data sets. In this abstract, a new approach to classification of multi-temporal optical satellite imagery with missing data due to clouds and shadows is proposed. First, self-organizing Kohonen maps (SOMs) are used to restore missing pixel values in a time series of satellite imagery. SOMs are trained for each spectral band separately using non-missing values. Missing values are restored through a special procedure that substitutes input sample's missing components with neuron's weight coefficients. After missing data restoration, a supervised classification is performed for multi-temporal satellite images. For this, an ensemble of neural networks, in particular multilayer perceptrons (MLPs), is proposed. Ensembling of neural networks is done by the technique of average committee, i.e. to calculate the average class probability over classifiers and select the class with the highest average posterior probability for the given input sample. The proposed approach is applied for large scale crop classification using multi temporal Landsat-8 images for the JECAM test site in Ukraine [1-2]. It is shown that ensemble of MLPs provides better performance than a single neural network in terms of overall classification accuracy and kappa coefficient. The obtained classification map is also validated through estimated crop and forest areas and comparison to official statistics. 1. A.Yu. Shelestov et al., "Geospatial information system for agricultural monitoring," Cybernetics Syst. Anal., vol. 49, no. 1, pp. 124-132, 2013. 2. J. Gallego et al., "Efficiency Assessment of Different Approaches to Crop Classification Based on Satellite and Ground Observations," J. Autom. Inform. Scie., vol. 44, no. 5, pp. 67-80, 2012.

  10. A unified solution to the small scale problems of the ΛCDM model II: introducing parent-satellite interaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popolo, A. Del; Delliou, M. Le, E-mail: adelpopolo@oact.inaf.it, E-mail: delliou@ift.unesp.br

    2014-12-01

    We continue the study of the impact of baryon physics on the small scale problems of the ΛCDM model, based on a semi-analytical model (Del Popolo, 2009). With such model, we show how the cusp/core, missing satellite (MSP), Too Big to Fail (TBTF) problems and the angular momentum catastrophe can be reconciled with observations, adding parent-satellite interaction. Such interaction between dark matter (DM) and baryons through dynamical friction (DF) can sufficiently flatten the inner cusp of the density profiles to solve the cusp/core problem. Combining, in our model, a Zolotov et al. (2012)-like correction, similarly to Brooks et al. (2013),more » and effects of UV heating and tidal stripping, the number of massive, luminous satellites, as seen in the Via Lactea 2 (VL2) subhaloes, is in agreement with the numbers observed in the MW, thus resolving the MSP and TBTF problems. The model also produces a distribution of the angular spin parameter and angular momentum in agreement with observations of the dwarfs studied by van den Bosch, Burkert, and Swaters (2001)« less

  11. Synthetic-Aperture Silhouette Imaging (SASI)

    NASA Astrophysics Data System (ADS)

    Paxman, R.

    2016-09-01

    The problem of ground-based fine-resolution imaging of geosynchronous satellites continues to be an important unsolved space-surveillance problem. We are investigating a passive-illumination approach that is radically different from amplitude, intensity, or heterodyne interferometry approaches. The approach, called Synthetic-Aperture Silhouette Imaging (SASI), produces a fine-resolution image of the satellite silhouette. When plane-wave radiation emanating from a bright star is occluded by a GEO satellite, then the light is diffracted and a moving diffraction pattern (shadow) is cast on the surface of the earth. With prior knowledge of the satellite orbit and star location, the track of the moving shadow can be predicted with high precision. A linear array of inexpensive hobby telescopes can be deployed roughly perpendicular to the shadow track to collect a time history of the star intensity as the shadow passes by. A phase-retrieval algorithm, using the strong constraint that the occlusion of the satellite is a binary-valued silhouette, allows us to retrieve the missing phase and reconstruct a fine-resolution image of the silhouette. Silhouettes are highly informative, providing diagnostic information about deployment of antennas and solar panels, enabling satellite pose estimation, and revealing the presence and orientation of neighboring satellites in rendezvous and proximity operations.

  12. Resilient Sensor Networks with Spatiotemporal Interpolation of Missing Sensors: An Example of Space Weather Forecasting by Multiple Satellites

    PubMed Central

    Tokumitsu, Masahiro; Hasegawa, Keisuke; Ishida, Yoshiteru

    2016-01-01

    This paper attempts to construct a resilient sensor network model with an example of space weather forecasting. The proposed model is based on a dynamic relational network. Space weather forecasting is vital for a satellite operation because an operational team needs to make a decision for providing its satellite service. The proposed model is resilient to failures of sensors or missing data due to the satellite operation. In the proposed model, the missing data of a sensor is interpolated by other sensors associated. This paper demonstrates two examples of space weather forecasting that involves the missing observations in some test cases. In these examples, the sensor network for space weather forecasting continues a diagnosis by replacing faulted sensors with virtual ones. The demonstrations showed that the proposed model is resilient against sensor failures due to suspension of hardware failures or technical reasons. PMID:27092508

  13. Resilient Sensor Networks with Spatiotemporal Interpolation of Missing Sensors: An Example of Space Weather Forecasting by Multiple Satellites.

    PubMed

    Tokumitsu, Masahiro; Hasegawa, Keisuke; Ishida, Yoshiteru

    2016-04-15

    This paper attempts to construct a resilient sensor network model with an example of space weather forecasting. The proposed model is based on a dynamic relational network. Space weather forecasting is vital for a satellite operation because an operational team needs to make a decision for providing its satellite service. The proposed model is resilient to failures of sensors or missing data due to the satellite operation. In the proposed model, the missing data of a sensor is interpolated by other sensors associated. This paper demonstrates two examples of space weather forecasting that involves the missing observations in some test cases. In these examples, the sensor network for space weather forecasting continues a diagnosis by replacing faulted sensors with virtual ones. The demonstrations showed that the proposed model is resilient against sensor failures due to suspension of hardware failures or technical reasons.

  14. Reconstruction and downscaling of Eastern Mediterranean OSCAR satellite surface current data using DINEOF

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Andreas; Stylianou, Stavros; Georgiou, Georgios; Hajimitsis, Diofantos; Gravanis, Elias; Akylas, Evangelos

    2015-04-01

    During the last decade, Rixen (2005) and Alvera-Azkarate (2010) presented the DINEOF (Data Interpolating Empirical Orthogonal Functions) method, a EOF-based technique to reconstruct missing data in satellite images. The application of DINEOF method, proved to provide relative success in various experimental trials (Wang and Liu, 2013; Nikolaidis et al., 2013;2014), and tends to be an effective and computationally affordable solution, on the problem of data reconstruction, for missing data from geophysical fields, such as chlorophyll-a, sea surface temperatures or salinity and geophysical fields derived from satellite data. Implementation of this method in a GIS system will provide with a more complete, integrated approach, permitting the expansion of the applicability over various aspects. This may be especially useful in studies where various data of different kind, have to be examined. For this purpose, in this study we have implemented and present a GIS toolbox that aims to automate the usage of the algorithm, incorporating the DINEOF codes provided by GHER (GeoHydrodynamics and Environment Research Group of University of Liege) into the ArcGIS®. ArcGIS® is a well known standard on Geographical Information Systems, used over the years for various remote sensing procedures, in sea and land environment alike. A case-study of filling the missing satellite derived current data in the Eastern Mediterranean Sea area, for a monthly period is analyzed, as an example for the effectiveness and simplicity of the usage of this toolbox. The specific study focuses to OSCAR satellite data (http://www.oscar.noaa.gov/) collected by NOAA/NESDIS Operational Surface Current Processing and Data Center, from the respective products of OSCAR Project Office Earth and Space Research organization, that provides free online access to unfiltered (1/3 degree) resolution. All the 5-day mean products data coverage were successfully reconstructed. KEY WORDS: Remote Sensing, Cyprus, Mediterranean, DINEOF, ArcGIS, data reconstruction.

  15. Global Studies of Molecular Clouds in the Galaxy, The Magellanic Clouds, and M31

    NASA Technical Reports Server (NTRS)

    Thaddeus, Patrick

    1999-01-01

    Over the course of this grant we used various spacecraft surveys of the Galaxy and M31 in conjunction with our extensive CO spectral line surveys to address central problems in galactic structure and the astrophysics of molecular clouds. These problems included the nature of the molecular ring and its relation to the spiral arms and central bar, the cosmic ray distribution, the origin of the diffuse X-ray background, the distribution and properties of x-ray sources and supernova remnants, and the Galactic stellar mass distribution. For many of these problems, the nearby spiral M31 provided an important complementary perspective. Our CO surveys of GMCs (Galactic Molecular Clouds) were crucial for interpreting Galactic continuum surveys from satellites such as GRO (Gamma Ray Observatory), ROSAT (Roentgen Satellite), IRAS (Infrared Astronomy Satellite), and COBE (Cosmic Background Explorer Satellite) because they provided the missing dimension of velocity or kinematic distance. GMCs are a well-defined and widespread population of objects whose velocities we could readily measure throughout the Galaxy. Through various emission and absorption mechanisms involving their gas, dust, or associated Population I objects, GMCs modulate the galactic emission in virtually every major wavelength band. Furthermore, the visibility. of GMCs at so many wavelengths provided various methods of resolving the kinematic distance ambiguity for these objects in the inner Galaxy. Summaries of our accomplishments in each of the major wavelength bands discussed in our original proposal are given

  16. Satellite orbital conjunction reports assessing threatening encounters in space (SOCRATES)

    NASA Astrophysics Data System (ADS)

    Kelso, T. S.; Alfano, S.

    2006-05-01

    While many satellite operators are aware of the possibility of a collision between their satellite and another object in earth orbit, most seem unaware of the frequency of near misses occurring each day. Until recently, no service existed to advise satellite operators of an impending conjunction of a satellite payload with another satellite, putting the responsibility for determining these occurrences squarely on the satellite operator's shoulders. This problem has been further confounded by the lack of a timely, comprehensive data set of satellite orbital element sets and computationally efficient tools to provide predictions using industry-standard software. As a result, hundreds of conjunctions within 1 km occur each week, with little or no intervention, putting billions of dollars of space hardware at risk, along with their associated missions. As a service to the satellite operator community, the Center for Space Standards & Innovation (CSSI) offers SOCRATES-Satellite Orbital Conjunction Reports Assessing Threatening Encounters in Space. Twice each day, CSSI runs a list of all satellite payloads on orbit against a list of all objects on orbit using the catalog of all unclassified NORAD two-line element sets to look for conjunctions over the next seven days. The runs are made using STK/CAT-Satellite Tool Kit's Conjunction Analysis Tools-together with the NORAD SGP4 propagator in STK. This paper will discuss how SOCRATES works and how it can help satellite operators avoid undesired close approaches through advanced mission planning.

  17. The predicted luminous satellite populations around SMC- and LMC-mass galaxies - a missing satellite problem around the LMC?

    NASA Astrophysics Data System (ADS)

    Dooley, Gregory A.; Peter, Annika H. G.; Carlin, Jeffrey L.; Frebel, Anna; Bechtol, Keith; Willman, Beth

    2017-11-01

    Recent discovery of many dwarf satellite galaxies in the direction of the Small and Large Magellanic Clouds (SMC and LMC) provokes questions of their origins, and what they can reveal about galaxy evolution theory. Here, we predict the satellite stellar mass function of Magellanic Cloud-mass host galaxies using abundance matching and reionization models applied to the Caterpillar simulations. Specifically focusing on the volume within 50 kpc of the LMC, we predict a mean of four to eight satellites with stellar mass M* > 104 M⊙, and three to four satellites with 80 < M* ≤ 3000 M⊙. Surprisingly, all 12 currently known satellite candidates have stellar masses of 80 < M* ≤ 3000 M⊙. Reconciling the dearth of large satellites and profusion of small satellites is challenging and may require a combination of a major modification of the M*-Mhalo relationship (steep, but with an abrupt flattening at 103 M⊙), late reionization for the Local Group (zreion ≲ 9 preferred) and/or strong tidal stripping. We can more robustly predict that ∼53 per cent of satellites within this volume were accreted together with the LMC and SMC and ∼47 per cent were only ever Milky Way satellites. Observing satellites of isolated LMC-sized field galaxies is essential to place the LMC in context, and to better constrain the M*-Mhalo relationship. Modelling known LMC-sized galaxies within 8 Mpc, we predict 1-6 (2-12) satellites with M* > 105 M⊙ (M* > 104 M⊙) within the virial volume of each, and 1-3 (1-7) within a single 1.5° diameter field of view, making their discovery likely.

  18. Condensing Massive Satellite Datasets For Rapid Interactive Analysis

    NASA Astrophysics Data System (ADS)

    Grant, G.; Gallaher, D. W.; Lv, Q.; Campbell, G. G.; Fowler, C.; LIU, Q.; Chen, C.; Klucik, R.; McAllister, R. A.

    2015-12-01

    Our goal is to enable users to interactively analyze massive satellite datasets, identifying anomalous data or values that fall outside of thresholds. To achieve this, the project seeks to create a derived database containing only the most relevant information, accelerating the analysis process. The database is designed to be an ancillary tool for the researcher, not an archival database to replace the original data. This approach is aimed at improving performance by reducing the overall size by way of condensing the data. The primary challenges of the project include: - The nature of the research question(s) may not be known ahead of time. - The thresholds for determining anomalies may be uncertain. - Problems associated with processing cloudy, missing, or noisy satellite imagery. - The contents and method of creation of the condensed dataset must be easily explainable to users. The architecture of the database will reorganize spatially-oriented satellite imagery into temporally-oriented columns of data (a.k.a., "data rods") to facilitate time-series analysis. The database itself is an open-source parallel database, designed to make full use of clustered server technologies. A demonstration of the system capabilities will be shown. Applications for this technology include quick-look views of the data, as well as the potential for on-board satellite processing of essential information, with the goal of reducing data latency.

  19. Reconstruction of Missing Pixels in Satellite Images Using the Data Interpolating Empirical Orthogonal Function (DINEOF)

    NASA Astrophysics Data System (ADS)

    Liu, X.; Wang, M.

    2016-02-01

    For coastal and inland waters, complete (in spatial) and frequent satellite measurements are important in order to monitor and understand coastal biological and ecological processes and phenomena, such as diurnal variations. High-frequency images of the water diffuse attenuation coefficient at the wavelength of 490 nm (Kd(490)) derived from the Korean Geostationary Ocean Color Imager (GOCI) provide a unique opportunity to study diurnal variation of the water turbidity in coastal regions of the Bohai Sea, Yellow Sea, and East China Sea. However, there are lots of missing pixels in the original GOCI-derived Kd(490) images due to clouds and various other reasons. Data Interpolating Empirical Orthogonal Function (DINEOF) is a method to reconstruct missing data in geophysical datasets based on Empirical Orthogonal Function (EOF). In this study, the DINEOF is applied to GOCI-derived Kd(490) data in the Yangtze River mouth and the Yellow River mouth regions, the DINEOF reconstructed Kd(490) data are used to fill in the missing pixels, and the spatial patterns and temporal functions of the first three EOF modes are also used to investigate the sub-diurnal variation due to the tidal forcing. In addition, DINEOF method is also applied to the Visible Infrared Imaging Radiometer Suite (VIIRS) on board the Suomi National Polar-orbiting Partnership (SNPP) satellite to reconstruct missing pixels in the daily Kd(490) and chlorophyll-a concentration images, and some application examples in the Chesapeake Bay and the Gulf of Mexico will be presented.

  20. A novel framework for objective detection and tracking of TC center from noisy satellite imagery

    NASA Astrophysics Data System (ADS)

    Johnson, Bibin; Thomas, Sachin; Rani, J. Sheeba

    2018-07-01

    This paper proposes a novel framework for automatically determining and tracking the center of a tropical cyclone (TC) during its entire life-cycle from the Thermal infrared (TIR) channel data of the geostationary satellite. The proposed method handles meteorological images with noise, missing or partial information due to the seasonal variability and lack of significant spatial or vortex features. To retrieve the cyclone center from these circumstances, a synergistic approach based on objective measures and Numerical Weather Prediction (NWP) model is being proposed. This method employs a spatial gradient scheme to process missing and noisy frames or a spatio-temporal gradient scheme for image sequences that are continuous and contain less noise. The initial estimate of the TC center from the missing imagery is corrected by exploiting a NWP model based post-processing scheme. The validity of the framework is tested on Infrared images of different cyclones obtained from various Geostationary satellites such as the Meteosat-7, INSAT- 3 D , Kalpana-1 etc. The computed track is compared with the actual track data obtained from Joint Typhoon Warning Center (JTWC), and it shows a reduction of mean track error by 11 % as compared to the other state of the art methods in the presence of missing and noisy frames. The proposed method is also successfully tested for simultaneous retrieval of the TC center from images containing multiple non-overlapping cyclones.

  1. Empirical likelihood method for non-ignorable missing data problems.

    PubMed

    Guan, Zhong; Qin, Jing

    2017-01-01

    Missing response problem is ubiquitous in survey sampling, medical, social science and epidemiology studies. It is well known that non-ignorable missing is the most difficult missing data problem where the missing of a response depends on its own value. In statistical literature, unlike the ignorable missing data problem, not many papers on non-ignorable missing data are available except for the full parametric model based approach. In this paper we study a semiparametric model for non-ignorable missing data in which the missing probability is known up to some parameters, but the underlying distributions are not specified. By employing Owen (1988)'s empirical likelihood method we can obtain the constrained maximum empirical likelihood estimators of the parameters in the missing probability and the mean response which are shown to be asymptotically normal. Moreover the likelihood ratio statistic can be used to test whether the missing of the responses is non-ignorable or completely at random. The theoretical results are confirmed by a simulation study. As an illustration, the analysis of a real AIDS trial data shows that the missing of CD4 counts around two years are non-ignorable and the sample mean based on observed data only is biased.

  2. Extreme deconvolution: Inferring complete distribution functions from noisy, heterogeneous and incomplete observations

    NASA Astrophysics Data System (ADS)

    Bovy Jo; Hogg, David W.; Roweis, Sam T.

    2011-06-01

    We generalize the well-known mixtures of Gaussians approach to density estimation and the accompanying Expectation-Maximization technique for finding the maximum likelihood parameters of the mixture to the case where each data point carries an individual d-dimensional uncertainty covariance and has unique missing data properties. This algorithm reconstructs the error-deconvolved or "underlying" distribution function common to all samples, even when the individual data points are samples from different distributions, obtained by convolving the underlying distribution with the heteroskedastic uncertainty distribution of the data point and projecting out the missing data directions. We show how this basic algorithm can be extended with conjugate priors on all of the model parameters and a "split-and-"erge- procedure designed to avoid local maxima of the likelihood. We demonstrate the full method by applying it to the problem of inferring the three-dimensional veloc! ity distribution of stars near the Sun from noisy two-dimensional, transverse velocity measurements from the Hipparcos satellite.

  3. Dealing with missing data in remote sensing images within land and crop classification

    NASA Astrophysics Data System (ADS)

    Skakun, Sergii; Kussul, Nataliia; Basarab, Ruslan

    Optical remote sensing images from space provide valuable data for environmental monitoring, disaster management [1], agriculture mapping [2], so forth. In many cases, a time-series of satellite images is used to discriminate or estimate particular land parameters. One of the factors that influence the efficiency of satellite imagery is the presence of clouds. This leads to the occurrence of missing data that need to be addressed. Numerous approaches have been proposed to fill in missing data (or gaps) and can be categorized into inpainting-based, multispectral-based, and multitemporal-based. In [3], ancillary MODIS data are utilized for filling gaps and predicting Landsat data. In this paper we propose to use self-organizing Kohonen maps (SOMs) for missing data restoration in time-series of satellite imagery. Such approach was previously used for MODIS data [4], but applying this approach for finer spatial resolution data such as Sentinel-2 and Landsat-8 represents a challenge. Moreover, data for training the SOMs are selected manually in [4] that complicates the use of the method in an automatic mode. SOM is a type of artificial neural network that is trained using unsupervised learning to produce a discretised representation of the input space of the training samples, called a map. The map seeks to preserve the topological properties of the input space. The reconstruction of satellite images is performed for each spectral band separately, i.e. a separate SOM is trained for each spectral band. Pixels that have no missing values in the time-series are selected for training. Selecting the number of training pixels represent a trade-off, in particular increasing the number of training samples will lead to the increased time of SOM training while increasing the quality of restoration. Also, training data sets should be selected automatically. As such, we propose to select training samples on a regular grid of pixels. Therefore, the SOM seeks to project a large number of non-missing data to the subspace vectors in the map. Restoration of the missing values is performed in the following way. The multi-temporal pixel values (with gaps) are put to the neural network. A neuron-winner (or a best matching unit, BMU) in the SOM is selected based on the distance metric (for example, Euclidian). It should be noted that missing values are omitted from metric estimation when selecting BMU. When the BMU is selected, missing values are substituted by corresponding components of the BMU values. The efficiency of the proposed approach was tested on a time-series of Landsat-8 images over the JECAM test site in Ukraine and Sich-2 images over Crimea (Sich-2 is Ukrainian remote sensing satellite acquiring images at 8m spatial resolution). Landsat-8 images were first converted to the TOA reflectance, and then were atmospherically corrected so each pixel value represents a surface reflectance in the range from 0 to 1. The error of reconstruction (error of quantization) on training data was: band-2: 0.015; band-3: 0.020; band-4: 0.026; band-5: 0.070; band-6: 0.060; band-7: 0.055. The reconstructed images were also used for crop classification using a multi-layer perceptron (MLP). Overall accuracy was 85.98% and Cohen's kappa was 0.83. References. 1. Skakun, S., Kussul, N., Shelestov, A. and Kussul, O. “Flood Hazard and Flood Risk Assessment Using a Time Series of Satellite Images: A Case Study in Namibia,” Risk Analysis, 2013, doi: 10.1111/risa.12156. 2. Gallego, F.J., Kussul, N., Skakun, S., Kravchenko, O., Shelestov, A., Kussul, O. “Efficiency assessment of using satellite data for crop area estimation in Ukraine,” International Journal of Applied Earth Observation and Geoinformation, vol. 29, pp. 22-30, 2014. 3. Roy D.P., Ju, J., Lewis, P., Schaaf, C., Gao, F., Hansen, M., and Lindquist, E., “Multi-temporal MODIS-Landsat data fusion for relative radiometric normalization, gap filling, and prediction of Landsat data,” Remote Sensing of Environment, 112(6), pp. 3112-3130, 2008. 4. Latif, B.A., and Mercier, G., “Self-Organizing maps for processing of data with missing values and outliers: application to remote sensing images,” Self-Organizing Maps. InTech, pp. 189-210, 2010.

  4. Satellite, climatological, and theoretical inputs for modeling of the diurnal cycle of fire emissions

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Reid, J. S.; Schmidt, C. C.; Giglio, L.; Prins, E.

    2009-12-01

    The diurnal cycle of fire activity is crucial for accurate simulation of atmospheric effects of fire emissions, especially at finer spatial and temporal scales. Estimating diurnal variability in emissions is also a critical problem for construction of emissions estimates from multiple sensors with variable coverage patterns. An optimal diurnal emissions estimate will use as much information as possible from satellite fire observations, compensate known biases in those observations, and use detailed theoretical models of the diurnal cycle to fill in missing information. As part of ongoing improvements to the Fire Location and Monitoring of Burning Emissions (FLAMBE) fire monitoring system, we evaluated several different methods of integrating observations with different temporal sampling. We used geostationary fire detections from WF_ABBA, fire detection data from MODIS, empirical diurnal cycles from TRMM, and simple theoretical diurnal curves based on surface heating. Our experiments integrated these data in different combinations to estimate the diurnal cycles of emissions for each location and time. Hourly emissions estimates derived using these methods were tested using an aerosol transport model. We present results of this comparison, and discuss the implications of our results for the broader problem of multi-sensor data fusion in fire emissions modeling.

  5. Electronics Devices and Materials

    DTIC Science & Technology

    2008-03-17

    Molecular -bea epitaxy MCNPX ............... Software code Misse6 ................. Satellite expected to carry ORMatE-I Misse7...patterning using electron beam lithography), spaces (class 1000 clean benches), and skills (appropriate mix of skilled technicians and professionals...34 Process samples for various projects such as Antimode Base High Electron Mobility Transistors ( HEMT ) and Double Heterojuction Bipolar Transistors

  6. The CACAO Method for Smoothing, Gap Filling, and Characterizing Seasonal Anomalies in Satellite Time Series

    NASA Technical Reports Server (NTRS)

    Verger, Aleixandre; Baret, F.; Weiss, M.; Kandasamy, S.; Vermote, E.

    2013-01-01

    Consistent, continuous, and long time series of global biophysical variables derived from satellite data are required for global change research. A novel climatology fitting approach called CACAO (Consistent Adjustment of the Climatology to Actual Observations) is proposed to reduce noise and fill gaps in time series by scaling and shifting the seasonal climatological patterns to the actual observations. The shift and scale CACAO parameters adjusted for each season allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. CACAO was assessed first over simulated daily Leaf Area Index (LAI) time series with varying fractions of missing data and noise. Then, performances were analyzed over actual satellite LAI products derived from AVHRR Long-Term Data Record for the 1981-2000 period over the BELMANIP2 globally representative sample of sites. Comparison with two widely used temporal filtering methods-the asymmetric Gaussian (AG) model and the Savitzky-Golay (SG) filter as implemented in TIMESAT-revealed that CACAO achieved better performances for smoothing AVHRR time series characterized by high level of noise and frequent missing observations. The resulting smoothed time series captures well the vegetation dynamics and shows no gaps as compared to the 50-60% of still missing data after AG or SG reconstructions. Results of simulation experiments as well as confrontation with actual AVHRR time series indicate that the proposed CACAO method is more robust to noise and missing data than AG and SG methods for phenology extraction.

  7. Australia's Domestic Communication Satellite and Education: Has Education Missed the Boat?

    ERIC Educational Resources Information Center

    White, Peter B.

    Educators have been critized for being unable to develop any firm plans for the use of Australia's Domestic Communications Satellite (AUSSAT). However, conferences, talks, and papers have resulted in some significant achievements. First, it is now possible to raise issues of communications and telecommunications planning at the very highest…

  8. Characterizing Satellite Rainfall Errors based on Land Use and Land Cover and Tracing Error Source in Hydrologic Model Simulation

    NASA Astrophysics Data System (ADS)

    Gebregiorgis, A. S.; Peters-Lidard, C. D.; Tian, Y.; Hossain, F.

    2011-12-01

    Hydrologic modeling has benefited from operational production of high resolution satellite rainfall products. The global coverage, near-real time availability, spatial and temporal sampling resolutions have advanced the application of physically based semi-distributed and distributed hydrologic models for wide range of environmental decision making processes. Despite these successes, the existence of uncertainties due to indirect way of satellite rainfall estimates and hydrologic models themselves remain a challenge in making meaningful and more evocative predictions. This study comprises breaking down of total satellite rainfall error into three independent components (hit bias, missed precipitation and false alarm), characterizing them as function of land use and land cover (LULC), and tracing back the source of simulated soil moisture and runoff error in physically based distributed hydrologic model. Here, we asked "on what way the three independent total bias components, hit bias, missed, and false precipitation, affect the estimation of soil moisture and runoff in physically based hydrologic models?" To understand the clear picture of the outlined question above, we implemented a systematic approach by characterizing and decomposing the total satellite rainfall error as a function of land use and land cover in Mississippi basin. This will help us to understand the major source of soil moisture and runoff errors in hydrologic model simulation and trace back the information to algorithm development and sensor type which ultimately helps to improve algorithms better and will improve application and data assimilation in future for GPM. For forest and woodland and human land use system, the soil moisture was mainly dictated by the total bias for 3B42-RT, CMORPH, and PERSIANN products. On the other side, runoff error was largely dominated by hit bias than the total bias. This difference occurred due to the presence of missed precipitation which is a major contributor to the total bias both during the summer and winter seasons. Missed precipitation, most likely light rain and rain over snow cover, has significant effect on soil moisture and are less capable of producing runoff that results runoff dependency on the hit bias only.

  9. Resolving the faint end of the satellite luminosity function for the nearest elliptical Centaurus A

    NASA Astrophysics Data System (ADS)

    Crnojevic, Denija

    2014-10-01

    We request HST/ACS imaging to follow up 15 new faint candidate dwarfs around the nearest elliptical Centaurus A (3.8 Mpc). The dwarfs were found via a systematic ground-based (Magellan/Megacam) survey out to ~150 kpc, designed to directly confront the "missing satellites" problem in a wholly new environment. Current Cold Dark Matter models for structure formation fail to reproduce the shallow slope of the satellite luminosity function in spiral-dominated groups for which dwarfs fainter than M_V<-14 have been surveyed (the Local Group and the nearby, interacting M81 group). Clusters of galaxies show a better agreement with cosmological predictions, suggesting an environmental dependence of the (poorly-understood) physical processes acting on the evolution of low mass galaxies (e.g., reionization). However, the luminosity function completeness for these rich environments quickly drops due to the faintness of the satellites and to the difficult cluster membership determination. We target a yet unexplored "intermediate" environment, a nearby group dominated by an elliptical galaxy, ideal due to its proximity: accurate (10%) distance determinations for its members can be derived from resolved stellar populations. The proposed observations of the candidate dwarfs will confirm their nature, group membership, and constrain their luminosities, metallicities, and star formation histories. We will obtain the first complete census of dwarf satellites of an elliptical down to an unprecedented M_V<-9. Our results will crucially constrain cosmological predictions for the faint end of the satellite luminosity function to achieve a more complete picture of the galaxy formation process.

  10. Understanding the Milky Way Halo through Large Surveys

    NASA Astrophysics Data System (ADS)

    Koposov, Sergey

    This thesis presents an extensive study of stellar substructure in the outskirts of the Milky Way(MW), combining data mining of SDSS with theoretical modeling. Such substructure, either bound star clusters and satellite galaxies, or tidally disrupted objects forming stellar streams are powerful diagnostics of the Milky Way's dynamics and formation history. I have developed an algorithmic technique of searching for stellar overdensities in the MW halo, based on SDSS catalogs. This led to the discovery of unusual ultra-faint ~ (1000Lsun) globular clusters with very compact sizes and relaxation times << t_Hubble. The detailed analysis of a known stellar stream (GD-1), allowed me to make the first 6-D phase space map for such an object along 60 degrees on the sky. By modeling the stream's orbit I could place strong constraints on the Galactic potential, e.g. Vcirc(R0)= 224+/-13 km/s. The application of the algorithmic search for stellar overdensities to the SDSS dataset and to mock datasets allowed me to quantify SDSS's severe radial incompleteness in its search for ultra-faint dwarf galaxies and to determine the luminosity function of MW satellites down to luminosities of M_V ~ -3. I used the semi-analytical model in order to compare the CDM model predictions for the MW satellite population with the observations; this comparison has shown that the recently increased census of MW satellites, better understanding of the radial incompleteness and the suppression of star formation after the reionization can fully solve the "Missing satellite problem".

  11. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  12. Earth Radiation Imbalance from a Constellation of 66 Iridium Satellites: Climate Science Aspects

    NASA Technical Reports Server (NTRS)

    Wiscombe, W.; Chiu, CJ. Y.

    2012-01-01

    The "global warming hiatus" since the 1998 El Nino, highlighted by Meehl et al., and the resulting "missing energy" problem highlighted by Trenberth et al., has opened the door to a more fundamental view of climate change than mere surface air temperature. That new view is based on two variables which are strongly correlated: the rate of change of ocean heat content d(OHC)/dt; and Earth Radiation Imbalance (ERI) at the top of the atmosphere, whose guesstimated range is 0.4 to 0.9 Watts per square meters (this imbalance being mainly due to increasing CO2). The Argo float array is making better and better measurements of OHC. But existing satellite systems cannot measure ERI to even one significant digit. So, climate model predictions of ERI are used in place of real measurements of it, and the satellite data are tuned to the climate model predictions. Some oceanographers say "just depend on Argo for understanding the global warming hiatus and the missing energy", but we don't think this is a good idea because d(OHC)/dt and ERI have different time scales and are never perfectly correlated. We think the ERB community needs to step up to measuring ERI correctly, just as oceanographers have deployed Argo to measure OHC correctly. This talk will overview a proposed constellation of 66 Earth radiation budget instruments, hosted on Iridium satellites, that will actually be able to measure ERI to at least one significant digit, thus enabling a crucial test of climate models. This constellation will also be able to provide ERI at two-hourly time scales and 500-km spatial scales without extrapolations from uncalibrated narrowband geostationary instruments, using the highly successful methods of GRACE to obtain spatial resolution. This high time resolution would make ERI a synoptic variable like temperature, and allow studies of ERI's response to fast-evolving phenomena like dust storms and hurricanes and even brief excursions of Total Solar Irradiance. Time permitting, we will also discuss the emerging view of clear vs. cloudy and its implications for the traditional ERB approach.

  13. Missing, Abducted, Runaway, and Thrownaway Children in America. First Report: Numbers and Characteristics, National Incidence Studies. Executive Summary.

    ERIC Educational Resources Information Center

    Finkelhor, David; And Others

    What has in the past been called the missing children problem is in reality a set of at least five distinct problems, each of which needs to be researched, analyzed, and treated separately. The problems are family abductions, nonfamily abductions, runaways, thrownaways, and lost, injured, or otherwise missing children. Many of the children in at…

  14. Implications of Systematic Nominator Missingness for Peer Nomination Data

    ERIC Educational Resources Information Center

    Babcock, Ben; Marks, Peter E. L.; van den Berg, Yvonne H. M.; Cillessen, Antonius H. N.

    2018-01-01

    Missing data are a persistent problem in psychological research. Peer nomination data present a unique missing data problem, because a nominator's nonparticipation results in missing data for other individuals in the study. This study examined the range of effects of systematic nonparticipation on the correlations between peer nomination data when…

  15. Searches for new Milky Way satellites from the first two years of data of the Subaru/Hyper Suprime-Cam survey: Discovery of Cetus III

    NASA Astrophysics Data System (ADS)

    Homma, Daisuke; Chiba, Masashi; Okamoto, Sakurako; Komiyama, Yutaka; Tanaka, Masayuki; Tanaka, Mikito; Ishigaki, Miho N.; Hayashi, Kohei; Arimoto, Nobuo; Garmilla, José A.; Lupton, Robert H.; Strauss, Michael A.; Miyazaki, Satoshi; Wang, Shiang-Yu; Murayama, Hitoshi

    2018-01-01

    We present the results from a search for new Milky Way (MW) satellites from the first two years of data from the Hyper Suprime-Cam (HSC) Subaru Strategic Program (SSP) ˜300 deg2 and report the discovery of a highly compelling ultra-faint dwarf galaxy candidate in Cetus. This is the second ultra-faint dwarf we have discovered after Virgo I reported in our previous paper. This satellite, Cetus III, has been identified as a statistically significant (10.7 σ) spatial overdensity of star-like objects, which are selected from a relevant isochrone filter designed for a metal-poor and old stellar population. This stellar system is located at a heliocentric distance of 251^{+24}_{-11}kpc with a most likely absolute magnitude of MV = -2.4 ± 0.6 mag estimated from a Monte Carlo analysis. Cetus III is extended with a half-light radius of r_h = 90^{+42}_{-17}pc, suggesting that this is a faint dwarf satellite in the MW located beyond the detection limit of the Sloan Digital Sky Survey. Further spectroscopic studies are needed to assess the nature of this stellar system. We also revisit and update the parameters for Virgo I, finding M_V = -0.33^{+0.75}_{-0.87}mag and r_h = 47^{+19}_{-13}pc. Using simulations of Λ-dominated cold dark matter models, we predict that we should find one or two new MW satellites from ˜300 deg2 HSC-SSP data, in rough agreement with the discovery rate so far. The further survey and completion of HSC-SSP over ˜1400 deg2 will provide robust insights into the missing satellites problem.

  16. Finding fixed satellite service orbital allotments with a k-permutation algorithm

    NASA Technical Reports Server (NTRS)

    Reilly, Charles H.; Mount-Campbell, Clark A.; Gonsalvez, David J. A.

    1990-01-01

    A satellite system synthesis problem, the satellite location problem (SLP), is addressed. In SLP, orbital locations (longitudes) are allotted to geostationary satellites in the fixed satellite service. A linear mixed-integer programming model is presented that views SLP as a combination of two problems: the problem of ordering the satellites and the problem of locating the satellites given some ordering. A special-purpose heuristic procedure, a k-permutation algorithm, has been developed to find solutions to SLPs. Solutions to small sample problems are presented and analyzed on the basis of calculated interferences.

  17. Longitudinal data analysis with non-ignorable missing data.

    PubMed

    Tseng, Chi-hong; Elashoff, Robert; Li, Ning; Li, Gang

    2016-02-01

    A common problem in the longitudinal data analysis is the missing data problem. Two types of missing patterns are generally considered in statistical literature: monotone and non-monotone missing data. Nonmonotone missing data occur when study participants intermittently miss scheduled visits, while monotone missing data can be from discontinued participation, loss to follow-up, and mortality. Although many novel statistical approaches have been developed to handle missing data in recent years, few methods are available to provide inferences to handle both types of missing data simultaneously. In this article, a latent random effects model is proposed to analyze longitudinal outcomes with both monotone and non-monotone missingness in the context of missing not at random. Another significant contribution of this article is to propose a new computational algorithm for latent random effects models. To reduce the computational burden of high-dimensional integration problem in latent random effects models, we develop a new computational algorithm that uses a new adaptive quadrature approach in conjunction with the Taylor series approximation for the likelihood function to simplify the E-step computation in the expectation-maximization algorithm. Simulation study is performed and the data from the scleroderma lung study are used to demonstrate the effectiveness of this method. © The Author(s) 2012.

  18. A k-permutation algorithm for Fixed Satellite Service orbital allotments

    NASA Technical Reports Server (NTRS)

    Reilly, Charles H.; Mount-Campbell, Clark A.; Gonsalvez, David J. A.

    1988-01-01

    A satellite system synthesis problem, the satellite location problem (SLP), is addressed in this paper. In SLP, orbital locations (longitudes) are allotted to geostationary satellites in the Fixed Satellite Service. A linear mixed-integer programming model is presented that views SLP as a combination of two problems: (1) the problem of ordering the satellites and (2) the problem of locating the satellites given some ordering. A special-purpose heuristic procedure, a k-permutation algorithm, that has been developed to find solutions to SLPs formulated in the manner suggested is described. Solutions to small example problems are presented and analyzed.

  19. THE PRIMEVAL POPULATIONS OF THE ULTRA-FAINT DWARF GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Thomas M.; Tumlinson, Jason; Kalirai, Jason S.

    We present new constraints on the star formation histories of the ultra-faint dwarf (UFD) galaxies, using deep photometry obtained with the Hubble Space Telescope (HST). A galaxy class recently discovered in the Sloan Digital Sky Survey, the UFDs appear to be an extension of the classical dwarf spheroidals to low luminosities, offering a new front in efforts to understand the missing satellite problem. They are the least luminous, most dark-matter-dominated, and least chemically evolved galaxies known. Our HST survey of six UFDs seeks to determine if these galaxies are true fossils from the early universe. We present here the preliminarymore » analysis of three UFD galaxies: Hercules, Leo IV, and Ursa Major I. Classical dwarf spheroidals of the Local Group exhibit extended star formation histories, but these three Milky Way satellites are at least as old as the ancient globular cluster M92, with no evidence for intermediate-age populations. Their ages also appear to be synchronized to within {approx}1 Gyr of each other, as might be expected if their star formation was truncated by a global event, such as reionization.« less

  20. Failures no More: The Radical Consequences of Realistic Stellar Feedback for Dwarf Galaxies, the Milky Way, and Reionization

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2016-06-01

    Many of the most fundamental unsolved questions in star and galaxy formation revolve around star formation and "feedback" from massive stars, in-extricably linking galaxy formation and stellar evolution. I'll present simulations with un-precedented resolution of Milky-Way (MW) mass galaxies, followed cosmologically to redshift zero. For the first time, these simulations resolve the internal structure of small dwarf satellites around a MW-like host, with detailed models for stellar evolution including radiation pressure, supernovae, stellar winds, and photo-heating. I'll show that, without fine-tuning, these feedback processes naturally resolve the "missing satellites," "too big to fail," and "cusp-core" problems, and produce realistic galaxy populations. At high redshifts however, the realistic ISM structure predicted, coupled to standard stellar population models, naively leads to the prediction that only ~1-2% of ionizing photons can ever escape galaxies, insufficient to ionize the Universe. But these models assume all stars are single: if we account for binary evolution, the escape fraction increases dramatically to ~20% for the small, low-metallicity galaxies believed to ionize the Universe.

  1. Simulation of solar array slewing of Indian remote sensing satellite

    NASA Astrophysics Data System (ADS)

    Maharana, P. K.; Goel, P. S.

    The effect of flexible arrays on sun tracking for the IRS satellite is studied. Equations of motion of satellites carrying a rotating flexible appendage are developed following the Newton-Euler approach and utilizing the constrained modes of the appendage. The drive torque, detent torque and friction torque in the SADA are included in the model. Extensive simulations of the slewing motion are carried out. The phenomena of back-stepping, step-missing, step-slipping and the influences of array flexibility in the acquisition mode are observed for certain combinations of parameters.

  2. Estimation of Item Response Theory Parameters in the Presence of Missing Data

    ERIC Educational Resources Information Center

    Finch, Holmes

    2008-01-01

    Missing data are a common problem in a variety of measurement settings, including responses to items on both cognitive and affective assessments. Researchers have shown that such missing data may create problems in the estimation of item difficulty parameters in the Item Response Theory (IRT) context, particularly if they are ignored. At the same…

  3. Treatment of Missing Data in Workforce Education Research

    ERIC Educational Resources Information Center

    Gemici, Sinan; Rojewski, Jay W.; Lee, In Heok

    2012-01-01

    Most quantitative analyses in workforce education are affected by missing data. Traditional approaches to remedy missing data problems often result in reduced statistical power and biased parameter estimates due to systematic differences between missing and observed values. This article examines the treatment of missing data in pertinent…

  4. Predicting the Magnetic Field of Earth-Impacting CMEs

    NASA Technical Reports Server (NTRS)

    Kay, C.; Gopalswamy, N.; Reinard, A.; Opher, M.

    2017-01-01

    Predicting the impact of coronal mass ejections (CMEs) and the southward component of their magnetic field is one of the key goals of space weather forecasting. We present a new model, the ForeCAT In situ Data Observer (FIDO), for predicting the in situ magnetic field of CMEs. We first simulate a CME using ForeCAT, a model for CME deflection and rotation resulting from the background solar magnetic forces. Using the CME position and orientation from ForeCAT, we then determine the passage of the CME over a simulated spacecraft. We model the CME's magnetic field using a force-free flux rope and we determine the in situ magnetic profile at the synthetic spacecraft. We show that FIDO can reproduce the general behavior of four observed CMEs. FIDO results are very sensitive to the CME's position and orientation, and we show that the uncertainty in a CME's position and orientation from coronagraph images corresponds to a wide range of in situ magnitudes and even polarities. This small range of positions and orientations also includes CMEs that entirely miss the satellite. We show that two derived parameters (the normalized angular distance between the CME nose and satellite position and the angular difference between the CME tilt and the position angle of the satellite with respect to the CME nose) can be used to reliably determine whether an impact or miss occurs. We find that the same criteria separate the impacts and misses for cases representing all four observed CMEs.

  5. Testing the Drake Equation in the Solar System

    NASA Astrophysics Data System (ADS)

    Chela-Flores, Julian

    Whereas Titan is an appropriate target for studying chemical evolution, the planet Mars and the Galilean satellites are favourable sites for the search of extraterrestrial life. The main encouragement for the search for life in the solar system is the possible evidence of liquid water in the early history of Mars and, at present, in the galilean satellites. Hydrothermal vents on the Earth's sea floor have been found to sustain life forms. Possible analogous geologic activity on Europa, caused by tidal heating and decay of radioactive elements, makes this satellite the best target for identifying a separate evolutionary line. We explore Europa's likely degree of biological evolution by discussing experimental tests that have been suggested. The theoretical bases for the distribution of life in the universe are still missing, in spite of considerable technological progress in radioastronomy. We intend to demonstrate that the search for life on the Galilean satellites can provide a first step towards the still missing theoretical insight: If f_i is the parameter in the Drake Equation denoting the fraction of life-bearing planets or satellites where biological evolution produces an intelligent species, then we suggest the equation: f_i = k_1 f_e f_m, where k_1 is a constant of proportionality, f_e and f_m denote the fractions of planets or satellites where eukaryogenesis, or multicellularity, respectively, may occur. Our conjecture motivates the search in our solar system, particularly in Europa, for a hint that the key factor f_e is a non-vanishing parameter in at least one extraterrestrial environment.

  6. Validation of satellite daily rainfall estimates in complex terrain of Bali Island, Indonesia

    NASA Astrophysics Data System (ADS)

    Rahmawati, Novi; Lubczynski, Maciek W.

    2017-11-01

    Satellite rainfall products have different performances in different geographic regions under different physical and climatological conditions. In this study, the objective was to select the most reliable and accurate satellite rainfall products for specific, environmental conditions of Bali Island. The performances of four spatio-temporal satellite rainfall products, i.e., CMORPH25, CMORPH8, TRMM, and PERSIANN, were evaluated at the island, zonation (applying elevation and climatology as constraints), and pixel scales, using (i) descriptive statistics and (ii) categorical statistics, including bias decomposition. The results showed that all the satellite products had low accuracy because of spatial scale effect, daily resolution and the island complexity. That accuracy was relatively lower in (i) dry seasons and dry climatic zones than in wet seasons and wet climatic zones; (ii) pixels jointly covered by sea and mountainous land than in pixels covered by land or by sea only; and (iii) topographically diverse than uniform terrains. CMORPH25, CMORPH8, and TRMM underestimated and PERSIANN overestimated rainfall when comparing them to gauged rain. The CMORPH25 had relatively the best performance and the PERSIANN had the worst performance in the Bali Island. The CMORPH25 had the lowest statistical errors, the lowest miss, and the highest hit rainfall events; it also had the lowest miss rainfall bias and was relatively the most accurate in detecting, frequent in Bali, ≤ 20 mm day-1 rain events. Lastly, the CMORPH25 coarse grid better represented rainfall events from coastal to inlands areas than other satellite products, including finer grid CMORPH8.

  7. Feedback Loop of Data Infilling Using Model Result of Actual Evapotranspiration from Satellites and Hydrological Model

    NASA Astrophysics Data System (ADS)

    Murdi Hartanto, Isnaeni; Alexandridis, Thomas K.; van Andel, Schalk Jan; Solomatine, Dimitri

    2014-05-01

    Using satellite data in a hydrological model has long been occurring in modelling of hydrological processes, as a source of low cost regular data. The methods range from using satellite products as direct input, model validation, and data assimilation. However, the satellite data frequently face the missing value problem, whether due to the cloud cover or the limited temporal coverage. The problem could seriously affect its usefulness in hydrological model, especially if the model uses it as direct input, so data infilling becomes one of the important parts in the whole modelling exercise. In this research, actual evapotranspiration product from satellite is directly used as input into a spatially distributed hydrological model, and validated by comparing the catchment's end discharge with measured data. The instantaneous actual evapotranspiration is estimated from MODIS satellite images using a variation of the energy balance model for land (SEBAL). The eight-day cumulative actual evapotranspiration is then obtained by a temporal integration that uses the reference evapotranspiration calculated from meteorological data [1]. However, the above method cannot fill in a cell if the cell is constantly having no-data value during the eight-day periods. The hydrological model requires full set of data without no-data cells, hence, the no-data cells in the satellite's evapotranspiration map need to be filled in. In order to fills the no-data cells, an output of hydrological model is used. The hydrological model is firstly run with reference evapotranspiration as input to calculate discharge and actual evapotranspiration. The no-data cells in the eight-day cumulative map from the satellite are then filled in with the output of the first run of hydrological model. The final data is then used as input in a hydrological model to calculate discharge, thus creating a loop. The method is applied in the case study of Rijnland, the Netherlands where in the winter, cloud cover is persistent and leads to many no-data cells in the satellite products. The Rijnland area is a low-lying area with tight water system control. The satellite data is used as input in a SIMGRO model, a spatially distributed hydrological model that is able to handle the controlled water system and that is suitable for the low-lying areas in the Netherlands. The application in the Rijnland area gives overall a good result of total discharge. By using the method, the hydrological model is improved in term of spatial hydrological state, where the original model is only calibrated to discharge in one location. [1] Alexandridis, T.K., Cherif, I., Chemin, Y., Silleos, G.N., Stavrinos, E. & Zalidis, G.C. (2009). Integrated Methodology for Estimating Water Use in Mediterranean Agricultural Areas. Remote Sensing. 1

  8. Space Flight Experiments to Measure Polymer Erosion and Contamination on Spacecraft

    NASA Technical Reports Server (NTRS)

    Lillis, Maura C.; Youngstrom, Erica E.; Marx, Laura M.; Hammerstrom, Anne M.; Finefrock, Katherine D.; Youngstrom, Christiane A.; Kaminski, Carolyn; Fine, Elizabeth S.; Hunt, Patricia K.; deGroh, Kim K.

    2002-01-01

    Atomic oxygen erosion and silicone contamination are serious issues that could damage or destroy spacecraft components after orbiting for an extended period of time, such as on a space station or satellite. An experiment, the Polymer Erosion And Contamination Experiment (PEACE) will be conducted to study the effects of atomic oxygen (AO) erosion and silicone contamination, and it will provide information and contribute to a solution for these problems. PEACE will fly 43 different polymer materials that will be analyzed for AO erosion effects through two techniques: mass loss measurement and recession depth measurement. Pinhole cameras will provide information about the arrival direction of AO, and silicone contamination pinhole cameras will identify the source of silicone contamination on a spacecraft. All experimental hardware will be passively exposed to AO for up to two weeks in the actual space environment when it flies in the bay of a space shuttle. A second set of the PEACE Polymers is being exposed to the space environment for erosion yield determination as part of a second experiment, Materials International Space Station Experiment (MISSE). MISSE is a collaboration between several federal agencies and aerospace companies. During a space walk on August 16, 2001, MISSE was attached to the outside of the International Space Station (ISS) during an extravehicular activity (EVA), where it began its exposure to AO for approximately 1.5 years. The PEACE polymers, therefore, will be analyzed after both short-term and long-term AO exposures for a more complete study of AO effects.

  9. An Upper Bound on Orbital Debris Collision Probability When Only One Object has Position Uncertainty Information

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, P (sub c), have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum P (sub c). If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but useful P (sub c) upper bound. There are various avenues along which an upper bound on the high speed satellite collision probability has been pursued. Typically, for the collision plane representation of the high speed collision probability problem, the predicted miss position in the collision plane is assumed fixed. Then the shape (aspect ratio of ellipse), the size (scaling of standard deviations) or the orientation (rotation of ellipse principal axes) of the combined position error ellipse is varied to obtain a maximum P (sub c). Regardless as to the exact details of the approach, previously presented methods all assume that an individual position error covariance matrix is available for each object and the two are combined into a single, relative position error covariance matrix. This combined position error covariance matrix is then modified according to the chosen scheme to arrive at a maximum P (sub c). But what if error covariance information for one of the two objects is not available? When error covariance information for one of the objects is not available the analyst has commonly defaulted to the situation in which only the relative miss position and velocity are known without any corresponding state error covariance information. The various usual methods of finding a maximum P (sub c) do no good because the analyst defaults to no knowledge of the combined, relative position error covariance matrix. It is reasonable to think, given an assumption of no covariance information, an analyst might still attempt to determine the error covariance matrix that results in an upper bound on the P (sub c). Without some guidance on limits to the shape, size and orientation of the unknown covariance matrix, the limiting case is a degenerate ellipse lying along the relative miss vector in the collision plane. Unless the miss position is exceptionally large or the at-risk object is exceptionally small, this method results in a maximum P (sub c) too large to be of practical use. For example, assuming that the miss distance is equal to the current ISS alert volume along-track (+ or -) distance of 25 kilometers and that the at-risk area has a 70 meter radius. The maximum (degenerate ellipse) P (sub c) is about 0.00136. At 40 kilometers, the maximum P (sub c) would be 0.00085 which is still almost an order of magnitude larger than the ISS maneuver threshold of 0.0001. In fact, a miss distance of almost 340 kilometers is necessary to reduce the maximum P (sub c) associated with this degenerate ellipse to the ISS maneuver threshold value. Such a result is frequently of no practical value to the analyst. Some improvement may be made with respect to this problem by realizing that while the position error covariance matrix of one of the objects (usually the debris object) may not be known the position error covariance matrix of the other object (usually the asset) is almost always available. Making use of the position error covariance information for the one object provides an improvement in finding a maximum P (sub c) which, in some cases, may offer real utility. The equations to be used are presented and their use discussed.

  10. Transmission media appropriate laser-microwave solar power satellite system

    NASA Astrophysics Data System (ADS)

    Schäfer, C. A.; Gray, D.

    2012-10-01

    As a solution to the most critical problems with Solar power Satellite (SPS) development, a system is proposed which uses laser power transmission in space to a receiver high in the atmosphere that relays the power to Earth by either cable or microwave power transmission. It has been shown in the past that such hybrid systems have the advantages of a reduction in the mass of equipment required in geostationary orbit and avoidance of radio frequency interference with other satellites and terrestrial communications systems. The advantage over a purely laser power beam SPS is that atmospheric absorption is avoided and outages due to clouds and precipitation will not occur, allowing for deployment in the equatorial zone and guaranteeing year round operation. This proposal is supported by brief literature surveys and theoretical calculations to estimate crucial parameters in this paper. In relation to this concept, we build on a recently proposed method to collect solar energy by a tethered balloon at high altitude because it enables a low-cost start for bringing the first Watt of power to Earth giving some quick return on investment, which is desperately missing in the traditional SPS concept. To tackle the significant problem of GW-class SPSs of high launch cost per kg mass brought to space, this paper introduces a concept which aims to achieve a superior power over mass ratio compared to traditional satellite designs by the use of thin-film solar cells combined with optical fibres for power delivery. To minimise the aperture sizes and cost of the transmitting and receiving components of the satellite and high altitude receiver, closed-loop laser beam pointing and target tracking is crucial for pointing a laser beam onto a target area that is of similar size to the beam's diameter. A recently developed technique based on optical phase conjugation is introduced and its applicability for maintaining power transmission between the satellite and high altitude receiver is assessed. It was found that the design of the high altitude receiver and the means of transporting the received power through the lower 21 km of the atmosphere are inextricably linked. It was concluded that an initial small scale low-cost demonstration flight of the receiver that delivers power using existing technology could be undertaken in the near future.

  11. Optimisation of sea surface current retrieval using a maximum cross correlation technique on modelled sea surface temperature

    NASA Astrophysics Data System (ADS)

    Heuzé, Céline; Eriksson, Leif; Carvajal, Gisela

    2017-04-01

    Using sea surface temperature from satellite images to retrieve sea surface currents is not a new idea, but so far its operational near-real time implementation has not been possible. Validation studies are too region-specific or uncertain, due to the errors induced by the images themselves. Moreover, the sensitivity of the most common retrieval method, the maximum cross correlation, to the three parameters that have to be set is unknown. Using model outputs instead of satellite images, biases induced by this method are assessed here, for four different seas of Western Europe, and the best of nine settings and eight temporal resolutions are determined. For all regions, tracking a small 5 km pattern from the first image over a large 30 km region around its original location on a second image, separated from the first image by 6 to 9 hours returned the most accurate results. Moreover, for all regions, the problem is not inaccurate results but missing results, where the velocity is too low to be picked by the retrieval. The results are consistent both with limitations caused by ocean surface current dynamics and with the available satellite technology, indicating that automated sea surface current retrieval from sea surface temperature images is feasible now, for search and rescue operations, pollution confinement or even for more energy efficient and comfortable ship navigation.

  12. Remedial Sheets for Progress Checks, Segments 19-40.

    ERIC Educational Resources Information Center

    New York Inst. of Tech., Old Westbury.

    The second part of the Self-Paced Physics Course remediation materials is presented for U. S. Naval Academy students who miss core problems on the progress check. The total of 101 problems is incorporated in this volume to match study segments 19 through 40. Each remedial sheet is composed of a statement of the missed problem and references to…

  13. Predicting the Magnetic Field of Earth-impacting CMEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kay, C.; Gopalswamy, N.; Reinard, A.

    Predicting the impact of coronal mass ejections (CMEs) and the southward component of their magnetic field is one of the key goals of space weather forecasting. We present a new model, the ForeCAT In situ Data Observer (FIDO), for predicting the in situ magnetic field of CMEs. We first simulate a CME using ForeCAT, a model for CME deflection and rotation resulting from the background solar magnetic forces. Using the CME position and orientation from ForeCAT, we then determine the passage of the CME over a simulated spacecraft. We model the CME’s magnetic field using a force-free flux rope andmore » we determine the in situ magnetic profile at the synthetic spacecraft. We show that FIDO can reproduce the general behavior of four observed CMEs. FIDO results are very sensitive to the CME’s position and orientation, and we show that the uncertainty in a CME’s position and orientation from coronagraph images corresponds to a wide range of in situ magnitudes and even polarities. This small range of positions and orientations also includes CMEs that entirely miss the satellite. We show that two derived parameters (the normalized angular distance between the CME nose and satellite position and the angular difference between the CME tilt and the position angle of the satellite with respect to the CME nose) can be used to reliably determine whether an impact or miss occurs. We find that the same criteria separate the impacts and misses for cases representing all four observed CMEs.« less

  14. Improving the Automatic Inversion of Digital ISIS-2 Ionogram Reflection Traces into Topside Vertical Electron-Density Profiles

    NASA Technical Reports Server (NTRS)

    Benson, R. F.; Truhlik, V.; Huang, X.; Wang, Y.; Bilitza, D.

    2011-01-01

    The topside-sounders on the four satellites of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. The resulting ionograms were displayed on 35-mm film for analysis by visual inspection. Each of these satellites, launched between 1962 and 1971, produced data for 10 to 20 years. A number of the original telemetry tapes from this large data set have been converted directly into digital records. Software, known as the TOPside Ionogram Scalar with True-height (TOPIST) algorithm has been produced that enables the automatic inversion of ISIS-2 ionogram reflection traces into topside vertical electron-density profiles Ne(h). More than million digital Alouette/ISIS topside ionograms have been produced and over 300,000 are from ISIS 2. Many of these ISIS-2 ionograms correspond to a passive mode of operation for the detection of natural radio emissions and thus do not contain ionospheric reflection traces. TOPIST, however, is not able to produce Ne(h) profiles from all of the ISIS-2 ionograms with reflection traces because some of them did not contain frequency information. This information was missing due to difficulties encountered during the analog-to-digital conversion process in the detection of the ionogram frame-sync pulse and/or the frequency markers. Of the many digital topside ionograms that TOPIST was able to process, over 200 were found where direct comparisons could be made with Ne(h) profiles that were produced by manual scaling in the early days of the ISIS program. While many of these comparisons indicated excellent agreement (<10% average difference over the entire profile) there were also many cases with large differences (more than a factor of two). Here we will report on two approaches to improve the automatic inversion process: (1) improve the quality of the digital ionogram database by remedying the missing frequency-information problem when possible, and (2) using the above-mentioned comparisons as teaching examples of how to improve the original TOPIST software.

  15. Visual Data Analysis for Satellites

    NASA Technical Reports Server (NTRS)

    Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick

    2008-01-01

    The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.

  16. Implementation of the DINEOF ArcGIS Toolbox: Case study of reconstruction of Chlorophyll-a missing data over the Mediterranean using MyOcean satellite data products.

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Andreas; Stylianou, Stavros; Georgiou, Georgios; Hadjimitsis, Diofantos; Akylas, Evangelos

    2014-05-01

    ArcGIS® is a well known standard on Geographical Information Systems, used over the years for various remote sensing procedures. During the last decade, Rixen (2003) and Azcarate (2011) presented the DINEOF (Data Interpolating Empirical Orthogonal Functions) method, a EOF-based technique to reconstruct missing data in satellite images. The recent results of the DINEOF method in various experimental trials (Wang and Liu, 2013; Nikolaidis et al., 2013;2014) showed that this computationally affordable method leads to effective reconstruction of missing data from geophysical fields, such as chlorophyll-a, sea surface temperatures or salinities and geophysical fields derived from satellite data. Implementing the method in a GIS system will lead to a complete and integrated approach, enhancing its applicability. The inclusion of statistical tools within the GIS, will multiply the effectiveness, providing interoperability with other sources in the same application environment. This may be especially useful in studies where various different kinds of data are of interest. For this purpose, in this study we have implemented a new GIS toolbox that aims at automating the usage of the algorithm, incorporating the DINEOF codes provided by GHER (GeoHydrodynamics and Environment Research Group of University of Liege) into the ArcGIS®. A case-study of filling the chlorophyll-a missing data in the Mediterranean Sea area, for a 18-day period is analyzed, as an example for the effectiveness and simplicity of the toolbox. More specifically, we focus on chlorophyll-a MODIS satellite data collected by CNR-ISAC (Italian National Research Council, Institute of Atmospheric Sciences and Climate), from the respective products of MyOcean2® organization, that provides free online access to Level 3, with 1 km resolution. All the daily products with an initial level of only 27% data coverage were successfully reconstructed over the Mediterranean Sea. [1] Alvera-Azcárate A., Barth A.,Sirjacobs D., Lenartz F., Beckers J.-M.. Data Interpolating Empirical Orthogonal Functions (DINEOF): a tool for geophysical data analyses. Medit. Mar. Sci., 5-11, (2011). [2] Rixen M., Beckers J. M.,, EOF Calculations and Data Filling from Incomplete Oceanographic Datasets. Journal of Atmospheric and Oceanic Technology, Vol. 20(12), pp. 1839-1856, (2003) [3] Nikolaidis A., Georgiou G., Hadjimitsis D. and E. Akylas, Applying a DINEOF algorithm on cloudy sea-surface temperature satellite data over the eastern Mediterranean Sea, Central European Journal of Geosciences 6(1), pp. 1-16, (2014) [4] Nikolaidis A., Georgiou G., Hadjimitsis D. and E. Akylas Applying DINEOF algorithm on cloudy sea-surface temperature satellite data over the eastern Mediterranean Sea, Proc. SPIE 8795, First International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2013), 87950L, 8-10 April 2013, Paphos, Cyprus, 10.1117/12.2029085 [5] Wang Y. and D. Liu (2014), Reconstruction of satellite chlorophyll-a data using a modified DINEOF method: a case study in the Bohai and Yellow seas, China, International Journal of Remote Sensing, Vol. 35(1), 204-217, (2014).

  17. Silver Alerts and the Problem of Missing Adults with Dementia

    ERIC Educational Resources Information Center

    Carr, Dawn; Muschert, Glenn W.; Kinney, Jennifer; Robbins, Emily; Petonito, Gina; Manning, Lydia; Brown, J. Scott

    2010-01-01

    In the months following the introduction of the National AMBER (America's Missing: Broadcast Emergency Response) Alert plan used to locate missing and abducted children, Silver Alert programs began to emerge. These programs use the same infrastructure and approach to find a different missing population, cognitively impaired older adults. By late…

  18. A Cautious Note on Auxiliary Variables That Can Increase Bias in Missing Data Problems.

    PubMed

    Thoemmes, Felix; Rose, Norman

    2014-01-01

    The treatment of missing data in the social sciences has changed tremendously during the last decade. Modern missing data techniques such as multiple imputation and full-information maximum likelihood are used much more frequently. These methods assume that data are missing at random. One very common approach to increase the likelihood that missing at random is achieved consists of including many covariates as so-called auxiliary variables. These variables are either included based on data considerations or in an inclusive fashion; that is, taking all available auxiliary variables. In this article, we point out that there are some instances in which auxiliary variables exhibit the surprising property of increasing bias in missing data problems. In a series of focused simulation studies, we highlight some situations in which this type of biasing behavior can occur. We briefly discuss possible ways how one can avoid selecting bias-inducing covariates as auxiliary variables.

  19. Multiple Imputation for Multivariate Missing-Data Problems: A Data Analyst's Perspective.

    ERIC Educational Resources Information Center

    Schafer, Joseph L.; Olsen, Maren K.

    1998-01-01

    The key ideas of multiple imputation for multivariate missing data problems are reviewed. Software programs available for this analysis are described, and their use is illustrated with data from the Adolescent Alcohol Prevention Trial (W. Hansen and J. Graham, 1991). (SLD)

  20. A heuristic approach to worst-case carrier-to-interference ratio maximization in satellite system synthesis

    NASA Technical Reports Server (NTRS)

    Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Mount-Campbell, Clark A.; Olen, Carl A.

    1990-01-01

    Consideration is given to the problem of allotting GEO locations to communication satellites so as to maximize the smallest aggregate carrier-to-interference (C/I) ratio calculated at any test point (assumed earth station). The location allotted to each satellite must be within the satellite's service arc, and angular separation constraints are enforced for each pair of satellites to control single-entry EMI. Solutions to this satellite system synthesis problem (SSSP) are found by embedding two heuristic procedures for the satellite location problem (SLP), in a binary search routine to find an estimate of the largest increment to the angular separation values that permits a feasible solution to SLP and SSSP. Numerical results for a 183-satellite, 208-beam example problem are presented.

  1. The Dragonfly Nearby Galaxies Survey. III. The Luminosity Function of the M101 Group

    NASA Astrophysics Data System (ADS)

    Danieli, Shany; van Dokkum, Pieter; Merritt, Allison; Abraham, Roberto; Zhang, Jielai; Karachentsev, I. D.; Makarova, L. N.

    2017-03-01

    We obtained follow-up HST observations of the seven low surface brightness galaxies discovered with the Dragonfly Telephoto Array in the field of the massive spiral galaxy M101. Out of the seven galaxies, only three were resolved into stars and are potentially associated with the M101 group at D = 7 Mpc. Based on HST ACS photometry in the broad F606W and F814W filters, we use a maximum likelihood algorithm to locate the Tip of the Red Giant Branch in galaxy color-magnitude diagrams. Distances are {6.38}-0.35+0.35,{6.87}-0.30+0.21 and {6.52}-0.27+0.25 {Mpc} and we confirm that they are members of the M101 group. Combining the three confirmed low-luminosity satellites with previous results for brighter group members, we find the M101 galaxy group to be a sparsely populated galaxy group consisting of seven group members, down to M V = -9.2 mag. We compare the M101 cumulative luminosity function to that of the Milky Way and M31. We find that they are remarkably similar; in fact, the cumulative luminosity function of the M101 group gets even flatter for fainter magnitudes, and we show that the M101 group might exhibit the two known small-scale flaws in the ΛCDM model, namely “the missing satellite” problem and the “too big to fail” problem. Kinematic measurements of M101's satellite galaxies are required to determine whether the “too big to fail” problem does in fact exist in the M101 group.

  2. ScienceCast 29: Did Earth Have Two Moons?

    NASA Image and Video Library

    2011-09-22

    Did our planet once have two moons? Some researchers say so. Moreover, the missing satellite might still be up there--splattered across the far side of the Moon. NASA's GRAIL mission could help confirm or refute the "two moon" hypothesis.

  3. A Review On Missing Value Estimation Using Imputation Algorithm

    NASA Astrophysics Data System (ADS)

    Armina, Roslan; Zain, Azlan Mohd; Azizah Ali, Nor; Sallehuddin, Roselina

    2017-09-01

    The presence of the missing value in the data set has always been a major problem for precise prediction. The method for imputing missing value needs to minimize the effect of incomplete data sets for the prediction model. Many algorithms have been proposed for countermeasure of missing value problem. In this review, we provide a comprehensive analysis of existing imputation algorithm, focusing on the technique used and the implementation of global or local information of data sets for missing value estimation. In addition validation method for imputation result and way to measure the performance of imputation algorithm also described. The objective of this review is to highlight possible improvement on existing method and it is hoped that this review gives reader better understanding of imputation method trend.

  4. Satellite tagging and biopsy sampling of killer whales at subantarctic Marion Island: effectiveness, immediate reactions and long-term responses.

    PubMed

    Reisinger, Ryan R; Oosthuizen, W Chris; Péron, Guillaume; Cory Toussaint, Dawn; Andrews, Russel D; de Bruyn, P J Nico

    2014-01-01

    Remote tissue biopsy sampling and satellite tagging are becoming widely used in large marine vertebrate studies because they allow the collection of a diverse suite of otherwise difficult-to-obtain data which are critical in understanding the ecology of these species and to their conservation and management. Researchers must carefully consider their methods not only from an animal welfare perspective, but also to ensure the scientific rigour and validity of their results. We report methods for shore-based, remote biopsy sampling and satellite tagging of killer whales Orcinus orca at Subantarctic Marion Island. The performance of these methods is critically assessed using 1) the attachment duration of low-impact minimally percutaneous satellite tags; 2) the immediate behavioural reactions of animals to biopsy sampling and satellite tagging; 3) the effect of researcher experience on biopsy sampling and satellite tagging; and 4) the mid- (1 month) and long- (24 month) term behavioural consequences. To study mid- and long-term behavioural changes we used multievent capture-recapture models that accommodate imperfect detection and individual heterogeneity. We made 72 biopsy sampling attempts (resulting in 32 tissue samples) and 37 satellite tagging attempts (deploying 19 tags). Biopsy sampling success rates were low (43%), but tagging rates were high with improved tag designs (86%). The improved tags remained attached for 26±14 days (mean ± SD). Individuals most often showed no reaction when attempts missed (66%) and a slight reaction-defined as a slight flinch, slight shake, short acceleration, or immediate dive-when hit (54%). Severe immediate reactions were never observed. Hit or miss and age-sex class were important predictors of the reaction, but the method (tag or biopsy) was unimportant. Multievent trap-dependence modelling revealed considerable variation in individual sighting patterns; however, there were no significant mid- or long-term changes following biopsy sampling or tagging.

  5. Satellite Tagging and Biopsy Sampling of Killer Whales at Subantarctic Marion Island: Effectiveness, Immediate Reactions and Long-Term Responses

    PubMed Central

    Reisinger, Ryan R.; Oosthuizen, W. Chris; Péron, Guillaume; Cory Toussaint, Dawn; Andrews, Russel D.; de Bruyn, P. J. Nico

    2014-01-01

    Remote tissue biopsy sampling and satellite tagging are becoming widely used in large marine vertebrate studies because they allow the collection of a diverse suite of otherwise difficult-to-obtain data which are critical in understanding the ecology of these species and to their conservation and management. Researchers must carefully consider their methods not only from an animal welfare perspective, but also to ensure the scientific rigour and validity of their results. We report methods for shore-based, remote biopsy sampling and satellite tagging of killer whales Orcinus orca at Subantarctic Marion Island. The performance of these methods is critically assessed using 1) the attachment duration of low-impact minimally percutaneous satellite tags; 2) the immediate behavioural reactions of animals to biopsy sampling and satellite tagging; 3) the effect of researcher experience on biopsy sampling and satellite tagging; and 4) the mid- (1 month) and long- (24 month) term behavioural consequences. To study mid- and long-term behavioural changes we used multievent capture-recapture models that accommodate imperfect detection and individual heterogeneity. We made 72 biopsy sampling attempts (resulting in 32 tissue samples) and 37 satellite tagging attempts (deploying 19 tags). Biopsy sampling success rates were low (43%), but tagging rates were high with improved tag designs (86%). The improved tags remained attached for 26±14 days (mean ± SD). Individuals most often showed no reaction when attempts missed (66%) and a slight reaction–defined as a slight flinch, slight shake, short acceleration, or immediate dive–when hit (54%). Severe immediate reactions were never observed. Hit or miss and age-sex class were important predictors of the reaction, but the method (tag or biopsy) was unimportant. Multievent trap-dependence modelling revealed considerable variation in individual sighting patterns; however, there were no significant mid- or long-term changes following biopsy sampling or tagging. PMID:25375329

  6. A Review of Missing Data Handling Methods in Education Research

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb R.

    2014-01-01

    Missing data are a common occurrence in survey-based research studies in education, and the way missing values are handled can significantly affect the results of analyses based on such data. Despite known problems with performance of some missing data handling methods, such as mean imputation, many researchers in education continue to use those…

  7. Effects of Modified Schema-Based Instruction on Real-World Algebra Problem Solving of Students with Autism Spectrum Disorder and Moderate Intellectual Disability

    ERIC Educational Resources Information Center

    Root, Jenny Rose

    2016-01-01

    The current study evaluated the effects of modified schema-based instruction (SBI) on the algebra problem solving skills of three middle school students with autism spectrum disorder and moderate intellectual disability (ASD/ID). Participants learned to solve two types of group word problems: missing-whole and missing-part. The themes of the word…

  8. The Missing Curriculum in Physics Problem-Solving Education

    NASA Astrophysics Data System (ADS)

    Williams, Mobolaji

    2018-05-01

    Physics is often seen as an excellent introduction to science because it allows students to learn not only the laws governing the world around them, but also, through the problems students solve, a way of thinking which is conducive to solving problems outside of physics and even outside of science. In this article, we contest this latter idea and argue that in physics classes, students do not learn widely applicable problem-solving skills because physics education almost exclusively requires students to solve well-defined problems rather than the less-defined problems which better model problem solving outside of a formal class. Using personal, constructed, and the historical accounts of Schrödinger's development of the wave equation and Feynman's development of path integrals, we argue that what is missing in problem-solving education is practice in identifying gaps in knowledge and in framing these knowledge gaps as questions of the kind answerable using techniques students have learned. We discuss why these elements are typically not taught as part of the problem-solving curriculum and end with suggestions on how to incorporate these missing elements into physics classes.

  9. Substructure of fuzzy dark matter haloes

    NASA Astrophysics Data System (ADS)

    Du, Xiaolong; Behrens, Christoph; Niemeyer, Jens C.

    2017-02-01

    We derive the halo mass function (HMF) for fuzzy dark matter (FDM) by solving the excursion set problem explicitly with a mass-dependent barrier function, which has not been done before. We find that compared to the naive approach of the Sheth-Tormen HMF for FDM, our approach has a higher cutoff mass and the cutoff mass changes less strongly with redshifts. Using merger trees constructed with a modified version of the Lacey & Cole formalism that accounts for suppressed small-scale power and the scale-dependent growth of FDM haloes and the semi-analytic GALACTICUS code, we study the statistics of halo substructure including the effects from dynamical friction and tidal stripping. We find that if the dark matter is a mixture of cold dark matter (CDM) and FDM, there will be a suppression on the halo substructure on small scales which may be able to solve the missing satellites problem faced by the pure CDM model. The suppression becomes stronger with increasing FDM fraction or decreasing FDM mass. Thus, it may be used to constrain the FDM model.

  10. Characterizing Longitude-Dependent Orbital Debris Congestion in the Geosynchronous Orbit Regime

    NASA Astrophysics Data System (ADS)

    Anderson, Paul V.

    The geosynchronous orbit (GEO) is a unique commodity of the satellite industry that is becoming increasingly contaminated with orbital debris, but is heavily populated with high-value assets from the civil, commercial, and defense sectors. The GEO arena is home to hundreds of communications, data transmission, and intelligence satellites collectively insured for an estimated 18.3 billion USD. As the lack of natural cleansing mechanisms at the GEO altitude renders the lifetimes of GEO debris essentially infinite, conjunction and risk assessment must be performed to safeguard operational assets from debris collisions. In this thesis, longitude-dependent debris congestion is characterized by predicting the number of near-miss events per day for every longitude slot at GEO, using custom debris propagation tools and a torus intersection metric. Near-miss events with the present-day debris population are assigned risk levels based on GEO-relative position and speed, and this risk information is used to prioritize the population for debris removal target selection. Long-term projections of debris growth under nominal launch traffic, mitigation practices, and fragmentation events are also discussed, and latitudinal synchronization of the GEO debris population is explained via node variations arising from luni-solar gravity. In addition to characterizing localized debris congestion in the GEO ring, this thesis further investigates the conjunction risk to operational satellites or debris removal systems applying low-thrust propulsion to raise orbit altitude at end-of-life to a super-synchronous disposal orbit. Conjunction risks as a function of thrust level, miss distance, longitude, and semi-major axis are evaluated, and a guidance method for evading conjuncting debris with continuous thrust by means of a thrust heading change via single-shooting is developed.

  11. Suspended Education in Massachusetts: Using Days of Lost Instruction Due to Suspension to Evaluate Our Schools

    ERIC Educational Resources Information Center

    Losen, Daniel J.; Sun, Wei-Ling; Keith, Michael A., II

    2017-01-01

    Missed instruction can have a devastating impact on educational outcomes. Some reasons for missed instruction are beyond the control of schools and districts: some students miss school due to mental or physical illness or injury, and transportation problems sometimes are to blame. One major reason for missed instruction that schools can directly…

  12. The continuing problem of missed test results in an integrated health system with an advanced electronic medical record.

    PubMed

    Wahls, Terry; Haugen, Thomas; Cram, Peter

    2007-08-01

    Missed results can cause needless treatment delays. However, there is little data about the magnitude of this problem and the systems that clinics use to manage test results. Surveys about potential problems related to test results management were developed and administered to clinical staff in a regional Veterans Administration (VA) health care network. The provider survey, conducted four times between May 2005 and October 2006, sampling VA staff physicians, physician assistants, nurse practitioners, and internal medicine trainees, asked questions about the frequency of missed results and diagnosis or treatment delays seen in the antecedent two weeks in their clinics, or if a trainee, the antecedent month. Clinical staff survey response rate was 39% (143 of 370), with 40% using standard operating procedures to manage test results. Forty-four percent routinely reported all results to patients. The provider survey response rate was 50% (441 of 884) overall, with responses often (37% overall; range 29% to 46%) indicating they had seen patients with diagnosis or treatment delays attributed to a missed result; 15% reported two or more such encounters. Even in an integrated health system with an advanced electronic medical record, missed test results and associated diagnosis or treatment delays are common. Additional study and measures of missed results and associated treatment delays are needed.

  13. Methods for Handling Missing Secondary Respondent Data

    ERIC Educational Resources Information Center

    Young, Rebekah; Johnson, David

    2013-01-01

    Secondary respondent data are underutilized because researchers avoid using these data in the presence of substantial missing data. The authors reviewed, evaluated, and tested solutions to this problem. Five strategies of dealing with missing partner data were reviewed: (a) complete case analysis, (b) inverse probability weighting, (c) correction…

  14. Why Missing Data Matter in the Longitudinal Study of Adolescent Development: Using the 4-H Study to Understand the Uses of Different Missing Data Methods

    ERIC Educational Resources Information Center

    Jelicic, Helena; Phelps, Erin; Lerner, Richard M.

    2010-01-01

    The study of adolescent development rests on methodologically appropriate collection and interpretation of longitudinal data. While all longitudinal studies of adolescent development involve missing data, the methods to treat missingness that have been recommended most often focus on missing data from cross-sectional studies. The problems of…

  15. Predicting the Velocity Dispersions of the Dwarf Satellite Galaxies of Andromeda

    NASA Astrophysics Data System (ADS)

    McGaugh, Stacy S.

    2016-05-01

    Dwarf Spheroidal galaxies in the Local Group are the faintest and most diffuse stellar systems known. They exhibit large mass discrepancies, making them popular laboratories for studying the missing mass problem. The PANDAS survey of M31 revealed dozens of new examples of such dwarfs. As these systems were discovered, it was possible to use the observed photometric properties to predict their stellar velocity dispersions with the modified gravity theory MOND. These predictions, made in advance of the observations, have since been largely confirmed. A unique feature of MOND is that a structurally identical dwarf will behave differently when it is or is not subject to the external field of a massive host like Andromeda. The role of this "external field effect" is critical in correctly predicting the velocity dispersions of dwarfs that deviate from empirical scaling relations. With continued improvement in the observational data, these systems could provide a test of the strong equivalence principle.

  16. Torrential Rain in China

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Concentric ovals of red, orange, yellow, and green are draped over southern China, showing rainfall totals for the week of June 4 through June 11, 2007. The rainfall totals are from the Goddard Space Flight Center Multi-satellite Precipitation Analysis, which is based on rainfall measurements taken by the Tropical Rainfall Measuring Mission (TRMM) satellite. Though seasonal rains are not unexpected in the area, the rain that fell during the week was torrential and relentless. As the image shows, a broad stretch of China received up to 200 millimeters (8 inches) of rain, and some areas were inundated with up to 500 millimeters (20 inches). Floods and landslides resulted, destroying crops and forcing some 643,000 people from their homes, reported the Xinhua News Agency on ReliefWeb. As of June 11, 71 people had died and 13 were missing. The most affected area was the southern coast, where rainfall totals are highest in this image. Heavy tropical rains combined with steep mountains make southeastern China prone to devastating landslides. Monitoring landslide-producing conditions typically requires extensive networks of ground-based rain gauges and weather instruments. But many developing countries in high-risk areas lack the resources to maintain such systems; heavy rains and flooding often wash away ground-based instruments. Robert Adler, a senior scientist in the Laboratory for Atmospheres at Goddard Space Flight Center, and Yang Hong, a research scientist at Goddard Earth Sciences Technology Center, are confronting the problem by developing a satellite-based system for predicting landslides. The system relies on TRMM data to predict when rainfall in different areas has reached a landslide-triggering threshold. The system makes data available on the Internet just a few hours after the satellite makes its observations. To read more about the landslide-monitoring system, please read the feature article Satellite Monitors Rains That Trigger Landslides, http://earthobservatory.nasa.gov/Study/LandslideWarning/. TRMM is a joint mission between NASA and the Japanese space agency, JAXA. NASA images produced by Hal Pierce (SSAI/NASA GSFC).

  17. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  18. They Remember the "Lost" People.

    ERIC Educational Resources Information Center

    Klages, Karen

    Estimates of the number of children currently missing in the United States are only approximate because there is no effective central data bank to collect information on missing persons and unidentified bodies. However, the problem appears to have reached epidemic proportions. Some parents of missing persons have formed organizations in different…

  19. The Empirical Nature and Statistical Treatment of Missing Data

    ERIC Educational Resources Information Center

    Tannenbaum, Christyn E.

    2009-01-01

    Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…

  20. Interaction of marine geodesy, satellite technology and ocean physics

    NASA Technical Reports Server (NTRS)

    Mourad, A. G.; Fubara, D. M. J.

    1972-01-01

    The possible applications of satellite technology in marine geodesy and geodetic related ocean physics were investigated. Four major problems were identified in the areas of geodesy and ocean physics: (1) geodetic positioning and control establishment; (2) sea surface topography and geoid determination; (3) geodetic applications to ocean physics; and (4) ground truth establishment. It was found that satellite technology can play a major role in their solution. For solution of the first problem, the use of satellite geodetic techniques, such as Doppler and C-band radar ranging, is demonstrated to fix the three-dimensional coordinates of marine geodetic control if multi-satellite passes are used. The second problem is shown to require the use of satellite altimetry, along with accurate knowledge of ocean-dynamics parameters such as sea state, ocean tides, and mean sea level. The use of both conventional and advanced satellite techniques appeared to be necessary to solve the third and fourth problems.

  1. On orbital allotments for geostationary satellites

    NASA Technical Reports Server (NTRS)

    Gonsalvez, David J. A.; Reilly, Charles H.; Mount-Campbell, Clark A.

    1986-01-01

    The following satellite synthesis problem is addressed: communication satellites are to be allotted positions on the geostationary arc so that interference does not exceed a given acceptable level by enforcing conservative pairwise satellite separation. A desired location is specified for each satellite, and the objective is to minimize the sum of the deviations between the satellites' prescribed and desired locations. Two mixed integer programming models for the satellite synthesis problem are presented. Four solution strategies, branch-and-bound, Benders' decomposition, linear programming with restricted basis entry, and a switching heuristic, are used to find solutions to example synthesis problems. Computational results indicate the switching algorithm yields solutions of good quality in reasonable execution times when compared to the other solution methods. It is demonstrated that the switching algorithm can be applied to synthesis problems with the objective of minimizing the largest deviation between a prescribed location and the corresponding desired location. Furthermore, it is shown that the switching heuristic can use no conservative, location-dependent satellite separations in order to satisfy interference criteria.

  2. Reliability and Validity in Measuring the Value Added of Schools

    ERIC Educational Resources Information Center

    van de Grift, Wim

    2009-01-01

    Instability in the school population between school entrance and school leaving is not "just a problem of missing data" but often the visible result of the educational problems in some schools and is, therefore, not merely to be treated as missing data but as indicator for the quality of educational processes. Even the most superior…

  3. A Couple of "Lim (h[right arrow]0)-Is-Missing" Problems

    ERIC Educational Resources Information Center

    Lau, Ko Hin

    2007-01-01

    Since most students "hate" the concept of limit, in order to make them "happier," this article suggests a couple of naive "lim (h[right arrow]0)-is-missing" problems for them to try for fun. Indeed, differential functional equations that are related to difference quotients in calculus are studied in this paper. In particular, two interesting…

  4. Addressing the Missing Instructional Data Problem: Using a Teacher Log to Document Tier 1 Instruction

    ERIC Educational Resources Information Center

    Kurz, Alexander; Elliott, Stephen N.; Roach, Andrew T.

    2015-01-01

    Response-to-intervention (RTI) systems posit that Tier 1 consists of high-quality general classroom instruction using evidence-based methods to address the needs of most students. However, data on the extent to which general education teachers provide such instruction are rarely collected. This missing instructional data problem may result in RTI…

  5. Momentum Flux Estimates for South Georgia Island Mountain Waves in the Stratosphere Observed via Satellite

    NASA Technical Reports Server (NTRS)

    Alexander, M. Joan; Eckermann, Stephen D.; Broutman, Dave; Ma, Jun

    2009-01-01

    We show high-resolution satellite observations of mountain wave events in the stratosphere above South Georgia Island in the remote southern Atlantic Ocean and compute the wave momentum fluxes for these events. The fluxes are large, and they imply important drag forces on the circulation. Small island orography is generally neglected in mountain wave parameterizations used in global climate models because limited model resolution treats the grid cell containing the island as ocean rather than land. Our results show that satellite observations can be used to quantitatively constrain mountain wave momentum fluxes, and they suggest that mountain waves from island topography may be an important missing source of drag on the atmospheric circulation.

  6. Sparse subspace clustering for data with missing entries and high-rank matrix completion.

    PubMed

    Fan, Jicong; Chow, Tommy W S

    2017-09-01

    Many methods have recently been proposed for subspace clustering, but they are often unable to handle incomplete data because of missing entries. Using matrix completion methods to recover missing entries is a common way to solve the problem. Conventional matrix completion methods require that the matrix should be of low-rank intrinsically, but most matrices are of high-rank or even full-rank in practice, especially when the number of subspaces is large. In this paper, a new method called Sparse Representation with Missing Entries and Matrix Completion is proposed to solve the problems of incomplete-data subspace clustering and high-rank matrix completion. The proposed algorithm alternately computes the matrix of sparse representation coefficients and recovers the missing entries of a data matrix. The proposed algorithm recovers missing entries through minimizing the representation coefficients, representation errors, and matrix rank. Thorough experimental study and comparative analysis based on synthetic data and natural images were conducted. The presented results demonstrate that the proposed algorithm is more effective in subspace clustering and matrix completion compared with other existing methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Wetland monitoring with Global Navigation Satellite System reflectometry

    PubMed Central

    Zuffada, Cinzia; Shah, Rashmi; Chew, Clara; Lowe, Stephen T.; Mannucci, Anthony J.; Cardellach, Estel; Brakenridge, G. Robert; Geller, Gary; Rosenqvist, Ake

    2017-01-01

    Abstract Information about wetland dynamics remains a major missing gap in characterizing, understanding, and projecting changes in atmospheric methane and terrestrial water storage. A review of current satellite methods to delineate and monitor wetland change shows some recent advances, but much improved sensing technologies are still needed for wetland mapping, not only to provide more accurate global inventories but also to examine changes spanning multiple decades. Global Navigation Satellite Systems Reflectometry (GNSS‐R) signatures from aircraft over the Ebro River Delta in Spain and satellite measurements over the Mississippi River and adjacent watersheds demonstrate that inundated wetlands can be identified under different vegetation conditions including a dense rice canopy and a thick forest with tall trees, where optical sensors and monostatic radars provide limited capabilities. Advantages as well as constraints of GNSS‐R are presented, and the synergy with various satellite observations are considered to achieve a breakthrough capability for multidecadal wetland dynamics monitoring with frequent global coverage at multiple spatial and temporal scales. PMID:28331894

  8. Strategies for Dealing with Missing Accelerometer Data.

    PubMed

    Stephens, Samantha; Beyene, Joseph; Tremblay, Mark S; Faulkner, Guy; Pullnayegum, Eleanor; Feldman, Brian M

    2018-05-01

    Missing data is a universal research problem that can affect studies examining the relationship between physical activity measured with accelerometers and health outcomes. Statistical techniques are available to deal with missing data; however, available techniques have not been synthesized. A scoping review was conducted to summarize the advantages and disadvantages of identified methods of dealing with missing data from accelerometers. Missing data poses a threat to the validity and interpretation of trials using physical activity data from accelerometry. Imputation using multiple imputation techniques is recommended to deal with missing data and improve the validity and interpretation of studies using accelerometry. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Do word-problem features differentially affect problem difficulty as a function of students' mathematics difficulty with and without reading difficulty?

    PubMed

    Powell, Sarah R; Fuchs, Lynn S; Fuchs, Douglas; Cirino, Paul T; Fletcher, Jack M

    2009-01-01

    This study examined whether and, if so, how word-problem features differentially affect problem difficulty as a function of mathematics difficulty (MD) status: no MD (n = 109), MD only (n = 109), or MD in combination with reading difficulties (MDRD; n = 109). The problem features were problem type (total, difference, or change) and position of missing information in the number sentence representing the word problem (first, second, or third position). Students were assessed on 14 word problems near the beginning of third grade. Consistent with the hypothesis that mathematical cognition differs as a function of MD subtype, problem type affected problem difficulty differentially for MDRD versus MD-only students; however, the position of missing information in word problems did not. Implications for MD subtyping and for instruction are discussed.

  10. A comparison of model-based imputation methods for handling missing predictor values in a linear regression model: A simulation study

    NASA Astrophysics Data System (ADS)

    Hasan, Haliza; Ahmad, Sanizah; Osman, Balkish Mohd; Sapri, Shamsiah; Othman, Nadirah

    2017-08-01

    In regression analysis, missing covariate data has been a common problem. Many researchers use ad hoc methods to overcome this problem due to the ease of implementation. However, these methods require assumptions about the data that rarely hold in practice. Model-based methods such as Maximum Likelihood (ML) using the expectation maximization (EM) algorithm and Multiple Imputation (MI) are more promising when dealing with difficulties caused by missing data. Then again, inappropriate methods of missing value imputation can lead to serious bias that severely affects the parameter estimates. The main objective of this study is to provide a better understanding regarding missing data concept that can assist the researcher to select the appropriate missing data imputation methods. A simulation study was performed to assess the effects of different missing data techniques on the performance of a regression model. The covariate data were generated using an underlying multivariate normal distribution and the dependent variable was generated as a combination of explanatory variables. Missing values in covariate were simulated using a mechanism called missing at random (MAR). Four levels of missingness (10%, 20%, 30% and 40%) were imposed. ML and MI techniques available within SAS software were investigated. A linear regression analysis was fitted and the model performance measures; MSE, and R-Squared were obtained. Results of the analysis showed that MI is superior in handling missing data with highest R-Squared and lowest MSE when percent of missingness is less than 30%. Both methods are unable to handle larger than 30% level of missingness.

  11. Missing Data and Multiple Imputation in the Context of Multivariate Analysis of Variance

    ERIC Educational Resources Information Center

    Finch, W. Holmes

    2016-01-01

    Multivariate analysis of variance (MANOVA) is widely used in educational research to compare means on multiple dependent variables across groups. Researchers faced with the problem of missing data often use multiple imputation of values in place of the missing observations. This study compares the performance of 2 methods for combining p values in…

  12. SPSS Syntax for Missing Value Imputation in Test and Questionnaire Data

    ERIC Educational Resources Information Center

    van Ginkel, Joost R.; van der Ark, L. Andries

    2005-01-01

    A well-known problem in the analysis of test and questionnaire data is that some item scores may be missing. Advanced methods for the imputation of missing data are available, such as multiple imputation under the multivariate normal model and imputation under the saturated logistic model (Schafer, 1997). Accompanying software was made available…

  13. Genetics Home Reference: Fraser syndrome

    MedlinePlus

    ... them, or they may be small ( microphthalmia ) or missing (anophthalmia). Eye abnormalities typically lead to impairment or ... other problems related to abnormal eye development, including missing eyebrows or eyelashes or a patch of hair ...

  14. The "Missing Males" and Other Gender-Related Issues in Music Education: A Critical Analysis of Evidence from the Music Supervisors' Journal, 1914-1924.

    ERIC Educational Resources Information Center

    Koza, Julia Eklund

    Boys' reluctance to participate in music education programs, particularly in school singing groups -- termed in this paper the "missing males" problem -- is just one among many pressing gender problems in music education. In order to discover whether boys' lack of participation in music, along with other gender-related issues, are merely…

  15. Multi-task Gaussian process for imputing missing data in multi-trait and multi-environment trials.

    PubMed

    Hori, Tomoaki; Montcho, David; Agbangla, Clement; Ebana, Kaworu; Futakuchi, Koichi; Iwata, Hiroyoshi

    2016-11-01

    A method based on a multi-task Gaussian process using self-measuring similarity gave increased accuracy for imputing missing phenotypic data in multi-trait and multi-environment trials. Multi-environmental trial (MET) data often encounter the problem of missing data. Accurate imputation of missing data makes subsequent analysis more effective and the results easier to understand. Moreover, accurate imputation may help to reduce the cost of phenotyping for thinned-out lines tested in METs. METs are generally performed for multiple traits that are correlated to each other. Correlation among traits can be useful information for imputation, but single-trait-based methods cannot utilize information shared by traits that are correlated. In this paper, we propose imputation methods based on a multi-task Gaussian process (MTGP) using self-measuring similarity kernels reflecting relationships among traits, genotypes, and environments. This framework allows us to use genetic correlation among multi-trait multi-environment data and also to combine MET data and marker genotype data. We compared the accuracy of three MTGP methods and iterative regularized PCA using rice MET data. Two scenarios for the generation of missing data at various missing rates were considered. The MTGP performed a better imputation accuracy than regularized PCA, especially at high missing rates. Under the 'uniform' scenario, in which missing data arise randomly, inclusion of marker genotype data in the imputation increased the imputation accuracy at high missing rates. Under the 'fiber' scenario, in which missing data arise in all traits for some combinations between genotypes and environments, the inclusion of marker genotype data decreased the imputation accuracy for most traits while increasing the accuracy in a few traits remarkably. The proposed methods will be useful for solving the missing data problem in MET data.

  16. A MAP-based image interpolation method via Viterbi decoding of Markov chains of interpolation functions.

    PubMed

    Vedadi, Farhang; Shirani, Shahram

    2014-01-01

    A new method of image resolution up-conversion (image interpolation) based on maximum a posteriori sequence estimation is proposed. Instead of making a hard decision about the value of each missing pixel, we estimate the missing pixels in groups. At each missing pixel of the high resolution (HR) image, we consider an ensemble of candidate interpolation methods (interpolation functions). The interpolation functions are interpreted as states of a Markov model. In other words, the proposed method undergoes state transitions from one missing pixel position to the next. Accordingly, the interpolation problem is translated to the problem of estimating the optimal sequence of interpolation functions corresponding to the sequence of missing HR pixel positions. We derive a parameter-free probabilistic model for this to-be-estimated sequence of interpolation functions. Then, we solve the estimation problem using a trellis representation and the Viterbi algorithm. Using directional interpolation functions and sequence estimation techniques, we classify the new algorithm as an adaptive directional interpolation using soft-decision estimation techniques. Experimental results show that the proposed algorithm yields images with higher or comparable peak signal-to-noise ratios compared with some benchmark interpolation methods in the literature while being efficient in terms of implementation and complexity considerations.

  17. Singularity free N-body simulations called 'Dynamic Universe Model' don't require dark matter

    NASA Astrophysics Data System (ADS)

    Naga Parameswara Gupta, Satyavarapu

    For finding trajectories of Pioneer satellite (Anomaly), New Horizons satellite going to Pluto, the Calculations of Dynamic Universe model can be successfully applied. No dark matter is assumed within solar system radius. The effect on the masses around SUN shows as though there is extra gravitation pull toward SUN. It solves the Dynamics of Extra-solar planets like Planet X, satellite like Pioneer and NH for 3-Position, 3-velocity 3-accelaration for their masses, considering the complex situation of Multiple planets, Stars, Galaxy parts and Galaxy centre and other Galaxies Using simple Newtonian Physics. It already solved problems Missing mass in Galaxies observed by galaxy circular velocity curves successfully. Singularity free Newtonian N-body simulations Historically, King Oscar II of Sweden an-nounced a prize to a solution of N-body problem with advice given by Güsta Mittag-Leffler in 1887. He announced `Given a system of arbitrarily many mass points that attract each according to Newton's law, under the assumption that no two points ever collide, try to find a representation of the coordinates of each point as a series in a variable that is some known function of time and for all of whose values the series converges uniformly.'[This is taken from Wikipedia]. The announced dead line that time was1st June 1888. And after that dead line, on 21st January 1889, Great mathematician Poincaré claimed that prize. Later he himself sent a telegram to journal Acta Mathematica to stop printing the special issue after finding the error in his solution. Yet for such a man of science reputation is important than money. [ Ref Book `Celestial mechanics: the waltz of the planets' By Alessandra Celletti, Ettore Perozzi, page 27]. He realized that he has been wrong in his general stability result! But till now nobody could solve that problem or claimed that prize. Later all solutions resulted in singularities and collisions of masses, given by many people . . . . . . . . . . . . . . . . . . . . . . . . .. Now I can say that the Dynamic Universe Model solves this classical N-body problem where only Newtonian Gravi-tation law and classical Physics were used. The solution converges at all points. There are no multiple values, diverging solutions or divided by zero singularities. Collisions of masses depend on physical values of masses and their space distribution only. These collisions do not happen due to internal inherent problems of Dynamic universe Model. If the mass distribution is homogeneous and isotropic, the masses will colloid. If the mass distribution is heterogeneous and anisotropic, they do not colloid. This approach solves many problems which otherwise can not be solved by General relativity, Steady state universe model etc. . .

  18. Kalman Filtering for Genetic Regulatory Networks with Missing Values

    PubMed Central

    Liu, Qiuhua; Lai, Tianyue; Wang, Wu

    2017-01-01

    The filter problem with missing value for genetic regulation networks (GRNs) is addressed, in which the noises exist in both the state dynamics and measurement equations; furthermore, the correlation between process noise and measurement noise is also taken into consideration. In order to deal with the filter problem, a class of discrete-time GRNs with missing value, noise correlation, and time delays is established. Then a new observation model is proposed to decrease the adverse effect caused by the missing value and to decouple the correlation between process noise and measurement noise in theory. Finally, a Kalman filtering is used to estimate the states of GRNs. Meanwhile, a typical example is provided to verify the effectiveness of the proposed method, and it turns out to be the case that the concentrations of mRNA and protein could be estimated accurately. PMID:28814967

  19. Concerning sources of O/1D/ in Aurora - Electron impact and dissociative recombination

    NASA Technical Reports Server (NTRS)

    Sharp, W. E.; Ortland, D.; Cageao, R.

    1983-01-01

    The present investigation is concerned with two questions. One is related to the possibility that the O(1D) level is produced by an as yet unidentified process in aurora. The second question is concerned with the need for an additional source and the altitude over which it is required. The data base of the AE satellite (AE-D in particular) was examined for this study. It is found that dissociative recombination and electron impact are inadequate sources of O(1D) in aurora. Nearly 90% of the source function is unidentified below 200 km and about 55% is missing above 250 km. The possibility that thermal electron impact could provide the missing source above 250 km was examined. Calculations showed that the missing source above 250 km could be explained by thermal electron impact if the electron temperatures were approximately 2900 K.

  20. Low-cost approaches to problem-driven hydrologic research: The case of Arkavathy watershed, India.

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Ballukraya, P. N.; Jeremiah, K.; R, A.

    2014-12-01

    Groundwater depletion is a major problem in the Arkavathy Basin and it is the probable cause of declining flows in the Arkavathy River. However, investigating groundwater trends and groundwater-surface water linkages is extremely challenging in a data-scarce environment where basins are largely ungauged so there is very little historical data; often the data are missing, flawed or biased. Moreover, hard-rock aquifer data are very difficult to interpret. In the absence of reliable data, establishing a trend let alone the causal linkages is a severe challenge. We used a combination of low-cost, participatory, satellite based and conventional data collection methods to maximize spatial and temporal coverage of data. For instance, long-term groundwater trends are biased because only a few dug wells with non-representative geological conditions still have water - the vast majority of the monitoring wells drilled in the 1970s and 1980s have dried up. Instead, we relied on "barefoot hydrology" techniques. By conducting a comprehensive well census, engaging farmers in participatory groundwater monitoring and using locally available commercial borewell scanning techniques we have been able to better establish groundwater trends and spatial patterns.

  1. Missing Children's Assistance Act. Hearings before the Subcommittee on Juvenile Justice of the Committee on the Judiciary. United States Senate. Ninety-Eighth Congress, Second Session on S. 2014, a Bill to Amend the Juvenile Justice and Delinquency Prevention Act of 1974 to Provide for Assistance in Locating Missing Children (February 7 and 21; March 8, 13, and 21, 1984).

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on the Judiciary.

    This document presents testimony and proceedings from Congressional hearings on the problem of missing children and the remedies proposed by the Missing Children's Assistance Act. Opening testimony by Senators Arlen Specter and Paula Hawkins is presented, as is the text of the Missing Children's Assistance Act of 1983. Prepared testimony from…

  2. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    PubMed

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and Nutrition Examination Survey (NHANES).

  3. Menstrual Cycle Problems

    MedlinePlus

    ... MoreDepression in Children and TeensRead MoreBMI Calculator Menstrual Cycle ProblemsFrom missed periods to painful periods, menstrual cycle problems are common, but usually not serious. Follow ...

  4. Galaxy Evolution at High Redshift: Obscured Star Formation, GRB Rates, Cosmic Reionization, and Missing Satellites

    NASA Astrophysics Data System (ADS)

    Lapi, A.; Mancuso, C.; Celotti, A.; Danese, L.

    2017-01-01

    We provide a holistic view of galaxy evolution at high redshifts z ≳ 4, which incorporates the constraints from various astrophysical/cosmological probes, including the estimate of the cosmic star formation rate (SFR) density from UV/IR surveys and long gamma-ray burst (GRBs) rates, the cosmic reionization history following the latest Planck measurements, and the missing satellites issue. We achieve this goal in a model-independent way by exploiting the SFR functions derived by Mancuso et al. on the basis of an educated extrapolation of the latest UV/far-IR data from HST/Herschel, and already tested against a number of independent observables. Our SFR functions integrated down to a UV magnitude limit MUV ≲ -13 (or SFR limit around 10-2 M⊙ yr-1) produce a cosmic SFR density in excellent agreement with recent determinations from IR surveys and, taking into account a metallicity ceiling Z ≲ Z⊙/2, with the estimates from long GRB rates. They also yield a cosmic reionization history consistent with that implied by the recent measurements of the Planck mission of the electron scattering optical depth τes ≈ 0.058 remarkably, this result is obtained under a conceivable assumption regarding the average value fesc ≈ 0.1 of the escape fraction for ionizing photons. We demonstrate via the abundance-matching technique that the above constraints concurrently imply galaxy formation becoming inefficient within dark matter halos of mass below a few 108 M⊙ pleasingly, such a limit is also required so as not to run into the missing satellites issue. Finally, we predict a downturn of the Galaxy luminosity function faintward of MUV ≲ -12, and stress that its detailed shape, to be plausibly probed in the near future by the JWST, will be extremely informative on the astrophysics of galaxy formation in small halos, or even on the microscopic nature of the dark matter.

  5. Missing value imputation: with application to handwriting data

    NASA Astrophysics Data System (ADS)

    Xu, Zhen; Srihari, Sargur N.

    2015-01-01

    Missing values make pattern analysis difficult, particularly with limited available data. In longitudinal research, missing values accumulate, thereby aggravating the problem. Here we consider how to deal with temporal data with missing values in handwriting analysis. In the task of studying development of individuality of handwriting, we encountered the fact that feature values are missing for several individuals at several time instances. Six algorithms, i.e., random imputation, mean imputation, most likely independent value imputation, and three methods based on Bayesian network (static Bayesian network, parameter EM, and structural EM), are compared with children's handwriting data. We evaluate the accuracy and robustness of the algorithms under different ratios of missing data and missing values, and useful conclusions are given. Specifically, static Bayesian network is used for our data which contain around 5% missing data to provide adequate accuracy and low computational cost.

  6. What to Do when Data Are Missing in Group Randomized Controlled Trials. NCEE 2009-0049

    ERIC Educational Resources Information Center

    Puma, Michael J.; Olsen, Robert B.; Bell, Stephen H.; Price, Cristofer

    2009-01-01

    This NCEE Technical Methods report examines how to address the problem of missing data in the analysis of data in Randomized Controlled Trials (RCTs) of educational interventions, with a particular focus on the common educational situation in which groups of students such as entire classrooms or schools are randomized. Missing outcome data are a…

  7. Working with Missing Data in Higher Education Research: A Primer and Real-World Example

    ERIC Educational Resources Information Center

    Cox, Bradley E.; McIntosh, Kadian; Reason, Robert D.; Terenzini, Patrick T.

    2014-01-01

    Nearly all quantitative analyses in higher education draw from incomplete datasets-a common problem with no universal solution. In the first part of this paper, we explain why missing data matter and outline the advantages and disadvantages of six common methods for handling missing data. Next, we analyze real-world data from 5,905 students across…

  8. Postmodeling Sensitivity Analysis to Detect the Effect of Missing Data Mechanisms

    ERIC Educational Resources Information Center

    Jamshidian, Mortaza; Mata, Matthew

    2008-01-01

    Incomplete or missing data is a common problem in almost all areas of empirical research. It is well known that simple and ad hoc methods such as complete case analysis or mean imputation can lead to biased and/or inefficient estimates. The method of maximum likelihood works well; however, when the missing data mechanism is not one of missing…

  9. Modeling missing data in knowledge space theory.

    PubMed

    de Chiusole, Debora; Stefanutti, Luca; Anselmi, Pasquale; Robusto, Egidio

    2015-12-01

    Missing data are a well known issue in statistical inference, because some responses may be missing, even when data are collected carefully. The problem that arises in these cases is how to deal with missing data. In this article, the missingness is analyzed in knowledge space theory, and in particular when the basic local independence model (BLIM) is applied to the data. Two extensions of the BLIM to missing data are proposed: The former, called ignorable missing BLIM (IMBLIM), assumes that missing data are missing completely at random; the latter, called missing BLIM (MissBLIM), introduces specific dependencies of the missing data on the knowledge states, thus assuming that the missing data are missing not at random. The IMBLIM and the MissBLIM modeled the missingness in a satisfactory way, in both a simulation study and an empirical application, depending on the process that generates the missingness: If the missing data-generating process is of type missing completely at random, then either IMBLIM or MissBLIM provide adequate fit to the data. However, if the pattern of missingness is functionally dependent upon unobservable features of the data (e.g., missing answers are more likely to be wrong), then only a correctly specified model of the missingness distribution provides an adequate fit to the data. (c) 2015 APA, all rights reserved).

  10. Fitting the multitemporal curve: a fourier series approach to the missing data problem in remote sensing analysis

    Treesearch

    Evan Brooks; Valerie Thomas; Wynne Randolph; John Coulston

    2012-01-01

    With the advent of free Landsat data stretching back decades, there has been a surge of interest in utilizing remotely sensed data in multitemporal analysis for estimation of biophysical parameters. Such analysis is confounded by cloud cover and other image-specific problems, which result in missing data at various aperiodic times of the year. While there is a wealth...

  11. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  12. Alternative mathematical programming formulations for FSS synthesis

    NASA Technical Reports Server (NTRS)

    Reilly, C. H.; Mount-Campbell, C. A.; Gonsalvez, D. J. A.; Levis, C. A.

    1986-01-01

    A variety of mathematical programming models and two solution strategies are suggested for the problem of allocating orbital positions to (synthesizing) satellites in the Fixed Satellite Service. Mixed integer programming and almost linear programming formulations are presented in detail for each of two objectives: (1) positioning satellites as closely as possible to specified desired locations, and (2) minimizing the total length of the geostationary arc allocated to the satellites whose positions are to be determined. Computational results for mixed integer and almost linear programming models, with the objective of positioning satellites as closely as possible to their desired locations, are reported for three six-administration test problems and a thirteen-administration test problem.

  13. Is Spending More Time Associated With Less Missed Care?: A Comparison of Time Use and Missed Care Across 15 Nursing Units at 2 Hospitals.

    PubMed

    McNair, Norma; Baird, Jennifer; Grogan, Tristan R; Walsh, Catherine M; Liang, Li-Jung; Worobel-Luk, Pamela; Needleman, Jack; Nuckols, Teryl K

    2016-09-01

    The aim of this study is to examine the relationship between nursing time use and perceptions of missed care. Recent literature has highlighted the problem of missed nursing care, but little is known about how nurses' time use patterns are associated with reports of missed care. In 15 nursing units at 2 hospitals, we assessed registered nurse (RN) perceptions of missed care, observed time use by RNs, and examined the relationship between time spent and degree of missed care at the nursing unit level. Patterns of time use were similar across hospitals, with 25% of time spent on documentation. For 6 different categories of nursing tasks, no association was detected between time use, including time spent on documentation, and the degree of missed care at the nursing unit level. Nursing time use cannot fully explain variation in missed care across nursing units. Further work is needed to account for patterns of missed care.

  14. Mathematical programming formulations for satellite synthesis

    NASA Technical Reports Server (NTRS)

    Bhasin, Puneet; Reilly, Charles H.

    1987-01-01

    The problem of satellite synthesis can be described as optimally allotting locations and sometimes frequencies and polarizations, to communication satellites so that interference from unwanted satellite signals does not exceed a specified threshold. In this report, mathematical programming models and optimization methods are used to solve satellite synthesis problems. A nonlinear programming formulation which is solved using Zoutendijk's method and a gradient search method is described. Nine mixed integer programming models are considered. Results of computer runs with these nine models and five geographically compatible scenarios are presented and evaluated. A heuristic solution procedure is also used to solve two of the models studied. Heuristic solutions to three large synthesis problems are presented. The results of our analysis show that the heuristic performs very well, both in terms of solution quality and solution time, on the two models to which it was applied. It is concluded that the heuristic procedure is the best of the methods considered for solving satellite synthesis problems.

  15. Planning and Scheduling for Fleets of Earth Observing Satellites

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Jonsson, Ari; Morris, Robert; Smith, David E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    We address the problem of scheduling observations for a collection of earth observing satellites. This scheduling task is a difficult optimization problem, potentially involving many satellites, hundreds of requests, constraints on when and how to service each request, and resources such as instruments, recording devices, transmitters, and ground stations. High-fidelity models are required to ensure the validity of schedules; at the same time, the size and complexity of the problem makes it unlikely that systematic optimization search methods will be able to solve them in a reasonable time. This paper presents a constraint-based approach to solving the Earth Observing Satellites (EOS) scheduling problem, and proposes a stochastic heuristic search method for solving it.

  16. A Novel Strategy Using Factor Graphs and the Sum-Product Algorithm for Satellite Broadcast Scheduling Problems

    NASA Astrophysics Data System (ADS)

    Chen, Jung-Chieh

    This paper presents a low complexity algorithmic framework for finding a broadcasting schedule in a low-altitude satellite system, i. e., the satellite broadcast scheduling (SBS) problem, based on the recent modeling and computational methodology of factor graphs. Inspired by the huge success of the low density parity check (LDPC) codes in the field of error control coding, in this paper, we transform the SBS problem into an LDPC-like problem through a factor graph instead of using the conventional neural network approaches to solve the SBS problem. Based on a factor graph framework, the soft-information, describing the probability that each satellite will broadcast information to a terminal at a specific time slot, is exchanged among the local processing in the proposed framework via the sum-product algorithm to iteratively optimize the satellite broadcasting schedule. Numerical results show that the proposed approach not only can obtain optimal solution but also enjoys the low complexity suitable for integral-circuit implementation.

  17. THE MASSIVE SATELLITE POPULATION OF MILKY-WAY-SIZED GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez-Puebla, Aldo; Avila-Reese, Vladimir; Drory, Niv, E-mail: apuebla@astro.unam.mx

    2013-08-20

    Several occupational distributions for satellite galaxies more massive than m{sub *} Almost-Equal-To 4 Multiplication-Sign 10{sup 7} M{sub Sun} around Milky-Way (MW)-sized hosts are presented and used to predict the internal dynamics of these satellites as a function of m{sub *}. For the analysis, a large galaxy group mock catalog is constructed on the basis of (sub)halo-to-stellar mass relations fully constrained with currently available observations, namely the galaxy stellar mass function decomposed into centrals and satellites, and the two-point correlation functions at different masses. We find that 6.6% of MW-sized galaxies host two satellites in the mass range of the Smallmore » and Large Magellanic Clouds (SMC and LMC, respectively). The probabilities of the MW-sized galaxies having one satellite equal to or larger than the LMC, two satellites equal to or larger than the SMC, or three satellites equal to or larger than Sagittarius (Sgr) are Almost-Equal-To 0.26, 0.14, and 0.14, respectively. The cumulative satellite mass function of the MW, N{sub s} ({>=}m{sub *}) , down to the mass of the Fornax dwarf is within the 1{sigma} distribution of all the MW-sized galaxies. We find that MW-sized hosts with three satellites more massive than Sgr (as the MW) are among the most common cases. However, the most and second most massive satellites in these systems are smaller than the LMC and SMC by roughly 0.7 and 0.8 dex, respectively. We conclude that the distribution N{sub s} ({>=}m{sub *}) for MW-sized galaxies is quite broad, the particular case of the MW being of low frequency but not an outlier. The halo mass of MW-sized galaxies correlates only weakly with N{sub s} ({>=}m{sub *}). Then, it is not possible to accurately determine the MW halo mass by means of its N{sub s} ({>=}m{sub *}); from our catalog, we constrain a lower limit of 1.38 Multiplication-Sign 10{sup 12} M{sub Sun} at the 1{sigma} level. Our analysis strongly suggests that the abundance of massive subhalos should agree with the abundance of massive satellites in all MW-sized hosts, i.e., there is not a missing (massive) satellite problem for the {Lambda}CDM cosmology. However, we confirm that the maximum circular velocity, v{sub max}, of the subhalos of satellites smaller than m{sub *} {approx} 10{sup 8} M{sub Sun} is systematically larger than the v{sub max} inferred from current observational studies of the MW bright dwarf satellites; different from previous works, this conclusion is based on an analysis of the overall population of MW-sized galaxies. Some pieces of evidence suggest that the issue could refer only to satellite dwarfs but not to central dwarfs, then environmental processes associated with dwarfs inside host halos combined with supernova-driven core expansion should be on the basis of the lowering of v{sub max}.« less

  18. Modeling Carbon Exchange

    NASA Technical Reports Server (NTRS)

    Sellers, Piers

    2012-01-01

    Model results will be reviewed to assess different methods for bounding the terrestrial role in the global carbon cycle. It is proposed that a series of climate model runs could be scoped that would tighten the limits on the "missing sink" of terrestrial carbon and could also direct future satellite image analyses to search for its geographical location and understand its seasonal dynamics.

  19. Considerations of multiple imputation approaches for handling missing data in clinical trials.

    PubMed

    Quan, Hui; Qi, Li; Luo, Xiaodong; Darchy, Loic

    2018-07-01

    Missing data exist in all clinical trials and missing data issue is a very serious issue in terms of the interpretability of the trial results. There is no universally applicable solution for all missing data problems. Methods used for handling missing data issue depend on the circumstances particularly the assumptions on missing data mechanisms. In recent years, if the missing at random mechanism cannot be assumed, conservative approaches such as the control-based and returning to baseline multiple imputation approaches are applied for dealing with the missing data issues. In this paper, we focus on the variability in data analysis of these approaches. As demonstrated by examples, the choice of the variability can impact the conclusion of the analysis. Besides the methods for continuous endpoints, we also discuss methods for binary and time to event endpoints as well as consideration for non-inferiority assessment. Copyright © 2018. Published by Elsevier Inc.

  20. The Effect of a Brief Acceptance and Commitment Therapy Intervention on the Near-Miss Effect in Problem Gamblers

    ERIC Educational Resources Information Center

    Nastally, Becky L.; Dixon, Mark R.

    2012-01-01

    In the current study, 3 participants with a history of problem gambling were exposed to computerized slot machine play consisting of outcomes that depicted wins, losses, and near misses (2 out of 3 identical slot machine symbols). Participants were asked to rate each type of outcome in terms of its closeness to a win on a scale of 1 to 10 before…

  1. Biweekly Maps of Wind Stress for the North Pacific from the ERS-1 Scatterometer

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The European Remote-sensing Satellite (ERS-1) was launched in July 1991 and contained several instruments for observing the Earth's ocean including a wind scatterometer. The scatterometer measurements were processed by the European Space Agency (ESA) and the Jet Propulsion Laboratory (JPL). JPL reprocessed (Freilich and Dunbar, 1992) the ERS-1 backscatter measurements to produced a 'value added' data set that contained the ESA wind vector as well as a set of up to four ambiguities. These ambiguities were further processed using a maximum-likelihood estimation (MLE) and a median filter to produce a 'selected vector.' This report describes a technique developed to produce time-averaged wind field estimates with their expected errors using only scatterometer wind vectors. The processing described in this report involved extracting regions of interest from the data tapes, checking the quality and creating the wind field estimate. This analysis also includes the derivation of biweekly average wind vectors over the North Pacific Ocean at a resolution of 0.50 x 0.50. This was done with an optimal average algorithm temporally and an over-determined biharmonic spline spatially. There have been other attempts at creating gridded wind files from ERS-1 winds, e.g., kriging techniques (Bentamy et al., 1996) and successive corrections schemes (Tang and Liu, 1996). There are several inherent problems with the ERS-1 scatterometer. Since this is a multidisciplinary mission, the satellite is flown in different orbits optimized for each phase of the mission. The scatterometer also shares several sub-systems with the Synthetic Aperture Radar (SAR) and cannot be operated while the SAR is in operation. The scatterometer is also a single-sided instrument and only measures backscatter along the right side of the satellite. The processing described here generates biweekly wind maps during the wktwo years analysis period regardless of the satellite orbit or missing data.

  2. Tisserand's polynomials and inclination functions in the theory of artificial earth satellites

    NASA Astrophysics Data System (ADS)

    Aksenov, E. P.

    1986-03-01

    The connection between Tisserand's polynomials and inclination functions in the theory of motion of artificial earth satellites is established in the paper. The most important properties of these special functions of celestial mechanics are presented. The problem of expanding the perturbation function in satellite problems is discussed.

  3. Relative tracking control of constellation satellites considering inter-satellite link

    NASA Astrophysics Data System (ADS)

    Fakoor, M.; Amozegary, F.; Bakhtiari, M.; Daneshjou, K.

    2017-11-01

    In this article, two main issues related to the large-scale relative motion of satellites in the constellation are investigated to establish the Inter Satellite Link (ISL) which means the dynamic and control problems. In the section related to dynamic problems, a detailed and effective analytical solution is initially provided for the problem of satellite relative motion considering perturbations. The direct geometric method utilizing spherical coordinates is employed to achieve this solution. The evaluation of simulation shows that the solution obtained from the geometric method calculates the relative motion of the satellite with high accuracy. Thus, the proposed analytical solution will be applicable and effective. In the section related to control problems, the relative tracking control system between two satellites will be designed in order to establish a communication link between the satellites utilizing analytical solution for relative motion of satellites with respect to the reference trajectory. Sliding mode control approach is employed to develop the relative tracking control system for body to body and payload to payload tracking control. Efficiency of sliding mode control approach is compared with PID and LQR controllers. Two types of payload to payload tracking control considering with and without payload degree of freedom are designed and suitable one for practical ISL applications is introduced. Also, Fuzzy controller is utilized to eliminate the control input in the sliding mode controller.

  4. The Search for RR Lyrae Variables in the Dark Energy Survey

    NASA Astrophysics Data System (ADS)

    Nielsen, Chandler; Marshall, Jennifer L.; Long, James

    2017-01-01

    RR Lyrae variables are stars with a characteristic relationship between magnitude and phase and whose distances can be easily determined, making them extremely valuable in mapping and analyzing galactic substructure. We present our method of searching for RR Lyrae variable stars using data extracted from the Dark Energy Survey (DES). The DES probes for stars as faint as i = 24.3. Finding such distant RR Lyrae allows for the discovery of objects such as dwarf spheroidal tidal streams and dwarf galaxies; in fact, at least one RR Lyrae has been discovered in each of the probed dwarf spheroidal galaxies orbiting the Milky Way (Baker & Willman 2015). In turn, these discoveries may ultimately resolve the well-known missing satellite problem, in which theoretical simulations predict many more dwarf satellites than are observed in the local Universe. Using the Lomb-Scargle periodogram to determine the period of the star being analyzed, we could display the relationship between magnitude and phase and visually determine if the star being analyzed was an RR Lyrae. We began the search in frequently observed regions of the DES footprint, known as the supernova fields. We then moved our search to known dwarf galaxies found during the second year of the DES. Unfortunately, we did not discover RR Lyrae in the probed dwarf galaxies; this method should be tried again once more observations are taken in the DES.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capela, Fabio; Ramazanov, Sabir, E-mail: fc403@cam.ac.uk, E-mail: Sabir.Ramazanov@ulb.ac.be

    At large scales and for sufficiently early times, dark matter is described as a pressureless perfect fluid—dust— non-interacting with Standard Model fields. These features are captured by a simple model with two scalars: a Lagrange multiplier and another playing the role of the velocity potential. That model arises naturally in some gravitational frameworks, e.g., the mimetic dark matter scenario. We consider an extension of the model by means of higher derivative terms, such that the dust solutions are preserved at the background level, but there is a non-zero sound speed at the linear level. We associate this Modified Dust withmore » dark matter, and study the linear evolution of cosmological perturbations in that picture. The most prominent effect is the suppression of their power spectrum for sufficiently large cosmological momenta. This can be relevant in view of the problems that cold dark matter faces at sub-galactic scales, e.g., the missing satellites problem. At even shorter scales, however, perturbations of Modified Dust are enhanced compared to the predictions of more common particle dark matter scenarios. This is a peculiarity of their evolution in radiation dominated background. We also briefly discuss clustering of Modified Dust. We write the system of equations in the Newtonian limit, and sketch the possible mechanism which could prevent the appearance of caustic singularities. The same mechanism may be relevant in light of the core-cusp problem.« less

  6. Autoregressive-model-based missing value estimation for DNA microarray time series data.

    PubMed

    Choong, Miew Keen; Charbit, Maurice; Yan, Hong

    2009-01-01

    Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.

  7. Satellite passive remote sensing of off-shore pollutants, volume 2

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Satellite detection and monitoring of off-shore dumped pollutants, other than oil, are discussed. Summaries of satellite sensor performance in three spectral bands (visible, infrared, and microwave) are presented. The bulk of the report gives all the calculations, trade-offs and limitations of the three sensor systems. It is asserted that the problem of pollution monitoring is not a sensor problem but a problem of mathematical modeling and data processing.

  8. The Impact of the Nursing Practice Environment on Missed Nursing Care.

    PubMed

    Hessels, Amanda J; Flynn, Linda; Cimiotti, Jeannie P; Cadmus, Edna; Gershon, Robyn R M

    2015-12-01

    Missed nursing care is an emerging problem negatively impacting patient outcomes. There are gaps in our knowledge of factors associated with missed nursing care. The aim of this study was to determine the relationship between the nursing practice environment and missed nursing care in acute care hospitals. This is a secondary analysis of cross sectional data from a survey of over 7.000 nurses from 70 hospitals on workplace and process of care. Ordinary least squares and multiple regression models were constructed to examine the relationship between the nursing practice environment and missed nursing care while controlling for characteristics of nurses and hospitals. Nurses missed delivering a significant amount of necessary patient care (10-27%). Inadequate staffing and inadequate resources were the practice environment factors most strongly associated with missed nursing care events. This multi-site study examined the risk and risk factors associated with missed nursing care. Improvements targeting modifiable risk factors may reduce the risk of missed nursing care.

  9. Analysis of space radiation data of semiconductor memories

    NASA Technical Reports Server (NTRS)

    Stassinopoulos, E. G.; Brucker, G. J.; Stauffer, C. A.

    1996-01-01

    This article presents an analysis of radiation effects for several select device types and technologies aboard the Combined Release and Radiation Effects Satellite (CRRES) satellite. These space-flight measurements covered a period of about 14 months of mission lifetime. Single Event Upset (SEU) data of the investigated devices from the Microelectronics Package (MEP) were processed and analyzed. Valid upset measurements were determined by correcting for invalid readings, hard failures, missing data tapes (thus voids in data), and periods over which devices were disabled from interrogation. The basic resolution time of the measurement system was confirmed to be 2 s. Lessons learned, important findings, and recommendations are presented.

  10. Refined Use of Satellite Aerosol Optical Depth Snapshots to Constrain Biomass Burning Emissions in the GOCART Model

    NASA Astrophysics Data System (ADS)

    Petrenko, Mariya; Kahn, Ralph; Chin, Mian; Limbacher, James

    2017-10-01

    Simulations of biomass burning (BB) emissions in global chemistry and aerosol transport models depend on external inventories, which provide location and strength for BB aerosol sources. Our previous work shows that to first order, satellite snapshots of aerosol optical depth (AOD) near the emitted smoke plume can be used to constrain model-simulated AOD, and effectively, the smoke source strength. We now refine the satellite-snapshot method and investigate where applying simple multiplicative emission adjustment factors alone to the widely used Global Fire Emission Database version 3 emission inventory can achieve regional-scale consistency between Moderate Resolution Imaging Spectroradiometer (MODIS) AOD snapshots and the Goddard Chemistry Aerosol Radiation and Transport model. The model and satellite AOD are compared globally, over a set of BB cases observed by the MODIS instrument during the 2004, and 2006-2008 biomass burning seasons. Regional discrepancies between the model and satellite are diverse around the globe yet quite consistent within most ecosystems. We refine our approach to address physically based limitations of our earlier work (1) by expanding the number of fire cases from 124 to almost 900, (2) by using scaled reanalysis-model simulations to fill missing AOD retrievals in the MODIS observations, (3) by distinguishing the BB components of the total aerosol load from background aerosol in the near-source regions, and (4) by including emissions from fires too small to be identified explicitly in the satellite observations. The small-fire emission adjustment shows the complimentary nature of correcting for source strength and adding geographically distinct missing sources. Our analysis indicates that the method works best for fire cases where the BB fraction of total AOD is high, primarily evergreen or deciduous forests. In heavily polluted or agricultural burning regions, where smoke and background AOD values tend to be comparable, this approach encounters large uncertainties, and in some regions, other model- or measurement-related factors might contribute significantly to model-satellite discrepancies. This work sets the stage for a larger study within the Aerosol Comparison between Observations and Models (AeroCOM) multimodel biomass burning experiment. By comparing multiple model results using the refined technique presented here, we aim to separate BB inventory from model-specific contributions to the remaining discrepancies.

  11. Microwave vs optical crosslink study

    NASA Technical Reports Server (NTRS)

    Kwong, Paulman W.; Bruno, Ronald C.; Marshalek, Robert G.

    1992-01-01

    The intersatellite links (ISL's) at geostationary orbit is currently a missing link in commercial satellite services. Prior studies have found that potential application of ISL's to domestic, regional, and global satellites will provide more cost-effective services than the non-ISL's systems (i.e., multiple-hop systems). In addition, ISL's can improve and expand the existing satellite services in several aspects. For example, ISL's can conserve the scarce spectrum allocated for fixed satellite services (FSS) by avoiding multiple hopping of the relay stations. ISL's can also conserve prime orbit slot by effectively expanding the geostationary arc. As a result of the coverage extension by using ISL's more users will have direct access to the satellite network, thus providing reduced signal propagation delay and improved signal quality. Given the potential benefits of ISL's system, it is of interest to determine the appropriate implementations for some potential ISL architectures. Summary of the selected ISL network architecture as supplied by NASA are listed. The projected high data rate requirements (greater than 400 Mbps) suggest that high frequency RF or optical implementations are natural approaches. Both RF and optical systems have their own merits and weaknesses which make the choice between them dependent on the specific application. Due to its relatively mature technology base, the implementation risk associated with RF (at least 32 GHz) is lower than that of the optical ISL's. However, the relatively large antenna size required by RF ISL's payload may cause real-estate problems on the host spacecraft. In addition, because of the frequency sharing (for duplex multiple channels communications) within the limited bandwidth allocated, RF ISL's are more susceptible to inter-system and inter-channel interferences. On the other hand, optical ISL's can offer interference-free transmission and compact sized payload. However, the extremely narrow beam widths (on the order of 10 micro-rad) associated with optical ISL's impose very stringent pointing, acquisition, and tracking requirements on the system. Even if the RF and optical systems are considered separately, questions still remain as to selection of RF frequency, direct versus coherent optical detection, etc. in implementing an ISL for a particular network architecture. These and other issues are studied.

  12. Parallel satellite orbital situational problems solver for space missions design and control

    NASA Astrophysics Data System (ADS)

    Atanassov, Atanas Marinov

    2016-11-01

    Solving different scientific problems for space applications demands implementation of observations, measurements or realization of active experiments during time intervals in which specific geometric and physical conditions are fulfilled. The solving of situational problems for determination of these time intervals when the satellite instruments work optimally is a very important part of all activities on every stage of preparation and realization of space missions. The elaboration of universal, flexible and robust approach for situation analysis, which is easily portable toward new satellite missions, is significant for reduction of missions' preparation times and costs. Every situation problem could be based on one or more situation conditions. Simultaneously solving different kinds of situation problems based on different number and types of situational conditions, each one of them satisfied on different segments of satellite orbit requires irregular calculations. Three formal approaches are presented. First one is related to situation problems description that allows achieving flexibility in situation problem assembling and presentation in computer memory. The second formal approach is connected with developing of situation problem solver organized as processor that executes specific code for every particular situational condition. The third formal approach is related to solver parallelization utilizing threads and dynamic scheduling based on "pool of threads" abstraction and ensures a good load balance. The developed situation problems solver is intended for incorporation in the frames of multi-physics multi-satellite space mission's design and simulation tools.

  13. The Ebb and Flow of Tidal Science, and the Impact of Satellite Altimetry

    NASA Technical Reports Server (NTRS)

    Ray, Richard; Egbert, Gary

    2006-01-01

    In the years immediately preceding the launches of Geosat and Topex/Poseidon, tidal science had lapsed into a period of uncertainty and discouragement, brought about by the failure of once-exciting new ideas that eventually proved overly optimistic. A long list of outstanding problems presented themselves, but progress had reached a "low water mark". What was lacking was a high-quality global dataset of tidal measurements, which satellite altimetry -- and especially Topex/Poseidon -- provided. With these data in hand, a "flood tide" of marked progress resulted. In this paper we review some of that progress. An important area of progress, with potentially important implications for other areas of physical oceanography, falls under the topic of "energy dissipation." With precise global constraints provided by altimetry -- combined with precise laser tracking of the altimeter, other geodetic satellites like Lageos, as well as the moon -- the planetary energy budgets of both Earth and ocean tides are now well determined. Moreover, the local energy balances, and thus local estimates of tidal dissipation, have now been mapped, although somewhat coarsely, throughout the ocean. This work has pointed to internal-tide generation in the deep ocean as the once missing sink of tidal energy, and has led to a plethora of new observational and theoretical studies of internal tides, and their role in vertical mixing of the deep ocean. The discovery that internal tides, or some part of them, can be directly mapped with an altimeter opens new lines of research on this topic. Low-mode internal tides have been found, at least in some regions, to propagate several thousand kilometers across open ocean. The study of such waves with altimetry gives us a global view heretofore unattainable, allowing strong observational constraints to be placed on possible ocean mixing processes, such as subharmonic instabilities.

  14. Solving the small-scale structure puzzles with dissipative dark matter

    NASA Astrophysics Data System (ADS)

    Foot, Robert; Vagnozzi, Sunny

    2016-07-01

    Small-scale structure is studied in the context of dissipative dark matter, arising for instance in models with a hidden unbroken Abelian sector, so that dark matter couples to a massless dark photon. The dark sector interacts with ordinary matter via gravity and photon-dark photon kinetic mixing. Mirror dark matter is a theoretically constrained special case where all parameters are fixed except for the kinetic mixing strength, epsilon. In these models, the dark matter halo around spiral and irregular galaxies takes the form of a dissipative plasma which evolves in response to various heating and cooling processes. It has been argued previously that such dynamics can account for the inferred cored density profiles of galaxies and other related structural features. Here we focus on the apparent deficit of nearby small galaxies (``missing satellite problem"), which these dissipative models have the potential to address through small-scale power suppression by acoustic and diffusion damping. Using a variant of the extended Press-Schechter formalism, we evaluate the halo mass function for the special case of mirror dark matter. Considering a simplified model where Mbaryons propto Mhalo, we relate the halo mass function to more directly observable quantities, and find that for epsilon ≈ 2 × 10-10 such a simplified description is compatible with the measured galaxy luminosity and velocity functions. On scales Mhalo lesssim 108 Msolar, diffusion damping exponentially suppresses the halo mass function, suggesting a nonprimordial origin for dwarf spheroidal satellite galaxies, which we speculate were formed via a top-down fragmentation process as the result of nonlinear dissipative collapse of larger density perturbations. This could explain the planar orientation of satellite galaxies around Andromeda and the Milky Way.

  15. Application of a novel hybrid method for spatiotemporal data imputation: A case study of the Minqin County groundwater level

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongrong; Yang, Xuan; Li, Hao; Li, Weide; Yan, Haowen; Shi, Fei

    2017-10-01

    The techniques for data analyses have been widely developed in past years, however, missing data still represent a ubiquitous problem in many scientific fields. In particular, dealing with missing spatiotemporal data presents an enormous challenge. Nonetheless, in recent years, a considerable amount of research has focused on spatiotemporal problems, making spatiotemporal missing data imputation methods increasingly indispensable. In this paper, a novel spatiotemporal hybrid method is proposed to verify and imputed spatiotemporal missing values. This new method, termed SOM-FLSSVM, flexibly combines three advanced techniques: self-organizing feature map (SOM) clustering, the fruit fly optimization algorithm (FOA) and the least squares support vector machine (LSSVM). We employ a cross-validation (CV) procedure and FOA swarm intelligence optimization strategy that can search available parameters and determine the optimal imputation model. The spatiotemporal underground water data for Minqin County, China, were selected to test the reliability and imputation ability of SOM-FLSSVM. We carried out a validation experiment and compared three well-studied models with SOM-FLSSVM using a different missing data ratio from 0.1 to 0.8 in the same data set. The results demonstrate that the new hybrid method performs well in terms of both robustness and accuracy for spatiotemporal missing data.

  16. Simulation-based sensitivity analysis for non-ignorably missing data.

    PubMed

    Yin, Peng; Shi, Jian Q

    2017-01-01

    Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.

  17. A novel approach for incremental uncertainty rule generation from databases with missing values handling: application to dynamic medical databases.

    PubMed

    Konias, Sokratis; Chouvarda, Ioanna; Vlahavas, Ioannis; Maglaveras, Nicos

    2005-09-01

    Current approaches for mining association rules usually assume that the mining is performed in a static database, where the problem of missing attribute values does not practically exist. However, these assumptions are not preserved in some medical databases, like in a home care system. In this paper, a novel uncertainty rule algorithm is illustrated, namely URG-2 (Uncertainty Rule Generator), which addresses the problem of mining dynamic databases containing missing values. This algorithm requires only one pass from the initial dataset in order to generate the item set, while new metrics corresponding to the notion of Support and Confidence are used. URG-2 was evaluated over two medical databases, introducing randomly multiple missing values for each record's attribute (rate: 5-20% by 5% increments) in the initial dataset. Compared with the classical approach (records with missing values are ignored), the proposed algorithm was more robust in mining rules from datasets containing missing values. In all cases, the difference in preserving the initial rules ranged between 30% and 60% in favour of URG-2. Moreover, due to its incremental nature, URG-2 saved over 90% of the time required for thorough re-mining. Thus, the proposed algorithm can offer a preferable solution for mining in dynamic relational databases.

  18. Consequences of severe obstetric complications on women's health in Morocco: please, listen to me!

    PubMed

    Assarag, Bouchra; Dujardin, Bruno; Essolbi, Amina; Cherkaoui, Imad; De Brouwere, Vincent

    2015-11-01

    In Morocco, medical care for women with severe obstetric complications (near-miss cases) ends at discharge from the hospital. Little information exists regarding what happens after returning home. The aim of the study was to assess the physical and mental health consequences of near-miss events on Moroccan women 8 months after childbirth. A prospective cohort study of 76 near-miss women was conducted in three hospitals. For every case, we recruited at least two women from the same hospital who had uncomplicated deliveries (n = 169). We used a mixed-methods approach. For the quantitative part, we analysed sociodemographic characteristics collected via a questionnaire and medical complications extracted from the medical records during a medical consultation at 8 months post-partum. Forty in-depth interviews were also conducted with 20 near-miss cases and 20 women with uncomplicated deliveries. The near-miss women were poorer and less educated than those who had uncomplicated deliveries. The proportion of physical consequences (serious illness) was higher among near-miss cases (22%) than uncomplicated deliveries (6%, P = 0.001). The risk of depression was significantly higher among near-miss cases with perinatal death (OR = 7.16; [95% CI: 2.85-17.98]) than among those who had an uncomplicated delivery. Interviews revealed that the economic burden of near-miss care contributed to social problems among the women and their households. A near-miss event has consequences that go beyond the first days after delivery. Developing new mechanisms for maternal and newborn health follow-up is essential and should address the mother's physical and mental health problems and involve husbands and family members. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  19. Data Quality and Reliability Analysis of U.S. Marine Corps Ground Vehicle Maintenance Records

    DTIC Science & Technology

    2015-06-01

    Corporation conducted a study on data quality issues present in U. S. Army logistics data ( Galway & Hanks, 1996). The study breaks data issues into three...categories: operational, conceptual, and organizational problems ( Galway & Hanks, 1996). Operational data problems relate to the number of missing or...codes (EIC) are left blank ( Galway & Hanks, 1996, p. 26). Missing entries are attributed to an assumed lack of significance of the EIC. The issue is

  20. Some Legal Problems of Satellite Transmission.

    ERIC Educational Resources Information Center

    Siebert, Fred S.

    Now that the technical aspects of satellite transmission have been solved, there remain the more complex and difficult problems of maintaining both order in outer space and the rights of nations and individuals as these rights may be affected by broadcasts transmitted by satellite stations. These broadcasts, whether beamed to a ground station or…

  1. Toward Continuous GPS Carrier-Phase Time Transfer: Eliminating the Time Discontinuity at an Anomaly

    PubMed Central

    Yao, Jian; Levine, Judah; Weiss, Marc

    2015-01-01

    The wide application of Global Positioning System (GPS) carrier-phase (CP) time transfer is limited by the problem of boundary discontinuity (BD). The discontinuity has two categories. One is “day boundary discontinuity,” which has been studied extensively and can be solved by multiple methods [1–8]. The other category of discontinuity, called “anomaly boundary discontinuity (anomaly-BD),” comes from a GPS data anomaly. The anomaly can be a data gap (i.e., missing data), a GPS measurement error (i.e., bad data), or a cycle slip. Initial study of the anomaly-BD shows that we can fix the discontinuity if the anomaly lasts no more than 20 min, using the polynomial curve-fitting strategy to repair the anomaly [9]. However, sometimes, the data anomaly lasts longer than 20 min. Thus, a better curve-fitting strategy is in need. Besides, a cycle slip, as another type of data anomaly, can occur and lead to an anomaly-BD. To solve these problems, this paper proposes a new strategy, i.e., the satellite-clock-aided curve fitting strategy with the function of cycle slip detection. Basically, this new strategy applies the satellite clock correction to the GPS data. After that, we do the polynomial curve fitting for the code and phase data, as before. Our study shows that the phase-data residual is only ~3 mm for all GPS satellites. The new strategy also detects and finds the number of cycle slips by searching the minimum curve-fitting residual. Extensive examples show that this new strategy enables us to repair up to a 40-min GPS data anomaly, regardless of whether the anomaly is due to a data gap, a cycle slip, or a combination of the two. We also find that interference of the GPS signal, known as “jamming”, can possibly lead to a time-transfer error, and that this new strategy can compensate for jamming outages. Thus, the new strategy can eliminate the impact of jamming on time transfer. As a whole, we greatly improve the robustness of the GPS CP time transfer. PMID:26958451

  2. Causal inference with missing exposure information: Methods and applications to an obstetric study.

    PubMed

    Zhang, Zhiwei; Liu, Wei; Zhang, Bo; Tang, Li; Zhang, Jun

    2016-10-01

    Causal inference in observational studies is frequently challenged by the occurrence of missing data, in addition to confounding. Motivated by the Consortium on Safe Labor, a large observational study of obstetric labor practice and birth outcomes, this article focuses on the problem of missing exposure information in a causal analysis of observational data. This problem can be approached from different angles (i.e. missing covariates and causal inference), and useful methods can be obtained by drawing upon the available techniques and insights in both areas. In this article, we describe and compare a collection of methods based on different modeling assumptions, under standard assumptions for missing data (i.e. missing-at-random and positivity) and for causal inference with complete data (i.e. no unmeasured confounding and another positivity assumption). These methods involve three models: one for treatment assignment, one for the dependence of outcome on treatment and covariates, and one for the missing data mechanism. In general, consistent estimation of causal quantities requires correct specification of at least two of the three models, although there may be some flexibility as to which two models need to be correct. Such flexibility is afforded by doubly robust estimators adapted from the missing covariates literature and the literature on causal inference with complete data, and by a newly developed triply robust estimator that is consistent if any two of the three models are correct. The methods are applied to the Consortium on Safe Labor data and compared in a simulation study mimicking the Consortium on Safe Labor. © The Author(s) 2013.

  3. The planes of satellite galaxies problem, suggested solutions, and open questions

    NASA Astrophysics Data System (ADS)

    Pawlowski, Marcel S.

    2018-02-01

    Satellite galaxies of the Milky Way and of the Andromeda galaxy have been found to preferentially align in significantly flattened planes of satellite galaxies, and available velocity measurements are indicative of a preference of satellites in those structures to co-orbit. There is an increasing evidence that such kinematically correlated satellite planes are also present around more distant hosts. Detailed comparisons show that similarly anisotropic phase-space distributions of sub-halos are exceedingly rare in cosmological simulations based on the ΛCDM paradigm. Analogs to the observed systems have frequencies of ≤ 0.5% in such simulations. In contrast to other small-scale problems, the satellite planes issue is not strongly affected by baryonic processes because the distribution of sub-halos on scales of hundreds of kpc is dominated by gravitational effects. This makes the satellite planes one of the most serious small-scale problems for ΛCDM. This review summarizes the observational evidence for planes of satellite galaxies in the Local Group and beyond, and provides an overview of how they compare to cosmological simulations. It also discusses scenarios which aim at explaining the coherence of satellite positions and orbits, and why they all are currently unable to satisfactorily resolve the issue.

  4. Engineering calculations for communications satellite systems planning

    NASA Technical Reports Server (NTRS)

    Reilly, C. H.; Levis, C. A.; Mount-Campbell, C.; Gonsalvez, D. J.; Wang, C. W.; Yamamura, Y.

    1985-01-01

    Computer-based techniques for optimizing communications-satellite orbit and frequency assignments are discussed. A gradient-search code was tested against a BSS scenario derived from the RARC-83 data. Improvement was obtained, but each iteration requires about 50 minutes of IBM-3081 CPU time. Gradient-search experiments on a small FSS test problem, consisting of a single service area served by 8 satellites, showed quickest convergence when the satellites were all initially placed near the center of the available orbital arc with moderate spacing. A transformation technique is proposed for investigating the surface topography of the objective function used in the gradient-search method. A new synthesis approach is based on transforming single-entry interference constraints into corresponding constraints on satellite spacings. These constraints are used with linear objective functions to formulate the co-channel orbital assignment task as a linear-programming (LP) problem or mixed integer programming (MIP) problem. Globally optimal solutions are always found with the MIP problems, but not necessarily with the LP problems. The MIP solutions can be used to evaluate the quality of the LP solutions. The initial results are very encouraging.

  5. Accounting for missing data in the estimation of contemporary genetic effective population size (N(e) ).

    PubMed

    Peel, D; Waples, R S; Macbeth, G M; Do, C; Ovenden, J R

    2013-03-01

    Theoretical models are often applied to population genetic data sets without fully considering the effect of missing data. Researchers can deal with missing data by removing individuals that have failed to yield genotypes and/or by removing loci that have failed to yield allelic determinations, but despite their best efforts, most data sets still contain some missing data. As a consequence, realized sample size differs among loci, and this poses a problem for unbiased methods that must explicitly account for random sampling error. One commonly used solution for the calculation of contemporary effective population size (N(e) ) is to calculate the effective sample size as an unweighted mean or harmonic mean across loci. This is not ideal because it fails to account for the fact that loci with different numbers of alleles have different information content. Here we consider this problem for genetic estimators of contemporary effective population size (N(e) ). To evaluate bias and precision of several statistical approaches for dealing with missing data, we simulated populations with known N(e) and various degrees of missing data. Across all scenarios, one method of correcting for missing data (fixed-inverse variance-weighted harmonic mean) consistently performed the best for both single-sample and two-sample (temporal) methods of estimating N(e) and outperformed some methods currently in widespread use. The approach adopted here may be a starting point to adjust other population genetics methods that include per-locus sample size components. © 2012 Blackwell Publishing Ltd.

  6. Replacing missing values using trustworthy data values from web data sources

    NASA Astrophysics Data System (ADS)

    Izham Jaya, M.; Sidi, Fatimah; Mat Yusof, Sharmila; Suriani Affendey, Lilly; Ishak, Iskandar; Jabar, Marzanah A.

    2017-09-01

    In practice, collected data usually are incomplete and contains missing value. Existing approaches in managing missing values overlook the importance of trustworthy data values in replacing missing values. In view that trusted completed data is very important in data analysis, we proposed a framework of missing value replacement using trustworthy data values from web data sources. The proposed framework adopted ontology to map data values from web data sources to the incomplete dataset. As data from web is conflicting with each other, we proposed a trust score measurement based on data accuracy and data reliability. Trust score is then used to select trustworthy data values from web data sources for missing values replacement. We successfully implemented the proposed framework using financial dataset and presented the findings in this paper. From our experiment, we manage to show that replacing missing values with trustworthy data values is important especially in a case of conflicting data to solve missing values problem.

  7. Transitioning to multiple imputation : a new method to impute missing blood alcohol concentration (BAC) values in FARS

    DOT National Transportation Integrated Search

    2002-01-01

    The National Center for Statistics and Analysis (NCSA) of the National Highway Traffic Safety : Administration (NHTSA) has undertaken several approaches to remedy the problem of missing blood alcohol : test results in the Fatality Analysis Reporting ...

  8. The dynamic phenomena of a tethered satellite: NASA's first Tethered Satellite Mission, TSS-1

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Mowery, D. K.; Tomlin, D. D.

    1993-01-01

    The tethered satellite system (TSS) was envisioned as a means of extending a satellite from its base (space shuttle, space station, space platform) into a lower or higher altitude in order to more efficiently acquire data and perform science experiments. This is accomplished by attaching the satellite to a tether, deploying it, then reeling it in. When its mission is completed, the satellite can be returned to its base for reuse. If the tether contains a conductor, it can also be used as a means to generate and flow current to and from the satellite to the base. When current is flowed, the tether interacts with the Earth's magnetic field, deflecting the tether. When the current flows in one direction, the system becomes a propulsive system that can be used to boost the orbiting system. In the other direction, it is a power generating system. Pulsing the current sets up a dynamic oscillation in the tether, which can upset the satellite attitude and preclude docking. A basic problem occurs around 400-m tether length, during satellite retrieval when the satellite's pendulous (rotational) mode gets in resonance with the first lateral tether string mode. The problem's magnitude is determined by the amount of skiprope present coming into this resonance condition. This paper deals with the tethered satellite, its dynamic phenomena, and how the resulting problems were solved for the first tethered satellite mission (TSS-1). Proposals for improvements for future tethered satellite missions are included. Results from the first tethered satellite flight are summarized.

  9. mvp - an open-source preprocessor for cleaning duplicate records and missing values in mass spectrometry data.

    PubMed

    Lee, Geunho; Lee, Hyun Beom; Jung, Byung Hwa; Nam, Hojung

    2017-07-01

    Mass spectrometry (MS) data are used to analyze biological phenomena based on chemical species. However, these data often contain unexpected duplicate records and missing values due to technical or biological factors. These 'dirty data' problems increase the difficulty of performing MS analyses because they lead to performance degradation when statistical or machine-learning tests are applied to the data. Thus, we have developed missing values preprocessor (mvp), an open-source software for preprocessing data that might include duplicate records and missing values. mvp uses the property of MS data in which identical chemical species present the same or similar values for key identifiers, such as the mass-to-charge ratio and intensity signal, and forms cliques via graph theory to process dirty data. We evaluated the validity of the mvp process via quantitative and qualitative analyses and compared the results from a statistical test that analyzed the original and mvp-applied data. This analysis showed that using mvp reduces problems associated with duplicate records and missing values. We also examined the effects of using unprocessed data in statistical tests and examined the improved statistical test results obtained with data preprocessed using mvp.

  10. Missed and Delayed Diagnosis of Dementia in Primary Care: Prevalence and Contributing Factors

    PubMed Central

    Bradford, Andrea; Kunik, Mark E.; Schulz, Paul; Williams, Susan P.; Singh, Hardeep

    2009-01-01

    Dementia is a growing public health problem for which early detection may be beneficial. Currently, the diagnosis of dementia in primary care is dependent mostly on clinical suspicion based on patient symptomsor caregivers’ concerns and is prone to be missed or delayed. We conducted a systematic review of the literature to ascertain the prevalence and contributing factors for missed and delayed dementia diagnoses in primary care. Prevalence of missed and delayed diagnosis was estimated by abstracting quantitative data from studies of diagnostic sensitivity among primary care providers. Possible predictors and contributory factors were determined from the text of quantitative and qualitative studies of patient-, caregiver-, provider-, and system-related barriers. Overall estimates of diagnostic sensitivity varied among studies and appeared to be in part a function of dementia severity, degree of patient impairment, dementia subtype, and frequency of patient-provider contact. Major contributory factors included problems with attitudes and patient-provider communication, educational deficits, and system resource constraints. The true prevalence of missed and delayed diagnoses of dementia is unknown but appears to be high. Until the case for dementia screening becomes more compelling, efforts to promote timely detection should focus on removing barriers to diagnosis. PMID:19568149

  11. Angular aberration in the problem of power beaming to geostationary satellites through the atmosphere.

    PubMed

    Baryshnikov, F F

    1995-10-20

    The influence of angular aberration of radiation as a result of the difference in speed of a geostationary satellite and the speed of the Earth's surface on laser power beaming to satellites is considered. Angular aberration makes it impossible to direct the energy to the satellite, and additional beam rotation is necessary. Because the Earth's rotation may cause bad phase restoration, we face a serious problem: how to transfer incoherent radiation to remote satellites. In the framework of the Kolmogorov turbulence model simple conditions of energy transfer are derived and discussed.

  12. Sensitivity Analysis of Multiple Informant Models When Data are Not Missing at Random

    PubMed Central

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae; Scaramella, Laura; Leve, Leslie; Reiss, David

    2014-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups may be retained even if only one member of a group contributes data. Statistical inference is based on the assumption that data are missing completely at random or missing at random. Importantly, whether or not data are missing is assumed to be independent of the missing data. A saturated correlates model that incorporates correlates of the missingness or the missing data into an analysis and multiple imputation that may also use such correlates offer advantages over the standard implementation of SEM when data are not missing at random because these approaches may result in a data analysis problem for which the missingness is ignorable. This paper considers these approaches in an analysis of family data to assess the sensitivity of parameter estimates to assumptions about missing data, a strategy that may be easily implemented using SEM software. PMID:25221420

  13. Application of heuristic satellite plan synthesis algorithms to requirements of the WARC-88 allotment plan

    NASA Technical Reports Server (NTRS)

    Heyward, Ann O.; Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Olen, Carl

    1990-01-01

    Creation of an Allotment Plan for the Fixed Satellite Service at the 1988 Space World Administrative Radio Conference (WARC) represented a complex satellite plan synthesis problem, involving a large number of planned and existing systems. Solutions to this problem at WARC-88 required the use of both automated and manual procedures to develop an acceptable set of system positions. Development of an Allotment Plan may also be attempted through solution of an optimization problem, known as the Satellite Location Problem (SLP). Three automated heuristic procedures, developed specifically to solve SLP, are presented. The heuristics are then applied to two specific WARC-88 scenarios. Solutions resulting from the fully automated heuristics are then compared with solutions obtained at WARC-88 through a combination of both automated and manual planning efforts.

  14. Interference problems for nongeostationary satellites

    NASA Technical Reports Server (NTRS)

    Sollfrey, W.

    1984-01-01

    The interference problems faced by nongeostationary satellites may be of major significance. A general discussion indicates the scope of the problems and describes several configurations of importance. Computer programs are described, which are employed by NASA/JPL and the U.S. Air Force Satellite Control Facility to provide interference-free scheduling of commands and data transmission. Satellite system mission planners are not concerned with the precise prediction of interference episodes, but rather with the expected total amount of interference, the mean and maximum duration of events, and the mean spacing between episodes. The procedures in the theory of probability developed by the author which permit calculation of such quantities are described and applied to several real cases. It may be anticipated that the problems will become steadily worse in the future as more and more data transmissions attempt to occupy the same frequency band.

  15. Constraining the Mass of the Local Group through Proper Motion Measurements of Local Group Galaxies

    NASA Astrophysics Data System (ADS)

    Sohn, S. Tony; van der Marel, R.; Anderson, J.

    2012-01-01

    The Local Group and its two dominant spiral galaxies have been the benchmark for testing many aspects of cosmological and galaxy formation theories. This includes, e.g., dark halo profiles and shapes, substructure and the "missing satellite" problem, and the minimum mass for galaxy formation. But despite the extensive work in all of these areas, our knowledge of the mass of the Milky Way and M31, and thus the total mass of the Local Group remains one of the most poorly established astronomical parameters (uncertain by a factor of 4). One important reason for this problem is the lack of information in tangential motions of galaxies, which can be only obtained through proper motion measurements. In this study, we introduce our projects for measuring absolute proper motions of (1) the dwarf spheroidal galaxy Leo I, (2) M31, and (3) the 4 dwarf galaxies near the edge of the Local Group (Cetus, Leo A, Tucana, and Sag DIG). Results from these three independent measurements will provide important clues to the mass of the Milky Way, M31, and the Local Group as a whole, respectively. We also present our proper motion measurement technique that uses compact background galaxies as astrometric reference sources.

  16. Florence Nightingale and the Salisbury incident.

    PubMed

    Palmer, I S

    1976-01-01

    Florence Nightingale's astute handling of mismanagement in Free Gifts Stores during the Crimean War underscored her administrative ability. Miss Nightingale went to Scutari ostensibly to nurse the British soldiers, and while there encountered innumerable instances of administrative and managerial ineffectiveness and difficulties. Among these were the problems in the accountability and deployment of supplies as well as the assignment and supervision of female personnel-an untried situation. The article identifies the misdirected organizational management which occasioned the introduction of women into British war nursing and the voluntary participation of the British citizenry in providing supplies and comfort for the Army. Through analysis of Miss Nightingale's and others' private correspondence, the problems of personnel management and supply distribution are brought into sharp focus. The interplay of policies and principles to which Miss Nightingale subscribed, the human frailty of one of her women, Miss Nightingale's illness, and the confusion and stress which characterized the Crimean War are discussed. The compassion, understanding, and rectitude as well as the human values to which Miss Nightingale subscribed in protecting a woman guilty of a breach of trust and felony and the troublesome slanderous attack to which Miss Nightingale was subjected at the instigation of her foes on the home front provide a background for the presentation of the Salisbury affair as an interesting aspect of historical research into the life of the Victorian heroine.

  17. Restoration of HST images with missing data

    NASA Technical Reports Server (NTRS)

    Adorf, Hans-Martin

    1992-01-01

    Missing data are a fairly common problem when restoring Hubble Space Telescope observations of extended sources. On Wide Field and Planetary Camera images cosmic ray hits and CCD hot spots are the prevalent causes of data losses, whereas on Faint Object Camera images data are lossed due to reseaux marks, blemishes, areas of saturation and the omnipresent frame edges. This contribution discusses a technique for 'filling in' missing data by statistical inference using information from the surrounding pixels. The major gain consists in minimizing adverse spill-over effects to the restoration in areas neighboring those where data are missing. When the mask delineating the support of 'missing data' is made dynamic, cosmic ray hits, etc. can be detected on the fly during restoration.

  18. Ocean observer study: A proposed national asset to augment the future U.S. operational satellite system

    USGS Publications Warehouse

    Cunningham, J.D.; Chambers, D.; Davis, C.O.; Gerber, A.; Helz, R.; McGuire, J.P.; Pichel, W.

    2003-01-01

    The next generation of U.S. polar orbiting environmental satellites, are now under development. These satellites, jointly developed by the Department of Defense (DoD), the Department of Commerce (DOC), and the National Aeronautics and Space Administration (NASA), will be known as the National Polar-orbiting Operational Environmental Satellite System (NPOESS). It is expected that the first of these satellites will be launched in 2010. NPOESS has been designed to meet the operational needs of the U.S. civilian meteorological, environmental, climatic, and space environmental remote sensing programs, and the Global Military Space and Geophysical Environmental remote sewing programs. This system, however, did not meet all the needs of the user community interested in operational oceanography (particularly in coastal regions). Beginning in the fall of 2000, the Integrated Program Office (IPO), a joint DoD, DOC, and NASA office responsible for the NPOESS development, initiated the Ocean Observer Study (OOS). The purpose of this study was to assess and recommend how best to measure the missing or inadequately sampled ocean parameters. This paper summarizes the ocean measurement requirements documented in the OOS, describes the national need to measure these parameters, and describes the satellite instrumentation required to make those measurements.

  19. Seventh Grade Students' Problem Solving Success Rates on Proportional Reasoning Problems

    ERIC Educational Resources Information Center

    Pelen, Mustafa Serkan; Artut, Perihan Dinç

    2016-01-01

    This research was conducted to investigate 7th grade students' problem solving success rates on proportional reasoning problems and whether these success rates change with different problem types. 331 randomly selected students of grade seven participated in this study. A problem test which contains three different types of missing value (direct…

  20. Reducing Noise in the MSU Daily Lower-Tropospheric Global Temperature Dataset

    NASA Technical Reports Server (NTRS)

    Christy, John R.; Spencer, Roy W.; McNider, Richard T.

    1996-01-01

    The daily global-mean values of the lower-tropospheric temperature determined from microwave emissions measured by satellites are examined in terms of their signal, noise, and signal-to-noise ratio. Daily and 30-day average noise estimates are reduced by almost 50% and 35%. respectively, by analyzing and adjusting (if necessary) for errors due to 1) missing data, 2) residual harmonics of the annual cycle unique to particular satellites, 3) lack of filtering, and 4) spurious trends. After adjustments, the decadal trend of the lower-tropospheric global temperature from January 1979 through February 1994 becomes -0.058 C. or about 0.03 C per decade cooler than previously calculated.

  1. Reducing Noise in the MSU Daily Lower-Tropospheric Global Temperature Dataset

    NASA Technical Reports Server (NTRS)

    Christy, John R.; Spencer, Roy W.; McNider, Richard T.

    1995-01-01

    The daily global-mean values of the lower-tropospheric temperature determined from microwave emissions measured by satellites are examined in terms of their signal, noise, and signal-to-noise ratio. Daily and 30-day average noise estimates are reduced by, almost 50% and 35%, respectively, by analyzing and adjusting (if necessary) for errors due to (1) missing data, (2) residual harmonics of the annual cycle unique to particular satellites, (3) lack of filtering, and (4) spurious trends. After adjustments, the decadal trend of the lower-tropospheric global temperature from January 1979 through February 1994 becomes -0.058 C, or about 0.03 C per decade cooler than previously calculated.

  2. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  3. Influence of tides in viscoelastic bodies of planet and satellite on the satellite's orbital motion

    NASA Astrophysics Data System (ADS)

    Emelyanov, N. V.

    2018-06-01

    The problem of influence of tidal friction in both planetary and satellite bodies upon satellite's orbital motion is considered. Using the differential equations in satellite's rectangular planetocentric coordinates, the differential equations describing the changes in semimajor axis and eccentricity are derived. The equations in rectangular coordinates were taken from earlier works on the problem. The calcultations carried out for a number of test examples prove that the averaged solutions of equations in coordinates and precise solutions of averaged equations in the Keplerian elements are identical. For the problem of tides raised on planet's body, it was found that, if satellite's mean motion n is equal to 11/18 Ω, where Ω is the planet's angular rotation rate, the orbital eccentricity does not change. This conclusion is in agreement with the results of other authors. It was also found that there is essential discrepancy between the equations in the elements obtained in this paper and analogous equations published by earlier researchers.

  4. ON THE PERSISTENCE OF TWO SMALL-SCALE PROBLEMS IN ΛCDM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlowski, Marcel S.; Famaey, Benoit; Merritt, David

    2015-12-10

    We investigate the degree to which the inclusion of baryonic physics can overcome two long-standing problems of the standard cosmological model on galaxy scales: (1) the problem of satellite planes around Local Group galaxies, and (2) the “too big to fail” problem. By comparing dissipational and dissipationless simulations, we find no indication that the addition of baryonic physics results in more flattened satellite distributions around Milky-Way-like systems. Recent claims to the contrary are shown to derive in part from a non-standard metric for the degree of flattening, which ignores the satellites’ radial positions. If the full 3D positions of themore » satellite galaxies are considered, none of the simulations we analyze reproduce the observed flattening nor the observed degree of kinematic coherence of the Milky Way satellite system. Our results are consistent with the expectation that baryonic physics should have little or no influence on the structure of satellite systems on scales of hundreds of kiloparsecs. Claims that the “too big to fail” problem can be resolved by the addition of baryonic physics are also shown to be problematic.« less

  5. GALAXY EVOLUTION AT HIGH REDSHIFT: OBSCURED STAR FORMATION, GRB RATES, COSMIC REIONIZATION, AND MISSING SATELLITES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lapi, A.; Mancuso, C.; Celotti, A.

    We provide a holistic view of galaxy evolution at high redshifts z ≳ 4, which incorporates the constraints from various astrophysical/cosmological probes, including the estimate of the cosmic star formation rate (SFR) density from UV/IR surveys and long gamma-ray burst (GRBs) rates, the cosmic reionization history following the latest Planck measurements, and the missing satellites issue. We achieve this goal in a model-independent way by exploiting the SFR functions derived by Mancuso et al. on the basis of an educated extrapolation of the latest UV/far-IR data from HST / Herschel , and already tested against a number of independent observables.more » Our SFR functions integrated down to a UV magnitude limit M {sub UV} ≲ −13 (or SFR limit around 10{sup −2} M {sub ⊙} yr{sup −1}) produce a cosmic SFR density in excellent agreement with recent determinations from IR surveys and, taking into account a metallicity ceiling Z ≲ Z {sub ⊙}/2, with the estimates from long GRB rates. They also yield a cosmic reionization history consistent with that implied by the recent measurements of the Planck mission of the electron scattering optical depth τ {sub es} ≈ 0.058; remarkably, this result is obtained under a conceivable assumption regarding the average value f {sub esc} ≈ 0.1 of the escape fraction for ionizing photons. We demonstrate via the abundance-matching technique that the above constraints concurrently imply galaxy formation becoming inefficient within dark matter halos of mass below a few 10{sup 8} M {sub ⊙}; pleasingly, such a limit is also required so as not to run into the missing satellites issue. Finally, we predict a downturn of the Galaxy luminosity function faintward of M {sub UV} ≲ −12, and stress that its detailed shape, to be plausibly probed in the near future by the JWST , will be extremely informative on the astrophysics of galaxy formation in small halos, or even on the microscopic nature of the dark matter.« less

  6. Full-Coverage High-Resolution Daily PM(sub 2.5) Estimation using MAIAC AOD in the Yangtze River Delta of China

    NASA Technical Reports Server (NTRS)

    Xiao, Qingyang; Wang, Yujie; Chang, Howard H.; Meng, Xia; Geng, Guannan; Lyapustin, Alexei Ivanovich; Liu, Yang

    2017-01-01

    Satellite aerosol optical depth (AOD) has been used to assess population exposure to fine particulate matter (PM (sub 2.5)). The emerging high-resolution satellite aerosol product, Multi-Angle Implementation of Atmospheric Correction(MAIAC), provides a valuable opportunity to characterize local-scale PM(sub 2.5) at 1-km resolution. However, non-random missing AOD due to cloud snow cover or high surface reflectance makes this task challenging. Previous studies filled the data gap by spatially interpolating neighboring PM(sub 2.5) measurements or predictions. This strategy ignored the effect of cloud cover on aerosol loadings and has been shown to exhibit poor performance when monitoring stations are sparse or when there is seasonal large-scale missngness. Using the Yangtze River Delta of China as an example, we present a Multiple Imputation (MI) method that combines the MAIAC high-resolution satellite retrievals with chemical transport model (CTM) simulations to fill missing AOD. A two-stage statistical model driven by gap-filled AOD, meteorology and land use information was then fitted to estimate daily ground PM(sub 2.5) concentrations in 2013 and 2014 at 1 km resolution with complete coverage in space and time. The daily MI models have an average R(exp 2) of 0.77, with an inter-quartile range of 0.71 to 0.82 across days. The overall Ml model 10-fold cross-validation R(exp 2) (root mean square error) were 0.81 (25 gm(exp 3)) and 0.73 (18 gm(exp 3)) for year 2013 and 2014, respectively. Predictions with only observational AOD or only imputed AOD showed similar accuracy.Comparing with previous gap-filling methods, our MI method presented in this study performed bette rwith higher coverage, higher accuracy, and the ability to fill missing PM(sub 2.5) predictions without ground PM(sub 2.5) measurements. This method can provide reliable PM(sub 2.5)predictions with complete coverage that can reduce biasin exposure assessment in air pollution and health studies.

  7. Reconstructing missing daily precipitation data using regression trees and artificial neural networks

    USDA-ARS?s Scientific Manuscript database

    Incomplete meteorological data has been a problem in environmental modeling studies. The objective of this work was to develop a technique to reconstruct missing daily precipitation data in the central part of Chesapeake Bay Watershed using regression trees (RT) and artificial neural networks (ANN)....

  8. Dark matter self-interactions and small scale structure

    NASA Astrophysics Data System (ADS)

    Tulin, Sean; Yu, Hai-Bo

    2018-02-01

    We review theories of dark matter (DM) beyond the collisionless paradigm, known as self-interacting dark matter (SIDM), and their observable implications for astrophysical structure in the Universe. Self-interactions are motivated, in part, due to the potential to explain long-standing (and more recent) small scale structure observations that are in tension with collisionless cold DM (CDM) predictions. Simple particle physics models for SIDM can provide a universal explanation for these observations across a wide range of mass scales spanning dwarf galaxies, low and high surface brightness spiral galaxies, and clusters of galaxies. At the same time, SIDM leaves intact the success of ΛCDM cosmology on large scales. This report covers the following topics: (1) small scale structure issues, including the core-cusp problem, the diversity problem for rotation curves, the missing satellites problem, and the too-big-to-fail problem, as well as recent progress in hydrodynamical simulations of galaxy formation; (2) N-body simulations for SIDM, including implications for density profiles, halo shapes, substructure, and the interplay between baryons and self-interactions; (3) semi-analytic Jeans-based methods that provide a complementary approach for connecting particle models with observations; (4) merging systems, such as cluster mergers (e.g., the Bullet Cluster) and minor infalls, along with recent simulation results for mergers; (5) particle physics models, including light mediator models and composite DM models; and (6) complementary probes for SIDM, including indirect and direct detection experiments, particle collider searches, and cosmological observations. We provide a summary and critical look for all current constraints on DM self-interactions and an outline for future directions.

  9. Treatment of missing data in follow-up studies of randomised controlled trials: A systematic review of the literature.

    PubMed

    Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-08-01

    After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.

  10. Real-Time and Seamless Monitoring of Ground-Level PM2.5 Using Satellite Remote Sensing

    NASA Astrophysics Data System (ADS)

    Li, Tongwen; Zhang, Chengyue; Shen, Huanfeng; Yuan, Qiangqiang; Zhang, Liangpei

    2018-04-01

    Satellite remote sensing has been reported to be a promising approach for the monitoring of atmospheric PM2.5. However, the satellite-based monitoring of ground-level PM2.5 is still challenging. First, the previously used polar-orbiting satellite observations, which can be usually acquired only once per day, are hard to monitor PM2.5 in real time. Second, many data gaps exist in satellitederived PM2.5 due to the cloud contamination. In this paper, the hourly geostationary satellite (i.e., Harawari-8) observations were adopted for the real-time monitoring of PM2.5 in a deep learning architecture. On this basis, the satellite-derived PM2.5 in conjunction with ground PM2.5 measurements are incorporated into a spatio-temporal fusion model to fill the data gaps. Using Wuhan Urban Agglomeration as an example, we have successfully derived the real-time and seamless PM2.5 distributions. The results demonstrate that Harawari-8 satellite-based deep learning model achieves a satisfactory performance (out-of-sample cross-validation R2 = 0.80, RMSE = 17.49 μg/m3) for the estimation of PM2.5. The missing data in satellite-derive PM2.5 are accurately recovered, with R2 between recoveries and ground measurements of 0.75. Overall, this study has inherently provided an effective strategy for the realtime and seamless monitoring of ground-level PM2.5.

  11. Removing the solar exclusion with high altitude satellites [Orbital strategies to mitigate the Solar Exclusion Effect on Space-Based Observation of the Geosynchronous Belt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vallado, David A.; Cefola, Paul J.; Kiziah, Rex R.

    Here, observing geosynchronous satellites has numerous applications. Lighting conditions near the equinoxes routinely cause problems for traditional observations of sensors near the equator – the solar exclusion. We investigate using sensors on satellites (in polar and high- altitude orbits) to observe satellites that are in geosynchronous orbit. It is hoped that these satellite configurations will alleviate many of these problems. Assessing the orbit insertion and station-keeping requirements are important to understand. We summarize the literature to understand the relevant perturbing forces and assess the delta-v requirements.

  12. Removing the solar exclusion with high altitude satellites [Orbital strategies to mitigate the Solar Exclusion Effect on Space-Based Observation of the Geosynchronous Belt

    DOE PAGES

    Vallado, David A.; Cefola, Paul J.; Kiziah, Rex R.; ...

    2016-09-09

    Here, observing geosynchronous satellites has numerous applications. Lighting conditions near the equinoxes routinely cause problems for traditional observations of sensors near the equator – the solar exclusion. We investigate using sensors on satellites (in polar and high- altitude orbits) to observe satellites that are in geosynchronous orbit. It is hoped that these satellite configurations will alleviate many of these problems. Assessing the orbit insertion and station-keeping requirements are important to understand. We summarize the literature to understand the relevant perturbing forces and assess the delta-v requirements.

  13. VIGAN: Missing View Imputation with Generative Adversarial Networks.

    PubMed

    Shang, Chao; Palmer, Aaron; Sun, Jiangwen; Chen, Ko-Shin; Lu, Jin; Bi, Jinbo

    2017-01-01

    In an era when big data are becoming the norm, there is less concern with the quantity but more with the quality and completeness of the data. In many disciplines, data are collected from heterogeneous sources, resulting in multi-view or multi-modal datasets. The missing data problem has been challenging to address in multi-view data analysis. Especially, when certain samples miss an entire view of data, it creates the missing view problem. Classic multiple imputations or matrix completion methods are hardly effective here when no information can be based on in the specific view to impute data for such samples. The commonly-used simple method of removing samples with a missing view can dramatically reduce sample size, thus diminishing the statistical power of a subsequent analysis. In this paper, we propose a novel approach for view imputation via generative adversarial networks (GANs), which we name by VIGAN. This approach first treats each view as a separate domain and identifies domain-to-domain mappings via a GAN using randomly-sampled data from each view, and then employs a multi-modal denoising autoencoder (DAE) to reconstruct the missing view from the GAN outputs based on paired data across the views. Then, by optimizing the GAN and DAE jointly, our model enables the knowledge integration for domain mappings and view correspondences to effectively recover the missing view. Empirical results on benchmark datasets validate the VIGAN approach by comparing against the state of the art. The evaluation of VIGAN in a genetic study of substance use disorders further proves the effectiveness and usability of this approach in life science.

  14. Dual transponder time synchronization at C band using ATS-3.

    NASA Technical Reports Server (NTRS)

    Mazur, W. E., Jr.

    1972-01-01

    The use of artificial satellites for time synchronization of geographically distant clocks is hindered by problems due to satellite motion or equipment delay measurements. The ATS-3 satellite with its two C-band transponder channels helps solve these problems through techniques for synchronization to accuracies of tenths of microseconds. Portable cesium clocks were used to verify the accuracy of the described system.

  15. Order-restricted inference for means with missing values.

    PubMed

    Wang, Heng; Zhong, Ping-Shou

    2017-09-01

    Missing values appear very often in many applications, but the problem of missing values has not received much attention in testing order-restricted alternatives. Under the missing at random (MAR) assumption, we impute the missing values nonparametrically using kernel regression. For data with imputation, the classical likelihood ratio test designed for testing the order-restricted means is no longer applicable since the likelihood does not exist. This article proposes a novel method for constructing test statistics for assessing means with an increasing order or a decreasing order based on jackknife empirical likelihood (JEL) ratio. It is shown that the JEL ratio statistic evaluated under the null hypothesis converges to a chi-bar-square distribution, whose weights depend on missing probabilities and nonparametric imputation. Simulation study shows that the proposed test performs well under various missing scenarios and is robust for normally and nonnormally distributed data. The proposed method is applied to an Alzheimer's disease neuroimaging initiative data set for finding a biomarker for the diagnosis of the Alzheimer's disease. © 2017, The International Biometric Society.

  16. Spacecraft intercept guidance using zero effort miss steering

    NASA Astrophysics Data System (ADS)

    Newman, Brett

    The suitability of proportional navigation, or an equivalent zero effort miss formulation, for spacecraft intercepts during midcourse guidance, followed by a ballistic coast to the endgame, is addressed. The problem is formulated in terms of relative motion in a general 3D framework. The proposed guidance law for the commanded thrust vector orientation consists of the sum of two terms: (1) along the line of sight unit direction and (2) along the zero effort miss component perpendicular to the line of sight and proportional to the miss itself and a guidance gain. If the guidance law is to be suitable for longer range targeting applications with significant ballistic coasting after burnout, determination of the zero effort miss must account for the different gravitational accelerations experienced by each vehicle. The proposed miss determination techniques employ approximations for the true differential gravity effect. Theoretical results are applied to a numerical engagement scenario and the resulting performance is evaluated in terms of the miss distances determined from nonlinear simulation.

  17. Spatial Scale Gap Filling Using an Unmanned Aerial System: A Statistical Downscaling Method for Applications in Precision Agriculture.

    PubMed

    Hassan-Esfahani, Leila; Ebtehaj, Ardeshir M; Torres-Rua, Alfonso; McKee, Mac

    2017-09-14

    Applications of satellite-borne observations in precision agriculture (PA) are often limited due to the coarse spatial resolution of satellite imagery. This paper uses high-resolution airborne observations to increase the spatial resolution of satellite data for related applications in PA. A new variational downscaling scheme is presented that uses coincident aerial imagery products from "AggieAir", an unmanned aerial system, to increase the spatial resolution of Landsat satellite data. This approach is primarily tested for downscaling individual band Landsat images that can be used to derive normalized difference vegetation index (NDVI) and surface soil moisture (SSM). Quantitative and qualitative results demonstrate promising capabilities of the downscaling approach enabling effective increase of the spatial resolution of Landsat imageries by orders of 2 to 4. Specifically, the downscaling scheme retrieved the missing high-resolution feature of the imageries and reduced the root mean squared error by 15, 11, and 10 percent in visual, near infrared, and thermal infrared bands, respectively. This metric is reduced by 9% in the derived NDVI and remains negligibly for the soil moisture products.

  18. Spatial Scale Gap Filling Using an Unmanned Aerial System: A Statistical Downscaling Method for Applications in Precision Agriculture

    PubMed Central

    Hassan-Esfahani, Leila; Ebtehaj, Ardeshir M.; McKee, Mac

    2017-01-01

    Applications of satellite-borne observations in precision agriculture (PA) are often limited due to the coarse spatial resolution of satellite imagery. This paper uses high-resolution airborne observations to increase the spatial resolution of satellite data for related applications in PA. A new variational downscaling scheme is presented that uses coincident aerial imagery products from “AggieAir”, an unmanned aerial system, to increase the spatial resolution of Landsat satellite data. This approach is primarily tested for downscaling individual band Landsat images that can be used to derive normalized difference vegetation index (NDVI) and surface soil moisture (SSM). Quantitative and qualitative results demonstrate promising capabilities of the downscaling approach enabling effective increase of the spatial resolution of Landsat imageries by orders of 2 to 4. Specifically, the downscaling scheme retrieved the missing high-resolution feature of the imageries and reduced the root mean squared error by 15, 11, and 10 percent in visual, near infrared, and thermal infrared bands, respectively. This metric is reduced by 9% in the derived NDVI and remains negligibly for the soil moisture products. PMID:28906428

  19. Aerosol loading in the Southeastern United States: reconciling surface and satellite observations

    NASA Astrophysics Data System (ADS)

    Ford, B.; Heald, C. L.

    2013-04-01

    We investigate the seasonality in aerosols over the Southeastern United States using observations from several satellite instruments (MODIS, MISR, CALIOP) and surface network sites (IMPROVE, SEARCH, AERONET). We find that the strong summertime enhancement in satellite-observed aerosol optical depth (factor 2-3 enhancement over wintertime AOD) is not present in surface mass concentrations (25-55% summertime enhancement). Goldstein et al. (2009) previously attributed this seasonality in AOD to biogenic organic aerosol; however, surface observations show that organic aerosol only accounts for ~35% of PM2.5 mass and exhibits similar seasonality to total PM2.5. The GEOS-Chem model generally reproduces these surface aerosol measurements, but under represents the AOD seasonality observed by satellites. We show that seasonal differences in water uptake cannot sufficiently explain the magnitude of AOD increase. As CALIOP profiles indicate the presence of additional aerosol in the lower troposphere (below 700 hPa), which cannot be explained by vertical mixing; we conclude that the discrepancy is due to a missing source of aerosols above the surface in summer.

  20. Small Fire Detection Algorithm Development using VIIRS 375m Imagery: Application to Agricultural Fires in Eastern China

    NASA Astrophysics Data System (ADS)

    Zhang, Tianran; Wooster, Martin

    2016-04-01

    Until recently, crop residues have been the second largest industrial waste product produced in China and field-based burning of crop residues is considered to remain extremely widespread, with impacts on air quality and potential negative effects on health, public transportation. However, due to the small size and perhaps short-lived nature of the individual burns, the extent of the activity and its spatial variability remains somewhat unclear. Satellite EO data has been used to gauge the timing and magnitude of Chinese crop burning, but current approaches very likely miss significant amounts of the activity because the individual burned areas are either too small to detect with frequently acquired moderate spatial resolution data such as MODIS. The Visible Infrared Imaging Radiometer Suite (VIIRS) on-board Suomi-NPP (National Polar-orbiting Partnership) satellite launched on October, 2011 has one set of multi-spectral channels providing full global coverage at 375 m nadir spatial resolutions. It is expected that the 375 m spatial resolution "I-band" imagery provided by VIIRS will allow active fires to be detected that are ~ 10× smaller than those that can be detected by MODIS. In this study the new small fire detection algorithm is built based on VIIRS-I band global fire detection algorithm and hot spot detection algorithm for the BIRD satellite mission. VIIRS-I band imagery data will be used to identify agricultural fire activity across Eastern China. A 30 m spatial resolution global land cover data map is used for false alarm masking. The ground-based validation is performed using images taken from UAV. The fire detection result is been compared with active fire product from the long-standing MODIS sensor onboard the TERRA and AQUA satellites, which shows small fires missed from traditional MODIS fire product may count for over 1/3 of total fire energy in Eastern China.

  1. Reducing Missed Laboratory Results: Defining Temporal Responsibility, Generating User Interfaces for Test Process Tracking, and Retrospective Analyses to Identify Problems

    PubMed Central

    Tarkan, Sureyya; Plaisant, Catherine; Shneiderman, Ben; Hettinger, A. Zachary

    2011-01-01

    Researchers have conducted numerous case studies reporting the details on how laboratory test results of patients were missed by the ordering medical providers. Given the importance of timely test results in an outpatient setting, there is limited discussion of electronic versions of test result management tools to help clinicians and medical staff with this complex process. This paper presents three ideas to reduce missed results with a system that facilitates tracking laboratory tests from order to completion as well as during follow-up: (1) define a workflow management model that clarifies responsible agents and associated time frame, (2) generate a user interface for tracking that could eventually be integrated into current electronic health record (EHR) systems, (3) help identify common problems in past orders through retrospective analyses. PMID:22195201

  2. A mission-oriented orbit design method of remote sensing satellite for region monitoring mission based on evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Zhang, Jing; Yao, Huang

    2015-12-01

    Remote sensing satellites play an increasingly prominent role in environmental monitoring and disaster rescue. Taking advantage of almost the same sunshine condition to same place and global coverage, most of these satellites are operated on the sun-synchronous orbit. However, it brings some problems inevitably, the most significant one is that the temporal resolution of sun-synchronous orbit satellite can't satisfy the demand of specific region monitoring mission. To overcome the disadvantages, two methods are exploited: the first one is to build satellite constellation which contains multiple sunsynchronous satellites, just like the CHARTER mechanism has done; the second is to design non-predetermined orbit based on the concrete mission demand. An effective method for remote sensing satellite orbit design based on multiobjective evolution algorithm is presented in this paper. Orbit design problem is converted into a multi-objective optimization problem, and a fast and elitist multi-objective genetic algorithm is utilized to solve this problem. Firstly, the demand of the mission is transformed into multiple objective functions, and the six orbit elements of the satellite are taken as genes in design space, then a simulate evolution process is performed. An optimal resolution can be obtained after specified generation via evolution operation (selection, crossover, and mutation). To examine validity of the proposed method, a case study is introduced: Orbit design of an optical satellite for regional disaster monitoring, the mission demand include both minimizing the average revisit time internal of two objectives. The simulation result shows that the solution for this mission obtained by our method meet the demand the users' demand. We can draw a conclusion that the method presented in this paper is efficient for remote sensing orbit design.

  3. An Integrated, Problem-Based Learning Material: The "Satellite" Module

    ERIC Educational Resources Information Center

    Selcuk, Gamze Sezgin; Emiroglu, Handan Byacioglu; Tarakci, Mehmet; Ozel, Mustafa

    2011-01-01

    The purpose of this study is to introduce a problem-based learning material, the Satellite Module, that has integrated some of the subjects included in the disciplines of physics and mathematics at an introductory level in undergraduate education. The reason why this modular and problem-based material has been developed is to enable students to…

  4. Comparison of Modern Methods for Analyzing Repeated Measures Data with Missing Values

    ERIC Educational Resources Information Center

    Vallejo, G.; Fernandez, M. P.; Livacic-Rojas, P. E.; Tuero-Herrero, E.

    2011-01-01

    Missing data are a pervasive problem in many psychological applications in the real world. In this article we study the impact of dropout on the operational characteristics of several approaches that can be easily implemented with commercially available software. These approaches include the covariance pattern model based on an unstructured…

  5. Solutions for Missing Data in Structural Equation Modeling

    ERIC Educational Resources Information Center

    Carter, Rufus Lynn

    2006-01-01

    Many times in both educational and social science research it is impossible to collect data that is complete. When administering a survey, for example, people may answer some questions and not others. This missing data causes a problem for researchers using structural equation modeling (SEM) techniques for data analyses. Because SEM and…

  6. Use of Missing Data Methods in Longitudinal Studies: The Persistence of Bad Practices in Developmental Psychology

    ERIC Educational Resources Information Center

    Jelicic, Helena; Phelps, Erin; Lerner, Richard M.

    2009-01-01

    Developmental science rests on describing, explaining, and optimizing intraindividual changes and, hence, empirically requires longitudinal research. Problems of missing data arise in most longitudinal studies, thus creating challenges for interpreting the substance and structure of intraindividual change. Using a sample of reports of longitudinal…

  7. Principle Component Analysis with Incomplete Data: A simulation of R pcaMethods package in Constructing an Environmental Quality Index with Missing Data

    EPA Science Inventory

    Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...

  8. Least-Squares Approximation of an Improper Correlation Matrix by a Proper One.

    ERIC Educational Resources Information Center

    Knol, Dirk L.; ten Berge, Jos M. F.

    1989-01-01

    An algorithm, based on a solution for C. I. Mosier's oblique Procrustes rotation problem, is presented for the best least-squares fitting correlation matrix approximating a given missing value or improper correlation matrix. Results are of interest for missing value and tetrachoric correlation, indefinite matrix correlation, and constrained…

  9. Growth Problems

    MedlinePlus

    ... hormones needed to grow and develop. For example, Turner syndrome is a genetic condition (due to a problem ... a missing or abnormal X chromosome. Girls with Turner syndrome tend to be short and don't usually ...

  10. Reconstructing (super)trees from data sets with missing distances: not all is lost.

    PubMed

    Kettleborough, George; Dicks, Jo; Roberts, Ian N; Huber, Katharina T

    2015-06-01

    The wealth of phylogenetic information accumulated over many decades of biological research, coupled with recent technological advances in molecular sequence generation, presents significant opportunities for researchers to investigate relationships across and within the kingdoms of life. However, to make best use of this data wealth, several problems must first be overcome. One key problem is finding effective strategies to deal with missing data. Here, we introduce Lasso, a novel heuristic approach for reconstructing rooted phylogenetic trees from distance matrices with missing values, for data sets where a molecular clock may be assumed. Contrary to other phylogenetic methods on partial data sets, Lasso possesses desirable properties such as its reconstructed trees being both unique and edge-weighted. These properties are achieved by Lasso restricting its leaf set to a large subset of all possible taxa, which in many practical situations is the entire taxa set. Furthermore, the Lasso approach is distance-based, rendering it very fast to run and suitable for data sets of all sizes, including large data sets such as those generated by modern Next Generation Sequencing technologies. To better understand the performance of Lasso, we assessed it by means of artificial and real biological data sets, showing its effectiveness in the presence of missing data. Furthermore, by formulating the supermatrix problem as a particular case of the missing data problem, we assessed Lasso's ability to reconstruct supertrees. We demonstrate that, although not specifically designed for such a purpose, Lasso performs better than or comparably with five leading supertree algorithms on a challenging biological data set. Finally, we make freely available a software implementation of Lasso so that researchers may, for the first time, perform both rooted tree and supertree reconstruction with branch lengths on their own partial data sets. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  11. Research to Assembly Scheme for Satellite Deck Based on Robot Flexibility Control Principle

    NASA Astrophysics Data System (ADS)

    Guo, Tao; Hu, Ruiqin; Xiao, Zhengyi; Zhao, Jingjing; Fang, Zhikai

    2018-03-01

    Deck assembly is critical quality control point in final satellite assembly process, and cable extrusion and structure collision problems in assembly process will affect development quality and progress of satellite directly. Aimed at problems existing in deck assembly process, assembly project scheme for satellite deck based on robot flexibility control principle is proposed in this paper. Scheme is introduced firstly; secondly, key technologies on end force perception and flexible docking control in the scheme are studied; then, implementation process of assembly scheme for satellite deck is described in detail; finally, actual application case of assembly scheme is given. Result shows that compared with traditional assembly scheme, assembly scheme for satellite deck based on robot flexibility control principle has obvious advantages in work efficiency, reliability and universality aspects etc.

  12. Determination of the number of navigation satellites within satellite acquisition range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurenkov, Vladimir I., E-mail: kvi.48@mail.ru, E-mail: ask@ssau.ru; Kucherov, Alexander S., E-mail: kvi.48@mail.ru, E-mail: ask@ssau.ru; Gordeev, Alexey I., E-mail: exactoone@yahoo.com

    2014-12-10

    The problem of determination of the number of navigation satellites within acquisition range with regard to antenna systems configuration and stochastic land remote sensing satellite maneuvers is the subject considered in the paper. Distribution function and density function of the number of navigation satellites within acquisition range are obtained.

  13. Mind the Gap: The Prospects of Missing Data.

    PubMed

    McConnell, Meghan; Sherbino, Jonathan; Chan, Teresa M

    2016-12-01

    The increasing use of workplace-based assessments (WBAs) in competency-based medical education has led to large data sets that assess resident performance longitudinally. With large data sets, problems that arise from missing data are increasingly likely. The purpose of this study is to examine (1) whether data are missing at random across various WBAs, and (2) the relationship between resident performance and the proportion of missing data. During 2012-2013, a total of 844 WBAs of CanMEDs Roles were completed for 9 second-year emergency medicine residents. To identify whether missing data were randomly distributed across various WBAs, the total number of missing data points was calculated for each Role. To examine whether the amount of missing data was related to resident performance, 5 faculty members rank-ordered the residents based on performance. A median rank score was calculated for each resident and was correlated with the proportion of missing data. More data were missing for Health Advocate and Professional WBAs relative to other competencies ( P  < .001). Furthermore, resident rankings were not related to the proportion of missing data points ( r  = 0.29, P  > .05). The results of the present study illustrate that some CanMEDS Roles are less likely to be assessed than others. At the same time, the amount of missing data did not correlate with resident performance, suggesting lower-performing residents are no more likely to have missing data than their higher-performing peers. This article discusses several approaches to dealing with missing data.

  14. Constructing a Coherent Problem Model to Facilitate Algebra Problem Solving in a Chemistry Context

    ERIC Educational Resources Information Center

    Ngu, Bing Hiong; Yeung, Alexander Seeshing; Phan, Huy P.

    2015-01-01

    An experiment using a sample of 11th graders compared text editing and worked examples approaches in learning to solve dilution and molarity algebra word problems in a chemistry context. Text editing requires students to assess the structure of a word problem by specifying whether the problem text contains sufficient, missing, or irrelevant…

  15. "Asia's missing women" as a problem in applied evolutionary psychology?

    PubMed

    Brooks, Robert

    2012-12-20

    In many parts of Asia, the Middle East and North Africa, women and children are so undervalued, neglected, abused, and so often killed, that sex ratios are now strongly male biased. In recent decades, sex-biased abortion has exacerbated the problem. In this article I highlight several important insights from evolutionary biology into both the origin and the severe societal consequences of "Asia's missing women", paying particular attention to interactions between evolution, economics and culture. Son preferences and associated cultural practices like patrilineal inheritance, patrilocality and the Indian Hindu dowry system arise among the wealthy and powerful elites for reasons consistent with models of sex-biased parental investment. Those practices then spread via imitation as technology gets cheaper and economic development allows the middle class to grow rapidly. I will consider evidence from India, China and elsewhere that grossly male-biased sex ratios lead to increased crime, violence, local warfare, political instability, drug abuse, prostitution and trafficking of women. The problem of Asia's missing women presents a challenge for applied evolutionary psychology to help us understand and ameliorate sex ratio biases and their most severe consequences.

  16. Sampled-data H∞ filtering for Markovian jump singularly perturbed systems with time-varying delay and missing measurements

    NASA Astrophysics Data System (ADS)

    Yan, Yifang; Yang, Chunyu; Ma, Xiaoping; Zhou, Linna

    2018-02-01

    In this paper, sampled-data H∞ filtering problem is considered for Markovian jump singularly perturbed systems with time-varying delay and missing measurements. The sampled-data system is represented by a time-delay system, and the missing measurement phenomenon is described by an independent Bernoulli random process. By constructing an ɛ-dependent stochastic Lyapunov-Krasovskii functional, delay-dependent sufficient conditions are derived such that the filter error system satisfies the prescribed H∞ performance for all possible missing measurements. Then, an H∞ filter design method is proposed in terms of linear matrix inequalities. Finally, numerical examples are given to illustrate the feasibility and advantages of the obtained results.

  17. The problems of formation and conservation of the green frame (green carcass) of the satellite city (on the example of Zelenodolsk)

    NASA Astrophysics Data System (ADS)

    Zakirova, J.; Khusnutdinova, S.

    2018-01-01

    The article is devoted to the study of the problems of formation and conservation of the green frame (green carcass) of a satellite city of a monocentric city agglomeration. Nowadays the green spaces fulfill not only ecological, but also social and economic functions. This is especially important for mono-industrial and satellite cities. Zelenodolsk is satellite city of Kazan agglomeration. This city has significant natural, geographical, industrial aspects. The article shows the possibilities of forming the green frame of the city and its socio-economic use.

  18. [Theme Issue: Communications Satellites.

    ERIC Educational Resources Information Center

    Howkins, John, Ed.

    1976-01-01

    One section of this journal is devoted to issues involving broadcast satellites. Separate articles discuss the need for international planning of satellite broadcasting, decisions made at the 1971 World Administrative Radio Conference for Space Telecommunications, potential problems in satellite broadcasting, a series of proposals drawn up by the…

  19. The Eccentric Satellites Problem: Comparing Milky Way Satellite Orbital Properties to Simulation Results

    NASA Astrophysics Data System (ADS)

    Haji, Umran; Pryor, Carlton; Applebaum, Elaad; Brooks, Alyson

    2018-01-01

    We compare the orbital properties of the satellite galaxies of the Milky Way to those of satellites found in simulated Milky Way-like systems as a means of testing cosmological simulations of galaxy formation. The particular problem that we are investigating is a discrepancy in the distribution of orbital eccentricities. Previous studies of Milky Way-mass systems analyzed in a semi-analytic ΛCDM cosmological model have found that the satellites tend to have significantly larger fractions of their kinetic energy invested in radial motion with respect to their central galaxy than do the real-world Milky Way satellites. We analyze several high-resolution ("zoom-in") hydrodynamical simulations of Milky Way-mass galaxies and their associated satellite systems to investigate why previous works found Milky Way-like systems to be rare. We find a possible relationship between a quiescent galactic assembly history and a distribution of satellite kinematics resembling that of the Milky Way. This project has been supported by funding from National Science Foundation grant PHY-1560077.

  20. Limb Loss

    MedlinePlus

    ... in amputation. Injuries, including from traffic accidents and military combat Cancer Birth defects Some amputees have phantom pain, which is the feeling of pain in the missing limb. Other physical problems include surgical complications and skin problems, if you ...

  1. Aerosols increase upper tropospheric humidity over the North Western Pacific

    NASA Astrophysics Data System (ADS)

    Riuttanen, Laura; Bister, Marja; John, Viju; Sundström, Anu-Maija; Dal Maso, Miikka; Räisänen, Jouni; de Leeuw, Gerrit; Kulmala, Markku

    2014-05-01

    Water vapour in the upper troposphere is highly important for the global radiative transfer. The source of upper tropospheric humidity is deep convection, and aerosol effects on them have got attention only recently. E.g., aerosol effects on deep convective clouds have been missing in general circulation models (Quaas et al., 2009). In deep convection, aerosol effect on cloud microphysics may lead to more ice precipitation and less warm rain (Khain et al., 2005), and thus more water vapour in upper troposphere (Bister & Kulmala, 2011). China outflow region over the Pacific Ocean was chosen as a region for a more detailed study, with latitudes 25-45 N and three longitude slots: 120-149 E, 150-179 E and 150-179 W. In this study, we used satellite measurements of aerosol optical depth (AOD) and upper tropospheric humidity (UTH). AOD was obtained from the MODIS instrument onboard Terra satellite, that crosses the equator southward at 10:30 AM local solar time (Remer et al., 2005). UTH was obtained from a microwave humidity sounder (MHS) onboard MetOp-A satellite, with passing time at 9:30 PM local solar time. It measures relative humidity of a layer extending approximately from 500 to 200 hPa. We binned the AOD and UTH data according to daily rainfall product 3B42 from Tropical Rainfall Measuring Mission (TRMM) satellite. Binning the data according to the amount of precipitation gives us a new way to account for the possible aerosol invigoration effect on convection and to alleviate the contamination and causality problems in aerosol indirect effect studies. In this study, we show for the first time, based on satellite data, that there is a connection between upper tropospheric humidity and aerosols. Anthropogenic aerosols from China increase upper tropospheric humidity, which causes a significant positive local radiative forcing in libRadtran radiative transfer model (Mayer & Kylling, 2005). References: Bister, M. & Kulmala, M. (2011). Atmos. Chem. Phys., 11, 4577-4586. Khain, A., Rosenfeld, D. & Pokrovsky, A. (2005). Q. J. R. Meteorol. Soc., 131, 2639-2663. Mayer, B. & Kylling, A. (2005). Atmos. Chem. Phys., 5, 1855-1877. Remer, L. A. et al. (2005). J. Atmos. Sci., 62, 947-973. Quaas, J. et al. (2009). Atmos. Chem. Phys., 9, 8697-8717.

  2. H∞ filtering for discrete-time systems subject to stochastic missing measurements: a decomposition approach

    NASA Astrophysics Data System (ADS)

    Gu, Zhou; Fei, Shumin; Yue, Dong; Tian, Engang

    2014-07-01

    This paper deals with the problem of H∞ filtering for discrete-time systems with stochastic missing measurements. A new missing measurement model is developed by decomposing the interval of the missing rate into several segments. The probability of the missing rate in each subsegment is governed by its corresponding random variables. We aim to design a linear full-order filter such that the estimation error converges to zero exponentially in the mean square with a less conservatism while the disturbance rejection attenuation is constrained to a given level by means of an H∞ performance index. Based on Lyapunov theory, the reliable filter parameters are characterised in terms of the feasibility of a set of linear matrix inequalities. Finally, a numerical example is provided to demonstrate the effectiveness and applicability of the proposed design approach.

  3. The effectiveness of supermarket posters in helping to find missing children.

    PubMed

    Lampinen, James Michael; Arnal, Jack; Hicks, Jason L

    2009-03-01

    One approach used to help find missing children is to place posters of them at the exits of supermarkets. The present research addresses the question of how effective that approach is likely to be. Posters of 8 missing children were displayed on a bulletin board at a cooperating grocery store. Customers leaving the store completed a survey and took a recognition memory test for the children. Most customers thought the problem of missing children was an important issue. However, the majority of customers also reported either not looking at the posters or only briefly looking at the posters. Recognition memory for children depicted in the posters did not reliably differ from chance. It appears that there is much room for improvement when it comes to increasing the attention paid to posters meant to help find missing children.

  4. Research on Scheduling Algorithm for Multi-satellite and Point Target Task on Swinging Mode

    NASA Astrophysics Data System (ADS)

    Wang, M.; Dai, G.; Peng, L.; Song, Z.; Chen, G.

    2012-12-01

    Nowadays, using satellite in space to observe ground is an important and major method to obtain ground information. With the development of the scientific technology in the field of space, many fields such as military and economic and other areas have more and more requirement of space technology because of the benefits of the satellite's widespread, timeliness and unlimited of area and country. And at the same time, because of the wide use of all kinds of satellites, sensors, repeater satellites and ground receiving stations, ground control system are now facing great challenge. Therefore, how to make the best value of satellite resources so as to make full use of them becomes an important problem of ground control system. Satellite scheduling is to distribute the resource to all tasks without conflict to obtain the scheduling result so as to complete as many tasks as possible to meet user's requirement under considering the condition of the requirement of satellites, sensors and ground receiving stations. Considering the size of the task, we can divide tasks into point task and area task. This paper only considers point targets. In this paper, a description of satellite scheduling problem and a chief introduction of the theory of satellite scheduling are firstly made. We also analyze the restriction of resource and task in scheduling satellites. The input and output flow of scheduling process are also chiefly described in the paper. On the basis of these analyses, we put forward a scheduling model named as multi-variable optimization model for multi-satellite and point target task on swinging mode. In the multi-variable optimization model, the scheduling problem is transformed the parametric optimization problem. The parameter we wish to optimize is the swinging angle of every time-window. In the view of the efficiency and accuracy, some important problems relating the satellite scheduling such as the angle relation between satellites and ground targets, positive and negative swinging angle and the computation of time window are analyzed and discussed. And many strategies to improve the efficiency of this model are also put forward. In order to solve the model, we bring forward the conception of activity sequence map. By using the activity sequence map, the activity choice and the start time of the activity can be divided. We also bring forward three neighborhood operators to search the result space. The front movement remaining time and the back movement remaining time are used to analyze the feasibility to generate solution from neighborhood operators. Lastly, the algorithm to solve the problem and model is put forward based genetic algorithm. Population initialization, crossover operator, mutation operator, individual evaluation, collision decrease operator, select operator and collision elimination operator is designed in the paper. Finally, the scheduling result and the simulation for a practical example on 5 satellites and 100 point targets with swinging mode is given, and the scheduling performances are also analyzed while the swinging angle in 0, 5, 10, 15, 25. It can be shown by the result that the model and the algorithm are more effective than those ones without swinging mode.

  5. VITMO - A Powerful Tool to Improve Discovery in the Magnetospheric and Ionosphere-Thermosphere Domains

    NASA Astrophysics Data System (ADS)

    Schaefer, R. K.; Morrison, D.; Potter, M.; Stephens, G.; Barnes, R. J.; Talaat, E. R.; Sarris, T.

    2017-12-01

    With the advent of the NASA Magnetospheric Multiscale Mission and the Van Allen Probes we have space missions that probe the Earth's magnetosphere and radiation belts. These missions fly at far distances from the Earth in contrast to the larger number of near-Earth satellites. Both of the satellites make in situ measurements. Energetic particles flow along magnetic field lines from these measurement locations down to the ionosphere/thermosphere region. Discovering other data that may be used with these satellites is a difficult and complicated process. To solve this problem, we have developed a series of light-weight web services that can provide a new data search capability for the Virtual Ionosphere Thermosphere Mesosphere Observatory (VITMO). The services consist of a database of spacecraft ephemerides and instrument fields of view; an overlap calculator to find times when the fields of view of different instruments intersect; and a magnetic field line tracing service that maps in situ and ground based measurements for a number of magnetic field models and geophysical conditions. These services run in real-time when the user queries for data and allow the non-specialist user to select data that they were previously unable to locate, opening up analysis opportunities beyond the instrument teams and specialists, making it easier for future students who come into the field. Each service on their own provides a useful new capability for virtual observatories; operating together they provide a powerful new search tool. The ephemerides service was built using the Navigation and Ancillary Information Facility (NAIF) SPICE toolkit (http://naif.jpl.nasa.gov/naif/index.html) allowing them to be extended to support any Earth orbiting satellite with the addition of the appropriate SPICE kernels. The overlap calculator uses techniques borrowed from computer graphics to identify overlapping measurements in space and time. The calculator will allow a user defined uncertainty to be selected to allow "near misses" to be found. The magnetic field tracing service will feature a database of pre-calculated field line tracings of ground stations but will also allow dynamic tracing of arbitrary coordinates.

  6. WHERE ARE THE FOSSILS OF THE FIRST GALAXIES? II. TRUE FOSSILS, GHOST HALOS, AND THE MISSING BRIGHT SATELLITES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bovill, Mia S.; Ricotti, Massimo, E-mail: msbovill@astro.umd.edu

    We use a new set of cold dark matter simulations of the local universe to investigate the distribution of fossils of primordial dwarf galaxies within and around the Milky Way. Throughout, we build upon previous results showing agreement between the observed stellar properties of a subset of the ultra-faint dwarfs and our simulated fossils. Here, we show that fossils of the first galaxies have galactocentric distributions and cumulative luminosity functions consistent with observations. In our model, we predict {approx}300 luminous satellites orbiting the Milky Way, 50%-70% of which are well-preserved fossils. Within the Milky Way virial radius, the majority ofmore » these fossils have luminosities L{sub V} < 10{sup 6} L{sub sun}. Despite our multidimensional agreement with observations at low masses and luminosities, the primordial model produces an overabundance of bright dwarf satellites (L{sub V} > 10{sup 4} L{sub sun}) with respect to observations where observations are nearly complete. The 'bright satellite problem' is most evident in the outer parts of the Milky Way. We estimate that, although relatively bright, the primordial stellar populations are very diffuse, producing a population with surface brightnesses below surveys' detection limits, and are easily stripped by tidal forces. Although we cannot yet present unmistakable evidence for the existence of the fossils of first galaxies in the Local Group, the results of our studies suggest observational strategies that may demonstrate their existence: (1) the detection of 'ghost halos' of primordial stars around isolated dwarfs would prove that stars formed in minihalos (M < 10{sup 8} M{sub sun}) before reionization and strongly suggest that at least a fraction of the ultra-faint dwarfs are fossils of the first galaxies; and (2) the existence of a yet unknown population of {approx}150 Milky Way ultra-faints with half-light radii r{sub hl} {approx} 100-1000 pc and luminosities L{sub V} < 10{sup 4} L{sub sun}, detectable by future deep surveys. These undetected dwarfs would have the mass-to-light ratios, stellar velocity dispersions, and metallicities predicted in this work.« less

  7. Improving Discoverability Between the Magnetosphere and Ionosphere/Thermosphere Domains

    NASA Astrophysics Data System (ADS)

    Schaefer, R. K.; Morrison, D.; Potter, M.; Barnes, R. J.; Talaat, E. R.; Sarris, T.

    2016-12-01

    With the advent of the NASA Magnetospheric Multiscale Mission and the Van Allen Probes we have space missions that probe the Earth's magnetosphere and radiation belts. These missions fly at far distances from the Earth in contrast to the larger number of near-Earth satellites. Both of the satellites make in situ measurements. Energetic particles flow along magnetic field lines from these measurement locations down to the ionosphere/thermosphere region. Discovering other data that may be used with these satellites is a difficult and complicated process. To solve this problem we have developed a series of light-weight web services that can provide a new data search capability for the Virtual Ionosphere Thermosphere Mesosphere Observatory (VITMO). The services consist of a database of spacecraft ephemerides and instrument fields of view; an overlap calculator to find times when the fields of view of different instruments intersect; and a magnetic field line tracing service that maps in situ and ground based measurements for a number of magnetic field models and geophysical conditions. These services run in real-time when the user queries for data and allow the non-specialist user to select data that they were previously unable to locate, opening up analysis opportunities beyond the instrument teams and specialists. Each service on their own provides a useful new capability for virtual observatories; operating together they will provide a powerful new search tool. The ephemerides service is being built using the Navigation and Ancillary Information Facility (NAIF) SPICE toolkit (http://naif.jpl.nasa.gov) allowing them to be extended to support any Earth orbiting satellite with the addition of the appropriate SPICE kernels. The overlap calculator uses techniques borrowed from computer graphics to identify overlapping measurements in space and time. The calculator will allow a user defined uncertainty to be selected to allow "near misses" to be found. The magnetic field tracing service will feature a database of pre-calculated field line tracings of ground stations but will also allow dynamic tracing of arbitrary coordinates with a user selected choice of magnetic field models.

  8. A collinearity diagnosis of the GNSS geocenter determination

    NASA Astrophysics Data System (ADS)

    Rebischung, Paul; Altamimi, Zuheir; Springer, Tim

    2014-01-01

    The problem of observing geocenter motion from global navigation satellite system (GNSS) solutions through the network shift approach is addressed from the perspective of collinearity (or multicollinearity) among the parameters of a least-squares regression. A collinearity diagnosis, based on the notion of variance inflation factor, is therefore developed and allows handling several peculiarities of the GNSS geocenter determination problem. Its application reveals that the determination of all three components of geocenter motion with GNSS suffers from serious collinearity issues, with a comparable level as in the problem of determining the terrestrial scale simultaneously with the GNSS satellite phase center offsets. The inability of current GNSS, as opposed to satellite laser ranging, to properly sense geocenter motion is mostly explained by the estimation, in the GNSS case, of epoch-wise station and satellite clock offsets simultaneously with tropospheric parameters. The empirical satellite accelerations, as estimated by most Analysis Centers of the International GNSS Service, slightly amplify the collinearity of the geocenter coordinate, but their role remains secondary.

  9. Global design of satellite constellations: a multi-criteria performance comparison of classical walker patterns and new design patterns

    NASA Astrophysics Data System (ADS)

    Lansard, Erick; Frayssinhes, Eric; Palmade, Jean-Luc

    Basically, the problem of designing a multisatellite constellation exhibits a lot of parameters with many possible combinations: total number of satellites, orbital parameters of each individual satellite, number of orbital planes, number of satellites in each plane, spacings between satellites of each plane, spacings between orbital planes, relative phasings between consecutive orbital planes. Hopefully, some authors have theoretically solved this complex problem under simplified assumptions: the permanent (or continuous) coverage by a single and multiple satellites of the whole Earth and zonal areas has been entirely solved from a pure geometrical point of view. These solutions exhibit strong symmetry properties (e.g. Walker, Ballard, Rider, Draim constellations): altitude and inclination are identical, orbital planes and satellites are regularly spaced, etc. The problem with such constellations is their oversimplified and restricted geometrical assumption. In fact, the evaluation function which is used implicitly only takes into account the point-to-point visibility between users and satellites and does not deal with very important constraints and considerations that become mandatory when designing a real satellite system (e.g. robustness to satellite failures, total system cost, common view between satellites and ground stations, service availability and satellite reliability, launch and early operations phase, production constraints, etc.). An original and global methodology relying on a powerful optimization tool based on genetic algorithms has been developed at ALCATEL ESPACE. In this approach, symmetrical constellations can be used as initial conditions of the optimization process together with specific evaluation functions. A multi-criteria performance analysis is conducted and presented here in a parametric way in order to identify and evaluate the main sensitive parameters. Quantitative results are given for three examples in the fields of navigation, telecommunication and multimedia satellite systems. In particular, a new design pattern with very efficient properties in terms of robustness to satellite failures is presented and compared with classical Walker patterns.

  10. Covariance Structure Model Fit Testing under Missing Data: An Application of the Supplemented EM Algorithm

    ERIC Educational Resources Information Center

    Cai, Li; Lee, Taehun

    2009-01-01

    We apply the Supplemented EM algorithm (Meng & Rubin, 1991) to address a chronic problem with the "two-stage" fitting of covariance structure models in the presence of ignorable missing data: the lack of an asymptotically chi-square distributed goodness-of-fit statistic. We show that the Supplemented EM algorithm provides a…

  11. The case of the missing third.

    PubMed

    Robertson, Robin

    2005-01-01

    How is it that form arises out of chaos? In attempting to deal with this primary question, time and again a "Missing Third" is posited that lies between extremes. The problem of the "Missing Third" can be traced through nearly the entire history of thought. The form it takes, the problems that arise from it, the solutions suggested for resolving it, are each representative of an age. This paper traces the issue from Plato and Parmenides in the 4th--5th centuries, B.C.; to Neoplatonism in the 3rd--5th centuries; to Locke and Descartes in the 17th century; on to Berkeley and Kant in the 18th century; Fechner and Wundt in the 19th century; to behaviorism and Gestalt psychology, Jung, early in the 20th century, ethology and cybernetics later in the 20th century, then culminates late in the 20th century, with chaos theory.

  12. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  13. Diagnostic tolerance for missing sensor data

    NASA Technical Reports Server (NTRS)

    Scarl, Ethan A.

    1989-01-01

    For practical automated diagnostic systems to continue functioning after failure, they must not only be able to diagnose sensor failures but also be able to tolerate the absence of data from the faulty sensors. It is shown that conventional (associational) diagnostic methods will have combinatoric problems when trying to isolate faulty sensors, even if they adequately diagnose other components. Moreover, attempts to extend the operation of diagnostic capability past sensor failure will necessarily compound those difficulties. Model-based reasoning offers a structured alternative that has no special problems diagnosing faulty sensors and can operate gracefully when sensor data is missing.

  14. Scaling Analysis of Ocean Surface Turbulent Heterogeneities from Satellite Remote Sensing: Use of 2D Structure Functions.

    PubMed

    Renosh, P R; Schmitt, Francois G; Loisel, Hubert

    2015-01-01

    Satellite remote sensing observations allow the ocean surface to be sampled synoptically over large spatio-temporal scales. The images provided from visible and thermal infrared satellite observations are widely used in physical, biological, and ecological oceanography. The present work proposes a method to understand the multi-scaling properties of satellite products such as the Chlorophyll-a (Chl-a), and the Sea Surface Temperature (SST), rarely studied. The specific objectives of this study are to show how the small scale heterogeneities of satellite images can be characterised using tools borrowed from the fields of turbulence. For that purpose, we show how the structure function, which is classically used in the frame of scaling time series analysis, can be used also in 2D. The main advantage of this method is that it can be applied to process images which have missing data. Based on both simulated and real images, we demonstrate that coarse-graining (CG) of a gradient modulus transform of the original image does not provide correct scaling exponents. We show, using a fractional Brownian simulation in 2D, that the structure function (SF) can be used with randomly sampled couple of points, and verify that 1 million of couple of points provides enough statistics.

  15. A Methodology For Measuring Resilience in a Satellite-Based Communication Network

    DTIC Science & Technology

    2014-03-27

    solving the Travelling Salesman Problem (TSP) (Solnon p. 1). Based upon swarm intelligence, in a travelling salesman problem ants are sent out from...developed for the Travelling Salesman Problem (TSP) in 1992 (Solnon p. 1), this metaheuristic shows its roots in the original formulations. Given v, the...is lost. To tackle this problem , a common LEO orbit type is examined, the polar orbit. Polar LEO satellites travel from the south pole to the

  16. Use of missing data methods in longitudinal studies: the persistence of bad practices in developmental psychology.

    PubMed

    Jelicić, Helena; Phelps, Erin; Lerner, Richard M

    2009-07-01

    Developmental science rests on describing, explaining, and optimizing intraindividual changes and, hence, empirically requires longitudinal research. Problems of missing data arise in most longitudinal studies, thus creating challenges for interpreting the substance and structure of intraindividual change. Using a sample of reports of longitudinal studies obtained from three flagship developmental journals-Child Development, Developmental Psychology, and Journal of Research on Adolescence-we examined the number of longitudinal studies reporting missing data and the missing data techniques used. Of the 100 longitudinal studies sampled, 57 either reported having missing data or had discrepancies in sample sizes reported for different analyses. The majority of these studies (82%) used missing data techniques that are statistically problematic (either listwise deletion or pairwise deletion) and not among the methods recommended by statisticians (i.e., the direct maximum likelihood method and the multiple imputation method). Implications of these results for developmental theory and application, and the need for understanding the consequences of using statistically inappropriate missing data techniques with actual longitudinal data sets, are discussed.

  17. Missing Data in Clinical Studies: Issues and Methods

    PubMed Central

    Ibrahim, Joseph G.; Chu, Haitao; Chen, Ming-Hui

    2012-01-01

    Missing data are a prevailing problem in any type of data analyses. A participant variable is considered missing if the value of the variable (outcome or covariate) for the participant is not observed. In this article, various issues in analyzing studies with missing data are discussed. Particularly, we focus on missing response and/or covariate data for studies with discrete, continuous, or time-to-event end points in which generalized linear models, models for longitudinal data such as generalized linear mixed effects models, or Cox regression models are used. We discuss various classifications of missing data that may arise in a study and demonstrate in several situations that the commonly used method of throwing out all participants with any missing data may lead to incorrect results and conclusions. The methods described are applied to data from an Eastern Cooperative Oncology Group phase II clinical trial of liver cancer and a phase III clinical trial of advanced non–small-cell lung cancer. Although the main area of application discussed here is cancer, the issues and methods we discuss apply to any type of study. PMID:22649133

  18. VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA

    PubMed Central

    Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu

    2009-01-01

    We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190

  19. Outlier Removal in Model-Based Missing Value Imputation for Medical Datasets.

    PubMed

    Huang, Min-Wei; Lin, Wei-Chao; Tsai, Chih-Fong

    2018-01-01

    Many real-world medical datasets contain some proportion of missing (attribute) values. In general, missing value imputation can be performed to solve this problem, which is to provide estimations for the missing values by a reasoning process based on the (complete) observed data. However, if the observed data contain some noisy information or outliers, the estimations of the missing values may not be reliable or may even be quite different from the real values. The aim of this paper is to examine whether a combination of instance selection from the observed data and missing value imputation offers better performance than performing missing value imputation alone. In particular, three instance selection algorithms, DROP3, GA, and IB3, and three imputation algorithms, KNNI, MLP, and SVM, are used in order to find out the best combination. The experimental results show that that performing instance selection can have a positive impact on missing value imputation over the numerical data type of medical datasets, and specific combinations of instance selection and imputation methods can improve the imputation results over the mixed data type of medical datasets. However, instance selection does not have a definitely positive impact on the imputation result for categorical medical datasets.

  20. A Bayesian approach to truncated data sets: An application to Malmquist bias in Supernova Cosmology

    NASA Astrophysics Data System (ADS)

    March, Marisa Cristina

    2018-01-01

    A problem commonly encountered in statistical analysis of data is that of truncated data sets. A truncated data set is one in which a number of data points are completely missing from a sample, this is in contrast to a censored sample in which partial information is missing from some data points. In astrophysics this problem is commonly seen in a magnitude limited survey such that the survey is incomplete at fainter magnitudes, that is, certain faint objects are simply not observed. The effect of this `missing data' is manifested as Malmquist bias and can result in biases in parameter inference if it is not accounted for. In Frequentist methodologies the Malmquist bias is often corrected for by analysing many simulations and computing the appropriate correction factors. One problem with this methodology is that the corrections are model dependent. In this poster we derive a Bayesian methodology for accounting for truncated data sets in problems of parameter inference and model selection. We first show the methodology for a simple Gaussian linear model and then go on to show the method for accounting for a truncated data set in the case for cosmological parameter inference with a magnitude limited supernova Ia survey.

  1. Drug Abuse

    MedlinePlus

    ... drugs, including opioids Drug abuse also plays a role in many major social problems, such as drugged driving, violence, stress, and child abuse. Drug abuse can lead to homelessness, crime, and missed work or problems with keeping a job. It harms ...

  2. Math Is Not a Problem...When You Know How to Visualize It.

    ERIC Educational Resources Information Center

    Nelson, Dennis W.

    1983-01-01

    Visualization is an effective technique for determining exactly what students must do to solve a mathematics problem. Pictures and charts can be used to help children understand which mathematics facts are present and which are missing--an important step toward problem solving. (PP)

  3. Date Sensitive Computing Problems: Understanding the Threat

    DTIC Science & Technology

    1998-08-29

    equipment on Earth.3 It can also interfere with electromagnetic signals from such devices as cell phones, radio, televison , and radar. By itself, the ...spacecraft. Debris from impacted satellites will add to the existing orbital debris problem, and could eventually cause damage to other satellites...Date Sensitive Computing Problems Understanding the Threat Aug. 17, 1998 Revised Aug. 29, 1998 Prepared by: The National Crisis Response

  4. Tensor completion for estimating missing values in visual data.

    PubMed

    Liu, Ji; Musialski, Przemyslaw; Wonka, Peter; Ye, Jieping

    2013-01-01

    In this paper, we propose an algorithm to estimate missing values in tensors of visual data. The values can be missing due to problems in the acquisition process or because the user manually identified unwanted outliers. Our algorithm works even with a small amount of samples and it can propagate structure to fill larger missing regions. Our methodology is built on recent studies about matrix completion using the matrix trace norm. The contribution of our paper is to extend the matrix case to the tensor case by proposing the first definition of the trace norm for tensors and then by building a working algorithm. First, we propose a definition for the tensor trace norm that generalizes the established definition of the matrix trace norm. Second, similarly to matrix completion, the tensor completion is formulated as a convex optimization problem. Unfortunately, the straightforward problem extension is significantly harder to solve than the matrix case because of the dependency among multiple constraints. To tackle this problem, we developed three algorithms: simple low rank tensor completion (SiLRTC), fast low rank tensor completion (FaLRTC), and high accuracy low rank tensor completion (HaLRTC). The SiLRTC algorithm is simple to implement and employs a relaxation technique to separate the dependent relationships and uses the block coordinate descent (BCD) method to achieve a globally optimal solution; the FaLRTC algorithm utilizes a smoothing scheme to transform the original nonsmooth problem into a smooth one and can be used to solve a general tensor trace norm minimization problem; the HaLRTC algorithm applies the alternating direction method of multipliers (ADMMs) to our problem. Our experiments show potential applications of our algorithms and the quantitative evaluation indicates that our methods are more accurate and robust than heuristic approaches. The efficiency comparison indicates that FaLTRC and HaLRTC are more efficient than SiLRTC and between FaLRTC an- HaLRTC the former is more efficient to obtain a low accuracy solution and the latter is preferred if a high-accuracy solution is desired.

  5. Inferential Precision in Single-Case Time-Series Data Streams: How Well Does the EM Procedure Perform When Missing Observations Occur in Autocorrelated Data?

    PubMed Central

    Smith, Justin D.; Borckardt, Jeffrey J.; Nash, Michael R.

    2013-01-01

    The case-based time-series design is a viable methodology for treatment outcome research. However, the literature has not fully addressed the problem of missing observations with such autocorrelated data streams. Mainly, to what extent do missing observations compromise inference when observations are not independent? Do the available missing data replacement procedures preserve inferential integrity? Does the extent of autocorrelation matter? We use Monte Carlo simulation modeling of a single-subject intervention study to address these questions. We find power sensitivity to be within acceptable limits across four proportions of missing observations (10%, 20%, 30%, and 40%) when missing data are replaced using the Expectation-Maximization Algorithm, more commonly known as the EM Procedure (Dempster, Laird, & Rubin, 1977).This applies to data streams with lag-1 autocorrelation estimates under 0.80. As autocorrelation estimates approach 0.80, the replacement procedure yields an unacceptable power profile. The implications of these findings and directions for future research are discussed. PMID:22697454

  6. Improving Long-term Quality and Continuity of Landsat-7 Data Through Inpainting of Lost Data Based on the Nonconvex Model of Dynamic Dictionary Learning

    NASA Astrophysics Data System (ADS)

    Miao, J.; Zhou, Z.; Zhou, X.; Huang, T.

    2017-12-01

    On May 31, 2003, the scan line corrector (SLC) of the Enhance Thematic Mapper Plus (ETM+) on board the Landsat-7 satellite was broken down, resulting in strips of lost data in the Landsat-7 images, which seriously affected the quality and continuous applications of the ETM+ data for space and earth science. This paper proposes a new inpainting method for repairing the Landsat-7 ETM+ images taking into account the physical characteristics and geometric features of the ground area of which the data are missed. Firstly, the two geometric slopes of the boundaries of each missing stripe of the georeferenced ETM+ image is calculated by the Hough, ignoring the slope of the part of the missing strip that are on the same edges of the whole image. Secondly, an adaptive dictionary was developed and trained using a large number of Landsat-7 ETM+ SLC-ON images. When the adaptive dictionary is used to restore an image with missing data, the dictionary is actually dynamic. Then the data-missing strips were repaired along their slope directions by using the logdet (.) low-rank non-convex model along with dynamic dictionary. Imperfect points are defined as the pixels whose values are quite different from its surrounding pixel values. They can be real values but most likely can be noise. Lastly, the imperfect points after the second step were replaced by using the method of sparse restoration of the overlapping groups. We take the Landsat ETM+ images of June 10, 2002 as the test image for our algorithm evaluation. There is no data missing in this image. Therefore we extract the same missing -stripes of the images of the same WRS path and WRS row as the 2002 image but acquired after 2003 to form the missing-stripe model. Then we overlay the missing-stripe model over the image of 2002 to get the simulated missing image. Fig.1(a)-(c) show the simulated missing images of Bands 1, 3, and 5 of the 2002 ETM+ image data. We apply the algorithm to restore the missing stripes. Fig.1(d)-(f) show the restored images of Bands 1, 3, and 5, corresponding to the images (a)-(c). The repaired images are then compared with the original images band by band and it is found the algorithm works very well. We will show application of the algorithm to other images and the details in comparison.

  7. Ring Around the Sun

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Our 'constant' sun is really more like a spherical sea of incredibly hot plasma, changing all the time. Astronomers like to keep a good eye on it, so no dramatic change goes by unnoticed. One amazing occurrence happened on Dec 7, 2007 and was seen by one of the two STEREO satellites. STEREO, as you recall, consists of a pair of satellites which observe the sun from different angles and allow astronomers to get a ŗ-D' view of the solar atmosphere and solar outflows. On December 7 one of the STEREO satellites captured a view (in the extreme ultraviolet part of the electromagnetic spectrum) of a Coronal Mass Ejection that released a huge amount of energy into the solar atmosphere, and a huge amount of matter into interplanetary space. A sort of atmospheric 'sunquake'. One result of this 'sunquake' was the production of a giant wave rippling through almost the entire solar atmosphere. The image above shows a snapshot of this unbelievable wave, slightly enhanced for viewability. Don't miss the movie. What damps the wave?

  8. Sampling design for an integrated socioeconomic and ecological survey by using satellite remote sensing and ordination

    PubMed Central

    Binford, Michael W.; Lee, Tae Jeong; Townsend, Robert M.

    2004-01-01

    Environmental variability is an important risk factor in rural agricultural communities. Testing models requires empirical sampling that generates data that are representative in both economic and ecological domains. Detrended correspondence analysis of satellite remote sensing data were used to design an effective low-cost sampling protocol for a field study to create an integrated socioeconomic and ecological database when no prior information on ecology of the survey area existed. We stratified the sample for the selection of tambons from various preselected provinces in Thailand based on factor analysis of spectral land-cover classes derived from satellite data. We conducted the survey for the sampled villages in the chosen tambons. The resulting data capture interesting variations in soil productivity and in the timing of good and bad years, which a purely random sample would likely have missed. Thus, this database will allow tests of hypotheses concerning the effect of credit on productivity, the sharing of idiosyncratic risks, and the economic influence of environmental variability. PMID:15254298

  9. The effect of radio frequency interference on the 136- to 138-MHz return link and 400.5- to 401.5-MHz forward link of the Tracking and Data Relay Satellite System

    NASA Technical Reports Server (NTRS)

    Jenny, J.; Lyttle, J. D.

    1973-01-01

    The purpose is to update the RFI estimates in the 136- to 138-MHz VHF band and to make estimates for the first time for the 400.5- to 401.5-MHz UHF band. These preliminary predictions are based on primarily ITU frequency-registration data, with missing data bridged by engineering judgement.

  10. Integrative missing value estimation for microarray data.

    PubMed

    Hu, Jianjun; Li, Haifeng; Waterman, Michael S; Zhou, Xianghong Jasmine

    2006-10-12

    Missing value estimation is an important preprocessing step in microarray analysis. Although several methods have been developed to solve this problem, their performance is unsatisfactory for datasets with high rates of missing data, high measurement noise, or limited numbers of samples. In fact, more than 80% of the time-series datasets in Stanford Microarray Database contain less than eight samples. We present the integrative Missing Value Estimation method (iMISS) by incorporating information from multiple reference microarray datasets to improve missing value estimation. For each gene with missing data, we derive a consistent neighbor-gene list by taking reference data sets into consideration. To determine whether the given reference data sets are sufficiently informative for integration, we use a submatrix imputation approach. Our experiments showed that iMISS can significantly and consistently improve the accuracy of the state-of-the-art Local Least Square (LLS) imputation algorithm by up to 15% improvement in our benchmark tests. We demonstrated that the order-statistics-based integrative imputation algorithms can achieve significant improvements over the state-of-the-art missing value estimation approaches such as LLS and is especially good for imputing microarray datasets with a limited number of samples, high rates of missing data, or very noisy measurements. With the rapid accumulation of microarray datasets, the performance of our approach can be further improved by incorporating larger and more appropriate reference datasets.

  11. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Algebra Word Problem Solving Approaches in a Chemistry Context: Equation Worked Examples versus Text Editing

    ERIC Educational Resources Information Center

    Ngu, Bing Hiong; Yeung, Alexander Seeshing

    2013-01-01

    Text editing directs students' attention to the problem structure as they classify whether the texts of word problems contain sufficient, missing or irrelevant information for working out a solution. Equation worked examples emphasize the formation of a coherent problem structure to generate a solution. Its focus is on the construction of three…

  13. Gender Differences in Solution of Algebraic Word Problems Containing Irrelevant Information.

    ERIC Educational Resources Information Center

    Low, Renae; Over, Ray

    1993-01-01

    Female tenth graders (n=217) were less likely than male tenth graders (n=219) to identify missing or irrelevant information in algebra problems. Female eleventh graders (n=234) were less likely than male eleventh graders (n=287) to solve problems with irrelevant information. Results indicate sex differences in knowledge of problem structure. (SLD)

  14. Guide to Understanding Moebius Syndrome

    MedlinePlus

    ... due to upper body weakness • Strabismus (crossed eyes) • Dry eyes and irritability • Dental problems • High palate • Cleft palate • Hand and feet problems including club foot and missing or fused fingers (syndactyly) • Hearing problems • Poland’s syndrome (chest wall and upper limb anomalies) Although they ...

  15. Teaching Pronunciation in the Learner-Centered Classroom.

    ERIC Educational Resources Information Center

    Lin, Hsiang-Pao; And Others

    Specific tools and techniques to help students of English as a Second Language overcome pronunciation problems are presented. The selection of problems addressed is based on the frequency and seriousness of errors that many native Chinese-speaking learners produce. Ways to resolve various problems (e.g., missing final consonants, misplaced stress…

  16. Modeling of afforestation possibilities on one part of Hungary

    NASA Astrophysics Data System (ADS)

    Bozsik, Éva; Riczu, Péter; Tamás, János; Burriel, Charles; Helilmeier, Hermann

    2015-04-01

    Agroforestry systems are part of the history of the European Union rural landscapes, but the regional increase of size of agricultural parcels had a significant effect on European land use in the 20th century, thereby it has radically reduced the coverage of natural forest. However, this cause conflicts between interest of agricultural and forestry sectors. The agroforestry land uses could be a solution of this conflict management. One real - ecological - problem with the remnant forests and new forest plantation is the partly missing of network function without connecting ecological green corridors, the other problem is verifiability for the agroforestry payment system, monitoring the arable lands and plantations. Remote sensing methods are currently used to supervise European Union payments. Nowadays, next to use satellite imagery the airborne hyperspectral and LiDAR (Light Detection And Ranging) remote sensing technologies are becoming more widespread use for nature, environmental, forest, agriculture protection, conservation and monitoring and it is an effective tool for monitoring biomass production. In this Hungarian case study we made a Spatial Decision Support System (SDSS) to create agroforestry site selection model. The aim of model building was to ensure the continuity of ecological green corridors, maintain the appropriate land use of regional endowments. The investigation tool was the more widely used hyperspectral and airborne LiDAR remote sensing technologies which can provide appropriate data acquisition and data processing tools to build a decision support system

  17. Attitude stability of spinning satellites

    NASA Technical Reports Server (NTRS)

    Caughey, T. K.

    1980-01-01

    Some problems of attitude stability of spinning satellites are treated in a rigorous manner. With certain restrictions, linearized stability analysis correctly predicts the attitude stability of spinning satellites, even in the critical cases of the Liapunov-Poincare stability theory.

  18. Prediction of missing common genes for disease pairs using network based module separation on incomplete human interactome.

    PubMed

    Akram, Pakeeza; Liao, Li

    2017-12-06

    Identification of common genes associated with comorbid diseases can be critical in understanding their pathobiological mechanism. This work presents a novel method to predict missing common genes associated with a disease pair. Searching for missing common genes is formulated as an optimization problem to minimize network based module separation from two subgraphs produced by mapping genes associated with disease onto the interactome. Using cross validation on more than 600 disease pairs, our method achieves significantly higher average receiver operating characteristic ROC Score of 0.95 compared to a baseline ROC score 0.60 using randomized data. Missing common genes prediction is aimed to complete gene set associated with comorbid disease for better understanding of biological intervention. It will also be useful for gene targeted therapeutics related to comorbid diseases. This method can be further considered for prediction of missing edges to complete the subgraph associated with disease pair.

  19. An Analytical Singularity-Free Solution to the J2 Perturbation Problem

    NASA Technical Reports Server (NTRS)

    Bond, V. R.

    1979-01-01

    The development of a singularity-free solution of the J2 problem in satellite theory is presented. The procedure resembles that of Lyndane who rederives Brouwer's satellite theory using Poincare elements. A comparable procedure is used in this report in which the satellite theory of Scheifele, who used elements similar to the Delaunay elements but in the extended phase space, is rederived using Poincare elements also in the extended phase space. Only the short-period effects due to J2 are included.

  20. X-ray lasers: Strategic problems and potential as an in-orbit exoatmospheric ballistic missile defense system

    NASA Astrophysics Data System (ADS)

    Perusich, Karl Anthony

    1986-12-01

    The problems and potential of a single proposed ballistic missile defense system, the X-ray laser-armed satellite, are examined in this research. Specifically, the X-ray laser satellite system is examined to determine its impact on the issues of cost-effectiveness and crisis stability. To examime the cost-effectiveness and the crisis stability of the X-ray laser satellites, a simulation of a nuclear exchange was constructed. The X-ray laser satellites were assumed to be vulnerable to attack from energy satellites with limited satellite-to-satellite lethal ranges. Symmetric weapons and force postures were used. Five principal weapon classes were used in the model: ICMBs, SLBMs, X-ray laser satellites, bombers, and endo-atmospheric silo defenses. Also, the orbital dynamics of the ballistic missiles and satellites were simulated. The cost-effectiveness of the X-ray laser satellites was determined for two different operational capabilities, damage-limitation and assured destruction. The following conclusions were reached. The effects of deployment of a new weapon system on the Triad as a whole should be examined. The X-ray laser was found to have little effectiveness as a damage-limiting weapon for a defender. For an assured destruction capability, X-ray laser satellites could be part of a minimum-cost force mix with that capability.

  1. GRACE Accelerometer data transplant

    NASA Astrophysics Data System (ADS)

    Bandikova, T.; McCullough, C. M.; Kruizinga, G. L. H.

    2017-12-01

    The Gravity Recovery and Climate Experiment (GRACE) has recently celebrated its 15th anniversary. The aging of the satellites brings along new challenges for both mission operation and science data delivery. Since September 2016, the accelerometer (ACC) onboard GRACE-B has been permanently turned off in order to reduce the battery load. The absence of the information about the non-gravitational forces acting on the spacecraft dramatically decreases the accuracy of the monthly gravity field solutions. The missing GRACE-B accelerometer data, however, can be recovered from the GRACE-A accelerometer measurement with satisfactory accuracy. In the current GRACE data processing, simple ACC data transplant is used which includes only attitude and time correction. The full ACC data transplant, however, requires not only the attitude and time correction, but also modeling of the residual accelerations due to thruster firings, which is the most challenging part. The residual linear accelerations ("thruster spikes") are caused by thruster imperfections such as misalignment of thruster pair, force imbalance or differences in reaction time. The thruster spikes are one of the most dominant high-frequency signals in the ACC measurement. The shape and amplitude of the thruster spikes are unique for each thruster pair, for each firing duration (30 ms - 1000 ms), for each x,y,z component of the ACC linear acceleration, and for each spacecraft. In our approach, the thruster spike model is an analytical function obtained by inverse Laplace transform of the ACC transfer function. The model shape parameters (amplitude, width and time delay) are estimated using Least squares method. The ACC data transplant is validated for days when ACC data from both satellites were available. The fully transplanted data fits the original GRACE-B measurement very well. The full ACC data transplant results in significantly reduced high frequency noise compared to the simple ACC transplant (i.e. without thruster spike modeling). The full ACC data transplant is a promising solution, which will allow GRACE to deliver high quality science data despite the serious problems related to satellite aging.

  2. Impact of tracking loop settings of the Swarm GPS receiver on gravity field recovery

    NASA Astrophysics Data System (ADS)

    Dahle, C.; Arnold, D.; Jäggi, A.

    2017-06-01

    The Swarm mission consists of three identical satellites equipped with GPS receivers and orbiting in near-polar low Earth orbits. Thus, they can be used to determine the Earth's gravity field by means of high-low satellite-to-satellite tracking (hl-SST). However, first results by several groups have revealed systematic errors both in precise science orbits and resulting gravity field solutions which are caused by ionospheric disturbances affecting the quality of Swarm GPS observations. Looking at gravity field solutions, the errors lead to systematic artefacts located in two bands north and south of the geomagnetic equator. In order to reduce these artefacts, erroneous GPS observations can be identified and rejected before orbit and gravity field processing, but this may also lead to slight degradations of orbit and low degree gravity field coefficient quality. Since the problems were believed to be receiver-specific, the GPS tracking loop bandwidths onboard Swarm have been widened several times starting in May 2015. The influence of these tracking loop updates on Swarm orbits and, particularly, gravity field solutions is investigated in this work. The main findings are that the first updates increasing the bandwidth from 0.25 Hz to 0.5 Hz help to significantly improve the quality of Swarm gravity fields and that the improvements are even larger than those achieved by GPS data rejection. It is also shown that these improvements are indeed due to an improved quality of GPS observations around the geomagnetic equator, and not due to missing observations in these regions. As the ionospheric activity is rather low in the most recent months, the effect of the tracking loop updates in summer 2016 cannot be properly assessed yet. Nevertheless, the quality of Swarm gravity field solutions has already improved after the first updates which is especially beneficial in view of filling the upcoming gap between the GRACE and GRACE Follow-on missions with hl-SST gravity products.

  3. Quantification of CO2 and CH4 megacity emissions using portable solar absorption spectrometers

    NASA Astrophysics Data System (ADS)

    Frey, Matthias; Hase, Frank; Blumenstock, Thomas; Morino, Isamu; Shiomi, Kei

    2017-04-01

    Urban areas already contribute to over 50% of the global population, additionally the percentage of the worldwide population living in Metropolitan areas is continuously growing. Thus, a precise knowledge of urban greenhouse gas (GHG) emissions is of utmost importance. Whereas, however, GHG emissions on a nationwide to continental scale can be relatively precisely estimated using satellite observations (and fossil fuel consumption statistics), reliable estimations for local to regional scale emissions pose a bigger problem due to lack of timely and spatially high resolved satellite data and possible biases of passive spectroscopic nadir observations (e.g. enhanced aerosol scattering in a city plume). Furthermore, emission inventories on the city scale might be missing contributions (e.g. methane leakage from gas pipes). Here, newly developed mobile low resolution Fourier Transform spectrometers (Bruker EM27/SUN) are utilized to quantify small scale emissions. This novel technique was successfully tested before by KIT and partners during campaigns in Berlin, Paris and Colorado for detecting emissions from various sources. We present results from a campaign carried out in February - April 2016 in the Tokyo bay area, one of the biggest Metropolitan areas worldwide. We positioned two EM27/SUN spectrometers on the outer perimeter of Tokyo along the prevailing wind axis upwind and downwind of the city source. Before and after the campaign, calibration measurements were performed in Tsukuba with a collocated high resolution FTIR spectrometer from the Total Carbon Column Observing Network (TCCON). During the campaign the observed XCO2 and XCH4 values vary significantly. Additionally, intraday variations are observed at both sites. Furthermore, an enhancement due to the Tokyo area GHG emissions is clearly visible for both XCO2 and XCH4. The observed signals are significantly higher compared to prior campaigns targeting other major cities. We perform a rough estimate of the source strength. Finally, a comparison with an observation from the OCO-2 satellite is shown.

  4. System aspects of spacecraft charging

    NASA Technical Reports Server (NTRS)

    Bower, S. P.

    1977-01-01

    Satellites come in a variety of sizes and configurations including spinning satellites and three-axis stabilized satellites. All of these characteristics have a significant effect on spacecraft charging considerations. There are, however, certain fundamentals which can be considered which indicate the nature and extent of the problem. The global positioning system satellite serves to illustrate certain characteristics.

  5. Filamentary field-aligned currents at the polar cap region during northward interplanetary magnetic field derived with the Swarm constellation

    PubMed Central

    Lühr, Hermann; Huang, Tao; Wing, Simon; Kervalishvili, Guram; Rauberg, Jan; Korth, Haje

    2017-01-01

    ESA’s Swarm constellation mission makes it possible for the first time to determine field-aligned currents (FACs) in the ionosphere uniquely. In particular at high latitudes, the dual-satellite approach can reliably detect some FAC structures which are missed by the traditional single-satellite technique. These FAC events occur preferentially poleward of the auroral oval and during times of northward interplanetary magnetic field (IMF) orientation. Most events appear on the nightside. They are not related to the typical FAC structures poleward of the cusp, commonly termed NBZ. Simultaneously observed precipitating particle spectrograms and auroral images from Defense Meteorological Satellite Program (DMSP) satellites are consistent with the detected FACs and indicate that they occur on closed field lines mostly adjacent to the auroral oval. We suggest that the FACs are associated with Sun-aligned filamentary auroral arcs. Here we introduce in an initial study features of the high-latitude FAC structures which have been observed during the early phase of the Swarm mission. A more systematic survey over longer times is required to fully characterize the so far undetected field aligned currents. PMID:29056833

  6. Aerosol loading in the Southeastern United States: reconciling surface and satellite observations

    NASA Astrophysics Data System (ADS)

    Ford, B.; Heald, C. L.

    2013-09-01

    We investigate the seasonality in aerosols over the Southeastern United States using observations from several satellite instruments (MODIS, MISR, CALIOP) and surface network sites (IMPROVE, SEARCH, AERONET). We find that the strong summertime enhancement in satellite-observed aerosol optical depth (AOD) (factor 2-3 enhancement over wintertime AOD) is not present in surface mass concentrations (25-55% summertime enhancement). Goldstein et al. (2009) previously attributed this seasonality in AOD to biogenic organic aerosol; however, surface observations show that organic aerosol only accounts for ∼35% of fine particulate matter (smaller than 2.5 μm in aerodynamic diameter, PM2.5) and exhibits similar seasonality to total surface PM2.5. The GEOS-Chem model generally reproduces these surface aerosol measurements, but underrepresents the AOD seasonality observed by satellites. We show that seasonal differences in water uptake cannot sufficiently explain the magnitude of AOD increase. As CALIOP profiles indicate the presence of additional aerosol in the lower troposphere (below 700 hPa), which cannot be explained by vertical mixing, we conclude that the discrepancy is due to a missing source of aerosols above the surface layer in summer.

  7. Optimal design of the satellite constellation arrangement reconfiguration process

    NASA Astrophysics Data System (ADS)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  8. Missed injuries during the initial assessment in a cohort of 1124 level-1 trauma patients.

    PubMed

    Giannakopoulos, G F; Saltzherr, T P; Beenen, L F M; Reitsma, J B; Bloemers, F W; Goslings, J C; Bakker, F C

    2012-09-01

    Despite the presence of diagnostic guidelines for the initial evaluation in trauma, the reported incidence of missed injuries is considerable. The aim of this study was to assess the missed injuries in a large cohort of trauma patients originating from two European Level-1 trauma centres. We analysed the 1124 patients included in the randomised REACT trial. Missed injuries were defined as injuries not diagnosed or suspected during initial clinical and radiological evaluation in the trauma room. We assessed the frequency, type, consequences and the phase in which the missed injuries were diagnosed and used univariate analysis to identify potential contributing factors. Eight hundred and three patients were male, median age was 38 years and 1079 patients sustained blunt trauma. Overall, 122 injuries were missed in 92 patients (8.2%). Most injuries concerned the extremities. Sixteen injuries had an AIS grade of ≥ 3. Patients with missed injuries had significantly higher injury severity scores (ISSs) (median of 15 versus 5, p<0.001). Factors associated with missed injuries were severe traumatic brain injury (GCS ≤ 8) and multitrauma (ISS ≥ 16). Seventy-two missed injuries remained undetected during tertiary survey (59%). In total, 31 operations were required for 26 initially missed injuries. Despite guidelines to avoid missed injuries, this problem is hard to prevent, especially in the severely injured. The present study showed that the rate of missed injuries was comparable with the literature and their consequences not severe. A high index of suspicion remains warranted, especially in multitrauma patients. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Protocol Support for a New Satellite-Based Airspace Communication Network

    NASA Technical Reports Server (NTRS)

    Shang, Yadong; Hadjitheodosiou, Michael; Baras, John

    2004-01-01

    We recommend suitable transport protocols for an aeronautical network supporting Internet and data services via satellite. We study the characteristics of an aeronautical satellite hybrid network and focus on the problems that cause dramatically degraded performance of the Transport Protocol. We discuss various extensions to standard TCP that alleviate some of these performance problems. Through simulation, we identify those TCP implementations that can be expected to perform well. Based on the observation that it is difficult for an end-to-end solution to solve these problems effectively, we propose a new TCP-splitting protocol, termed Aeronautical Transport Control Protocol (AeroTCP). The main idea of this protocol is to use a fixed window for flow control and one duplicated acknowledgement (ACK) for fast recovery. Our simulation results show that AeroTCP can maintain higher utilization for the satellite link than end-to-end TCP, especially in high BER environment.

  10. STS-72 Flight Day 2

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On this second day of the STS-72 mission, the flight crew, Cmdr. Brian Duffy, Pilot Brent W. Jett, and Mission Specialists Leroy Chiao, Daniel T. Barry, Winston E. Scott, and Koichi Wakata (NASDA), awakened to music from the motion picture 'Star Wars.' The crew performed a systems checkout, prepared for the retrieval of the Japanese Space Flyer Unit (SFU), tested the spacesuits for the EVA, and activated some of the secondary experiments. An in-orbit news interview was conducted with the crew via satellite downlinking. Questions asked ranged from the logistics of the mission to the avoidance procedures the Endeavour Orbiter performed to miss hitting the inactive Air Force satellite, nicknamed 'Misty' (MSTI). Earth views included cloud cover, several storm systems, and various land masses with several views of the shuttle's open cargo bay in the foreground.

  11. Assessing the population coverage of a health demographic surveillance system using satellite imagery and crowd-sourcing.

    PubMed

    Di Pasquale, Aurelio; McCann, Robert S; Maire, Nicolas

    2017-01-01

    Remotely sensed data can serve as an independent source of information about the location of residential structures in areas under demographic and health surveillance. We report on results obtained combining satellite imagery, imported from Bing, with location data routinely collected using the built-in GPS sensors of tablet computers, to assess completeness of population coverage in a Health and Demographic Surveillance System in Malawi. The Majete Malaria Project Health and Demographic Surveillance System, in Malawi, started in 2014 to support a project with the aim of studying the reduction of malaria using an integrated control approach by rolling out insecticide treated nets and improved case management supplemented with house improvement and larval source management. In order to support the monitoring of the trial a Health and Demographic Surveillance System was established in the area that surrounds the Majete Wildlife Reserve (1600 km2), using the OpenHDS data system. We compared house locations obtained using GPS recordings on mobile devices during the demographic surveillance census round with those acquired from satellite imagery. Volunteers were recruited through the crowdcrafting.org platform to identify building structures on the images, which enabled the compilation of a database with coordinates of potential residences. For every building identified on these satellite images by the volunteers (11,046 buildings identified of which 3424 (ca. 30%) were part of the censused area), we calculated the distance to the nearest house enumerated on the ground by fieldworkers during the census round of the HDSS. A random sample of buildings (85 structures) identified on satellite images without a nearby location enrolled in the census were visited by a fieldworker to determine how many were missed during the baseline census survey, if any were missed. The findings from this ground-truthing effort suggest that a high population coverage was achieved in the census survey, however the crowd-sourcing did not locate many of the inhabited structures (52.3% of the 6543 recorded during the census round). We conclude that using auxiliary data can play a useful role in quality assurance in population based health surveillance, but improved algorithms would be needed if crowd-sourced house locations are to be used as the basis of population databases.

  12. Model of load distribution for earth observation satellite

    NASA Astrophysics Data System (ADS)

    Tu, Shumin; Du, Min; Li, Wei

    2017-03-01

    For the system of multiple types of EOS (Earth Observing Satellites), it is a vital issue to assure that each type of payloads carried by the group of EOS can be used efficiently and reasonably for in astronautics fields. Currently, most of researches on configuration of satellite and payloads focus on the scheduling for launched satellites. However, the assignments of payloads for un-launched satellites are bit researched, which are the same crucial as the scheduling of tasks. Moreover, the current models of satellite resources scheduling lack of more general characteristics. Referring the idea about roles-based access control (RBAC) of information system, this paper brings forward a model based on role-mining of RBAC to improve the generality and foresight of the method of assignments of satellite-payload. By this way, the assignment of satellite-payload can be mapped onto the problem of role-mining. A novel method will be introduced, based on the idea of biclique-combination in graph theory and evolutionary algorithm in intelligence computing, to address the role-mining problem of satellite-payload assignments. The simulation experiments are performed to verify the novel method. Finally, the work of this paper is concluded.

  13. Consensus of satellite cluster flight using an energy-matching optimal control method

    NASA Astrophysics Data System (ADS)

    Luo, Jianjun; Zhou, Liang; Zhang, Bo

    2017-11-01

    This paper presents an optimal control method for consensus of satellite cluster flight under a kind of energy matching condition. Firstly, the relation between energy matching and satellite periodically bounded relative motion is analyzed, and the satellite energy matching principle is applied to configure the initial conditions. Then, period-delayed errors are adopted as state variables to establish the period-delayed errors dynamics models of a single satellite and the cluster. Next a novel satellite cluster feedback control protocol with coupling gain is designed, so that the satellite cluster periodically bounded relative motion consensus problem (period-delayed errors state consensus problem) is transformed to the stability of a set of matrices with the same low dimension. Based on the consensus region theory in the research of multi-agent system consensus issues, the coupling gain can be obtained to satisfy the requirement of consensus region and decouple the satellite cluster information topology and the feedback control gain matrix, which can be determined by Linear quadratic regulator (LQR) optimal method. This method can realize the consensus of satellite cluster period-delayed errors, leading to the consistency of semi-major axes (SMA) and the energy-matching of satellite cluster. Then satellites can emerge the global coordinative cluster behavior. Finally the feasibility and effectiveness of the present energy-matching optimal consensus for satellite cluster flight is verified through numerical simulations.

  14. Overcoming an obstacle in expanding a UMLS semantic type extent.

    PubMed

    Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James

    2012-02-01

    This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent

    PubMed Central

    Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James

    2011-01-01

    This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287

  16. Algorithm and Application of Gcp-Independent Block Adjustment for Super Large-Scale Domestic High Resolution Optical Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.

    2018-04-01

    The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.

  17. An estimate of the suspended particulate matter (SPM) transport in the southern North Sea using SeaWiFS images, in situ measurements and numerical model results

    NASA Astrophysics Data System (ADS)

    Fettweis, Michael; Nechad, Bouchra; Van den Eynde, Dries

    2007-06-01

    A study is presented where satellite images (SeaWiFS), in situ measurements (tidal cycle and snapshot) and a 2D hydrodynamic numerical model have been combined to calculate the long term SPM (Suspended Particulate Matter) transport through the Dover Strait and in the southern North Sea. The total amount of SPM supplied to the North Sea through the Dover Strait is estimated to be 31.74×10 6 t. The satellite images provide synoptic views of SPM concentration distribution but do not take away the uncertainty of SPM transport calculation. This is due to the fact that SPM concentration varies as a function of tide, wind, spring-neap tidal cycles and seasons. The short term variations (tidal, spring-neap tidal cycle) have not been found in the satellite images, however seasonal variations are clearly visible. Furthermore the SPM concentration in the satellite images is generally lower than in the in situ measurements. The representativness of SPM concentration maps derived from satellites for calculating long term transports has therefore been investigated by comparing the SPM concentration variability from the in situ measurements with those of the remote sensing data. The most important constraints of satellite images are related to the fact that satellite data is evidence of clear sky conditions, whereas in situ measurements from a vessel can be carried out also during rougher meteorological conditions and that due to the too low time resolution of the satellite images the SPM concentration peaks are often missed. It is underlined that SPM concentration measurements should be carried out during at least one tidal cycle in high turbidity areas to obtain representative values of SPM concentration.

  18. Saturn's Misbegotten Moonlets

    NASA Astrophysics Data System (ADS)

    Spitale, Joseph N.

    2017-06-01

    Saturn's rings are interspersed with numerous narrow (tens of km wide) gaps. Two of the largest of these gaps -- Encke and Keeler -- contain satellites -- Pan and Daphnis -- that maintain their respective gaps via the classical Goldreich/Tremaine-style shepherding mechanism wherein angular momentum is transferred across the essentially empty gap via torques acting between the satellites and the ring. Other prominent gaps are shepherded by resonances with external satellites or planetary modes: Mimas shepherds the outer edge of the B ring, clearing the inner part of the Cassini Division, Titan shepherds the Columbo ringlet / gap, and the Maxwell ringlet / gap is likely maintained by a resonance with a planetary mode. Prior to Cassini, it was expected that all of the gaps would be shepherded in a similar manner.However, many small gaps do not correspond with known resonances, and no satellites were spotted within those gaps during Cassini's prime and extended mission. To address this issue, a series of Cassini imaging observations were planned to examine 11 gaps in the C ring and Cassini division at a resolution and longitudinal coverage sufficient to either discover the shepherds or rule out their presence. The survey discovered no embedded satellites. Longitudinal coverage was incomplete, but within longitudes covered by the survey, satellites are ruled out to sizes in the 100-m range, far too small keep the observed gaps open. It is possible (about even odds) that there could be a larger satellite residing at a longitude not covered in the survey, but the probability that the survey was unfortunate enough to miss significant satellites in all 11 gaps is exceedingly small (~0.002%). Moreover, these gaps appear in earlier imaging sequences, with some high-resolution coverage, so the true probability is smaller yet. Therefore, a new theory is likely needed to explain the presence of the gaps.

  19. NRL’s Forward Technology Solar Cell Experiment Flies as Part of MISSE-5 Aboard Space Shuttle Discovery Mission

    DTIC Science & Technology

    2006-01-01

    Satellite Service in cooperation with ARISS (Amateur Radio on the International Space Station) and provides a PSK-31 multiuser transponder, an FM voice...interference with existing ARISS missions. PCSat2 has quad redun- dant transmit inhibits for extravehicular activity safety issues, thus it is easy...to deactivate to avoid any issues with other UHF ARISS experiments that may be acti- vated in the future. Acknowledgments: The authors acknowledge

  20. Year Five of Southeast Atlantic Coastal Ocean Observing System (SEACOOS) Implementation

    DTIC Science & Technology

    2007-12-15

    137 total]. Alvera -Azcarate, A., A. Barth, J.M. Beckers, and R.H. Weisberg, 2007. Multivariate reconstruction of missing data in sea surface...temperature, chlorophyll and wind satellite fields. Jour. Geophys. Res., 112, C03008, doi: 10.1029/2006JC003660. Alvera -Azcarate, A., A. Barth, and R.H...A., J.-M. Beckers, A. Alvera -Azcarate, and R. H. Weisberg, 2007. Filtering inertia-gravity waves from the initial conditions of the linear shallow

  1. Mind the gap: The impact of missing data on the calculation of phytoplankton phenology metrics

    NASA Astrophysics Data System (ADS)

    Cole, Harriet; Henson, Stephanie; Martin, Adrian; Yool, Andrew

    2012-08-01

    Annual phytoplankton blooms are key events in marine ecosystems and interannual variability in bloom timing has important implications for carbon export and the marine food web. The degree of match or mismatch between the timing of phytoplankton and zooplankton annual cycles may impact larval survival with knock-on effects at higher trophic levels. Interannual variability in phytoplankton bloom timing may also be used to monitor changes in the pelagic ecosystem that are either naturally or anthropogenically forced. Seasonality metrics that use satellite ocean color data have been developed to quantify the timing of phenological events which allow for objective comparisons between different regions and over long periods of time. However, satellite data sets are subject to frequent gaps due to clouds and atmospheric aerosols, or persistent data gaps in winter due to low sun angle. Here we quantify the impact of these gaps on determining the start and peak timing of phytoplankton blooms. We use the NASA Ocean Biogeochemical Model that assimilates SeaWiFS data as a gap-free time series and derive an empirical relationship between the percentage of missing data and error in the phenology metric. Applied globally, we find that the majority of subpolar regions have typical errors of 30 days for the bloom initiation date and 15 days for the peak date. The errors introduced by intermittent data must be taken into account in phenological studies.

  2. Exploring quantum computing application to satellite data assimilation

    NASA Astrophysics Data System (ADS)

    Cheung, S.; Zhang, S. Q.

    2015-12-01

    This is an exploring work on potential application of quantum computing to a scientific data optimization problem. On classical computational platforms, the physical domain of a satellite data assimilation problem is represented by a discrete variable transform, and classical minimization algorithms are employed to find optimal solution of the analysis cost function. The computation becomes intensive and time-consuming when the problem involves large number of variables and data. The new quantum computer opens a very different approach both in conceptual programming and in hardware architecture for solving optimization problem. In order to explore if we can utilize the quantum computing machine architecture, we formulate a satellite data assimilation experimental case in the form of quadratic programming optimization problem. We find a transformation of the problem to map it into Quadratic Unconstrained Binary Optimization (QUBO) framework. Binary Wavelet Transform (BWT) will be applied to the data assimilation variables for its invertible decomposition and all calculations in BWT are performed by Boolean operations. The transformed problem will be experimented as to solve for a solution of QUBO instances defined on Chimera graphs of the quantum computer.

  3. Dual S and Ku-band tracking feed for a TDRS reflector antenna

    NASA Technical Reports Server (NTRS)

    Pullara, J. C.; Bales, C. W.; Kefalas, G. P.; Uyehara, M.

    1974-01-01

    The results are presented of a trade study designed to identify a synchronous satellite antenna system suitable for receiving and transmitting data from lower orbiting satellites at both S- and K sub u-bands simultaneously as part of the Tracking and Data Relay Satellite System. All related problems associated with maintaining a data link between two satellites with a K sub u-band half-power beamwidth of 0.4 db are considered including data link maintenance techniques, beam pointing accuracies, gimbal and servo errors, solar heating, angle tracking schemes, acquisition problems and aids, tracking accuracies versus SNR, antenna feed designs, equipment designs, weight and power budgets, and detailed candidate antenna system designs.

  4. Fidelity of Problem Solving in Everyday Practice: Typical Training May Miss the Mark

    ERIC Educational Resources Information Center

    Ruby, Susan F.; Crosby-Cooper, Tricia; Vanderwood, Michael L.

    2011-01-01

    With national attention on scaling up the implementation of Response to Intervention, problem solving teams remain one of the central components for development, implementation, and monitoring of school-based interventions. Studies have shown that problem solving teams evidence a sound theoretical base and demonstrated efficacy; however, limited…

  5. Preservice Middle and High School Mathematics Teachers' Strategies When Solving Proportion Problems

    ERIC Educational Resources Information Center

    Arican, Muhammet

    2018-01-01

    The purpose of this study was to investigate eight preservice middle and high school mathematics teachers' solution strategies when solving single and multiple proportion problems. Real-world missing-value word problems were used in an interview setting to collect information about preservice teachers' (PSTs) reasoning about proportional…

  6. Exoatmospheric intercepts using zero effort miss steering for midcourse guidance

    NASA Astrophysics Data System (ADS)

    Newman, Brett

    The suitability of proportional navigation, or an equivalent zero effort miss formulation, for exatmospheric intercepts during midcourse guidance, followed by a ballistic coast to the endgame, is addressed. The problem is formulated in terms of relative motion in a general, three dimensional framework. The proposed guidance law for the commanded thrust vector orientation consists of the sum of two terms: (1) along the line of sight unit direction and (2) along the zero effort miss component perpendicular to the line of sight and proportional to the miss itself and a guidance gain. If the guidance law is to be suitable for longer range targeting applications with significant ballistic coasting after burnout, determination of the zero effort miss must account for the different gravitational accelerations experienced by each vehicle. The proposed miss determination techniques employ approximations for the true differential gravity effect and thus, are less accurate than a direct numerical propagation of the governing equations, but more accurate than a baseline determination, which assumes equal accelerations for both vehicles. Approximations considered are constant, linear, quadratic, and linearized inverse square models. Theoretical results are applied to a numerical engagement scenario and the resulting performance is evaluated in terms of the miss distances determined from nonlinear simulation.

  7. Study on miss distance based on projectile shock wave sensor

    NASA Astrophysics Data System (ADS)

    Gu, Guohua; Cheng, Gang; Zhang, Chenjun; Zhou, Lei

    2017-05-01

    The paper establishes miss distance models based on physical characteristic of shock-wave. The aerodynamic theory shows that the shock-wave of flying super-sonic projectile is generated for the projectile compressing and expending its ambient atmosphere. It advances getting miss distance according to interval of the first sensors, which first catches shock-wave, to solve the problem such as noise filtering on severe background, and signals of amplifier vibration dynamic disposal and electromagnetism compatibility, in order to improves the precision and reliability of gathering wave N signals. For the first time, it can identify the kinds of pills and firing units automatically, measure miss distance and azimuth when pills are firing. Application shows that the tactics and technique index is advanced all of the world.

  8. [School absenteeism in Germany: prevalence of excused and unexcused absenteeism and its correlation with emotional and behavioural problems].

    PubMed

    Lenzen, Christoph; Fischer, Gloria; Jentzsch, Anika; Kaess, Michael; Parzer, Peter; Carli, Vladimir; Wasserman, Danuta; Resch, Franz; Brunner, Romuald

    2013-01-01

    Data about the prevalence of school absenteeism and its correlation with emotional and behavioural problems in Germany is scarce, in particular regarding excused absenteeism. This study aims to close the gap by examining a sample of 2,679 pupils attending the different types of secondary school (Hauptschule, Realschule, Gymnasium), who participated in a clinical trial for the prevention of truancy (WE-STAY-Project). Pupils' mean age was 14 years (M = 13.94, SD = 0.85, Range = 11-19) and gender distribution was balanced (49.35% males, 50.65% females). Using a self-report questionnaire, pupils where asked on how many days they had missed school on average per month during the last school year (excused and unexcused). Emotional and behavioural problems were measured by using the "Strengths and Difficulties Questionnaire" (SDQ). 4.1% of the pupils reported to have missed school without a valid excuse on more than four days per month (unexcused absenteeism). 6.1% had missed school having an excuse on more than ten days per month (excused absenteeism). Both, unexcused and excused absenteeism, showed an increase of emotional and behavioural problems dependent on the intensity of absenteeism. In conclusion, these findings show the relevance of school absenteeism in Germany. In the future, more attention should be given to pupils with also excused absenteeism.

  9. Manifold regularized matrix completion for multi-label learning with ADMM.

    PubMed

    Liu, Bin; Li, Yingming; Xu, Zenglin

    2018-05-01

    Multi-label learning is a common machine learning problem arising from numerous real-world applications in diverse fields, e.g, natural language processing, bioinformatics, information retrieval and so on. Among various multi-label learning methods, the matrix completion approach has been regarded as a promising approach to transductive multi-label learning. By constructing a joint matrix comprising the feature matrix and the label matrix, the missing labels of test samples are regarded as missing values of the joint matrix. With the low-rank assumption of the constructed joint matrix, the missing labels can be recovered by minimizing its rank. Despite its success, most matrix completion based approaches ignore the smoothness assumption of unlabeled data, i.e., neighboring instances should also share a similar set of labels. Thus they may under exploit the intrinsic structures of data. In addition, the matrix completion problem can be less efficient. To this end, we propose to efficiently solve the multi-label learning problem as an enhanced matrix completion model with manifold regularization, where the graph Laplacian is used to ensure the label smoothness over it. To speed up the convergence of our model, we develop an efficient iterative algorithm, which solves the resulted nuclear norm minimization problem with the alternating direction method of multipliers (ADMM). Experiments on both synthetic and real-world data have shown the promising results of the proposed approach. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. An illustrative analysis of technological alternatives for satellite communications

    NASA Technical Reports Server (NTRS)

    Metcalfe, M. R.; Cazalet, E. G.; North, D. W.

    1979-01-01

    The demand for satellite communications services in the domestic market is discussed. Two approaches to increasing system capacity are the expansion of service into frequencies presently allocated but not used for satellite communications, and the development of technologies that provide a greater level of service within the currently used frequency bands. The development of economic models and analytic techniques for evaluating capacity expansion alternatives such as these are presented. The satellite orbit spectrum problem, and also outlines of some suitable analytic approaches are examined. Illustrative analysis of domestic communications satellite technology options for providing increased levels of service are also examined. The analysis illustrates the use of probabilities and decision trees in analyzing alternatives, and provides insight into the important aspects of the orbit spectrum problem that would warrant inclusion in a larger scale analysis.

  11. ESPA: EELV secondary payload adapter with whole-spacecraft isolation for primary and secondary payloads

    NASA Astrophysics Data System (ADS)

    Maly, Joseph R.; Haskett, Scott A.; Wilke, Paul S.; Fowler, E. C.; Sciulli, Dino; Meink, Troy E.

    2000-04-01

    ESPA, the Secondary Payload Adapter for Evolved Expendable Launch Vehicles, addresses two of the major problems currently facing the launch industry: the vibration environment of launch vehicles, and the high cost of putting satellites into orbit. (1) During the 1990s, billions of dollars have been lost due to satellite malfunctions, resulting in total or partial mission failure, which can be directly attributed to vibration loads experienced by payloads during launch. Flight data from several recent launches have shown that whole- spacecraft launch isolation is an excellent solution to this problem. (2) Despite growing worldwide interest in small satellites, launch costs continue to hinder the full exploitation of small satellite technology. Many small satellite users are faced with shrinking budgets, limiting the scope of what can be considered an 'affordable' launch opportunity.

  12. Satellite Fault Diagnosis Using Support Vector Machines Based on a Hybrid Voting Mechanism

    PubMed Central

    Yang, Shuqiang; Zhu, Xiaoqian; Jin, Songchang; Wang, Xiang

    2014-01-01

    The satellite fault diagnosis has an important role in enhancing the safety, reliability, and availability of the satellite system. However, the problem of enormous parameters and multiple faults makes a challenge to the satellite fault diagnosis. The interactions between parameters and misclassifications from multiple faults will increase the false alarm rate and the false negative rate. On the other hand, for each satellite fault, there is not enough fault data for training. To most of the classification algorithms, it will degrade the performance of model. In this paper, we proposed an improving SVM based on a hybrid voting mechanism (HVM-SVM) to deal with the problem of enormous parameters, multiple faults, and small samples. Many experimental results show that the accuracy of fault diagnosis using HVM-SVM is improved. PMID:25215324

  13. Alaska/Yukon Geoid Improvement by a Data-Driven Stokes's Kernel Modification Approach

    NASA Astrophysics Data System (ADS)

    Li, Xiaopeng; Roman, Daniel R.

    2015-04-01

    Geoid modeling over Alaska of USA and Yukon Canada being a trans-national issue faces a great challenge primarily due to the inhomogeneous surface gravity data (Saleh et al, 2013) and the dynamic geology (Freymueller et al, 2008) as well as its complex geological rheology. Previous study (Roman and Li 2014) used updated satellite models (Bruinsma et al 2013) and newly acquired aerogravity data from the GRAV-D project (Smith 2007) to capture the gravity field changes in the targeting areas primarily in the middle-to-long wavelength. In CONUS, the geoid model was largely improved. However, the precision of the resulted geoid model in Alaska was still in the decimeter level, 19cm at the 32 tide bench marks and 24cm on the 202 GPS/Leveling bench marks that gives a total of 23.8cm at all of these calibrated surface control points, where the datum bias was removed. Conventional kernel modification methods in this area (Li and Wang 2011) had limited effects on improving the precision of the geoid models. To compensate the geoid miss fits, a new Stokes's kernel modification method based on a data-driven technique is presented in this study. First, the method was tested on simulated data sets (Fig. 1), where the geoid errors have been reduced by 2 orders of magnitude (Fig 2). For the real data sets, some iteration steps are required to overcome the rank deficiency problem caused by the limited control data that are irregularly distributed in the target area. For instance, after 3 iterations, the standard deviation dropped about 2.7cm (Fig 3). Modification at other critical degrees can further minimize the geoid model miss fits caused either by the gravity error or the remaining datum error in the control points.

  14. THE HALO MASS FUNCTION CONDITIONED ON DENSITY FROM THE MILLENNIUM SIMULATION: INSIGHTS INTO MISSING BARYONS AND GALAXY MASS FUNCTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faltenbacher, A.; Finoguenov, A.; Drory, N.

    2010-03-20

    The baryon content of high-density regions in the universe is relevant to two critical unanswered questions: the workings of nurture effects on galaxies and the whereabouts of the missing baryons. In this paper, we analyze the distribution of dark matter and semianalytical galaxies in the Millennium Simulation to investigate these problems. Applying the same density field reconstruction schemes as used for the overall matter distribution to the matter locked in halos, we study the mass contribution of halos to the total mass budget at various background field densities, i.e., the conditional halo mass function. In this context, we present amore » simple fitting formula for the cumulative mass function accurate to {approx}<5% for halo masses between 10{sup 10} and 10{sup 15} h {sup -1} M{sub sun}. We find that in dense environments the halo mass function becomes top heavy and present corresponding fitting formulae for different redshifts. We demonstrate that the major fraction of matter in high-density fields is associated with galaxy groups. Since current X-ray surveys are able to nearly recover the universal baryon fraction within groups, our results indicate that the major part of the so-far undetected warm-hot intergalactic medium resides in low-density regions. Similarly, we show that the differences in galaxy mass functions with environment seen in observed and simulated data stem predominantly from differences in the mass distribution of halos. In particular, the hump in the galaxy mass function is associated with the central group galaxies, and the bimodality observed in the galaxy mass function is therefore interpreted as that of central galaxies versus satellites.« less

  15. The role of service areas in the optimization of FSS orbital and frequency assignments

    NASA Technical Reports Server (NTRS)

    Levis, C. A.; Wang, C. W.; Yamamura, Y.; Reilly, C. H.; Gonsalvez, D. J.

    1985-01-01

    A relationship is derived, on a single-entry interference basis, for the minimum allowable spacing between two satellites as a function of electrical parameters and service-area geometries. For circular beams, universal curves relate the topocentric satellite spacing angle to the service-area separation angle measured at the satellite. The corresponding geocentric spacing depends only weakly on the mean longitude of the two satellites, and this is true also for alliptical antenna beams. As a consequence, if frequency channels are preassigned, the orbital assignment synthesis of a satellite system can be formulated as a mixed-integer programming (MIP) problem or approximated by a linear programming (LP) problem, with the interference protection requirements enforced by constraints while some linear function is optimized. Possible objective-function choices are discussed and explicit formulations are presented for the choice of the sum of the absolute deviations of the orbital locations from some prescribed ideal location set. A test problem is posed consisting of six service areas, each served by one satellite, all using elliptical antenna beams and the same frequency channels. Numerical results are given for the three ideal location prescriptions for both the MIP and LP formulations. The resulting scenarios also satisfy reasonable aggregate interference protection requirements.

  16. High voltage plasma sheath analysis related to TSS-1

    NASA Technical Reports Server (NTRS)

    Sheldon, John W.

    1990-01-01

    On the first mission of the Tethered Satellite System (TSS-1), a 1.8 m diameter spherical satellite will be deployed a distance of 20 km above the Space Shuttle Orbiter on an insulated conducting tether. The satellite will be held at electric potentials up to 5000 volts positive with respect to the ambient plasma. Due to the passage of the conducting tether through the Earth's magnetic field, an electromagnetic field (EMF) will be created, driving electrons down the tether to the Orbiter, out through an electron gun into the ionosphere and back into the positive-biased satellite. The main problem addressed here is the current-voltage characteristics of the ionospheric interaction with the satellite. The first problem is that while the satellite will be capable of measuring charged particle flow to the surface at several locations, the detectors have a limited range of acceptance angle. The second problem is that the angle of incidence of the incoming electrons will have to be relative to the local normal. This will be important in order to predict the magnitude of the detectable current at each detector location so the detector gain can be pre-set to the correct range. The plasma sheath was analyzed mathematically, and subroutines were written to solve relevant finite element, Taylor-Vlasov, and Poisson equations.

  17. Mission planning optimization of video satellite for ground multi-object staring imaging

    NASA Astrophysics Data System (ADS)

    Cui, Kaikai; Xiang, Junhua; Zhang, Yulin

    2018-03-01

    This study investigates the emergency scheduling problem of ground multi-object staring imaging for a single video satellite. In the proposed mission scenario, the ground objects require a specified duration of staring imaging by the video satellite. The planning horizon is not long, i.e., it is usually shorter than one orbit period. A binary decision variable and the imaging order are used as the design variables, and the total observation revenue combined with the influence of the total attitude maneuvering time is regarded as the optimization objective. Based on the constraints of the observation time windows, satellite attitude adjustment time, and satellite maneuverability, a constraint satisfaction mission planning model is established for ground object staring imaging by a single video satellite. Further, a modified ant colony optimization algorithm with tabu lists (Tabu-ACO) is designed to solve this problem. The proposed algorithm can fully exploit the intelligence and local search ability of ACO. Based on full consideration of the mission characteristics, the design of the tabu lists can reduce the search range of ACO and improve the algorithm efficiency significantly. The simulation results show that the proposed algorithm outperforms the conventional algorithm in terms of optimization performance, and it can obtain satisfactory scheduling results for the mission planning problem.

  18. Spatial resolution and frequency of satellite data acquisition for multi-temporal analysis of environment

    NASA Astrophysics Data System (ADS)

    Tanaka, S.; Sugimura, T.; Kameda, K.

    1992-07-01

    The environmental monitoring capacity by satellite depends upon the spatial resolution and the acquisition frequency it provides. The information on environmental change obtained by Landsat, the first earth observation satellite, was a rectangular reclamation area on Tokyo Bay meaning only a few square kilometers. However, multi-temporal SPOT/HRV data enables newly built small buildings meaning just ten square meters or so to be detected. Environmental changes of the global dimensions are today attracting world attention. In Japan, the major environmental problems are decaying cedar forests due to acid rain, decaying pine forests due to the pine beetle, landslides due to left-cut forests and problem resulting from agricultural chemicals on golf courses. All of these pose a national problem, but each is a phenomenon which covers an area of a few meters square at the largest. The existing earth observation satellites are unable to monitor these seemingly small sized environmental changes. For this, satellites with a spatial resolution of a few meters only or less than a meter are required. This situation becomes apparent when specific cases are examined, and it is expected considering the speed of past sensor development satellite observation systems providing this capacity will most probably be developed by the year 2020.

  19. An asymptotic method for estimating the vertical ozone distribution in the Earth's atmosphere from satellite measurements of backscattered solar UV-radiation

    NASA Technical Reports Server (NTRS)

    Ishov, Alexander G.

    1994-01-01

    An asymptotic approach to solution of the inverse problems of remote sensing is presented. It consists in changing integral operators characteristic of outgoing radiation into their asymptotic analogues. Such approach does not add new principal uncertainties into the problem and significantly reduces computation time that allows to develop the real (or about) time algorithms for interpretation of satellite measurements. The asymptotic approach has been realized for estimating vertical ozone distribution from satellite measurements of backscatter solar UV radiation in the Earth's atmosphere.

  20. Inferential precision in single-case time-series data streams: how well does the em procedure perform when missing observations occur in autocorrelated data?

    PubMed

    Smith, Justin D; Borckardt, Jeffrey J; Nash, Michael R

    2012-09-01

    The case-based time-series design is a viable methodology for treatment outcome research. However, the literature has not fully addressed the problem of missing observations with such autocorrelated data streams. Mainly, to what extent do missing observations compromise inference when observations are not independent? Do the available missing data replacement procedures preserve inferential integrity? Does the extent of autocorrelation matter? We use Monte Carlo simulation modeling of a single-subject intervention study to address these questions. We find power sensitivity to be within acceptable limits across four proportions of missing observations (10%, 20%, 30%, and 40%) when missing data are replaced using the Expectation-Maximization Algorithm, more commonly known as the EM Procedure (Dempster, Laird, & Rubin, 1977). This applies to data streams with lag-1 autocorrelation estimates under 0.80. As autocorrelation estimates approach 0.80, the replacement procedure yields an unacceptable power profile. The implications of these findings and directions for future research are discussed. Copyright © 2011. Published by Elsevier Ltd.

  1. Solar array stepping problems in satellites and solutions

    NASA Astrophysics Data System (ADS)

    Maharana, P. K.; Goel, P. S.

    1992-01-01

    The dynamics problems arising due to stepping motion of the solar arrays of spacecraft are studied. To overcome these problems, design improvements in the drive logic based on the phase plane analysis are suggested. The improved designs are applied to the Solar Array Drive Assembly (SADA) of IRS-1B and INSAT-2A satellites. In addition, an alternate torquing strategy for very successful slewing of the arrays, and with minimum excitation of flexible modes, is proposed.

  2. Experiments on the CMB Spectrum, Big Jets Model and Their Implications for the Missing Half of the Universe

    NASA Astrophysics Data System (ADS)

    Hsu, Leonardo; Hsu, Jong-Ping

    2018-01-01

    Based on the limiting continuation of Lorentz-Poincaré invariance, we propose an alternative formulation of the generalized Planck distribution for inertial and noninertial frames. The Lorentz invariant Planck distribution law leads to a new physical interpretation of the dipole anisotropy of the Cosmic Microwave Background. The Big Jets model predicts a distant `antimatter blackbody,' whose radiations could make 50% of the sky very slightly warmer than the isotropic CMB temperature TCMB with a cosine function. The other 50% of the sky has the same isotropic temperature TCMB. Thus, we could have a pseudo-dipole anisotropy because the microwaves emitted from the antimatter blackbody are totally absorbed by our matter blackbody. We suggest that accurate data of satellite experiments might be used to search for the pseudo-dipole anisotropy and the missing half of the antimatter universe.

  3. Materials samples face rigors of space.

    PubMed

    Flinn, Edward D

    2002-07-01

    The Materials International Space Station Experiment (MISSE) is described. This project is designed to conduct long duration materials tests on samples attached to the ISS. A batch of 750 material samples were delivered on STS-105 and attached to the ISS airlock. They will be exposed to the space environment for 18 months and are slated to return on STS-114. A second batch of 750 samples is being prepared. The experiment containers were used originally for the Mir Environmental Effects Payload, which tested a variety of substances, including some slated for use on the ISS. Researchers are particularly interested in the effects of atomic oxygen on the samples. Some samples are being tested to determine their use in radiation protection. As part of the MISSE project, ultrathin tether materials are being tested for use on the Propulsive Small Expendable Depoloyer System (ProSEDS), which will use a tether system to change a satellite's orbital altitude.

  4. Magnitude of dental caries, missing and filled teeth in Malawi: National Oral Health Survey.

    PubMed

    Msyamboza, Kelias Phiri; Phale, Enock; Namalika, Jessie Mlotha; Mwase, Younam; Samonte, Gian Carlo; Kajirime, Doubt; Sumani, Sewedi; Chalila, Pax D; Potani, Rennie; Mwale, George Chithope-; Kathyola, Damson; Mukiwa, Weston

    2016-03-09

    Oral health problems are significant cause of morbidity particularly in sub-Saharan Africa. In Malawi, routine health management information system data over the years showed that oral health problems were one of the top ten reasons for outpatient attendance. However, to date, no national oral survey has been carried out to determine the prevalence of oral health problems. A national population-based cross-sectional survey was conducted in 2013. A total of 130 enumeration areas (EAs) were randomly selected and from each EA, 40 participants were randomly selected as per WHO STEPS survey protocol. Eligible participants were 12, 15, 35-44 and 65-74 year old. A multi-stage sampling design was used to obtain a national representative sample of these age groups. Oral examination was based on WHO diagnostic criteria (2010). A total of 5400 participants were enrolled in the survey. Of these: 3304 (61.3 %) were females, 2090 (38.7 %) were males; 327 (6.9 %) were from urban and 4386 (93.1 %) from rural areas; 1115 (20.6 %), 993 (17.3 %), 2306 (42.7 %) and 683 (12.6 %) were aged 12, 15, 35-44, 65-74 years respectively. Among 12 year-old, 15 year-old, 35-44 and 65-74 year age groups, prevalence of dental caries was 19.1, 21.9, 49.0 and 49.2 % respectively, overall 37.4 %. Prevalence of missing teeth was 2.7, 5.2, 47.7 and 79.9 %, overall 35.2 %. Prevalence of filled teeth was 0.2 %, 1.3 %, 8.7 %, 12.7 %, overall 6.5 %. Prevalence of bleeding gums was 13.0, 11.8, 30.8 and 36.1 %, overall 23.5 %. Toothache, dental caries and missing teeth were more common in females than males; 46.5 % vs 37.9 %, 40.5 % vs 32.4 %, 37.7 % vs 30.1 % respectively, all p < 0.05. Prevalence of dental caries and missing teeth in urban areas were as high as in the rural areas; 33.3 % vs 37.4 % and 30.9 % vs 33.7 % respectively, all p > 0.05. The mean number of decayed, missing and filled teeth (DMFT) in 12, 15, 35-44, 65-74 year old was 0.67, 0.71, 3.11 and 6.87 respectively. Self- reported brushing of teeth was poor with only 35.2 % of people brushed their teeth twice a day and tobacco smoking was high, particularly among adult males where one in five (22.9 %) was a smoker. This study demonstrated that oral health problems are major public health problems in Malawi. One in five (21 %) adolescents aged 12-15 years and half (49 %) of adults aged 35 years or more had dental caries, half (48 %) and 80 % of the population aged 35-44, 65-74 years had missing teeth respectively. Toothache, dental caries and missing teeth were more prevalent in females than males and prevalence in urban was as high as in rural areas. Oral hygiene was poor with less than 40 % of the population brush their teeth twice a day and tobacco smoking was high, particularly in men where prevalence was 23 %. These findings could be used to develop evidence-informed national policy, action and resource mobilization plan and community based interventions to reduce the prevalence of oral health problems in Malawi.

  5. Japanese propagation experiments with ETS-5

    NASA Technical Reports Server (NTRS)

    Ikegami, Tetsushi

    1989-01-01

    Propagation experiments for maritime, aeronautical, and land mobile satellite communications were performed using Engineering Test Satellite-Five (ETS-5). The propagation experiments are one of major mission of Experimental Mobile Satellite System (EMSS) which is aimed for establishing basic technology for future general mobile satellite communication systems. A brief introduction is presented for the experimental results on propagation problems of ETS-5/EMSS.

  6. Multiple imputation of missing fMRI data in whole brain analysis

    PubMed Central

    Vaden, Kenneth I.; Gebregziabher, Mulugeta; Kuchinsky, Stefanie E.; Eckert, Mark A.

    2012-01-01

    Whole brain fMRI analyses rarely include the entire brain because of missing data that result from data acquisition limits and susceptibility artifact, in particular. This missing data problem is typically addressed by omitting voxels from analysis, which may exclude brain regions that are of theoretical interest and increase the potential for Type II error at cortical boundaries or Type I error when spatial thresholds are used to establish significance. Imputation could significantly expand statistical map coverage, increase power, and enhance interpretations of fMRI results. We examined multiple imputation for group level analyses of missing fMRI data using methods that leverage the spatial information in fMRI datasets for both real and simulated data. Available case analysis, neighbor replacement, and regression based imputation approaches were compared in a general linear model framework to determine the extent to which these methods quantitatively (effect size) and qualitatively (spatial coverage) increased the sensitivity of group analyses. In both real and simulated data analysis, multiple imputation provided 1) variance that was most similar to estimates for voxels with no missing data, 2) fewer false positive errors in comparison to mean replacement, and 3) fewer false negative errors in comparison to available case analysis. Compared to the standard analysis approach of omitting voxels with missing data, imputation methods increased brain coverage in this study by 35% (from 33,323 to 45,071 voxels). In addition, multiple imputation increased the size of significant clusters by 58% and number of significant clusters across statistical thresholds, compared to the standard voxel omission approach. While neighbor replacement produced similar results, we recommend multiple imputation because it uses an informed sampling distribution to deal with missing data across subjects that can include neighbor values and other predictors. Multiple imputation is anticipated to be particularly useful for 1) large fMRI data sets with inconsistent missing voxels across subjects and 2) addressing the problem of increased artifact at ultra-high field, which significantly limit the extent of whole brain coverage and interpretations of results. PMID:22500925

  7. A comparison of multiple imputation methods for handling missing values in longitudinal data in the presence of a time-varying covariate with a non-linear association with time: a simulation study.

    PubMed

    De Silva, Anurika Priyanjali; Moreno-Betancur, Margarita; De Livera, Alysha Madhu; Lee, Katherine Jane; Simpson, Julie Anne

    2017-07-25

    Missing data is a common problem in epidemiological studies, and is particularly prominent in longitudinal data, which involve multiple waves of data collection. Traditional multiple imputation (MI) methods (fully conditional specification (FCS) and multivariate normal imputation (MVNI)) treat repeated measurements of the same time-dependent variable as just another 'distinct' variable for imputation and therefore do not make the most of the longitudinal structure of the data. Only a few studies have explored extensions to the standard approaches to account for the temporal structure of longitudinal data. One suggestion is the two-fold fully conditional specification (two-fold FCS) algorithm, which restricts the imputation of a time-dependent variable to time blocks where the imputation model includes measurements taken at the specified and adjacent times. To date, no study has investigated the performance of two-fold FCS and standard MI methods for handling missing data in a time-varying covariate with a non-linear trajectory over time - a commonly encountered scenario in epidemiological studies. We simulated 1000 datasets of 5000 individuals based on the Longitudinal Study of Australian Children (LSAC). Three missing data mechanisms: missing completely at random (MCAR), and a weak and a strong missing at random (MAR) scenarios were used to impose missingness on body mass index (BMI) for age z-scores; a continuous time-varying exposure variable with a non-linear trajectory over time. We evaluated the performance of FCS, MVNI, and two-fold FCS for handling up to 50% of missing data when assessing the association between childhood obesity and sleep problems. The standard two-fold FCS produced slightly more biased and less precise estimates than FCS and MVNI. We observed slight improvements in bias and precision when using a time window width of two for the two-fold FCS algorithm compared to the standard width of one. We recommend the use of FCS or MVNI in a similar longitudinal setting, and when encountering convergence issues due to a large number of time points or variables with missing values, the two-fold FCS with exploration of a suitable time window.

  8. Automating Web Collection and Validation of GPS data for Longitudinal Urban Travel Studies

    DOT National Transportation Integrated Search

    2012-08-01

    Traditional paper and phone travel surveys are expensive, time consuming, and have problems of missing trips, illogical trip sequences, and : imprecise travel time. GPS-based travel surveys can avoid many of these problems and are becoming increasing...

  9. Clustering and variable selection in the presence of mixed variable types and missing data.

    PubMed

    Storlie, C B; Myers, S M; Katusic, S K; Weaver, A L; Voigt, R G; Croarkin, P E; Stoeckel, R E; Port, J D

    2018-05-17

    We consider the problem of model-based clustering in the presence of many correlated, mixed continuous, and discrete variables, some of which may have missing values. Discrete variables are treated with a latent continuous variable approach, and the Dirichlet process is used to construct a mixture model with an unknown number of components. Variable selection is also performed to identify the variables that are most influential for determining cluster membership. The work is motivated by the need to cluster patients thought to potentially have autism spectrum disorder on the basis of many cognitive and/or behavioral test scores. There are a modest number of patients (486) in the data set along with many (55) test score variables (many of which are discrete valued and/or missing). The goal of the work is to (1) cluster these patients into similar groups to help identify those with similar clinical presentation and (2) identify a sparse subset of tests that inform the clusters in order to eliminate unnecessary testing. The proposed approach compares very favorably with other methods via simulation of problems of this type. The results of the autism spectrum disorder analysis suggested 3 clusters to be most likely, while only 4 test scores had high (>0.5) posterior probability of being informative. This will result in much more efficient and informative testing. The need to cluster observations on the basis of many correlated, continuous/discrete variables with missing values is a common problem in the health sciences as well as in many other disciplines. Copyright © 2018 John Wiley & Sons, Ltd.

  10. A regressive methodology for estimating missing data in rainfall daily time series

    NASA Astrophysics Data System (ADS)

    Barca, E.; Passarella, G.

    2009-04-01

    The "presence" of gaps in environmental data time series represents a very common, but extremely critical problem, since it can produce biased results (Rubin, 1976). Missing data plagues almost all surveys. The problem is how to deal with missing data once it has been deemed impossible to recover the actual missing values. Apart from the amount of missing data, another issue which plays an important role in the choice of any recovery approach is the evaluation of "missingness" mechanisms. When data missing is conditioned by some other variable observed in the data set (Schafer, 1997) the mechanism is called MAR (Missing at Random). Otherwise, when the missingness mechanism depends on the actual value of the missing data, it is called NCAR (Not Missing at Random). This last is the most difficult condition to model. In the last decade interest arose in the estimation of missing data by using regression (single imputation). More recently multiple imputation has become also available, which returns a distribution of estimated values (Scheffer, 2002). In this paper an automatic methodology for estimating missing data is presented. In practice, given a gauging station affected by missing data (target station), the methodology checks the randomness of the missing data and classifies the "similarity" between the target station and the other gauging stations spread over the study area. Among different methods useful for defining the similarity degree, whose effectiveness strongly depends on the data distribution, the Spearman correlation coefficient was chosen. Once defined the similarity matrix, a suitable, nonparametric, univariate, and regressive method was applied in order to estimate missing data in the target station: the Theil method (Theil, 1950). Even though the methodology revealed to be rather reliable an improvement of the missing data estimation can be achieved by a generalization. A first possible improvement consists in extending the univariate technique to the multivariate approach. Another approach follows the paradigm of the "multiple imputation" (Rubin, 1987; Rubin, 1988), which consists in using a set of "similar stations" instead than the most similar. This way, a sort of estimation range can be determined allowing the introduction of uncertainty. Finally, time series can be grouped on the basis of monthly rainfall rates defining classes of wetness (i.e.: dry, moderately rainy and rainy), in order to achieve the estimation using homogeneous data subsets. We expect that integrating the methodology with these enhancements will certainly improve its reliability. The methodology was applied to the daily rainfall time series data registered in the Candelaro River Basin (Apulia - South Italy) from 1970 to 2001. REFERENCES D.B., Rubin, 1976. Inference and Missing Data. Biometrika 63 581-592 D.B. Rubin, 1987. Multiple Imputation for Nonresponce in Surveys, New York: John Wiley & Sons, Inc. D.B. Rubin, 1988. An overview of multiple imputation. In Survey Research Section, pp. 79-84, American Statistical Association, 1988. J.L., Schafer, 1997. Analysis of Incomplete Multivariate Data, Chapman & Hall. J., Scheffer, 2002. Dealing with Missing Data. Res. Lett. Inf. Math. Sci. 3, 153-160. Available online at http://www.massey.ac.nz/~wwiims/research/letters/ H. Theil, 1950. A rank-invariant method of linear and polynomial regression analysis. Indicationes Mathematicae, 12, pp.85-91.

  11. Automatic Annotation Method on Learners' Opinions in Case Method Discussion

    ERIC Educational Resources Information Center

    Samejima, Masaki; Hisakane, Daichi; Komoda, Norihisa

    2015-01-01

    Purpose: The purpose of this paper is to annotate an attribute of a problem, a solution or no annotation on learners' opinions automatically for supporting the learners' discussion without a facilitator. The case method aims at discussing problems and solutions in a target case. However, the learners miss discussing some of problems and solutions.…

  12. Students' Performance on Missing-Value Word Problems: A Cross-National Developmental Study

    ERIC Educational Resources Information Center

    Jiang, Ronghuan; Li, Xiaodong; Fernández, Ceneida; Fu, Xinchen

    2017-01-01

    This study investigates Spanish and Chinese students' performance on both addition problems and proportion problems considering a cross-national perspective. The effect of number structure and nature of quantities was also considered. Nine hundred twenty-five 4th to 8th graders (453 Chinese, 472 Spanish) took a test which is composed of addition…

  13. Event-Based Variance-Constrained ${\\mathcal {H}}_{\\infty }$ Filtering for Stochastic Parameter Systems Over Sensor Networks With Successive Missing Measurements.

    PubMed

    Wang, Licheng; Wang, Zidong; Han, Qing-Long; Wei, Guoliang

    2018-03-01

    This paper is concerned with the distributed filtering problem for a class of discrete time-varying stochastic parameter systems with error variance constraints over a sensor network where the sensor outputs are subject to successive missing measurements. The phenomenon of the successive missing measurements for each sensor is modeled via a sequence of mutually independent random variables obeying the Bernoulli binary distribution law. To reduce the frequency of unnecessary data transmission and alleviate the communication burden, an event-triggered mechanism is introduced for the sensor node such that only some vitally important data is transmitted to its neighboring sensors when specific events occur. The objective of the problem addressed is to design a time-varying filter such that both the requirements and the variance constraints are guaranteed over a given finite-horizon against the random parameter matrices, successive missing measurements, and stochastic noises. By recurring to stochastic analysis techniques, sufficient conditions are established to ensure the existence of the time-varying filters whose gain matrices are then explicitly characterized in term of the solutions to a series of recursive matrix inequalities. A numerical simulation example is provided to illustrate the effectiveness of the developed event-triggered distributed filter design strategy.

  14. Preliminary PANSAT ground station software design and use of an expert system to analyze telemetry

    NASA Astrophysics Data System (ADS)

    Lawrence, Gregory W.

    1994-03-01

    The Petite Amateur Navy Satellite (PANSAT) is a communications satellite designed to be used by civilian amateur radio operators. A master ground station is being built at the Naval Postgraduate School. This computer system performs satellite commands, displays telemetry, trouble-shoots problems, and passes messages. The system also controls an open loop tracking antenna. This paper concentrates on the telemetry display, decoding, and interpretation through artificial intelligence (AI). The telemetry is displayed in an easily interpretable format, so that any user can understand the current health of the satellite and be cued as to any problems and possible solutions. Only the master ground station has the ability to receive all telemetry and send commands to the spacecraft; civilian ham users do not have access to this information. The telemetry data is decommutated and analyzed before it is displayed to the user, so that the raw data will not have to be interpreted by ground users. The analysis will use CLIPS imbedded in the code, and derive its inputs from telemetry decommutation. The program is an expert system using a forward chaining set of rules based on the expected operation and parameters of the satellite. By building the rules during the construction and design of the satellite, the telemetry can be well understood and interpreted after the satellite is launched and the designers may no longer be available to provide input to the problem.

  15. Missing-value estimation using linear and non-linear regression with Bayesian gene selection.

    PubMed

    Zhou, Xiaobo; Wang, Xiaodong; Dougherty, Edward R

    2003-11-22

    Data from microarray experiments are usually in the form of large matrices of expression levels of genes under different experimental conditions. Owing to various reasons, there are frequently missing values. Estimating these missing values is important because they affect downstream analysis, such as clustering, classification and network design. Several methods of missing-value estimation are in use. The problem has two parts: (1) selection of genes for estimation and (2) design of an estimation rule. We propose Bayesian variable selection to obtain genes to be used for estimation, and employ both linear and nonlinear regression for the estimation rule itself. Fast implementation issues for these methods are discussed, including the use of QR decomposition for parameter estimation. The proposed methods are tested on data sets arising from hereditary breast cancer and small round blue-cell tumors. The results compare very favorably with currently used methods based on the normalized root-mean-square error. The appendix is available from http://gspsnap.tamu.edu/gspweb/zxb/missing_zxb/ (user: gspweb; passwd: gsplab).

  16. Automated Telerobotic Inspection Of Surfaces

    NASA Technical Reports Server (NTRS)

    Balaram, J.; Prasad, K. Venkatesh

    1996-01-01

    Method of automated telerobotic inspection of surfaces undergoing development. Apparatus implementing method includes video camera that scans over surfaces to be inspected, in manner of mine detector. Images of surfaces compared with reference images to detect flaws. Developed for inspecting external structures of Space Station Freedom for damage from micrometeorites and debris from prior artificial satellites. On Earth, applied to inspection for damage, missing parts, contamination, and/or corrosion on interior surfaces of pipes or exterior surfaces of bridges, towers, aircraft, and ships.

  17. REVIEWS OF TOPICAL PROBLEMS: Satellites of asteroids

    NASA Astrophysics Data System (ADS)

    Prokof'eva, Valentina V.; Tarashchuk, V. P.; Gor'kavyi, N. N.

    1995-06-01

    More than 6000 asteroids in the Solar System have now been discovered and enumerated, and about 500 of them have been investigated in detail by different methods. This rewiew gives observational evidence which indicates that no fewer than 10% of asteroids may be composed of two or more bodies. This was supported by the detection of a satellite of the asteroid Ida by the Galileo spacecraft. This discovery symbolises the change of both observational and theoretical paradigms. Space and ground observations of asteroids by modern teghniques may give extensive new data for modelling double asteroids. The analysis of problems of stability, formation and dynamics of asteroid satellites shows that their sphere of stable motion extends up to several hundred asteroid radii. The idea that the origin of the asteroid satellites may be explained in the frame of a unified accretion model of planetary satellite formation is proposed and justified.

  18. Small satellite attitude determination based on GPS/IMU data fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovan, Andrey; Cepe, Ali

    In this paper, we present the mathematical models and algorithms that describe the problem of attitude determination for a small satellite using measurements from three angular rate sensors (ARS) and aiding measurements from multiple GPS receivers/antennas rigidly attached to the platform of the satellite.

  19. Satellite orbit computation methods

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Mathematical and algorithmical techniques for solution of problems in satellite dynamics were developed, along with solutions to satellite orbit motion. Dynamical analysis of shuttle on-orbit operations were conducted. Computer software routines for use in shuttle mission planning were developed and analyzed, while mathematical models of atmospheric density were formulated.

  20. Signal Conditioning for the Kalman Filter: Application to Satellite Attitude Estimation with Magnetometer and Sun Sensors

    PubMed Central

    Esteban, Segundo; Girón-Sierra, Jose M.; Polo, Óscar R.; Angulo, Manuel

    2016-01-01

    Most satellites use an on-board attitude estimation system, based on available sensors. In the case of low-cost satellites, which are of increasing interest, it is usual to use magnetometers and Sun sensors. A Kalman filter is commonly recommended for the estimation, to simultaneously exploit the information from sensors and from a mathematical model of the satellite motion. It would be also convenient to adhere to a quaternion representation. This article focuses on some problems linked to this context. The state of the system should be represented in observable form. Singularities due to alignment of measured vectors cause estimation problems. Accommodation of the Kalman filter originates convergence difficulties. The article includes a new proposal that solves these problems, not needing changes in the Kalman filter algorithm. In addition, the article includes assessment of different errors, initialization values for the Kalman filter; and considers the influence of the magnetic dipole moment perturbation, showing how to handle it as part of the Kalman filter framework. PMID:27809250

  1. Signal Conditioning for the Kalman Filter: Application to Satellite Attitude Estimation with Magnetometer and Sun Sensors.

    PubMed

    Esteban, Segundo; Girón-Sierra, Jose M; Polo, Óscar R; Angulo, Manuel

    2016-10-31

    Most satellites use an on-board attitude estimation system, based on available sensors. In the case of low-cost satellites, which are of increasing interest, it is usual to use magnetometers and Sun sensors. A Kalman filter is commonly recommended for the estimation, to simultaneously exploit the information from sensors and from a mathematical model of the satellite motion. It would be also convenient to adhere to a quaternion representation. This article focuses on some problems linked to this context. The state of the system should be represented in observable form. Singularities due to alignment of measured vectors cause estimation problems. Accommodation of the Kalman filter originates convergence difficulties. The article includes a new proposal that solves these problems, not needing changes in the Kalman filter algorithm. In addition, the article includes assessment of different errors, initialization values for the Kalman filter; and considers the influence of the magnetic dipole moment perturbation, showing how to handle it as part of the Kalman filter framework.

  2. Optical data communication for Earth observation satellite systems

    NASA Astrophysics Data System (ADS)

    Fischer, J.; Loecherbach, E.

    1991-10-01

    The current development status of optical communication engineering in comparison to the conventional microwave systems and the different configurations of the optical data communication for Earth observation satellite systems are described. An outlook to future optical communication satellite systems is given. During the last decade Earth observation became more and more important for the extension of the knowledge about our planet and the human influence on nature. Today pictures taken by satellites are used, for example, to discover mineral resources or to predict harvest, crops, climate, and environment variations and their influence on the population. A new and up to date application for Earth observation satellites can be the verification of disarmament arrangements and the control of crises areas. To solve these tasks a system of Earth observing satellites with sensors tailored to the envisaged mission is necessary. Besides these low Earth orbiting satellites, a global Earth observation system consists of at least two data relay satellites. The communication between the satellites will be established via Inter-Satellite Links (ISL) and Inter-Orbit Links (IOL). On these links, bitrates up to 1 Gbit/s must be taken into account. Due to the increasing scarcity of suitable frequencies, higher carrier frequencies must probably be considered, and possible interference with terrestrial radio relay systems are two main problems for a realization in microwave technique. One important step to tackle these problems is the use of optical frequencies for IOL's and ISL's.

  3. Engineering calculations for solving the orbital allotment problem

    NASA Technical Reports Server (NTRS)

    Reilly, C.; Walton, E. K.; Mount-Campbell, C.; Caldecott, R.; Aebker, E.; Mata, F.

    1988-01-01

    Four approaches for calculating downlink interferences for shaped-beam antennas are described. An investigation of alternative mixed-integer programming models for satellite synthesis is summarized. Plans for coordinating the various programs developed under this grant are outlined. Two procedures for ordering satellites to initialize the k-permutation algorithm are proposed. Results are presented for the k-permutation algorithms. Feasible solutions are found for 5 of the 6 problems considered. Finally, it is demonstrated that the k-permutation algorithm can be used to solve arc allotment problems.

  4. Bayesian Inference in Satellite Gravity Inversion

    NASA Technical Reports Server (NTRS)

    Kis, K. I.; Taylor, Patrick T.; Wittmann, G.; Kim, Hyung Rae; Torony, B.; Mayer-Guerr, T.

    2005-01-01

    To solve a geophysical inverse problem means applying measurements to determine the parameters of the selected model. The inverse problem is formulated as the Bayesian inference. The Gaussian probability density functions are applied in the Bayes's equation. The CHAMP satellite gravity data are determined at the altitude of 400 kilometer altitude over the South part of the Pannonian basin. The model of interpretation is the right vertical cylinder. The parameters of the model are obtained from the minimum problem solved by the Simplex method.

  5. The application of generalized, cyclic, and modified numerical integration algorithms to problems of satellite orbit computation

    NASA Technical Reports Server (NTRS)

    Chesler, L.; Pierce, S.

    1971-01-01

    Generalized, cyclic, and modified multistep numerical integration methods are developed and evaluated for application to problems of satellite orbit computation. Generalized methods are compared with the presently utilized Cowell methods; new cyclic methods are developed for special second-order differential equations; and several modified methods are developed and applied to orbit computation problems. Special computer programs were written to generate coefficients for these methods, and subroutines were written which allow use of these methods with NASA's GEOSTAR computer program.

  6. Corrigendum to "Dynamics of a flexible tethered satellite system utilising various materials for coplanar and non-coplanar models" [Adv. Space Res. 56 (2015) 648-663

    NASA Astrophysics Data System (ADS)

    Hong, Aaron Aw Teik; Varatharajoo, Renuganth

    2015-12-01

    The authors would like to thank Dr. N.A. Ismail for some of the discussions found in her thesis as these discussions have facilitated to achieve some of the results published in this article. Therefore, Ismail, N.A., "The Dynamics of a Flexible Motorised Momentum Exchange Tether (MMET)", PhD. thesis, University of Glasgow, UK, pp. 26-41, 2012 is cited accordingly herein. The thesis was missed out from the reference list in the original version of this article due to an oversight with no other intention. Similarly the thesis by Stevens, R.E., "Optimal Control of Electrodynamic Tether Satellites", PhD. thesis, Air Force Institute of Technology, USA, pp. 87-96, 2008 is referred for a further readership completeness.

  7. Improving record linkage performance in the presence of missing linkage data.

    PubMed

    Ong, Toan C; Mannino, Michael V; Schilling, Lisa M; Kahn, Michael G

    2014-12-01

    Existing record linkage methods do not handle missing linking field values in an efficient and effective manner. The objective of this study is to investigate three novel methods for improving the accuracy and efficiency of record linkage when record linkage fields have missing values. By extending the Fellegi-Sunter scoring implementations available in the open-source Fine-grained Record Linkage (FRIL) software system we developed three novel methods to solve the missing data problem in record linkage, which we refer to as: Weight Redistribution, Distance Imputation, and Linkage Expansion. Weight Redistribution removes fields with missing data from the set of quasi-identifiers and redistributes the weight from the missing attribute based on relative proportions across the remaining available linkage fields. Distance Imputation imputes the distance between the missing data fields rather than imputing the missing data value. Linkage Expansion adds previously considered non-linkage fields to the linkage field set to compensate for the missing information in a linkage field. We tested the linkage methods using simulated data sets with varying field value corruption rates. The methods developed had sensitivity ranging from .895 to .992 and positive predictive values (PPV) ranging from .865 to 1 in data sets with low corruption rates. Increased corruption rates lead to decreased sensitivity for all methods. These new record linkage algorithms show promise in terms of accuracy and efficiency and may be valuable for combining large data sets at the patient level to support biomedical and clinical research. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Reporting and dealing with missing quality of life data in RCTs: has the picture changed in the last decade?

    PubMed

    Fielding, S; Ogbuagu, A; Sivasubramaniam, S; MacLennan, G; Ramsay, C R

    2016-12-01

    Missing data are a major problem in the analysis of data from randomised trials affecting power and potentially producing biased treatment effects. Specifically focussing on quality of life outcomes, we aimed to report the amount of missing data, whether imputation was used and what methods and was the missing mechanism discussed from four leading medical journals and compare the picture to our previous review nearly a decade ago. A random selection (50 %) of all RCTS published during 2013-2014 in BMJ, JAMA, Lancet and NEJM was obtained. RCTs reported in research letters, cluster RCTs, non-randomised designs, review articles and meta-analysis were excluded. We included 87 RCTs in the review of which 35 % the amount of missing primary QoL data was unclear, 31 (36 %) used imputation. Only 23 % discussed the missing data mechanism. Nearly half used complete case analysis. Reporting was more unclear for secondary QoL outcomes. Compared to the previous review, multiple imputation was used more prominently but mainly in sensitivity analysis. Inadequate reporting and handling of missing QoL data in RCTs are still an issue. There is a large gap between statistical methods research relating to missing data and the use of the methods in applications. A sensitivity analysis should be undertaken to explore the sensitivity of the main results to different missing data assumptions. Medical journals can help to improve the situation by requiring higher standards of reporting and analytical methods to deal with missing data, and by issuing guidance to authors on expected standard.

  9. [Missed lessons, missed opportunities: a role for public health services in medical absenteeism in young people].

    PubMed

    Vanneste, Y T M; van de Goor, L A M; Feron, F J M

    2016-01-01

    Young people who often miss school for health reasons are not only missing education, but also the daily routine of school, and social intercourse with their classmates. Medical absenteeism among students merits greater attention. For a number of years, in various regions in the Netherlands, students with extensive medical absenteeism have been invited to see a youth healthcare specialist. The MASS intervention (Medical Advice of Students reported Sick; in Dutch: Medische Advisering van de Ziekgemelde Leerling, abbreviated as M@ZL) has been developed by the West Brabant Regional Public Health Service together with secondary schools to address school absenteeism due to reporting sick. In this paper we discuss the MASS intervention and explain why attention should be paid by public health services to the problem of school absenteeism, especially absenteeism on health grounds.

  10. Time series change detection: Algorithms for land cover change

    NASA Astrophysics Data System (ADS)

    Boriah, Shyam

    The climate and earth sciences have recently undergone a rapid transformation from a data-poor to a data-rich environment. In particular, climate and ecosystem related observations from remote sensors on satellites, as well as outputs of climate or earth system models from large-scale computational platforms, provide terabytes of temporal, spatial and spatio-temporal data. These massive and information-rich datasets offer huge potential for advancing the science of land cover change, climate change and anthropogenic impacts. One important area where remote sensing data can play a key role is in the study of land cover change. Specifically, the conversion of natural land cover into humandominated cover types continues to be a change of global proportions with many unknown environmental consequences. In addition, being able to assess the carbon risk of changes in forest cover is of critical importance for both economic and scientific reasons. In fact, changes in forests account for as much as 20% of the greenhouse gas emissions in the atmosphere, an amount second only to fossil fuel emissions. Thus, there is a need in the earth science domain to systematically study land cover change in order to understand its impact on local climate, radiation balance, biogeochemistry, hydrology, and the diversity and abundance of terrestrial species. Land cover conversions include tree harvests in forested regions, urbanization, and agricultural intensification in former woodland and natural grassland areas. These types of conversions also have significant public policy implications due to issues such as water supply management and atmospheric CO2 output. In spite of the importance of this problem and the considerable advances made over the last few years in high-resolution satellite data, data mining, and online mapping tools and services, end users still lack practical tools to help them manage and transform this data into actionable knowledge of changes in forest ecosystems that can be used for decision making and policy planning purposes. In particular, previous change detection studies have primarily relied on examining differences between two or more satellite images acquired on different dates. Thus, a technological solution that detects global land cover change using high temporal resolution time series data will represent a paradigm-shift in the field of land cover change studies. To realize these ambitious goals, a number of computational challenges in spatio-temporal data mining need to be addressed. Specifically, analysis and discovery approaches need to be cognizant of climate and ecosystem data characteristics such as seasonality, non-stationarity/inter-region variability, multi-scale nature, spatio-temporal autocorrelation, high-dimensionality and massive data size. This dissertation, a step in that direction, translates earth science challenges to computer science problems, and provides computational solutions to address these problems. In particular, three key technical capabilities are developed: (1) Algorithms for time series change detection that are effective and can scale up to handle the large size of earth science data; (2) Change detection algorithms that can handle large numbers of missing and noisy values present in satellite data sets; and (3) Spatio-temporal analysis techniques to identify the scale and scope of disturbance events.

  11. [Near miss outcomes in gambling games].

    PubMed

    Pecsenye, Zsuzsa; Kurucz, Gyozo

    2017-01-01

    Games of chance operate with an intermittent reinforcement schedule in which the number of games takes the player to win differ in each turn thus they can not predict when the next positive reinforcement arrives. The near miss outcome (close to winning but actually a losing outcome) can be interpreted as a secondary (built in) reinforcement within variable ratio reinforcement schedule that presumably contribute to the development and maintanance of gambling addiction. The aim of this publication would be to introduce near miss outcomes and to summarize and critically analyze literature connected to this issue.We searched internet datebases using word "near miss" and analyse articles focusing on gambling games. Based on numerous authors' results a near miss rate set at around 30% increases the desire to continue playing among gamblers and players who have no former gambling experience as well. Some studies have demonstrated that this effect might be related to the extent the player has the situation under control during the gambling session. The hypothetical inhibiting effect of a 45% near miss ratio has not yet been proven. Neurobiological researches show middle-cerebral activity during near miss outcomes furthermore similar physiological patterns have been discovered following a near miss and winning outcomes. Regarding the connection between intrapsychic variables (cognitive and personality factors) and near misses there are very few studies. The fact that different authors interpret near miss outcomes differently even when studying the same game leads to problems in interpreting their results. It follows from the foregoing empirical results that near miss outcomes contribute to the development and maintanance of pathological gambling but we have little information on the factors implementing this effect.

  12. Missing persons-missing data: the need to collect antemortem dental records of missing persons.

    PubMed

    Blau, Soren; Hill, Anthony; Briggs, Christopher A; Cordner, Stephen M

    2006-03-01

    The subject of missing persons is of great concern to the community with numerous associated emotional, financial, and health costs. This paper examines the forensic medical issues raised by the delayed identification of individuals classified as "missing" and highlights the importance of including dental data in the investigation of missing persons. Focusing on Australia, the current approaches employed in missing persons investigations are outlined. Of particular significance is the fact that each of the eight Australian states and territories has its own Missing Persons Unit that operates within distinct state and territory legislation. Consequently, there is a lack of uniformity within Australia about the legal and procedural framework within which investigations of missing persons are conducted, and the interaction of that framework with coronial law procedures. One of the main investigative problems in missing persons investigations is the lack of forensic medical, particularly, odontological input. Forensic odontology has been employed in numerous cases in Australia where identity is unknown or uncertain because of remains being skeletonized, incinerated, or partly burnt. The routine employment of the forensic odontologist to assist in missing person inquiries, has however, been ignored. The failure to routinely employ forensic odontology in missing persons inquiries has resulted in numerous delays in identification. Three Australian cases are presented where the investigation of individuals whose identity was uncertain or unknown was prolonged due to the failure to utilize the appropriate (and available) dental resources. In light of the outcomes of these cases, we suggest that a national missing persons dental records database be established for future missing persons investigations. Such a database could be easily managed between a coronial system and a forensic medical institute. In Australia, a national missing persons dental records database could be incorporated into the National Coroners Information System (NCIS) managed, on behalf of Australia's Coroners, by the Victorian Institute of Forensic Medicine. The existence of the NCIS would ensure operational collaboration in the implementation of the system and cost savings to Australian policing agencies involved in missing person inquiries. The implementation of such a database would facilitate timely and efficient reconciliation of clinical and postmortem dental records and have subsequent social and financial benefits.

  13. Effective Interpolation of Incomplete Satellite-Derived Leaf-Area Index Time Series for the Continental United States

    NASA Technical Reports Server (NTRS)

    Jasinski, Michael F.; Borak, Jordan S.

    2008-01-01

    Many earth science modeling applications employ continuous input data fields derived from satellite data. Environmental factors, sensor limitations and algorithmic constraints lead to data products of inherently variable quality. This necessitates interpolation of one form or another in order to produce high quality input fields free of missing data. The present research tests several interpolation techniques as applied to satellite-derived leaf area index, an important quantity in many global climate and ecological models. The study evaluates and applies a variety of interpolation techniques for the Moderate Resolution Imaging Spectroradiometer (MODIS) Leaf-Area Index Product over the time period 2001-2006 for a region containing the conterminous United States. Results indicate that the accuracy of an individual interpolation technique depends upon the underlying land cover. Spatial interpolation provides better results in forested areas, while temporal interpolation performs more effectively over non-forest cover types. Combination of spatial and temporal approaches offers superior interpolative capabilities to any single method, and in fact, generation of continuous data fields requires a hybrid approach such as this.

  14. Fire and Smoke Monitoring at NOAA' Satellite Service; Applications to Smoke Forecasting

    NASA Astrophysics Data System (ADS)

    Stephens, G.; Ruminski, M.

    2005-12-01

    The Hazard Mapping System (HMS), developed and run operationally by NOAA's Satellite Services Division (SSD), is a multiplatform remote sensing approach to detecting fires and smoke over the US and adjacent areas of Canada and Mexico. The system utilizes sensors on 7 different NOAA and NASA satellites. Automated detection algorithms are employed for each of the satellites for the fire detects while smoke is delineated by an image analyst. Analyses are quality control by an analyst who inspects all available imagery and automated fire detects, deleting suspected false detects and adding fires that the automated routines miss. Graphical, text, and GIS compatible analyses are posted to a web site as soon as updates are performed, and a final product for a given day is posted early the following morning. All products are archived at NOAA's National Geophysical Data Center. Areal extent of detectable smoke is outlined using animated visible imagery, for input to a dispersion and transport model, the HYbrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT), developed by NOAA's Air Resources Laboratory (ARL). Resulting smoke forecasts will soon be used as input to NOAA's Air Quality forecasts. The GOES Aerosol and Smoke Product (GASP) is an experimental GOES imagery based aerosol optical depth (AOD) product developed by the NESDIS Office of Research and Applications, being implemented for evaluation by the NESDIS Satellite Analysis Branch for use in smoke and volcanic ash monitoring. Currently, research is underway in NESDIS' Office of Research and Applications to objectivize smoke delineation using GASP and MODIS AOD retrievals. NOAA's Operational Significant Event Imagery (OSEI) program processes satellite imagery of environmentally significant events, including fire, smoke and volcanic ash, visible in operational satellite data. This imagery is often referred to by fire managers and air quality agencies. Future plans include the integration of high resolution global data from the European Space Agency's MetOp satellite and global geostationary satellites.

  15. Radiant coolers - Theory, flight histories, design comparisons and future applications

    NASA Technical Reports Server (NTRS)

    Donohoe, M. J.; Sherman, A.; Hickman, D. E.

    1975-01-01

    Radiant coolers have been developed for application to the cooling of infrared detectors aboard NASA earth observation systems and as part of the Defense Meteorological Satellite Program. The prime design constraints for these coolers are the location of the cooler aboard the satellite and the satellite orbit. Flight data from several coolers indicates that, in general, design temperatures are achieved. However, potential problems relative to the contamination of cold surfaces are also revealed by the data. A comparison among the various cooler designs and flight performances indicates design improvements that can minimize the contamination problem in the future.

  16. Characterizing Dw1335-29, a Recently Discovered Dwarf Satellite of M83

    NASA Astrophysics Data System (ADS)

    Carrillo, Andreia Jessica; Bell, Eric F.; Bailin, Jeremy; Monachesi, Antonela

    2016-01-01

    Simulations of galaxy formation in a cosmological context predict that galaxies should be surrounded by hundreds of relatively massive dark matter subhalos, each of which was expected to host a dwarf satellite galaxy. Large numbers of luminous dwarf galaxies do not exist around the Milky Way or M31 - this has been termed the missing satellite problem. There are a number of possible physical drivers of this discrepancy, some of which might predict significant differences from galaxy to galaxy. Accordingly, there are a number of efforts whose goal is to solidify and augment the census of dwarf satellites of external galaxies, outside the Local Group. Recently, Mueller, Jergen & Bingelli (2015; arXiv.1509.04931) presented 16 dwarf galaxy candidates in the vicinity of M83 using the Dark Energy CAMera (DECAM). With a field from the HST/GHOSTS survey that partly covers dw1335-29 (Radburn-Smith et al. 2011; ApJS, 195, 18) in conjunction with complementary ground-based images from VIMOS that cover the whole dwarf, we confirm that one of the candidates dw1335-29 is a dwarf satellite of M83, at a projected distance from M83 of 26 kpc and a with distance modulus of m-M = 28.5-0.1+0.3, placing it in the M83 group. From our VIMOS imaging that covers the entire dwarf, we estimate an absolute magnitude of MV = -9.8-0.1+0.3, show that it is elongated with an ellipticity of 0.35+/-0.15, and has a half light radius of 500+/-50pc. Dw1335-29 has both a somewhat irregular shape and has superimposed young stars in the resolved stellar population maps, leading us to classify this galaxy as a faint dwarf irregular or transition dwarf. This is especially curious, as with a projected distance of only 26kpc from M83, our prior expectation from study of the Local Group (following e.g., Grebel et al. 2003; AJ, 125, 1926, Slater & Bell 2013; ApJ, 772, 15) would be that dw1335-29 would lack recent star formation. Further study of M83's dwarf population will reveal if star formation in its dwarfs is commonplace (suggesting a lack of a hot gas envelope for M83 that would quench star formation) or rare (suggesting that dw1335-29 is at much larger 3D distance from M83, and is fortuitously projected to small radii).

  17. Design, implementation and reporting strategies to reduce the instance and impact of missing patient-reported outcome (PRO) data: a systematic review

    PubMed Central

    Mercieca-Bebber, Rebecca; Palmer, Michael J; Brundage, Michael; Stockler, Martin R; King, Madeleine T

    2016-01-01

    Objectives Patient-reported outcomes (PROs) provide important information about the impact of treatment from the patients' perspective. However, missing PRO data may compromise the interpretability and value of the findings. We aimed to report: (1) a non-technical summary of problems caused by missing PRO data; and (2) a systematic review by collating strategies to: (A) minimise rates of missing PRO data, and (B) facilitate transparent interpretation and reporting of missing PRO data in clinical research. Our systematic review does not address statistical handling of missing PRO data. Data sources MEDLINE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases (inception to 31 March 2015), and citing articles and reference lists from relevant sources. Eligibility criteria English articles providing recommendations for reducing missing PRO data rates, or strategies to facilitate transparent interpretation and reporting of missing PRO data were included. Methods 2 reviewers independently screened articles against eligibility criteria. Discrepancies were resolved with the research team. Recommendations were extracted and coded according to framework synthesis. Results 117 sources (55% discussion papers, 26% original research) met the eligibility criteria. Design and methodological strategies for reducing rates of missing PRO data included: incorporating PRO-specific information into the protocol; carefully designing PRO assessment schedules and defining termination rules; minimising patient burden; appointing a PRO coordinator; PRO-specific training for staff; ensuring PRO studies are adequately resourced; and continuous quality assurance. Strategies for transparent interpretation and reporting of missing PRO data include utilising auxiliary data to inform analysis; transparently reporting baseline PRO scores, rates and reasons for missing data; and methods for handling missing PRO data. Conclusions The instance of missing PRO data and its potential to bias clinical research can be minimised by implementing thoughtful design, rigorous methodology and transparent reporting strategies. All members of the research team have a responsibility in implementing such strategies. PMID:27311907

  18. Using High-Resolution Satellite Aerosol Optical Depth To Estimate Daily PM2.5 Geographical Distribution in Mexico City.

    PubMed

    Just, Allan C; Wright, Robert O; Schwartz, Joel; Coull, Brent A; Baccarelli, Andrea A; Tellez-Rojo, Martha María; Moody, Emily; Wang, Yujie; Lyapustin, Alexei; Kloog, Itai

    2015-07-21

    Recent advances in estimating fine particle (PM2.5) ambient concentrations use daily satellite measurements of aerosol optical depth (AOD) for spatially and temporally resolved exposure estimates. Mexico City is a dense megacity that differs from other previously modeled regions in several ways: it has bright land surfaces, a distinctive climatological cycle, and an elevated semi-enclosed air basin with a unique planetary boundary layer dynamic. We extend our previous satellite methodology to the Mexico City area, a region with higher PM2.5 than most U.S. and European urban areas. Using a novel 1 km resolution AOD product from the MODIS instrument, we constructed daily predictions across the greater Mexico City area for 2004-2014. We calibrated the association of AOD to PM2.5 daily using municipal ground monitors, land use, and meteorological features. Predictions used spatial and temporal smoothing to estimate AOD when satellite data were missing. Our model performed well, resulting in an out-of-sample cross-validation R(2) of 0.724. Cross-validated root-mean-squared prediction error (RMSPE) of the model was 5.55 μg/m(3). This novel model reconstructs long- and short-term spatially resolved exposure to PM2.5 for epidemiological studies in Mexico City.

  19. Using high-resolution satellite aerosol optical depth to estimate daily PM2.5 geographical distribution in Mexico City

    PubMed Central

    Just, Allan C.; Wright, Robert O.; Schwartz, Joel; Coull, Brent A.; Baccarelli, Andrea A.; Tellez-Rojo, Martha María; Moody, Emily; Wang, Yujie; Lyapustin, Alexei; Kloog, Itai

    2015-01-01

    Recent advances in estimating fine particle (PM2.5) ambient concentrations use daily satellite measurements of aerosol optical depth (AOD) for spatially and temporally resolved exposure estimates. Mexico City is a dense megacity that differs from other previously modeled regions in several ways: it has bright land surfaces, a distinctive climatological cycle, and an elevated semi-enclosed air basin with a unique planetary boundary layer dynamic. We extend our previous satellite methodology to the Mexico City area, a region with higher PM2.5 than most US and European urban areas. Using a novel 1 km resolution AOD product from the MODIS instrument, we constructed daily predictions across the greater Mexico City area for 2004–2014. We calibrated the association of AOD to PM2.5 daily using municipal ground monitors, land use, and meteorological features. Predictions used spatial and temporal smoothing to estimate AOD when satellite data were missing. Our model performed well, resulting in an out-of-sample cross validation R2 of 0.724. Cross-validated root mean squared prediction error (RMSPE) of the model was 5.55 μg/m3. This novel model reconstructs long- and short-term spatially resolved exposure to PM2.5 for epidemiological studies in Mexico City. PMID:26061488

  20. ChinaSpec: a network of SIF observations to bridge flux measurements and remote sensing data

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Wang, S.; Liu, L.; Ju, W.; Zhu, X.

    2017-12-01

    Accurately quantifying atmosphere-biosphere interactions across multiple scale still remains a challenge. Remote sensing, especially satellite data, has been widely used as a solution to resolve the broad scale estimation of carbon flux by upscaling the point measurements of eddy covariance (EC) technique. However, critical gaps remain between the EC observations and coarse satellite data due to the scale mismatch. In this regard, it is necessary to build a network of in situ optical observations to bridge the scale-mismatch between EC measurements and satellite remote sensing data. Internationally, a few networks have already been established (e.g., SpecNet and EuroSpec), but still at its early stage. ChinaSpec is a network of linking in situ spectral measurements, especially sun-induce chlorophyll fluorescence (SIF), with point EC observations for better understanding the interactions of atmosphere-biosphere. One main focus of ChinsSpec is to conduct continuous field SIF measurements at multiple EC sites across the mainland of China. This will help us better understand the mechanics of SIF and photosynthesis, and resolve the missing gaps between recent SIF retrievals from coarse satellite data and EC observations. In this presentation, we introduce the background, current stage, and the development of ChinaSpec network.

  1. Basic research and data analysis for the national geodetic satellite program and for the earth and ocean physics applications program

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Activities related to the National Geodetic Satellite Program are reported and include a discussion of Ohio State University's OSU275 set of tracking station coordinates and transformation parameters, determination of network distortions, and plans for data acquisition and processing. The problems encountered in the development of the LAGEOS satellite are reported in an account of activities related to the Earth and Ocean Physics Applications Program. The LAGEOS problem involves transmission and reception of the laser pulse designed to make accurate determinations of the earth's crustal and rotational motions. Pulse motion, ephemeris, arc range measurements, and accuracy estimates are discussed in view of the problem. Personnel involved in the two programs are also listed, along with travel activities and reports published to date.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baghram, Shant; Abolhasani, Ali Akbar; Firouzjahi, Hassan

    We study the predictions of anomalous inflationary models on the abundance of structures in large scale structure observations. The anomalous features encoded in primordial curvature perturbation power spectrum are (a): localized feature in momentum space, (b): hemispherical asymmetry and (c): statistical anisotropies. We present a model-independent expression relating the number density of structures to the changes in the matter density variance. Models with localized feature can alleviate the tension between observations and numerical simulations of cold dark matter structures on galactic scales as a possible solution to the missing satellite problem. In models with hemispherical asymmetry we show that themore » abundance of structures becomes asymmetric depending on the direction of observation to sky. In addition, we study the effects of scale-dependent dipole amplitude on the abundance of structures. Using the quasars data and adopting the power-law scaling k{sup n{sub A}-1} for the amplitude of dipole we find the upper bound n{sub A} < 0.6 for the spectral index of the dipole asymmetry. In all cases there is a critical mass scale M{sub c} in which for M M{sub c}) the enhancement in variance induced from anomalous feature decreases (increases) the abundance of dark matter structures in Universe.« less

  3. Rain/No-Rain Identification from Bispectral Satellite Information using Deep Neural Networks

    NASA Astrophysics Data System (ADS)

    Tao, Y.

    2016-12-01

    Satellite-based precipitation estimation products have the advantage of high resolution and global coverage. However, they still suffer from insufficient accuracy. To accurately estimate precipitation from satellite data, there are two most important aspects: sufficient precipitation information in the satellite information and proper methodologies to extract such information effectively. This study applies the state-of-the-art machine learning methodologies to bispectral satellite information for Rain/No-Rain detection. Specifically, we use deep neural networks to extract features from infrared and water vapor channels and connect it to precipitation identification. To evaluate the effectiveness of the methodology, we first applies it to the infrared data only (Model DL-IR only), the most commonly used inputs for satellite-based precipitation estimation. Then we incorporates water vapor data (Model DL-IR + WV) to further improve the prediction performance. Radar stage IV dataset is used as ground measurement for parameter calibration. The operational product, Precipitation Estimation from Remotely Sensed Information Using Artificial Neural Networks Cloud Classification System (PERSIANN-CCS), is used as a reference to compare the performance of both models in both winter and summer seasons.The experiments show significant improvement for both models in precipitation identification. The overall performance gains in the Critical Success Index (CSI) are 21.60% and 43.66% over the verification periods for Model DL-IR only and Model DL-IR+WV model compared to PERSIANN-CCS, respectively. Moreover, specific case studies show that the water vapor channel information and the deep neural networks effectively help recover a large number of missing precipitation pixels under warm clouds while reducing false alarms under cold clouds.

  4. Refined Use of Satellite Aerosol Optical Depth Snapshots to Constrain Biomass Burning Emissions in the GOCART Model

    NASA Technical Reports Server (NTRS)

    Petrenko, Mariya; Kahn, Ralph; Chin, Mian; Limbacher, James

    2017-01-01

    Simulations of biomass burning (BB) emissions in global chemistry and aerosol transport models depend on external inventories, which provide location and strength of burning aerosol sources. Our previous work (Petrenko et al., 2012) shows that satellite snapshots of aerosol optical depth (AOD) near the emitted smoke plume can be used to constrain model-simulated AOD, and effectively, the assumed source strength. We now refine the satellite-snapshot method and investigate applying simple multiplicative emission correction factors for the widely used Global Fire Emission Database version 3 (GFEDv3) emission inventory can achieve regional-scale consistency between MODIS AOD snapshots and the Goddard Chemistry Aerosol Radiation and Transport (GOCART) model. The model and satellite AOD are compared over a set of more than 900 BB cases observed by the MODIS instrument during the 2004, and 2006-2008 biomass burning seasons. The AOD comparison presented here shows that regional discrepancies between the model and satellite are diverse around the globe yet quite consistent within most ecosystems. Additional analysis of including small fire emission correction shows the complimentary nature of correcting for source strength and adding missing sources, and also indicates that in some regions other factors may be significant in explaining model-satellite discrepancies. This work sets the stage for a larger intercomparison within the Aerosol Inter-comparisons between Observations and Models (AeroCom) multi-model biomass burning experiment. We discuss here some of the other possible factors affecting the remaining discrepancies between model simulations and observations, but await comparisons with other AeroCom models to draw further conclusions.

  5. A multi-model evaluation of aerosols over South Asia: common problems and possible causes

    NASA Astrophysics Data System (ADS)

    Pan, X.; Chin, M.; Gautam, R.; Bian, H.; Kim, D.; Colarco, P. R.; Diehl, T. L.; Takemura, T.; Pozzoli, L.; Tsigaridis, K.; Bauer, S.; Bellouin, N.

    2015-05-01

    Atmospheric pollution over South Asia attracts special attention due to its effects on regional climate, water cycle and human health. These effects are potentially growing owing to rising trends of anthropogenic aerosol emissions. In this study, the spatio-temporal aerosol distributions over South Asia from seven global aerosol models are evaluated against aerosol retrievals from NASA satellite sensors and ground-based measurements for the period of 2000-2007. Overall, substantial underestimations of aerosol loading over South Asia are found systematically in most model simulations. Averaged over the entire South Asia, the annual mean aerosol optical depth (AOD) is underestimated by a range 15 to 44% across models compared to MISR (Multi-angle Imaging SpectroRadiometer), which is the lowest bound among various satellite AOD retrievals (from MISR, SeaWiFS (Sea-Viewing Wide Field-of-View Sensor), MODIS (Moderate Resolution Imaging Spectroradiometer) Aqua and Terra). In particular during the post-monsoon and wintertime periods (i.e., October-January), when agricultural waste burning and anthropogenic emissions dominate, models fail to capture AOD and aerosol absorption optical depth (AAOD) over the Indo-Gangetic Plain (IGP) compared to ground-based Aerosol Robotic Network (AERONET) sunphotometer measurements. The underestimations of aerosol loading in models generally occur in the lower troposphere (below 2 km) based on the comparisons of aerosol extinction profiles calculated by the models with those from Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) data. Furthermore, surface concentrations of all aerosol components (sulfate, nitrate, organic aerosol (OA) and black carbon (BC)) from the models are found much lower than in situ measurements in winter. Several possible causes for these common problems of underestimating aerosols in models during the post-monsoon and wintertime periods are identified: the aerosol hygroscopic growth and formation of secondary inorganic aerosol are suppressed in the models because relative humidity (RH) is biased far too low in the boundary layer and thus foggy conditions are poorly represented in current models, the nitrate aerosol is either missing or inadequately accounted for, and emissions from agricultural waste burning and biofuel usage are too low in the emission inventories. These common problems and possible causes found in multiple models point out directions for future model improvements in this important region.

  6. Modeling and Analysis of Wholesale Electricity Market Design. Understanding the Missing Money Problem. December 2013 - January 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papalexopoulos, A.; Hansen, C.; Perrino, D.

    This project examined the impact of renewable energy sources, which have zero incremental energy costs, on the sustainability of conventional generation. This “missing money” problem refers to market outcomes in which infra-marginal energy revenues in excess of operations and maintenance (O&M) costs are systematically lower than the amortized costs of new entry for a marginal generator. The problem is caused by two related factors: (1) conventional generation is dispatched less, and (2) the price that conventional generation receives for its energy is lower. This lower revenue stream may not be sufficient to cover both the variable and fixed costs ofmore » conventional generation. In fact, this study showed that higher wind penetrations in the Electric Reliability Council of Texas (ERCOT) system could cause many conventional generators to become uneconomic.« less

  7. A method to estimate the additional uncertainty in gap-filled NEE resulting from long gaps in the CO2 flux record

    Treesearch

    Andrew D. Richardson; David Y. Hollinger

    2007-01-01

    Missing values in any data set create problems for researchers. The process by which missing values are replaced, and the data set is made complete, is generally referred to as imputation. Within the eddy flux community, the term "gap filling" is more commonly applied. A major challenge is that random errors in measured data result in uncertainty in the gap-...

  8. Snags hit tethered satellite mission

    NASA Technical Reports Server (NTRS)

    Schuiling, Roelof

    1993-01-01

    The processing and course of the STS-46 Space Shuttle Atlantis mission are described. Problems experienced by the astronaut team in deploying the Tethered Satellite System during the mission are recounted.

  9. Satellite sound broadcasting system, portable reception

    NASA Technical Reports Server (NTRS)

    Golshan, Nasser; Vaisnys, Arvydas

    1990-01-01

    Studies are underway at JPL in the emerging area of Satellite Sound Broadcast Service (SSBS) for direct reception by low cost portable, semi portable, mobile and fixed radio receivers. This paper addresses the portable reception of digital broadcasting of monophonic audio with source material band limited to 5 KHz (source audio comparable to commercial AM broadcasting). The proposed system provides transmission robustness, uniformity of performance over the coverage area and excellent frequency reuse. Propagation problems associated with indoor portable reception are considered in detail and innovative antenna concepts are suggested to mitigate these problems. It is shown that, with the marriage of proper technologies a single medium power satellite can provide substantial direct satellite audio broadcast capability to CONUS in UHF or L Bands, for high quality portable indoor reception by low cost radio receivers.

  10. A review of the handling of missing longitudinal outcome data in clinical trials

    PubMed Central

    2014-01-01

    The aim of this review was to establish the frequency with which trials take into account missingness, and to discover what methods trialists use for adjustment in randomised controlled trials with longitudinal measurements. Failing to address the problems that can arise from missing outcome data can result in misleading conclusions. Missing data should be addressed as a means of a sensitivity analysis of the complete case analysis results. One hundred publications of randomised controlled trials with longitudinal measurements were selected randomly from trial publications from the years 2005 to 2012. Information was extracted from these trials, including whether reasons for dropout were reported, what methods were used for handing the missing data, whether there was any explanation of the methods for missing data handling, and whether a statistician was involved in the analysis. The main focus of the review was on missing data post dropout rather than missing interim data. Of all the papers in the study, 9 (9%) had no missing data. More than half of the papers included in the study failed to make any attempt to explain the reasons for their choice of missing data handling method. Of the papers with clear missing data handling methods, 44 papers (50%) used adequate methods of missing data handling, whereas 30 (34%) of the papers used missing data methods which may not have been appropriate. In the remaining 17 papers (19%), it was difficult to assess the validity of the methods used. An imputation method was used in 18 papers (20%). Multiple imputation methods were introduced in 1987 and are an efficient way of accounting for missing data in general, and yet only 4 papers used these methods. Out of the 18 papers which used imputation, only 7 displayed the results as a sensitivity analysis of the complete case analysis results. 61% of the papers that used an imputation explained the reasons for their chosen method. Just under a third of the papers made no reference to reasons for missing outcome data. There was little consistency in reporting of missing data within longitudinal trials. PMID:24947664

  11. Mining Missing Hyperlinks from Human Navigation Traces: A Case Study of Wikipedia.

    PubMed

    West, Robert; Paranjape, Ashwin; Leskovec, Jure

    Hyperlinks are an essential feature of the World Wide Web. They are especially important for online encyclopedias such as Wikipedia: an article can often only be understood in the context of related articles, and hyperlinks make it easy to explore this context. But important links are often missing, and several methods have been proposed to alleviate this problem by learning a linking model based on the structure of the existing links. Here we propose a novel approach to identifying missing links in Wikipedia. We build on the fact that the ultimate purpose of Wikipedia links is to aid navigation. Rather than merely suggesting new links that are in tune with the structure of existing links, our method finds missing links that would immediately enhance Wikipedia's navigability. We leverage data sets of navigation paths collected through a Wikipedia-based human-computation game in which users must find a short path from a start to a target article by only clicking links encountered along the way. We harness human navigational traces to identify a set of candidates for missing links and then rank these candidates. Experiments show that our procedure identifies missing links of high quality.

  12. Exemplar-based inpainting as a solution to the missing wedge problem in electron tomography.

    PubMed

    Trampert, Patrick; Wang, Wu; Chen, Delei; Ravelli, Raimond B G; Dahmen, Tim; Peters, Peter J; Kübel, Christian; Slusallek, Philipp

    2018-04-21

    A new method for dealing with incomplete projection sets in electron tomography is proposed. The approach is inspired by exemplar-based inpainting techniques in image processing and heuristically generates data for missing projection directions. The method has been extended to work on three dimensional data. In general, electron tomography reconstructions suffer from elongation artifacts along the beam direction. These artifacts can be seen in the corresponding Fourier domain as a missing wedge. The new method synthetically generates projections for these missing directions with the help of a dictionary based approach that is able to convey both structure and texture at the same time. It constitutes a preprocessing step that can be combined with any tomographic reconstruction algorithm. The new algorithm was applied to phantom data, to a real electron tomography data set taken from a catalyst, as well as to a real dataset containing solely colloidal gold particles. Visually, the synthetic projections, reconstructions, and corresponding Fourier power spectra showed a decrease of the typical missing wedge artifacts. Quantitatively, the inpainting method is capable to reduce missing wedge artifacts and improves tomogram quality with respect to full width half maximum measurements. Copyright © 2018. Published by Elsevier B.V.

  13. Mining Missing Hyperlinks from Human Navigation Traces: A Case Study of Wikipedia

    PubMed Central

    West, Robert; Paranjape, Ashwin; Leskovec, Jure

    2015-01-01

    Hyperlinks are an essential feature of the World Wide Web. They are especially important for online encyclopedias such as Wikipedia: an article can often only be understood in the context of related articles, and hyperlinks make it easy to explore this context. But important links are often missing, and several methods have been proposed to alleviate this problem by learning a linking model based on the structure of the existing links. Here we propose a novel approach to identifying missing links in Wikipedia. We build on the fact that the ultimate purpose of Wikipedia links is to aid navigation. Rather than merely suggesting new links that are in tune with the structure of existing links, our method finds missing links that would immediately enhance Wikipedia's navigability. We leverage data sets of navigation paths collected through a Wikipedia-based human-computation game in which users must find a short path from a start to a target article by only clicking links encountered along the way. We harness human navigational traces to identify a set of candidates for missing links and then rank these candidates. Experiments show that our procedure identifies missing links of high quality. PMID:26634229

  14. Forecasting E > 50-MeV proton events with the proton prediction system (PPS)

    NASA Astrophysics Data System (ADS)

    Kahler, Stephen W.; White, Stephen M.; Ling, Alan G.

    2017-11-01

    Forecasting solar energetic (E > 10-MeV) particle (SEP) events is an important element of space weather. While several models have been developed for use in forecasting such events, satellite operations are particularly vulnerable to higher-energy (≥50-MeV) SEP events. Here we validate one model, the proton prediction system (PPS), which extends to that energy range. We first develop a data base of E ≥ 50-MeV proton events >1.0 proton flux units (pfu) events observed on the GOES satellite over the period 1986-2016. We modify the PPS to forecast proton events at the reduced level of 1 pfu and run PPS for four different solar input parameters: (1) all ≥M5 solar X-ray flares; (2) all ≥200 sfu 8800-MHz bursts with associated ≥M5 flares; (3) all ≥500 sfu 8800-MHz bursts; and (4) all ≥5000 sfu 8800-MHz bursts. The validation contingency tables and skill scores are calculated for all groups and used as a guide to use of the PPS. We plot the false alarms and missed events as functions of solar source longitude, and argue that the longitude-dependence employed by PPS does not match modern observations. Use of the radio fluxes as the PPS driver tends to result in too many false alarms at the 500 sfu threshold, and misses more events than the soft X-ray predictor at the 5000 sfu threshold.

  15. Educational Television Via Satellite: Studies of Antecedents and Projects, Preliminary Plan. Volume One.

    ERIC Educational Resources Information Center

    Comision Nacional de Investigaciones Espaciales, Buenos Aires (Argentina).

    A proposed satellite-aided educational television (ETV) system for Argentina is described in this Spanish-language report. The requirements and advantages of such a system are discussed, and some other studies of satellite-aided ETV are summarized. International and legal considerations, and problems of integrating existing Argentine TV stations…

  16. Attitude guidance and simulation with animation of a land-survey satellite motion

    NASA Astrophysics Data System (ADS)

    Somova, Tatyana

    2017-01-01

    We consider problems of synthesis of the vector spline attitude guidance laws for a land-survey satellite and an in-flight support of the satellite attitude control system with the use of computer animation of its motion. We have presented the results on the efficiency of the developed algorithms.

  17. The Future of Satellite Communications. Resource Management and the Needs of Nations.

    ERIC Educational Resources Information Center

    Hinchman, Walter R.; Dunn, D. A.

    Recent events suggest that Intelsat (the 68-nation International Telecommunications Satellite Consortium) will coordinate a number of domestic and regional systems that provide satellite communications services, some of which will be maintained by Intelsat and some of which will be independent. This report addresses the problems of conflict in…

  18. Aeronautical satellite antenna steering using magnetic field sensors

    NASA Technical Reports Server (NTRS)

    Sydor, John; Dufour, Martial

    1993-01-01

    Designers of aeronautical satellite terminals are often faced with the problem of steering a directive antenna from an airplane or helicopter. This problem is usually solved by using aircraft orientation information derived from inertial sensors on-board the aircraft in combination with satellite ephemeris information calculated from geographic coordinates. This procedure works well but relies heavily on avionics that are external to the terminal. For the majority of small aircraft and helicopters which will form the bulk of future aeronautical satcom users, such avionics either do not exist or are difficult for the satellite terminal to interface with. At the Communications Research Center (CRC), work has been undertaken to develop techniques that use the geomagnetic field and satellite antenna pointing vectors (both of which are stationary in a local geographical area) to track the position of a satellite relative to a moving platform such as an aircraft. The performance of this technique is examined and a mathematical steering transformation is developed within this paper. Details are given regarding the experimental program that will be undertaken to test the concepts proposed herein.

  19. Assessing temporally and spatially resolved PM 2.5 exposures for epidemiological studies using satellite aerosol optical depth measurements

    NASA Astrophysics Data System (ADS)

    Kloog, Itai; Koutrakis, Petros; Coull, Brent A.; Lee, Hyung Joo; Schwartz, Joel

    2011-11-01

    Land use regression (LUR) models provide good estimates of spatially resolved long-term exposures, but are poor at capturing short term exposures. Satellite-derived Aerosol Optical Depth (AOD) measurements have the potential to provide spatio-temporally resolved predictions of both long and short term exposures, but previous studies have generally showed relatively low predictive power. Our objective was to extend our previous work on day-specific calibrations of AOD data using ground PM 2.5 measurements by incorporating commonly used LUR variables and meteorological variables, thus benefiting from both the spatial resolution from the LUR models and the spatio-temporal resolution from the satellite models. Later we use spatial smoothing to predict PM 2.5 concentrations for day/locations with missing AOD measures. We used mixed models with random slopes for day to calibrate AOD data for 2000-2008 across New-England with monitored PM 2.5 measurements. We then used a generalized additive mixed model with spatial smoothing to estimate PM 2.5 in location-day pairs with missing AOD, using regional measured PM 2.5, AOD values in neighboring cells, and land use. Finally, local (100 m) land use terms were used to model the difference between grid cell prediction and monitored value to capture very local traffic particles. Out-of-sample ten-fold cross-validation was used to quantify the accuracy of our predictions. For days with available AOD data we found high out-of-sample R2 (mean out-of-sample R2 = 0.830, year to year variation 0.725-0.904). For days without AOD values, our model performance was also excellent (mean out-of-sample R2 = 0.810, year to year variation 0.692-0.887). Importantly, these R2 are for daily, rather than monthly or yearly, values. Our model allows one to assess short term and long-term human exposures in order to investigate both the acute and chronic effects of ambient particles, respectively.

  20. The mean-square error optimal linear discriminant function and its application to incomplete data vectors

    NASA Technical Reports Server (NTRS)

    Walker, H. F.

    1979-01-01

    In many pattern recognition problems, data vectors are classified although one or more of the data vector elements are missing. This problem occurs in remote sensing when the ground is obscured by clouds. Optimal linear discrimination procedures for classifying imcomplete data vectors are discussed.

  1. Minimax Rate-optimal Estimation of High-dimensional Covariance Matrices with Incomplete Data*

    PubMed Central

    Cai, T. Tony; Zhang, Anru

    2016-01-01

    Missing data occur frequently in a wide range of applications. In this paper, we consider estimation of high-dimensional covariance matrices in the presence of missing observations under a general missing completely at random model in the sense that the missingness is not dependent on the values of the data. Based on incomplete data, estimators for bandable and sparse covariance matrices are proposed and their theoretical and numerical properties are investigated. Minimax rates of convergence are established under the spectral norm loss and the proposed estimators are shown to be rate-optimal under mild regularity conditions. Simulation studies demonstrate that the estimators perform well numerically. The methods are also illustrated through an application to data from four ovarian cancer studies. The key technical tools developed in this paper are of independent interest and potentially useful for a range of related problems in high-dimensional statistical inference with missing data. PMID:27777471

  2. Minimax Rate-optimal Estimation of High-dimensional Covariance Matrices with Incomplete Data.

    PubMed

    Cai, T Tony; Zhang, Anru

    2016-09-01

    Missing data occur frequently in a wide range of applications. In this paper, we consider estimation of high-dimensional covariance matrices in the presence of missing observations under a general missing completely at random model in the sense that the missingness is not dependent on the values of the data. Based on incomplete data, estimators for bandable and sparse covariance matrices are proposed and their theoretical and numerical properties are investigated. Minimax rates of convergence are established under the spectral norm loss and the proposed estimators are shown to be rate-optimal under mild regularity conditions. Simulation studies demonstrate that the estimators perform well numerically. The methods are also illustrated through an application to data from four ovarian cancer studies. The key technical tools developed in this paper are of independent interest and potentially useful for a range of related problems in high-dimensional statistical inference with missing data.

  3. Getting patients in the door: medical appointment reminder preferences.

    PubMed

    Crutchfield, Trisha M; Kistler, Christine E

    2017-01-01

    Between 23% and 34% of outpatient appointments are missed annually. Patients who frequently miss medical appointments have poorer health outcomes and are less likely to use preventive health care services. Missed appointments result in unnecessary costs and organizational inefficiencies. Appointment reminders may help reduce missed appointments; particular types may be more effective than other types. We used a survey with a discrete choice experiment (DCE) to learn why individuals miss appointments and to assess appointment reminder preferences. We enrolled a national sample of adults from an online survey panel to complete demographic and appointment habit questions as well as a 16-task DCE designed in Sawtooth Software's Discover tool. We assessed preferences for four reminder attributes - initial reminder type, arrival of initial reminder, reminder content, and number of reminders. We derived utilities and importance scores. We surveyed 251 adults nationally, with a mean age of 43 (range 18-83) years: 51% female, 84% White, and 8% African American. Twenty-three percent of individuals missed one or more appointments in the past 12 months. Two primary reasons given for missing an appointment include transportation problems (28%) and forgetfulness (26%). Participants indicated the initial reminder type (21%) was the most important attribute, followed by the number of reminders (10%). Overall, individuals indicated a preference for a single reminder, arriving via email, phone call, or text message, delivered less than 2 weeks prior to an appointment. Preferences for reminder content were less clear. The number of missed appointments and reasons for missing appointments are consistent with prior research. Patient-centered appointment reminders may improve appointment attendance by addressing some of the reasons individuals report missing appointments and by meeting patients' needs. Future research is necessary to determine if preferred reminders used in practice will result in improved appointment attendance in clinical settings.

  4. Informatively missing quality of life and unmet needs sex data for immigrant and Anglo-Australian cancer patients and survivors.

    PubMed

    Bell, Melanie L; Butow, Phyllis N; Goldstein, David

    2013-12-01

    Although cancer can seriously affect peoples' sexual well-being, survivors and patients may be reluctant to answer questions about sex. This reluctance may be stronger for immigrants. This study aimed to investigate missing sex data rates and predictors of missingness in two large studies on immigrants and Anglo-Australian controls with cancer and to investigate whether those with missing sex data may have worse sexual outcomes than those with complete data. We carried out two studies aimed at describing the quality of life (QoL) and unmet needs amongst Arabic, Chinese and Greek immigrants versus Anglo-Australians cancer survivors (n = 596, recruited from cancer registries) and patients (n = 845). Logistic regression was used to model the probability of having missing sex data in either of the questionnaires. We compared the mean of the unmet sex needs responses of those who had missing QoL sex data (but not needs) to those who had completed both, and vice versa. Missing sex data rates were as high as 65 %, with immigrants more likely to skip sex items than Anglo-Australians (p = 0.02 for registry study, p < 0.0001 for hospital study). Women, older participants and participants with more advanced disease had increased odds of missingness. There was evidence that data were informatively missing. Additionally, the questionnaire which stated that the sex questions are optional had higher missing data rates. High missing data rates and informatively missing data can lead to biased results. Using the questionnaires that state that they may skip sex items may lead to an underestimation of sexual problems or an overestimation of quality of life.

  5. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  6. A new Ellipsoidal Gravimetric-Satellite Altimetry Boundary Value Problem; Case study: High Resolution Geoid of Iran

    NASA Astrophysics Data System (ADS)

    Ardalan, A.; Safari, A.; Grafarend, E.

    2003-04-01

    A new ellipsoidal gravimetric-satellite altimetry boundary value problem has been developed and successfully tested. This boundary value problem has been constructed for gravity observables of the type (i) gravity potential (ii) gravity intensity (iii) deflection of vertical and (iv) satellite altimetry data. The developed boundary value problem is enjoying the ellipsoidal nature and as such can take advantage of high precision GPS observations in the set-up of the problem. The highlights of the solution are as follows: begin{itemize} Application of ellipsoidal harmonic expansion up to degree/order and ellipsoidal centrifugal field for the reduction of global gravity and isostasy effects from the gravity observable at the surface of the Earth. Application of ellipsoidal Newton integral on the equal area map projection surface for the reduction of residual mass effects within a radius of 55 km around the computational point. Ellipsoidal harmonic downward continuation of the residual observables from the surface of the earth down to the surface of reference ellipsoid using the ellipsoidal height of the observation points derived from GPS. Restore of the removed effects at the application points on the surface of reference ellipsoid. Conversion of the satellite altimetry derived heights of the water bodies into potential. Combination of the downward continued gravity information with the potential equivalent of the satellite altimetry derived heights of the water bodies. Application of ellipsoidal Bruns formula for converting the potential values on the surface of the reference ellipsoid into the geoidal heights (i.e. ellipsoidal heights of the geoid) with respect to the reference ellipsoid. Computation of the high-resolution geoid of Iran has successfully tested this new methodology!

  7. Some environmental problems and their satellite monitoring. [anthropogenic modifications of earth surface

    NASA Technical Reports Server (NTRS)

    Otterman, J.

    1975-01-01

    Anthropogenic modification of the earth's surface is discussed in two problem areas: (1) land use changes and overgrazing, and how it affects albedo and land surface-atmosphere interactions, and (2) water and land surface pollution, especially oil slicks. A literature survey evidences the importance of these problems. The need for monitoring is stressed, and it is suggested that with some modifications to the sensors, ERTS (Landsat) series satellites can provide approximate monitoring information. The European Landsat receiving station in Italy will facilitate data collection for the tasks described.

  8. Evaluation of high resolution global satellite precipitation products using daily raingauge data over the Upper Blue Nile Basin

    NASA Astrophysics Data System (ADS)

    Sahlu, Dejene; Moges, Semu; Anagnostou, Emmanouil; Nikolopoulos, Efthymios; Hailu, Dereje; Mei, Yiwen

    2017-04-01

    Water resources assessment, planning and management in Africa is often constrained by the lack of reliable spatio-temporal rainfall data. Satellite products are steadily growing and offering useful alternative datasets of rainfall globally. The aim of this paper is to examine the error characteristics of the main available global satellite precipitation products with the view of improving the reliability of wet season (June to September) and small rainy season rainfall datasets over the Upper Blue Nile Basin. The study utilized six satellite derived precipitation datasets at 0.25-deg spatial grid size and daily temporal resolution:1) the near real-time (3B42_RT) and gauge adjusted (3B42_V7) products of Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA), 2) gauge adjusted and unadjusted Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) products and 3) the gauge adjusted and un-adjusted product of the National Oceanic and Atmospheric Administration (NOAA) Climate Prediction Center Morphing technique (CMORPH) over the period of 2000 to 2013.The error analysis utilized statistical techniques using bias ratio (Bias), correlation coefficient (CC) and root-mean-square-error (RMSE). Mean relative error (MRE), CC and RMSE metrics are further examined for six categories of 10th, 25th, 50th, 75th, 90thand 95th percentile rainfall thresholds. The skill of the satellite estimates is evaluated using categorical error metrics of missed rainfall volume fraction (MRV), falsely detected rainfall volume fraction (FRV), probability of detection (POD) and False Alarm Ratio (FAR). Results showed that six satellite based rainfall products underestimated wet season (June to September) gauge precipitation, with the exception of non-adjusted PERSIANN that overestimated the initial part of the rainy season (March to May). During the wet season, adjusted CMORPH has relatively better bias ratio (89 %) followed by 3B42_V7 (88%), adjusted-PERSIANN (81%), and non-adjusted products have relatively lower bias ratio. The results from CC statistic range from 0.34 to 0.43 for the wet season with adjusted products having slightly higher values. The initial rainy season has relatively higher CC than the wet season. Results from the categorical error metrics showed that CMORPH products have higher POD (91%), which are better in avoiding detecting false rainfall events in the wet season. For the initial rainy season PERSIANN (<50%), TMPA and CMORPH products are nearly equivalent (63-67%). On the other hand, FAR is below 0.1% for all products while in the wet season is higher (10-25%). In terms of rainfall volume of missed and false detected rainfall, CMORPH exhibited lower MRV ( 4.5%) than the TMPA and PERSIANN products (11-19%.) in the wet season. MRV for the initial rainy season was 20% for TMPA and CMORPH products and above 30% for PERSIANN products. All products are nearly equivalent in the wet season in terms of FRV (< 0.2%). The magnitude of MRE increases with gauge rainfall threshold categories with 3B42-V7 and adjusted CMORPH having lower magnitude, showing that underestimation of rainfall increases with increasing rainfall magnitude. CC also decreases with gauge rainfall threshold categories with CMORPH products having slightly higher values. Overall, all satellite products underestimated (overestimated) lower (higher) quantiles quantiles. We have observed that among the six satellite rainfall products the adjusted CMORPH has relatively better potential to improve wet season rainfall estimate and 3B42-V7 that initial rainy season in the Upper Blue Nile Basin.

  9. An alternative empirical likelihood method in missing response problems and causal inference.

    PubMed

    Ren, Kaili; Drummond, Christopher A; Brewster, Pamela S; Haller, Steven T; Tian, Jiang; Cooper, Christopher J; Zhang, Biao

    2016-11-30

    Missing responses are common problems in medical, social, and economic studies. When responses are missing at random, a complete case data analysis may result in biases. A popular debias method is inverse probability weighting proposed by Horvitz and Thompson. To improve efficiency, Robins et al. proposed an augmented inverse probability weighting method. The augmented inverse probability weighting estimator has a double-robustness property and achieves the semiparametric efficiency lower bound when the regression model and propensity score model are both correctly specified. In this paper, we introduce an empirical likelihood-based estimator as an alternative to Qin and Zhang (2007). Our proposed estimator is also doubly robust and locally efficient. Simulation results show that the proposed estimator has better performance when the propensity score is correctly modeled. Moreover, the proposed method can be applied in the estimation of average treatment effect in observational causal inferences. Finally, we apply our method to an observational study of smoking, using data from the Cardiovascular Outcomes in Renal Atherosclerotic Lesions clinical trial. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. [Health status, use of health services and reported morbidity: application of correspondence analysis].

    PubMed

    Espinàs, J A; Riba, M D; Borràs, J M; Sánchez, V

    1995-01-01

    The study of the relationship between self-reported morbidity, health status and health care utilization presents methodological problems due to the variety of illnesses and medical conditions that one individual may report. In this article, correspondence analysis was use to analyse these relationships. Data from the Spanish National Health Survey pertaining to the region of Catalonia was studied. Statistical analysis included multi-way correspondence analysis (MCA) followed by cluster analysis. The first factor extracted is defined by self-assessed health perception; the second, by limitation of activities, and the third is related to self-reported morbidity caused by chronic and acute health problems. Fourth and fifth factors, capture residual variability and missing values. Acute problems are more related to perception of poor health while chronic problems are related to perception of fair health. Also, it may be possible to distinguish self-reported morbidity due to relapses of chronic diseases from true acute health problems. Cluster analysis classified individuals into four groups: 1) healthy people; 2) people who assess their health as being poor and those with acute health problems; 3) people with chronic health problems, limited activity and a perception of fair health; and 4) missing values. Correspondence analysis is a useful tool when analyzing qualitative variables like those in a health survey.

  11. The CEOS Atmospheric Composition Constellation (ACC), an Integrated Observing System

    NASA Astrophysics Data System (ADS)

    Hilsenrath, E.; Langen, J.; Zehner, C.

    2008-05-01

    The Atmospheric Composition (AC) Constellation is one of four pilot projects initiated by the Committee for Earth Observations Systems (CEOS) to bring about technical/scientific cooperation among space agencies that meet the goals of GEO and comply with the CEOS member agencies national programs. The Constellation concept has been endorsed in the GEO Work Plan, 2007-2009. The AC Constellation goal is to collect and deliver data to develop and improve monitoring, assessment and predictive capabilities for changes in the ozone layer, air quality and climate forcing associated with changes in the environment. These data will support five of the nine GEO SBAs: Health, Energy, Climate, Hazards, and Ecosystems. At the present time ESA, EC, CSA, CNES, JAXA, DLR, NIVR, NASA, NOAA and Eumetsat are participating in the Constellation study, and have major assets in orbit including 17 instruments on seven platforms. One goal of the Constellation study is to identify missing capabilities that will result when the present orbiting research satellites missions end and those not included in the next generation operational missions. Missing observations include very accurate and high spatial resolution measurements needed to be to track trends in atmospheric composition and understand their relationship to climate change. The following are the top level objectives for the AC Constellation Concept Study: • Develop a virtual constellation of existing and upcoming missions using synergies among the instruments and identify missing capabilities. • Study advanced architecture with new space assets and varying orbits with expectations that new technology could also be brought forward to best meet user requirements • Data system interoperability to insure that data are useful, properly targeted, and easily accessible. To demonstrate that the Constellation concept can provide value added data products, the ACC has initiated the three projects that are being supported by the participating space agencies. These include 1) Time of day changes in NO2 using Aura/OMI and Metop/GOME-2. 2) Near-real-time fire detection and smoke forecasts using multiple satellites (A-Train, GOES, GOME-2, MSG, etc) and trajectory model, and 3) Improved volcanic ash alerts for aviation hazard avoidance from satellite SO2 and ash data from SCIAMACHY, OMI, GOME-2, AIRS and SEVIRI. Each of the three projects will address the GEO SBAs with consideration to discovery and interoperability of their data products. The status of the ACC studies will be reviewed with a progress report on the above three projects.

  12. Explicating the Conditions Under Which Multilevel Multiple Imputation Mitigates Bias Resulting from Random Coefficient-Dependent Missing Longitudinal Data.

    PubMed

    Gottfredson, Nisha C; Sterba, Sonya K; Jackson, Kristina M

    2017-01-01

    Random coefficient-dependent (RCD) missingness is a non-ignorable mechanism through which missing data can arise in longitudinal designs. RCD, for which we cannot test, is a problematic form of missingness that occurs if subject-specific random effects correlate with propensity for missingness or dropout. Particularly when covariate missingness is a problem, investigators typically handle missing longitudinal data by using single-level multiple imputation procedures implemented with long-format data, which ignores within-person dependency entirely, or implemented with wide-format (i.e., multivariate) data, which ignores some aspects of within-person dependency. When either of these standard approaches to handling missing longitudinal data is used, RCD missingness leads to parameter bias and incorrect inference. We explain why multilevel multiple imputation (MMI) should alleviate bias induced by a RCD missing data mechanism under conditions that contribute to stronger determinacy of random coefficients. We evaluate our hypothesis with a simulation study. Three design factors are considered: intraclass correlation (ICC; ranging from .25 to .75), number of waves (ranging from 4 to 8), and percent of missing data (ranging from 20 to 50%). We find that MMI greatly outperforms the single-level wide-format (multivariate) method for imputation under a RCD mechanism. For the MMI analyses, bias was most alleviated when the ICC is high, there were more waves of data, and when there was less missing data. Practical recommendations for handling longitudinal missing data are suggested.

  13. Quantitative Analysis of the Contributing Factors Affecting Specialty Care No-Show Rates at Brooke Army Medical Center

    DTIC Science & Technology

    2007-03-30

    2002). In the Vein Treatment Surgery Center in Texas, failure to properly cancel cosmetic appointments will result in forfeiture of the patients’ $100...appointments. This problem affects more than just the United States. Missed appointments cost the National Healthcare System ( NHS ) in England a...significant amount of money last year. Official figures from the NHS showed 5.7 million appointments were missed in 2004-2005 (Carvel, 2006). When patients

  14. A public service communications satellite user brochure

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The capabilities of a proposed communications satellite that would be devoted to experiments and demonstrations of various public services is described. A Public Service Communications Satellite study was undertaken at the NASA Goddard Space Flight Center (GSFC) to define the problems and opportunities of a renewed NASA role and the form such NASA involvement should take. The concept that has evolved has resulted from careful consideration of experiments that were already undertaken on existing satellites.

  15. Generalizing MOND to explain the missing mass in galaxy clusters

    NASA Astrophysics Data System (ADS)

    Hodson, Alistair O.; Zhao, Hongsheng

    2017-02-01

    Context. MOdified Newtonian Dynamics (MOND) is a gravitational framework designed to explain the astronomical observations in the Universe without the inclusion of particle dark matter. MOND, in its current form, cannot explain the missing mass in galaxy clusters without the inclusion of some extra mass, be it in the form of neutrinos or non-luminous baryonic matter. We investigate whether the MOND framework can be generalized to account for the missing mass in galaxy clusters by boosting gravity in high gravitational potential regions. We examine and review Extended MOND (EMOND), which was designed to increase the MOND scale acceleration in high potential regions, thereby boosting the gravity in clusters. Aims: We seek to investigate galaxy cluster mass profiles in the context of MOND with the primary aim at explaining the missing mass problem fully without the need for dark matter. Methods: Using the assumption that the clusters are in hydrostatic equilibrium, we can compute the dynamical mass of each cluster and compare the result to the predicted mass of the EMOND formalism. Results: We find that EMOND has some success in fitting some clusters, but overall has issues when trying to explain the mass deficit fully. We also investigate an empirical relation to solve the cluster problem, which is found by analysing the cluster data and is based on the MOND paradigm. We discuss the limitations in the text.

  16. The SPS interference problem-electronic system effects and mitigation techniques

    NASA Technical Reports Server (NTRS)

    Juroshek, J. R.

    1980-01-01

    The potential for interference between solar power satellites (SPS) and other Earth satellite operations was examined along with interference problems involving specific electronic devices. Conclusions indicate that interference is likely in the 2500 MHz to 2690 MHz direct broadcast satellite band adjacent to SPS. Estimates of the adjacent channel noise from SPS in this band are as high as -124 dBc/4 kHz and -100 dBc/MHz, where dBc represents decibels relative to the total power in the fundamental. A second potential problem is the 7350 MHz, 3d harmonic from SPS that falls within the 7300 MHz to 7450 MHz space to Earth, government, satellite assignment. Catastrophic failures can be produced in integrated circuits when the microwave power levels coupled into inputs and power leads reach 1 to 100 watts. The failures are typically due to bonding wire melting, metallization failures, and junction shorting. Nondestructive interaction or interference, however, generally occurs with coupled power levels of the order of 10 milliwatts. This integration is due to the rectification of microwave energy by the numerous pn junctions within these circuits.

  17. Sixteen years of ICPC use in Norwegian primary care: looking through the facts

    PubMed Central

    2010-01-01

    Background The International Classification for Primary Care (ICPC) standard aims to facilitate simultaneous and longitudinal comparisons of clinical primary care practice within and across country borders; it is also used for administrative purposes. This study evaluates the use of the original ICPC-1 and the more complete ICPC-2 Norwegian versions in electronic patient records. Methods We performed a retrospective study of approximately 1.5 million ICPC codes and diagnoses that were collected over a 16-year period at 12 primary care sites in Norway. In the first phase of this period (transition phase, 1992-1999) physicians were allowed to not use an ICPC code in their practice while in the second phase (regular phase, 2000-2008) the use of an ICPC code was mandatory. The ICPC codes and diagnoses defined a problem event for each patient in the PROblem-oriented electronic MEDical record (PROMED). The main outcome measure of our analysis was the percentage of problem events in PROMEDs with inappropriate (or missing) ICPC codes and of diagnoses that did not map the latest ICPC-2 classification. Specific problem areas (pneumonia, anaemia, tonsillitis and diabetes) were examined in the same context. Results Codes were missing in 6.2% of the problem events; incorrect codes were observed in 4.0% of the problem events and text mismatch between the diagnoses and the expected ICPC-2 diagnoses text in 53.8% of the problem events. Missing codes were observed only during the transition phase while incorrect and inappropriate codes were used all over the 16-year period. The physicians created diagnoses that did not exist in ICPC. These 'new' diagnoses were used with varying frequency; many of them were used only once. Inappropriate ICPC-2 codes were also observed in the selected problem areas and for both phases. Conclusions Our results strongly suggest that physicians did not adhere to the ICPC standard due to its incompleteness, i.e. lack of many clinically important diagnoses. This indicates that ICPC is inappropriate for the classification of problem events and the clinical practice in primary care. PMID:20181271

  18. Banded whistlers observed on OGO-4

    NASA Technical Reports Server (NTRS)

    Paymar, E. M.

    1972-01-01

    Inspection of broadband VLF records from OGO-4 shows that some whistlers exhibit a banded structure in which one or more bands of frequencies are missing from the whistler's spectrum. The phenomenon is commonly observed by satellites on midlatitude field lines at all local times and at various longitudes around the world. The dispersion of banded whistlers (BW) is of several tens of sec to the 1/2 power, indicating that they originated in the opposite hemisphere and are propagating downward at the satellite. BW are generally spread in time (tenths of seconds) rather than sharply defined and tend to occur at random. The frequency spacing of the bands may be either uniform or irregular, and may vary radically between successive events. Several possible explanations for BW are considered. In particular, an analysis of the interaction of plane electromagnetic waves traveling in an anisotropic plasma with a field aligned slab of enhanced ionization is presented with promising results.

  19. Centralized Networks to Generate Human Body Motions

    PubMed Central

    Vakulenko, Sergei; Radulescu, Ovidiu; Morozov, Ivan

    2017-01-01

    We consider continuous-time recurrent neural networks as dynamical models for the simulation of human body motions. These networks consist of a few centers and many satellites connected to them. The centers evolve in time as periodical oscillators with different frequencies. The center states define the satellite neurons’ states by a radial basis function (RBF) network. To simulate different motions, we adjust the parameters of the RBF networks. Our network includes a switching module that allows for turning from one motion to another. Simulations show that this model allows us to simulate complicated motions consisting of many different dynamical primitives. We also use the model for learning human body motion from markers’ trajectories. We find that center frequencies can be learned from a small number of markers and can be transferred to other markers, such that our technique seems to be capable of correcting for missing information resulting from sparse control marker settings. PMID:29240694

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saide, Pablo E.; Peterson, David A.; de Silva, Arlindo

    We couple airborne, ground-based, and satellite observations; conduct regional simulations; and develop and apply an inversion technique to constrain hourly smoke emissions from the Rim Fire, the third largest observed in California, USA. Emissions constrained with multiplatform data show notable nocturnal enhancements (sometimes over a factor of 20), correlate better with daily burned area data, and are a factor of 2–4 higher than a priori estimates, highlighting the need for improved characterization of diurnal profiles and day-to-day variability when modeling extreme fires. Constraining only with satellite data results in smaller enhancements mainly due to missing retrievals near the emissions source,more » suggesting that top-down emission estimates for these events could be underestimated and a multiplatform approach is required to resolve them. Predictions driven by emissions constrained with multiplatform data present significant variations in downwind air quality and in aerosol feedback on meteorology, emphasizing the need for improved emissions estimates during exceptional events.« less

  1. Satellite techniques yield insight into devastating rainfall from Hurricane Mitch

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Vicente, G.; Ba, M.; Gruber, A.; Scofield, R.; Li, Q.; Weldon, R.

    Hurricane Mitch may prove to be one of the most devastating tropical cyclones to affect the western hemisphere. Heavy rains over Central America from October 28, 1998, to November 1, 1998, caused widespread flooding and mud slides in Nicaragua and Honduras resulting in thousands of deaths and missing persons. News reports indicated entire towns being swept away, destruction of national economies and infrastructure, and widespread disease in the aftermath of the storm, which some estimates suggested dropped as much as 1300 mm of rain.However, in view of the widespread damage it is difficult to determine the actual amounts and distribution of rainfall. More accurate means of determining the rainfall associated with Mitch are vital for diagnosing and understanding the evolution of this disaster and for developing new mitigation strategies for future tropical cyclones. Satellite data may prove to be a reliable resource for accurate rainfall analysis and have yielded apparently reliable figures for Hurricane Mitch.

  2. Centralized Networks to Generate Human Body Motions.

    PubMed

    Vakulenko, Sergei; Radulescu, Ovidiu; Morozov, Ivan; Weber, Andres

    2017-12-14

    We consider continuous-time recurrent neural networks as dynamical models for the simulation of human body motions. These networks consist of a few centers and many satellites connected to them. The centers evolve in time as periodical oscillators with different frequencies. The center states define the satellite neurons' states by a radial basis function (RBF) network. To simulate different motions, we adjust the parameters of the RBF networks. Our network includes a switching module that allows for turning from one motion to another. Simulations show that this model allows us to simulate complicated motions consisting of many different dynamical primitives. We also use the model for learning human body motion from markers' trajectories. We find that center frequencies can be learned from a small number of markers and can be transferred to other markers, such that our technique seems to be capable of correcting for missing information resulting from sparse control marker settings.

  3. Space debris tracking at San Fernando laser station

    NASA Astrophysics Data System (ADS)

    Catalán, M.; Quijano, M.; Pazos, A.; Martín Davila, J.; Cortina, L. M.

    2016-12-01

    For years to come space debris will be a major issue for society. It has a negative impact on active artificial satellites, having implications for future missions. Tracking space debris as accurately as possible is the first step towards controlling this problem, yet it presents a challenge for science. The main limitation is the relatively low accuracy of the methods used to date for tracking these objects. Clearly, improving the predicted orbit accuracy is crucial (avoiding unnecessary anti-collision maneuvers). A new field of research was recently instituted by our satellite laser ranging station: tracking decommissioned artificial satellites equipped with retroreflectors. To this end we work in conjunction with international space agencies which provide increasing attention to this problem. We thus proposed to share our time-schedule of use of the satellite laser ranging station for obtaining data that would make orbital element predictions far more accurate (meter accuracy), whilst maintaining our tracking routines for active satellites. This manuscript reports on the actions carried out so far.

  4. A preliminary assessment of GPM-based multi-satellite precipitation estimates over a monsoon dominated region

    NASA Astrophysics Data System (ADS)

    Prakash, Satya; Mitra, Ashis K.; AghaKouchak, Amir; Liu, Zhong; Norouzi, Hamidreza; Pai, D. S.

    2018-01-01

    Following the launch of the Global Precipitation Measurement (GPM) Core Observatory, two advanced high resolution multi-satellite precipitation products namely, Integrated Multi-satellitE Retrievals for GPM (IMERG) and Global Satellite Mapping of Precipitation (GSMaP) version 6 are released. A critical evaluation of these newly released precipitation data sets is very important for both the end users and data developers. This study provides a comprehensive assessment of IMERG research product and GSMaP estimates over India at a daily scale for the southwest monsoon season (June to September 2014). The GPM-based precipitation products are inter-compared with widely used TRMM Multi-satellite Precipitation Analysis (TMPA), and gauge-based observations over India. Results show that the IMERG estimates represent the mean monsoon rainfall and its variability more realistically than the gauge-adjusted TMPA and GSMaP data. However, GSMaP has relatively smaller root-mean-square error than IMERG and TMPA, especially over the low mean rainfall regimes and along the west coast of India. An entropy-based approach is employed to evaluate the distributions of the selected precipitation products. The results indicate that the distribution of precipitation in IMERG and GSMaP has been improved markedly, especially for low precipitation rates. IMERG shows a clear improvement in missed and false precipitation bias over India. However, all the three satellite-based rainfall estimates show exceptionally smaller correlation coefficient, larger RMSE, larger negative total bias and hit bias over the northeast India where precipitation is dominated by orographic effects. Similarly, the three satellite-based estimates show larger false precipitation over the southeast peninsular India which is a rain-shadow region. The categorical verification confirms that these satellite-based rainfall estimates have difficulties in detection of rain over the southeast peninsula and northeast India. These preliminary results need to be confirmed in other monsoon seasons in future studies when the fully GPM-based IMERG retrospectively processed data prior to 2014 are available.

  5. Satellite recovery - Attitude dynamics of the targets

    NASA Technical Reports Server (NTRS)

    Cochran, J. E., Jr.; Lahr, B. S.

    1986-01-01

    The problems of categorizing and modeling the attitude dynamics of uncontrolled artificial earth satellites which may be targets in recovery attempts are addressed. Methods of classification presented are based on satellite rotational kinetic energy, rotational angular momentum and orbit and on the type of control present prior to the benign failure of the control system. The use of approximate analytical solutions and 'exact' numerical solutions to the equations governing satellite attitude motions to predict uncontrolled attitude motion is considered. Analytical and numerical results are presented for the evolution of satellite attitude motions after active control termination.

  6. Tools for Understanding Space Weather Impacts to Satellites

    NASA Astrophysics Data System (ADS)

    Green, J. C.; Shprits, Y.; Likar, J. J.; Kellerman, A. C.; Quinn, R. A.; Whelan, P.; Reker, N.; Huston, S. L.

    2017-12-01

    Space weather causes dramatic changes in the near-Earth radiation environment. Intense particle fluxes can damage electronic components on satellites, causing temporary malfunctions, degraded performance, or a complete system/mission loss. Understanding whether space weather is the cause of such problems expedites investigations and guides successful design improvements resulting in a more robust satellite architecture. Here we discuss our progress in developing tools for satellite designers, manufacturers, and decision makers - tools that summarize space weather impacts to specific satellite assets and enable confident identification of the cause and right solution.

  7. A recursively formulated first-order semianalytic artificial satellite theory based on the generalized method of averaging. Volume 1: The generalized method of averaging applied to the artificial satellite problem

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.

    1977-01-01

    A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.

  8. Reading Profiles in Multi-Site Data With Missingness.

    PubMed

    Eckert, Mark A; Vaden, Kenneth I; Gebregziabher, Mulugeta

    2018-01-01

    Children with reading disability exhibit varied deficits in reading and cognitive abilities that contribute to their reading comprehension problems. Some children exhibit primary deficits in phonological processing, while others can exhibit deficits in oral language and executive functions that affect comprehension. This behavioral heterogeneity is problematic when missing data prevent the characterization of different reading profiles, which often occurs in retrospective data sharing initiatives without coordinated data collection. Here we show that reading profiles can be reliably identified based on Random Forest classification of incomplete behavioral datasets, after the missForest method is used to multiply impute missing values. Results from simulation analyses showed that reading profiles could be accurately classified across degrees of missingness (e.g., ∼5% classification error for 30% missingness across the sample). The application of missForest to a real multi-site dataset with missingness ( n = 924) showed that reading disability profiles significantly and consistently differed in reading and cognitive abilities for cases with and without missing data. The results of validation analyses indicated that the reading profiles (cases with and without missing data) exhibited significant differences for an independent set of behavioral variables that were not used to classify reading profiles. Together, the results show how multiple imputation can be applied to the classification of cases with missing data and can increase the integrity of results from multi-site open access datasets.

  9. Controlling misses and false alarms in a machine learning framework for predicting uniformity of printed pages

    NASA Astrophysics Data System (ADS)

    Nguyen, Minh Q.; Allebach, Jan P.

    2015-01-01

    In our previous work1 , we presented a block-based technique to analyze printed page uniformity both visually and metrically. The features learned from the models were then employed in a Support Vector Machine (SVM) framework to classify the pages into one of the two categories of acceptable and unacceptable quality. In this paper, we introduce a set of tools for machine learning in the assessment of printed page uniformity. This work is primarily targeted to the printing industry, specifically the ubiquitous laser, electrophotographic printer. We use features that are well-correlated with the rankings of expert observers to develop a novel machine learning framework that allows one to achieve the minimum "false alarm" rate, subject to a chosen "miss" rate. Surprisingly, most of the research that has been conducted on machine learning does not consider this framework. During the process of developing a new product, test engineers will print hundreds of test pages, which can be scanned and then analyzed by an autonomous algorithm. Among these pages, most may be of acceptable quality. The objective is to find the ones that are not. These will provide critically important information to systems designers, regarding issues that need to be addressed in improving the printer design. A "miss" is defined to be a page that is not of acceptable quality to an expert observer that the prediction algorithm declares to be a "pass". Misses are a serious problem, since they represent problems that will not be seen by the systems designers. On the other hand, "false alarms" correspond to pages that an expert observer would declare to be of acceptable quality, but which are flagged by the prediction algorithm as "fails". In a typical printer testing and development scenario, such pages would be examined by an expert, and found to be of acceptable quality after all. "False alarm" pages result in extra pages to be examined by expert observers, which increases labor cost. But "false alarms" are not nearly as catastrophic as "misses", which represent potentially serious problems that are never seen by the systems developers. This scenario motivates us to develop a machine learning framework that will achieve the minimum "false alarm" rate subject to a specified "miss" rate. In order to construct such a set of receiver operating characteristic2 (ROC) curves, we examine various tools for the prediction, ranging from an exhaustive search over the space of the nonlinear discriminants to a Cost-Sentitive SVM3 framework. We then compare the curves gained from those methods. Our work shows promise for applying a standard framework to obtain a full ROC curve when it comes to tackling other machine learning problems in industry.

  10. Missing Oral Health-Related Data in the interRAI-HC - Associations with Selected Variables of General Health and the Effect of Multiple Imputation on the Relationship between Oral and General Health.

    PubMed

    Krausch-Hofmann, Stefanie; Bogaerts, Kris; Hofmann, Michael; de Almeida Mello, Johanna; Fávaro Moreira, Nádia Cristina; Lesaffre, Emmanuel; Declerck, Dominique; Declercq, Anja; Duyck, Joke

    2015-01-01

    Missing data within the comprehensive geriatric assessment of the interRAI suite of assessment instruments potentially imply the under-detection of conditions that require care as well as the risk of biased statistical results. Impaired oral health in older individuals has to be registered accurately as it causes pain and discomfort and is related to the general health status. This study was based on interRAI-Home Care (HC) baseline data from 7590 subjects (mean age 81.2 years, SD 6.9) in Belgium. It was investigated if missingness of the oral health-related items was associated with selected variables of general health. It was also determined if multiple imputation of missing data affected the associations between oral and general health. Multivariable logistic regression was used to determine if the prevalence of missingness in the oral health-related variables was associated with activities of daily life (ADLH), cognitive performance (CPS2) and depression (DRS). Associations between oral health and ADLH, CPS2 and DRS were determined, with missing data treated by 1. the complete-case technique and 2. by multiple imputation, and results were compared. The individual oral health-related variables had a similar proportion of missing values, ranging from 16.3% to 17.2%. The prevalence of missing data in all oral health-related variables was significantly associated with symptoms of depression (dental prosthesis use OR 1.66, CI 1.41-1.95; damaged teeth OR 1.74, CI 1.48-2.04; chewing problems OR 1.74, CI 1.47-2.05; dry mouth OR 1.65, CI 1.40-1.94). Missingness in damaged teeth (OR 1.27, CI 1.08-1.48), chewing problems (OR 1.22, CI 1.04-1.44) and dry mouth (OR 1.23, CI 1.05-1.44) occurred more frequently in cognitively impaired subjects. ADLH was not associated with the prevalence of missing data. When comparing the complete-case technique with the multiple imputation approach, nearly identical odds ratios characterized the associations between oral and general health. Cognitively impaired and depressive individuals had a higher risk of missing oral health-related information. Associations between oral health and ADLH, CPS2 and DRS were not influenced by multiple imputation of missing data. Further research should concentrate on the mechanisms that mediate the occurrence of missingness to develop preventative strategies.

  11. A multi-satellite orbit determination problem in a parallel processing environment

    NASA Technical Reports Server (NTRS)

    Deakyne, M. S.; Anderle, R. J.

    1988-01-01

    The Engineering Orbit Analysis Unit at GE Valley Forge used an Intel Hypercube Parallel Processor to investigate the performance and gain experience of parallel processors with a multi-satellite orbit determination problem. A general study was selected in which major blocks of computation for the multi-satellite orbit computations were used as units to be assigned to the various processors on the Hypercube. Problems encountered or successes achieved in addressing the orbit determination problem would be more likely to be transferable to other parallel processors. The prime objective was to study the algorithm to allow processing of observations later in time than those employed in the state update. Expertise in ephemeris determination was exploited in addressing these problems and the facility used to bring a realism to the study which would highlight the problems which may not otherwise be anticipated. Secondary objectives were to gain experience of a non-trivial problem in a parallel processor environment, to explore the necessary interplay of serial and parallel sections of the algorithm in terms of timing studies, to explore the granularity (coarse vs. fine grain) to discover the granularity limit above which there would be a risk of starvation where the majority of nodes would be idle or under the limit where the overhead associated with splitting the problem may require more work and communication time than is useful.

  12. Data acquisition and processing history for the Explorer 33 (AIMP-D) satellite

    NASA Technical Reports Server (NTRS)

    Karras, T. J.

    1972-01-01

    The quality control monitoring system, using accounting and quality control data bases, made it possible to perform an in-depth analysis. Results show that the percentage of useable data files for experimenter analysis was 97.7%; only 0.4% of the data sequences supplied to the experimenter exhibited missing data. The 50 percentile probability delay values (referenced to station record data) indicate that the analog tapes arrived within 11 days, the data were digitized within 4.2 weeks, and the experimenter tapes were delivered in 8.95 weeks or less.

  13. Moon Search Algorithms for NASA's Dawn Mission to Asteroid Vesta

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mcfadden, Lucy A.; Skillman, David R.; McLean, Brian; Mutchler, Max; Carsenty, Uri; Palmer, Eric E.

    2012-01-01

    A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid. Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies. Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid Vesta primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and tools used when addressing this problem, and 3) describe applications of various image processing and computational algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore, we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach and orbit the dwarf planet Ceres.

  14. Breast Cancer and Modifiable Lifestyle Factors in Argentinean Women: Addressing Missing Data in a Case-Control Study

    PubMed Central

    Coquet, Julia Becaria; Tumas, Natalia; Osella, Alberto Ruben; Tanzi, Matteo; Franco, Isabella; Diaz, Maria Del Pilar

    2016-01-01

    A number of studies have evidenced the effect of modifiable lifestyle factors such as diet, breastfeeding and nutritional status on breast cancer risk. However, none have addressed the missing data problem in nutritional epidemiologic research in South America. Missing data is a frequent problem in breast cancer studies and epidemiological settings in general. Estimates of effect obtained from these studies may be biased, if no appropriate method for handling missing data is applied. We performed Multiple Imputation for missing values on covariates in a breast cancer case-control study of Córdoba (Argentina) to optimize risk estimates. Data was obtained from a breast cancer case control study from 2008 to 2015 (318 cases, 526 controls). Complete case analysis and multiple imputation using chained equations were the methods applied to estimate the effects of a Traditional dietary pattern and other recognized factors associated with breast cancer. Physical activity and socioeconomic status were imputed. Logistic regression models were performed. When complete case analysis was performed only 31% of women were considered. Although a positive association of Traditional dietary pattern and breast cancer was observed from both approaches (complete case analysis OR=1.3, 95%CI=1.0-1.7; multiple imputation OR=1.4, 95%CI=1.2-1.7), effects of other covariates, like BMI and breastfeeding, were only identified when multiple imputation was considered. A Traditional dietary pattern, BMI and breastfeeding are associated with the occurrence of breast cancer in this Argentinean population when multiple imputation is appropriately performed. Multiple Imputation is suggested in Latin America’s epidemiologic studies to optimize effect estimates in the future. PMID:27892664

  15. Harnessing data structure for recovery of randomly missing structural vibration responses time history: Sparse representation versus low-rank structure

    NASA Astrophysics Data System (ADS)

    Yang, Yongchao; Nagarajaiah, Satish

    2016-06-01

    Randomly missing data of structural vibration responses time history often occurs in structural dynamics and health monitoring. For example, structural vibration responses are often corrupted by outliers or erroneous measurements due to sensor malfunction; in wireless sensing platforms, data loss during wireless communication is a common issue. Besides, to alleviate the wireless data sampling or communication burden, certain accounts of data are often discarded during sampling or before transmission. In these and other applications, recovery of the randomly missing structural vibration responses from the available, incomplete data, is essential for system identification and structural health monitoring; it is an ill-posed inverse problem, however. This paper explicitly harnesses the data structure itself-of the structural vibration responses-to address this (inverse) problem. What is relevant is an empirical, but often practically true, observation, that is, typically there are only few modes active in the structural vibration responses; hence a sparse representation (in frequency domain) of the single-channel data vector, or, a low-rank structure (by singular value decomposition) of the multi-channel data matrix. Exploiting such prior knowledge of data structure (intra-channel sparse or inter-channel low-rank), the new theories of ℓ1-minimization sparse recovery and nuclear-norm-minimization low-rank matrix completion enable recovery of the randomly missing or corrupted structural vibration response data. The performance of these two alternatives, in terms of recovery accuracy and computational time under different data missing rates, is investigated on a few structural vibration response data sets-the seismic responses of the super high-rise Canton Tower and the structural health monitoring accelerations of a real large-scale cable-stayed bridge. Encouraging results are obtained and the applicability and limitation of the presented methods are discussed.

  16. Energy-Efficient Optimal Power Allocation in Integrated Wireless Sensor and Cognitive Satellite Terrestrial Networks

    PubMed Central

    Li, Guangxia; An, Kang; Gao, Bin; Zheng, Gan

    2017-01-01

    This paper proposes novel satellite-based wireless sensor networks (WSNs), which integrate the WSN with the cognitive satellite terrestrial network. Having the ability to provide seamless network access and alleviate the spectrum scarcity, cognitive satellite terrestrial networks are considered as a promising candidate for future wireless networks with emerging requirements of ubiquitous broadband applications and increasing demand for spectral resources. With the emerging environmental and energy cost concerns in communication systems, explicit concerns on energy efficient resource allocation in satellite networks have also recently received considerable attention. In this regard, this paper proposes energy-efficient optimal power allocation schemes in the cognitive satellite terrestrial networks for non-real-time and real-time applications, respectively, which maximize the energy efficiency (EE) of the cognitive satellite user while guaranteeing the interference at the primary terrestrial user below an acceptable level. Specifically, average interference power (AIP) constraint is employed to protect the communication quality of the primary terrestrial user while average transmit power (ATP) or peak transmit power (PTP) constraint is adopted to regulate the transmit power of the satellite user. Since the energy-efficient power allocation optimization problem belongs to the nonlinear concave fractional programming problem, we solve it by combining Dinkelbach’s method with Lagrange duality method. Simulation results demonstrate that the fading severity of the terrestrial interference link is favorable to the satellite user who can achieve EE gain under the ATP constraint comparing to the PTP constraint. PMID:28869546

  17. Contribution of Equal-Sign Instruction beyond Word-Problem Tutoring for Third-Grade Students with Mathematics Difficulty

    ERIC Educational Resources Information Center

    Powell, Sarah R.; Fuchs, Lynn S.

    2010-01-01

    Elementary school students often misinterpret the equal sign (=) as an operational rather than a relational symbol. Such misunderstanding is problematic because solving equations with missing numbers may be important for the development of higher order mathematics skills, including solving word problems. Research indicates equal-sign instruction…

  18. Language and Communication-Related Problems of Aviation Safety.

    ERIC Educational Resources Information Center

    Cushing, Steven

    A study of the problems posed by the use of natural language in various aspects of aviation is presented. The study, part of a larger investigation of the feasibility of voice input/output interfaces for communication in aviation, looks at representative real examples of accidents and near misses resulting from language confusions and omissions.…

  19. Addressing the Curriculum Problem in Doctoral Education

    ERIC Educational Resources Information Center

    Green, Bill

    2012-01-01

    How best to understand the curriculum problem in doctoral research education: that is the question that this paper engages. It begins by noting that curriculum as such is little referenced and inadequately theorised in higher education and certainly in doctoral education, and indeed has been described as a "missing term". The paper then…

  20. Improved satellite constellations for CONUS ATC coverage

    DOT National Transportation Integrated Search

    1974-05-01

    The report examines the problem of designing a constellation of orbiting satellites capable of supporting an aircraft navigation/surveillance service over CONUS. It is assumed that the aircraft positions are determined by hyperbolic multilateration u...

  1. Persons with dementia missing in the community: is it wandering or something unique?

    PubMed

    Rowe, Meredeth A; Vandeveer, Sydney S; Greenblum, Catherine A; List, Cassandra N; Fernandez, Rachael M; Mixson, Natalie E; Ahn, Hyo C

    2011-06-05

    At some point in the disease process many persons with dementia (PWD) will have a missing incident and be unable to safely return to their care setting. In previous research studies, researchers have begun to question whether this phenomenon should continue to be called wandering since the antecedents and characteristics of a missing incident are dissimilar to accepted definitions of wandering in dementia. The purpose of this study was to confirm previous findings regarding the antecedents and characteristics of missing incidents, understand the differences between those found dead and alive, and compare the characteristics of a missing incident to that of wandering. A retrospective design was used to analyse 325 newspaper reports of PWD missing in the community. The primary antecedent to a missing incident, particularly in community-dwelling PWD, was becoming lost while conducting a normal and permitted activity alone in the community. The other common antecedent was a lapse in supervision with the expectation that the PWD would remain in a safe location but did not. Deaths most commonly occurred in unpopulated areas due to exposure and drowning. Those who died were found closer to the place last seen and took longer to find, but there were no significant differences in gender or age. The key characteristics of a missing incident were: unpredictable, non-repetitive, temporally appropriate but spatially-disordered, and while using multiple means of movement (walking, car, public transportation). Missing incidents occurred without the discernible pattern present in wandering such as lapping or pacing, repetitive and temporally-disordered. This research supports the mounting evidence that the concept of wandering, in its formal sense, and missing incidents are two distinct concepts. It will be important to further develop the concept of missing incidents by identifying the differences and similarities from wandering. This will allow a more targeted assessment and intervention strategy for each problem.

  2. Persons with dementia missing in the community: Is it wandering or something unique?

    PubMed Central

    2011-01-01

    Background At some point in the disease process many persons with dementia (PWD) will have a missing incident and be unable to safely return to their care setting. In previous research studies, researchers have begun to question whether this phenomenon should continue to be called wandering since the antecedents and characteristics of a missing incident are dissimilar to accepted definitions of wandering in dementia. The purpose of this study was to confirm previous findings regarding the antecedents and characteristics of missing incidents, understand the differences between those found dead and alive, and compare the characteristics of a missing incident to that of wandering. Methods A retrospective design was used to analyse 325 newspaper reports of PWD missing in the community. Results The primary antecedent to a missing incident, particularly in community-dwelling PWD, was becoming lost while conducting a normal and permitted activity alone in the community. The other common antecedent was a lapse in supervision with the expectation that the PWD would remain in a safe location but did not. Deaths most commonly occurred in unpopulated areas due to exposure and drowning. Those who died were found closer to the place last seen and took longer to find, but there were no significant differences in gender or age. The key characteristics of a missing incident were: unpredictable, non-repetitive, temporally appropriate but spatially-disordered, and while using multiple means of movement (walking, car, public transportation). Missing incidents occurred without the discernible pattern present in wandering such as lapping or pacing, repetitive and temporally-disordered. Conclusions This research supports the mounting evidence that the concept of wandering, in its formal sense, and missing incidents are two distinct concepts. It will be important to further develop the concept of missing incidents by identifying the differences and similarities from wandering. This will allow a more targeted assessment and intervention strategy for each problem. PMID:21639942

  3. Developing a framework to review near-miss maternal morbidity in India: a structured review and key stakeholder analysis.

    PubMed

    Bhattacharyya, Sanghita; Srivastava, Aradhana; Knight, Marian

    2014-11-13

    In India there is a thrust towards promoting institutional delivery, resulting in problems of overcrowding and compromise to quality of care. Review of near-miss obstetric events has been suggested to be useful to investigate health system functioning, complementing maternal death reviews. The aim of this project was to identify the key elements required for a near-miss review programme for India. A structured review was conducted to identify methods used in assessing near-miss cases. The findings of the structured review were used to develop a suggested framework for conducting near-miss reviews in India. A pool of experts in near-miss review methods in low and middle income countries (LMICs) was identified for vetting the framework developed. Opinions were sought about the feasibility of implementing near-miss reviews in India, the processes to be followed, factors that made implementation successful and the associated challenges. A draft of the framework was revised based on the experts' opinions. Five broad methods of near-miss case review/audit were identified: Facility-based near-miss case review, confidential enquiries, criterion-based clinical audit, structured case review (South African Model) and home-based interviews. The opinion of the 11 stakeholders highlighted that the methods that a facility adopts should depend on the type and number of cases the facility handles, availability and maintenance of a good documentation system, and local leadership and commitment of staff. A proposed framework for conducting near-miss reviews was developed that included a combination of criterion-based clinical audit and near-miss review methods. The approach allowed for development of a framework for researchers and planners seeking to improve quality of maternal care not only at the facility level but also beyond, encompassing community health workers and referral. Further work is needed to evaluate the implementation of this framework to determine its efficacy in improving the quality of care and hence maternal and perinatal morbidity and mortality.

  4. Development of a PC-based ground support system for a small satellite instrument

    NASA Astrophysics Data System (ADS)

    Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.

    1993-11-01

    The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.

  5. The fusion of satellite and UAV data: simulation of high spatial resolution band

    NASA Astrophysics Data System (ADS)

    Jenerowicz, Agnieszka; Siok, Katarzyna; Woroszkiewicz, Malgorzata; Orych, Agata

    2017-10-01

    Remote sensing techniques used in the precision agriculture and farming that apply imagery data obtained with sensors mounted on UAV platforms became more popular in the last few years due to the availability of low- cost UAV platforms and low- cost sensors. Data obtained from low altitudes with low- cost sensors can be characterised by high spatial and radiometric resolution but quite low spectral resolution, therefore the application of imagery data obtained with such technology is quite limited and can be used only for the basic land cover classification. To enrich the spectral resolution of imagery data acquired with low- cost sensors from low altitudes, the authors proposed the fusion of RGB data obtained with UAV platform with multispectral satellite imagery. The fusion is based on the pansharpening process, that aims to integrate the spatial details of the high-resolution panchromatic image with the spectral information of lower resolution multispectral or hyperspectral imagery to obtain multispectral or hyperspectral images with high spatial resolution. The key of pansharpening is to properly estimate the missing spatial details of multispectral images while preserving their spectral properties. In the research, the authors presented the fusion of RGB images (with high spatial resolution) obtained with sensors mounted on low- cost UAV platforms and multispectral satellite imagery with satellite sensors, i.e. Landsat 8 OLI. To perform the fusion of UAV data with satellite imagery, the simulation of the panchromatic bands from RGB data based on the spectral channels linear combination, was conducted. Next, for simulated bands and multispectral satellite images, the Gram-Schmidt pansharpening method was applied. As a result of the fusion, the authors obtained several multispectral images with very high spatial resolution and then analysed the spatial and spectral accuracies of processed images.

  6. Design, implementation and reporting strategies to reduce the instance and impact of missing patient-reported outcome (PRO) data: a systematic review.

    PubMed

    Mercieca-Bebber, Rebecca; Palmer, Michael J; Brundage, Michael; Calvert, Melanie; Stockler, Martin R; King, Madeleine T

    2016-06-15

    Patient-reported outcomes (PROs) provide important information about the impact of treatment from the patients' perspective. However, missing PRO data may compromise the interpretability and value of the findings. We aimed to report: (1) a non-technical summary of problems caused by missing PRO data; and (2) a systematic review by collating strategies to: (A) minimise rates of missing PRO data, and (B) facilitate transparent interpretation and reporting of missing PRO data in clinical research. Our systematic review does not address statistical handling of missing PRO data. MEDLINE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases (inception to 31 March 2015), and citing articles and reference lists from relevant sources. English articles providing recommendations for reducing missing PRO data rates, or strategies to facilitate transparent interpretation and reporting of missing PRO data were included. 2 reviewers independently screened articles against eligibility criteria. Discrepancies were resolved with the research team. Recommendations were extracted and coded according to framework synthesis. 117 sources (55% discussion papers, 26% original research) met the eligibility criteria. Design and methodological strategies for reducing rates of missing PRO data included: incorporating PRO-specific information into the protocol; carefully designing PRO assessment schedules and defining termination rules; minimising patient burden; appointing a PRO coordinator; PRO-specific training for staff; ensuring PRO studies are adequately resourced; and continuous quality assurance. Strategies for transparent interpretation and reporting of missing PRO data include utilising auxiliary data to inform analysis; transparently reporting baseline PRO scores, rates and reasons for missing data; and methods for handling missing PRO data. The instance of missing PRO data and its potential to bias clinical research can be minimised by implementing thoughtful design, rigorous methodology and transparent reporting strategies. All members of the research team have a responsibility in implementing such strategies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. A hybrid frame concealment algorithm for H.264/AVC.

    PubMed

    Yan, Bo; Gharavi, Hamid

    2010-01-01

    In packet-based video transmissions, packets loss due to channel errors may result in the loss of the whole video frame. Recently, many error concealment algorithms have been proposed in order to combat channel errors; however, most of the existing algorithms can only deal with the loss of macroblocks and are not able to conceal the whole missing frame. In order to resolve this problem, in this paper, we have proposed a new hybrid motion vector extrapolation (HMVE) algorithm to recover the whole missing frame, and it is able to provide more accurate estimation for the motion vectors of the missing frame than other conventional methods. Simulation results show that it is highly effective and significantly outperforms other existing frame recovery methods.

  8. Are Expectations the Missing Link between Life History Strategies and Psychopathology?

    PubMed

    Kavanagh, Phillip S; Kahl, Bianca L

    2018-01-01

    Despite advances in knowledge and thinking about using life history theory to explain psychopathology there is still a missing link. That is, we all have a life history strategy, but not all of us develop mental health problems. We propose that the missing link is expectations - a mismatch between expected environmental conditions (including social) set by variations in life history strategies and the current environmental conditions. The mismatch hypothesis has been applied at the biological level in terms of health and disease and we believe that it can also be applied more broadly at the psychological level in terms of perceived expectations in the social environment and the resulting distress-psychopathology-that manifests when our expectations are not met.

  9. Communications network design and costing model technical manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    This computer model provides the capability for analyzing long-haul trunking networks comprising a set of user-defined cities, traffic conditions, and tariff rates. Networks may consist of all terrestrial connectivity, all satellite connectivity, or a combination of terrestrial and satellite connectivity. Network solutions provide the least-cost routes between all cities, the least-cost network routing configuration, and terrestrial and satellite service cost totals. The CNDC model allows analyses involving three specific FCC-approved tariffs, which are uniquely structured and representative of most existing service connectivity and pricing philosophies. User-defined tariffs that can be variations of these three tariffs are accepted as input to the model and allow considerable flexibility in network problem specification. The resulting model extends the domain of network analysis from traditional fixed link cost (distance-sensitive) problems to more complex problems involving combinations of distance and traffic-sensitive tariffs.

  10. Convolutional Neural Network for Multi-Source Deep Learning Crop Classification in Ukraine

    NASA Astrophysics Data System (ADS)

    Lavreniuk, M. S.

    2016-12-01

    Land cover and crop type maps are one of the most essential inputs when dealing with environmental and agriculture monitoring tasks [1]. During long time neural network (NN) approach was one of the most efficient and popular approach for most applications, including crop classification using remote sensing data, with high an overall accuracy (OA) [2]. In the last years the most popular and efficient method for multi-sensor and multi-temporal land cover classification is convolution neural networks (CNNs). Taking into account presence clouds in optical data, self-organizing Kohonen maps (SOMs) are used to restore missing pixel values in a time series of optical imagery from Landsat-8 satellite. After missing data restoration, optical data from Landsat-8 was merged with Sentinel-1A radar data for better crop types discrimination [3]. An ensemble of CNNs is proposed for multi-temporal satellite images supervised classification. Each CNN in the corresponding ensemble is a 1-d CNN with 4 layers implemented using the Google's library TensorFlow. The efficiency of the proposed approach was tested on a time-series of Landsat-8 and Sentinel-1A images over the JECAM test site (Kyiv region) in Ukraine in 2015. Overall classification accuracy for ensemble of CNNs was 93.5% that outperformed an ensemble of multi-layer perceptrons (MLPs) by +0.8% and allowed us to better discriminate summer crops, in particular maize and soybeans. For 2016 we would like to validate this method using Sentinel-1 and Sentinel-2 data for Ukraine territory within ESA project on country level demonstration Sen2Agri. 1. A. Kolotii et al., "Comparison of biophysical and satellite predictors for wheat yield forecasting in Ukraine," The Int. Arch. of Photogram., Rem. Sens. and Spatial Inform. Scie., vol. 40, no. 7, pp. 39-44, 2015. 2. F. Waldner et al., "Towards a set of agrosystem-specific cropland mapping methods to address the global cropland diversity," Int. Journal of Rem. Sens. vol. 37, no. 14, pp 3196-3231, 2016. 3. S. Skakun et al., "Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine," IEEE Journal of Selected Topics in Applied Earth Observ. and Rem. Sens., 2015, DOI: 10.1109/JSTARS.2015.2454297.

  11. Selection-Fusion Approach for Classification of Datasets with Missing Values

    PubMed Central

    Ghannad-Rezaie, Mostafa; Soltanian-Zadeh, Hamid; Ying, Hao; Dong, Ming

    2010-01-01

    This paper proposes a new approach based on missing value pattern discovery for classifying incomplete data. This approach is particularly designed for classification of datasets with a small number of samples and a high percentage of missing values where available missing value treatment approaches do not usually work well. Based on the pattern of the missing values, the proposed approach finds subsets of samples for which most of the features are available and trains a classifier for each subset. Then, it combines the outputs of the classifiers. Subset selection is translated into a clustering problem, allowing derivation of a mathematical framework for it. A trade off is established between the computational complexity (number of subsets) and the accuracy of the overall classifier. To deal with this trade off, a numerical criterion is proposed for the prediction of the overall performance. The proposed method is applied to seven datasets from the popular University of California, Irvine data mining archive and an epilepsy dataset from Henry Ford Hospital, Detroit, Michigan (total of eight datasets). Experimental results show that classification accuracy of the proposed method is superior to those of the widely used multiple imputations method and four other methods. They also show that the level of superiority depends on the pattern and percentage of missing values. PMID:20212921

  12. Automatic tracking of dynamical evolutions of oceanic mesoscale eddies with satellite observation data

    NASA Astrophysics Data System (ADS)

    Sun, Liang; Li, Qiu-Yang

    2017-04-01

    The oceanic mesoscale eddies play a major role in ocean climate system. To analyse spatiotemporal dynamics of oceanic mesoscale eddies, the Genealogical Evolution Model (GEM) based on satellite data is developed, which is an efficient logical model used to track dynamic evolution of mesoscale eddies in the ocean. It can distinguish different dynamic processes (e.g., merging and splitting) within a dynamic evolution pattern, which is difficult to accomplish using other tracking methods. To this end, a mononuclear eddy detection method was firstly developed with simple segmentation strategies, e.g. watershed algorithm. The algorithm is very fast by searching the steepest descent path. Second, the GEM uses a two-dimensional similarity vector (i.e. a pair of ratios of overlap area between two eddies to the area of each eddy) rather than a scalar to measure the similarity between eddies, which effectively solves the ''missing eddy" problem (temporarily lost eddy in tracking). Third, for tracking when an eddy splits, GEM uses both "parent" (the original eddy) and "child" (eddy split from parent) and the dynamic processes are described as birth and death of different generations. Additionally, a new look-ahead approach with selection rules effectively simplifies computation and recording. All of the computational steps are linear and do not include iteration. Given the pixel number of the target region L, the maximum number of eddies M, the number N of look-ahead time steps, and the total number of time steps T, the total computer time is O (LM(N+1)T). The tracking of each eddy is very smooth because we require that the snapshots of each eddy on adjacent days overlap one another. Although eddy splitting or merging is ubiquitous in the ocean, they have different geographic distribution in the Northern Pacific Ocean. Both the merging and splitting rates of the eddies are high, especially at the western boundary, in currents and in "eddy deserts". GEM is useful not only for satellite-based observational data but also for numerical simulation outputs. It is potentially useful for studying dynamic processes in other related fields, e.g., the dynamics of cyclones in meteorology.

  13. Handling missing Mini-Mental State Examination (MMSE) values: Results from a cross-sectional long-term-care study.

    PubMed

    Godin, Judith; Keefe, Janice; Andrew, Melissa K

    2017-04-01

    Missing values are commonly encountered on the Mini Mental State Examination (MMSE), particularly when administered to frail older people. This presents challenges for MMSE scoring in research settings. We sought to describe missingness in MMSEs administered in long-term-care facilities (LTCF) and to compare and contrast approaches to dealing with missing items. As part of the Care and Construction project in Nova Scotia, Canada, LTCF residents completed an MMSE. Different methods of dealing with missing values (e.g., use of raw scores, raw scores/number of items attempted, scale-level multiple imputation [MI], and blended approaches) are compared to item-level MI. The MMSE was administered to 320 residents living in 23 LTCF. The sample was predominately female (73%), and 38% of participants were aged >85 years. At least one item was missing from 122 (38.2%) of the MMSEs. Data were not Missing Completely at Random (MCAR), χ 2 (1110) = 1,351, p < 0.001. Using raw scores for those missing <6 items in combination with scale-level MI resulted in the regression coefficients and standard errors closest to item-level MI. Patterns of missing items often suggest systematic problems, such as trouble with manual dexterity, literacy, or visual impairment. While these observations may be relatively easy to take into account in clinical settings, non-random missingness presents challenges for research and must be considered in statistical analyses. We present suggestions for dealing with missing MMSE data based on the extent of missingness and the goal of analyses. Copyright © 2016 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  14. Evaluation of missing value methods for predicting ambient BTEX concentrations in two neighbouring cities in Southwestern Ontario Canada

    NASA Astrophysics Data System (ADS)

    Miller, Lindsay; Xu, Xiaohong; Wheeler, Amanda; Zhang, Tianchu; Hamadani, Mariam; Ejaz, Unam

    2018-05-01

    High density air monitoring campaigns provide spatial patterns of pollutant concentrations which are integral in exposure assessment. Such analysis can assist with the determination of links between air quality and health outcomes, however, problems due to missing data can threaten to compromise these studies. This research evaluates four methods; mean value imputation, inverse distance weighting (IDW), inter-species ratios, and regression, to address missing spatial concentration data ranging from one missing data point up to 50% missing data. BTEX (benzene, toluene, ethylbenzene, and xylenes) concentrations were measured in Windsor and Sarnia, Ontario in the fall of 2005. Concentrations and inter-species ratios were generally similar between the two cities. Benzene (B) was observed to be higher in Sarnia, whereas toluene (T) and the T/B ratios were higher in Windsor. Using these urban, industrialized cities as case studies, this research demonstrates that using inter-species ratios or regression of the data for which there is complete information, along with one measured concentration (i.e. benzene) to predict for missing concentrations (i.e. TEX) results in good agreement between predicted and measured values. In both cities, the general trend remains that best agreement is observed for the leave-one-out scenario, followed by 10% and 25% missing, and the least agreement for the 50% missing cases. In the absence of any known concentrations IDW can provide reasonable agreement between observed and estimated concentrations for the BTEX species, and was superior over mean value imputation which was not able to preserve the spatial trend. The proposed methods can be used to fill in missing data, while preserving the general characteristics and rank order of the data which are sufficient for epidemiologic studies.

  15. A novel application of the Intent to Attend assessment to reduce bias due to missing data in a randomized controlled clinical trial

    PubMed Central

    Rabideau, Dustin J; Nierenberg, Andrew A; Sylvia, Louisa G; Friedman, Edward S.; Bowden, Charles L.; Thase, Michael E.; Ketter, Terence; Ostacher, Michael J.; Reilly-Harrington, Noreen; Iosifescu, Dan V.; Calabrese, Joseph R.; Leon, Andrew C.; Schoenfeld, David A

    2014-01-01

    Background Missing data are unavoidable in most randomized controlled clinical trials, especially when measurements are taken repeatedly. If strong assumptions about the missing data are not accurate, crude statistical analyses are biased and can lead to false inferences. Furthermore, if we fail to measure all predictors of missing data, we may not be able to model the missing data process sufficiently. In longitudinal randomized trials, measuring a patient's intent to attend future study visits may help to address both of these problems. Leon et al. developed and included the Intent to Attend assessment in the Lithium Treatment—Moderate dose Use Study (LiTMUS), aiming to remove bias due to missing data from the primary study hypothesis [1]. Purpose The purpose of this study is to assess the performance of the Intent to Attend assessment with regard to its use in a sensitivity analysis of missing data. Methods We fit marginal models to assess whether a patient's self-rated intent predicted actual study adherence. We applied inverse probability of attrition weighting (IPAW) coupled with patient intent to assess whether there existed treatment group differences in response over time. We compared the IPAW results to those obtained using other methods. Results Patient-rated intent predicted missed study visits, even when adjusting for other predictors of missing data. On average, the hazard of retention increased by 19% for every one-point increase in intent. We also found that more severe mania, male gender, and a previously missed visit predicted subsequent absence. Although we found no difference in response between the randomized treatment groups, IPAW increased the estimated group difference over time. Limitations LiTMUS was designed to limit missed study visits, which may have attenuated the effects of adjusting for missing data. Additionally, IPAW can be less efficient and less powerful than maximum likelihood or Bayesian estimators, given that the parametric model is well-specified. Conclusions In LiTMUS, the Intent to Attend assessment predicted missed study visits. This item was incorporated into our IPAW models and helped reduce bias due to informative missing data. This analysis should both encourage and facilitate future use of the Intent to Attend assessment along with IPAW to address missing data in a randomized trial. PMID:24872362

  16. The treatment of missing data in a large cardiovascular clinical outcomes study.

    PubMed

    Little, Roderick J; Wang, Julia; Sun, Xiang; Tian, Hong; Suh, Eun-Young; Lee, Michael; Sarich, Troy; Oppenheimer, Leonard; Plotnikov, Alexei; Wittes, Janet; Cook-Bruns, Nancy; Burton, Paul; Gibson, C Michael; Mohanty, Surya

    2016-06-01

    The potential impact of missing data on the results of clinical trials has received heightened attention recently. A National Research Council study provides recommendations for limiting missing data in clinical trial design and conduct, and principles for analysis, including the need for sensitivity analyses to assess robustness of findings to alternative assumptions about the missing data. A Food and Drug Administration advisory committee raised missing data as a serious concern in their review of results from the ATLAS ACS 2 TIMI 51 study, a large clinical trial that assessed rivaroxaban for its ability to reduce the risk of cardiovascular death, myocardial infarction or stroke in patients with acute coronary syndrome. This case study describes a variety of measures that were taken to address concerns about the missing data. A range of analyses are described to assess the potential impact of missing data on conclusions. In particular, measures of the amount of missing data are discussed, and the fraction of missing information from multiple imputation is proposed as an alternative measure. The sensitivity analysis in the National Research Council study is modified in the context of survival analysis where some individuals are lost to follow-up. The impact of deviations from ignorable censoring is assessed by differentially increasing the hazard of the primary outcome in the treatment groups and multiply imputing events between dropout and the end of the study. Tipping-point analyses are described, where the deviation from ignorable censoring that results in a reversal of significance of the treatment effect is determined. A study to determine the vital status of participants lost to follow-up was also conducted, and the results of including this additional information are assessed. Sensitivity analyses suggest that findings of the ATLAS ACS 2 TIMI 51 study are robust to missing data; this robustness is reinforced by the follow-up study, since inclusion of data from this study had little impact on the study conclusions. Missing data are a serious problem in clinical trials. The methods presented here, namely, the sensitivity analyses, the follow-up study to determine survival of missing cases, and the proposed measurement of missing data via the fraction of missing information, have potential application in other studies involving survival analysis where missing data are a concern. © The Author(s) 2016.

  17. Quarantine constraints as applied to satellites

    NASA Technical Reports Server (NTRS)

    Hoffman, A. R.; Stavro, W.; Gonzalez, C. C.

    1973-01-01

    Plans for unmanned missions to planets beyond Mars in the 1970s include satellite encounters. Recently published observations of data for Titan, a satellite of Saturn, indicate that conditions may be hospitable for the growth of microorganisms. Therefore, the problem of satisfying possible quarantine constraints for outer planet satellites was investigated. This involved determining the probability of impacting a satellite of Jupiter or Saturn by a spacecraft for a planned satellite encounter during an outer planet mission. Mathematical procedures were formulated which determine the areas in the aim-plane that would result in trajectories that impact the satellite and provide a technique for numerically integrating the navigation error function over the impact area to obtain impact probabilities. The results indicate which of the planned spacecraft trajectory correction maneuvers are most critical in terms of satellite quarantine violation.

  18. Wildland fire management. Volume 2: Wildland fire control 1985-1995. [satellite information system for California fire problems

    NASA Technical Reports Server (NTRS)

    Saveker, D. R. (Editor)

    1973-01-01

    The preliminary design of a satellite plus computer earth resources information system is proposed for potential uses in fire prevention and control in the wildland fire community. Suggested are satellite characteristics, sensor characteristics, discrimination algorithms, data communication techniques, data processing requirements, display characteristics, and costs in achieving the integrated wildland fire information system.

  19. Telepsychiatry, the satellite system and family consultation.

    PubMed

    Paul, N L

    1997-01-01

    A pilot telepsychiatry session was conducted with the US Department of Defense Satellite Communication System. The subjects were a family incompletely divorced many years before. There were two satellite interviews with this family. Bringing together all members of the original family so that questions could be addressed as to what happened when the children were very young unblocked a 13-year-old communication problem.

  20. Integrating multisensor satellite data merging and image reconstruction in support of machine learning for better water quality management.

    PubMed

    Chang, Ni-Bin; Bai, Kaixu; Chen, Chi-Farn

    2017-10-01

    Monitoring water quality changes in lakes, reservoirs, estuaries, and coastal waters is critical in response to the needs for sustainable development. This study develops a remote sensing-based multiscale modeling system by integrating multi-sensor satellite data merging and image reconstruction algorithms in support of feature extraction with machine learning leading to automate continuous water quality monitoring in environmentally sensitive regions. This new Earth observation platform, termed "cross-mission data merging and image reconstruction with machine learning" (CDMIM), is capable of merging multiple satellite imageries to provide daily water quality monitoring through a series of image processing, enhancement, reconstruction, and data mining/machine learning techniques. Two existing key algorithms, including Spectral Information Adaptation and Synthesis Scheme (SIASS) and SMart Information Reconstruction (SMIR), are highlighted to support feature extraction and content-based mapping. Whereas SIASS can support various data merging efforts to merge images collected from cross-mission satellite sensors, SMIR can overcome data gaps by reconstructing the information of value-missing pixels due to impacts such as cloud obstruction. Practical implementation of CDMIM was assessed by predicting the water quality over seasons in terms of the concentrations of nutrients and chlorophyll-a, as well as water clarity in Lake Nicaragua, providing synergistic efforts to better monitor the aquatic environment and offer insightful lake watershed management strategies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Lessons Learned in the Flight Qualification of the S-NPP and NOAA-20 Solar Array Mechanisms

    NASA Technical Reports Server (NTRS)

    Helfrich, Daniel; Sexton, Adam

    2018-01-01

    Deployable solar arrays are the energy source used on almost all Earth orbiting spacecraft and their release and deployment are mission-critical; fully testing them on the ground is a challenging endeavor. The 8 meter long deployable arrays flown on two sequential NASA weather satellites were each comprised of three rigid panels almost 2 meters wide. These large panels were deployed by hinges comprised of stacked constant force springs, eddy current dampers, and were restrained through launch by a set of four releasable hold-downs using shape memory alloy release devices. The ground qualification testing of such unwieldy deployable solar arrays, whose design was optimized for orbital operations, proved to be quite challenging and provides numerous lessons learned. A paperwork review and follow-up inspection after hardware storage determined that there were negative torque margins and missing lubricant, this paper will explain how these unexpected issues were overcome. The paper will also provide details on how the hinge subassemblies, the fully-assembled array, and mechanical ground support equipment were subsequently improved and qualified for a follow-on flight with considerably less difficulty. The solar arrays built by Ball Aerospace Corp. for the Suomi National Polar Partnership (S-NPP) satellite and the Joint Polar Satellite System (JPSS-1) satellite (now NOAA-20) were both successfully deployed on-obit and are performing well.

  2. Solar radio proxies for improved satellite orbit prediction

    NASA Astrophysics Data System (ADS)

    Yaya, Philippe; Hecker, Louis; Dudok de Wit, Thierry; Fèvre, Clémence Le; Bruinsma, Sean

    2017-12-01

    Specification and forecasting of solar drivers to thermosphere density models is critical for satellite orbit prediction and debris avoidance. Satellite operators routinely forecast orbits up to 30 days into the future. This requires forecasts of the drivers to these orbit prediction models such as the solar Extreme-UV (EUV) flux and geomagnetic activity. Most density models use the 10.7 cm radio flux (F10.7 index) as a proxy for solar EUV. However, daily measurements at other centimetric wavelengths have also been performed by the Nobeyama Radio Observatory (Japan) since the 1950's, thereby offering prospects for improving orbit modeling. Here we present a pre-operational service at the Collecte Localisation Satellites company that collects these different observations in one single homogeneous dataset and provides a 30 days forecast on a daily basis. Interpolation and preprocessing algorithms were developed to fill in missing data and remove anomalous values. We compared various empirical time series prediction techniques and selected a multi-wavelength non-recursive analogue neural network. The prediction of the 30 cm flux, and to a lesser extent that of the 10.7 cm flux, performs better than NOAA's present prediction of the 10.7 cm flux, especially during periods of high solar activity. In addition, we find that the DTM-2013 density model (Drag Temperature Model) performs better with (past and predicted) values of the 30 cm radio flux than with the 10.7 flux.

  3. Lessons Learned in the Flight Qualification of the S-NPP and NOAA-20 Solar Array Mechanisms

    NASA Technical Reports Server (NTRS)

    Sexton, Adam; Helfrich, Dan

    2018-01-01

    Deployable solar arrays are the energy source used on almost all Earth orbiting spacecraft and their release and deployment are mission-critical; fully testing them on the ground is a challenging endeavor. The 8 meter long deployable arrays flown on two sequential NASA weather satellites were each comprised of three rigid panels almost 2 meters wide. These large panels were deployed by hinges comprised of stacked constant force springs, eddy current dampers, and were restrained through launch by a set of four releasable hold-downs using shape memory alloy release devices. The ground qualification testing of such unwieldy deployable solar arrays, whose design was optimized for orbital operations, proved to be quite challenging and provides numerous lessons learned. A paperwork review and follow-up inspection after hardware storage determined that there were negative torque margins and missing lubricant, this paper will explain how these unexpected issues were overcome. The paper will also provide details on how the hinge subassemblies, the fully-assembled array, and mechanical ground support equipment were subsequently improved and qualified for a follow-on flight with considerably less difficulty. The solar arrays built by Ball Aerospace Corp. for the Suomi National Polar Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS-1) satellite (now NOAA-20) were both successfully deployed on-obit and are performing well.

  4. Are We Really Missing Small Galaxies?

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2018-02-01

    One long-standing astrophysical puzzle is that of so-called missing dwarf galaxies: the number of small dwarf galaxies that we observe is far fewer than that predicted by theory. New simulations, however, suggest that perhaps theres no mystery after all.Missing DwarfsDark-matter cosmological simulations predict many small galaxy halos for every large halo that forms. [The Via Lactea project]Models of a lambda-cold-dark-matter (CDM) universe predict the distribution of galaxy halo sizes throughout the universe, suggesting there should be many more small galaxies than large ones. In what has become known as the missing dwarf problem, however, we find that while we observe the expected numbers of galaxies at the larger end of the scale, we dont see nearly enough small galaxies to match the predictions.Are these galaxies actually missing? Are our predictions wrong? Or are the galaxies there and were just not spotting them? A recent study led by Alyson Brooks (Rutgers University) uses new simulations to explore whatscausing the difference between theory and observation.The fraction of detectable halos as a function of velocity, according to the authors simulations. Below 35 km/s, the detectability of the galaxies drops precipitously. [Brooks et al. 2017]Simulating Galactic VelocitiesBecause we cant weigh a galaxy directly, one proxy used for galaxy mass is its circular velocity; the more massive a galaxy, the faster gas and stars rotate around its center. The discrepancy between models and observations lies in whats known as the galaxy velocity function, which describes the number density of galaxies for a given circular velocity. While theory and observations agree for galaxies with circular velocities above 100 km/s, theory predicts far more dwarfs below this velocity than we observe.To investigate this problem, Brooks and collaborators ran a series of cosmological simulations based on our understanding of a CDM universe. Instead of exploring the result using only dark matter, however, the team included baryons in their simulations. They then produced mock observations of the resulting galaxy velocities to see what an observed velocity function would look like for their simulated galaxies.No Problem After All?Comparison of theoretical velocity functions to observations. The black dashed line shows the original, dark-matter-only model predictions; the black solid line includes the effects of detectability. Blue lines show the authors new model, including the effects of detectability and inclusion of baryons. The red and teal data points from observations match this corrected model well. [Brooks et al. 2017]Based on their baryon-inclusive simulations, Brooks and collaborators argue that there are two main factors that have contributed to the seeming theory/observation mismatch of the missing dwarf problem:Galaxies with low velocities arent detectable by our current surveys.The authors found that the detectable fraction of their simulated galaxies plunges as soon as galaxy velocity drops below 35 km/s. They conclude that were probably unable to see a large fraction of the smallest galaxies.Were not correctly inferring the circular velocity of the galaxies.Circular velocity is usually measured by looking at the line width of a gas tracer like HI. The authors find that this doesnt trace the full potential wells of the dwarf galaxies, however, resulting in an incorrect interpretation of their velocities.The authors show that the inclusion of these effects in the theoretical model significantly changes the predicted shape of the galaxy velocity function. This new function beautifully matches observations, neatly eliminating the missing dwarf problem. Perhaps this long-standing mystery has been a problem of interpretation all along!CitationAlyson M. Brooks et al 2017 ApJ 850 97. doi:10.3847/1538-4357/aa9576

  5. Satellites for distress alerting and locating: Report by Interagency Committee for Search and Rescue Ad Hoc Working Group

    NASA Technical Reports Server (NTRS)

    Ehrlich, E.

    1976-01-01

    The background behind the congressional legislation that led to the requirement for the Emergency Locator Transmitter (ELT) and the Emergency Position-Indicating Radio Beacon (EPIRB) to be installed on certain types of aircraft and inspected marine vessels respectively is discussed. The DAL problem is discussed for existing ELT and EPIRB equipped aircraft and ships. It is recognized that the DAL requirement for CONUS and Alaska and the maritime regions are not identical. In order to address the serious DAL problem which currently exists in CONUS and Alaska, a low orbiting satellite system evolves as the most viable and cost effective alternative that satisfies the overall SAR system design requirements. A satellite system designed to meet the needs of the maritime regions could be either low orbiting or geostationary. The conclusions drawn from this report support the recommendation to proceed with the implementation of a SAR orbiting satellite system.

  6. Three challenges described for identifying participants with missing data in trials reports, and potential solutions suggested to systematic reviewers.

    PubMed

    Akl, Elie A; Kahale, Lara A; Ebrahim, Shanil; Alonso-Coello, Pablo; Schünemann, Holger J; Guyatt, Gordon H

    2016-08-01

    To categorize the challenges in determining the extent of missing participant data in randomized trials and suggest potential solutions for systematic review authors. During the process of updating a series of Cochrane systematic reviews on the topic of anticoagulation in patients with cancer, we identified challenges and used an iterative approach to improve, and a consensus process to agree on the challenges identified, and to suggest potential ways of dealing with them. The five systematic reviews included 58 trials and 75 meta-analyses for patient-important dichotomous outcomes with 27,037 randomized participants. We identified three categories of challenges: (1) Although systematic reviewers require information about missing data to be reported by outcome, trialists typically report the information by participant; (2) It is not always clear whether the trialists followed up participants in certain categories (e.g., noncompliers), that is, whether some categories of participants did or did not have missing data; (3) It is not always clear how the trialists dealt with missing data in their analysis (e.g., exclusion from the denominator vs. assumptions made for the numerator). We discuss potential solutions for each one of these challenges and suggest further research work. Current reporting of missing data is often not explicit and transparent, and although our potential solutions to problems of suboptimal reporting may be helpful, reliable and valid characterization of the extent and nature of missing data remains elusive. Reporting of missing data in trials needs further improvement. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Striatal connectivity changes following gambling wins and near-misses: Associations with gambling severity.

    PubMed

    van Holst, Ruth J; Chase, Henry W; Clark, Luke

    2014-01-01

    Frontostriatal circuitry is implicated in the cognitive distortions associated with gambling behaviour. 'Near-miss' events, where unsuccessful outcomes are proximal to a jackpot win, recruit overlapping neural circuitry with actual monetary wins. Personal control over a gamble (e.g., via choice) is also known to increase confidence in one's chances of winning (the 'illusion of control'). Using psychophysiological interaction (PPI) analyses, we examined changes in functional connectivity as regular gamblers and non-gambling participants played a slot-machine game that delivered wins, near-misses and full-misses, and manipulated personal control. We focussed on connectivity with striatal seed regions, and associations with gambling severity, using voxel-wise regression. For the interaction term of near-misses (versus full-misses) by personal choice (participant-chosen versus computer-chosen), ventral striatal connectivity with the insula, bilaterally, was positively correlated with gambling severity. In addition, some effects for the contrast of wins compared to all non-wins were observed at an uncorrected (p < .001) threshold: there was an overall increase in connectivity between the striatal seeds and left orbitofrontal cortex and posterior insula, and a negative correlation for gambling severity with the connectivity between the right ventral striatal seed and left anterior cingulate cortex. These findings corroborate the 'non-categorical' nature of reward processing in gambling: near-misses and full-misses are objectively identical outcomes that are processed differentially. Ventral striatal connectivity with the insula correlated positively with gambling severity in the illusion of control contrast, which could be a risk factor for the cognitive distortions and loss-chasing that are characteristic of problem gambling.

  8. Reconstruction from EOF analysis of SMOS salinity data in Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Parard, Gaelle; Alvera-Azcárate, Aida; Barth, Alexander; Olmedo, Estrella; Turiel, Antonio; Becker, Jean-Marie

    2017-04-01

    Sea Surface Salinity (SSS) data from the Soil Moisture and Ocean Salinity (SMOS) mission is reconstructed in the North Atlantic and the Mediterranean Sea using DINEOF (Data Interpolating Empirical Orthogonal Functions). We used the satellite data Level 2 from SMOS Barcelona Expert Centre between 2011 and 2015. DINEOF is a technique that reconstructs missing data and removes noise by retaining only an optimal set of EOFs. DINEOF analysis is used to detect and remove outliers from the SMOS SSS daily field. The gain obtained with DINEOF method and L2 SMOS data give a higher spatial and temporal resolution between 2011 and 2015, allow to study the SSS variability from daily to seasonal resolution. In order to improve the SMOS salinity data reconstruction we combine with other parameters measured from satellite such chlorophyll, sea surface temperature, precipitation and CDOM variability. After a validation of the SMOS satellite data reconstruction with in situ data (CTD, Argo float salinity measurement) in the North Atlantic and Mediterranean Sea, the main SSS processes and their variability are studied. The gain obtained with the higher spatial and temporal resolution with SMOS salinity data give assess to study the characteristics of oceanic structures in North Atlantic and Mediterranean Sea.

  9. New energy conversion techniques in space, applicable to propulsion

    NASA Technical Reports Server (NTRS)

    Hertzberg, A.; Sun, K. C.

    1989-01-01

    The powering of aircraft with laser energy from a solar power satellite may be a promising new approach to the critical problem of the rising cost of fuel for aircraft transportation systems. The result is a nearly fuelless, pollution-free flight transportation system which is cost-competitive with the fuel-conservative airplane of the future. The major components of this flight system include a laser power satellite, relay satellites, laser-powered turbofans and a conventional airframe. The relay satellites are orbiting optical systems which intercept the beam from a power satellite and refocus and redirect the beam to its next target.

  10. Dealing with Divorce

    MedlinePlus

    ... to a serious problem like drinking , abuse, or gambling. Sometimes nothing bad happens, but parents just decide ... distance. Even a quick email saying "I'm thinking of you" helps ease the feelings of missing ...

  11. A hybrid online scheduling mechanism with revision and progressive techniques for autonomous Earth observation satellite

    NASA Astrophysics Data System (ADS)

    Li, Guoliang; Xing, Lining; Chen, Yingwu

    2017-11-01

    The autonomicity of self-scheduling on Earth observation satellite and the increasing scale of satellite network attract much attention from researchers in the last decades. In reality, the limited onboard computational resource presents challenge for the online scheduling algorithm. This study considered online scheduling problem for a single autonomous Earth observation satellite within satellite network environment. It especially addressed that the urgent tasks arrive stochastically during the scheduling horizon. We described the problem and proposed a hybrid online scheduling mechanism with revision and progressive techniques to solve this problem. The mechanism includes two decision policies, a when-to-schedule policy combining periodic scheduling and critical cumulative number-based event-driven rescheduling, and a how-to-schedule policy combining progressive and revision approaches to accommodate two categories of task: normal tasks and urgent tasks. Thus, we developed two heuristic (re)scheduling algorithms and compared them with other generally used techniques. Computational experiments indicated that the into-scheduling percentage of urgent tasks in the proposed mechanism is much higher than that in periodic scheduling mechanism, and the specific performance is highly dependent on some mechanism-relevant and task-relevant factors. For the online scheduling, the modified weighted shortest imaging time first and dynamic profit system benefit heuristics outperformed the others on total profit and the percentage of successfully scheduled urgent tasks.

  12. Merging Satellite Precipitation Products for Improved Streamflow Simulations

    NASA Astrophysics Data System (ADS)

    Maggioni, V.; Massari, C.; Barbetta, S.; Camici, S.; Brocca, L.

    2017-12-01

    Accurate quantitative precipitation estimation is of great importance for water resources management, agricultural planning and forecasting and monitoring of natural hazards such as flash floods and landslides. In situ observations are limited around the Earth, especially in remote areas (e.g., complex terrain, dense vegetation), but currently available satellite precipitation products are able to provide global precipitation estimates with an accuracy that depends upon many factors (e.g., type of storms, temporal sampling, season, etc.). The recent SM2RAIN approach proposes to estimate rainfall by using satellite soil moisture observations. As opposed to traditional satellite precipitation methods, which sense cloud properties to retrieve instantaneous estimates, this new bottom-up approach makes use of two consecutive soil moisture measurements for obtaining an estimate of the fallen precipitation within the interval between two satellite overpasses. As a result, the nature of the measurement is different and complementary to the one of classical precipitation products and could provide a different valid perspective to substitute or improve current rainfall estimates. Therefore, we propose to merge SM2RAIN and the widely used TMPA 3B42RT product across Italy for a 6-year period (2010-2015) at daily/0.25deg temporal/spatial scale. Two conceptually different merging techniques are compared to each other and evaluated in terms of different statistical metrics, including hit bias, threat score, false alarm rates, and missed rainfall volumes. The first is based on the maximization of the temporal correlation with a reference dataset, while the second is based on a Bayesian approach, which provides a probabilistic satellite precipitation estimate derived from the joint probability distribution of observations and satellite estimates. The merged precipitation products show a better performance with respect to the parental satellite-based products in terms of categorical statistics, as well as bias reduction and correlation coefficient, with the Bayesian approach being superior to other methods. A study case in the Tiber river basin is also presented to discuss the performance of forcing a hydrological model with the merged satellite precipitation product to simulate streamflow time series.

  13. Assessment of BSRN radiation records for the computation of monthly means

    NASA Astrophysics Data System (ADS)

    Roesch, A.; Wild, M.; Ohmura, A.; Dutton, E. G.; Long, C. N.; Zhang, T.

    2011-02-01

    The integrity of the Baseline Surface Radiation Network (BSRN) radiation monthly averages are assessed by investigating the impact on monthly means due to the frequency of data gaps caused by missing or discarded high time resolution data. The monthly statistics, especially means, are considered to be important and useful values for climate research, model performance evaluations and for assessing the quality of satellite (time- and space-averaged) data products. The study investigates the spread in different algorithms that have been applied for the computation of monthly means from 1-min values. The paper reveals that the computation of monthly means from 1-min observations distinctly depends on the method utilized to account for the missing data. The intra-method difference generally increases with an increasing fraction of missing data. We found that a substantial fraction of the radiation fluxes observed at BSRN sites is either missing or flagged as questionable. The percentage of missing data is 4.4%, 13.0%, and 6.5% for global radiation, direct shortwave radiation, and downwelling longwave radiation, respectively. Most flagged data in the shortwave are due to nighttime instrumental noise and can reasonably be set to zero after correcting for thermal offsets in the daytime data. The study demonstrates that the handling of flagged data clearly impacts on monthly mean estimates obtained with different methods. We showed that the spread of monthly shortwave fluxes is generally clearly higher than for downwelling longwave radiation. Overall, BSRN observations provide sufficient accuracy and completeness for reliable estimates of monthly mean values. However, the value of future data could be further increased by reducing the frequency of data gaps and the number of outliers. It is shown that two independent methods for accounting for the diurnal and seasonal variations in the missing data permit consistent monthly means to within less than 1 W m-2 in most cases. The authors suggest using a standardized method for the computation of monthly means which addresses diurnal variations in the missing data in order to avoid a mismatch of future published monthly mean radiation fluxes from BSRN. The application of robust statistics would probably lead to less biased results for data records with frequent gaps and/or flagged data and outliers. The currently applied empirical methods should, therefore, be completed by the development of robust methods.

  14. Satellite Power System (SPS) international agreements

    NASA Technical Reports Server (NTRS)

    Grove, S.

    1978-01-01

    The problems in obtaining international agreements on geostationary orbit availability, microwave frequency allocations and microwave frequency standards for satellites transmitting solar power are considered. The various U.S. policy options, strategies and time frames with respect to key issues are analyzed.

  15. A Mathematical Model for the Height of a Satellite.

    ERIC Educational Resources Information Center

    Thoemke, Sharon S.; And Others

    1993-01-01

    Emphasizes a real-world-problem situation using sine law and cosine law. Angles of elevation from two tracking stations located in the plane of the equator determine height of a satellite. Calculators or computers can be used. (LDR)

  16. An analysis of satellite state vector observability using SST tracking data

    NASA Technical Reports Server (NTRS)

    Englar, T. S., Jr.; Hammond, C. L.

    1976-01-01

    Observability of satellite state vectors, using only SST tracking data was investigated by covariance analysis under a variety of satellite and station configurations. These results indicate very precarious observability in most short arc cases. The consequences of this are large variances on many state components, such as the downrange component of the relay satellite position. To illustrate the impact of observability problems, an example is given of two distinct satellite orbit pairs generating essentially the same data arc. The physical bases for unobservability are outlined and related to proposed TDRSS configurations. Results are relevant to any mission depending upon TDRSS to determine satellite state. The required mathematical analysis and the software used is described.

  17. Do the methods used to analyse missing data really matter? An examination of data from an observational study of Intermediate Care patients.

    PubMed

    Kaambwa, Billingsley; Bryan, Stirling; Billingham, Lucinda

    2012-06-27

    Missing data is a common statistical problem in healthcare datasets from populations of older people. Some argue that arbitrarily assuming the mechanism responsible for the missingness and therefore the method for dealing with this missingness is not the best option-but is this always true? This paper explores what happens when extra information that suggests that a particular mechanism is responsible for missing data is disregarded and methods for dealing with the missing data are chosen arbitrarily. Regression models based on 2,533 intermediate care (IC) patients from the largest evaluation of IC done and published in the UK to date were used to explain variation in costs, EQ-5D and Barthel index. Three methods for dealing with missingness were utilised, each assuming a different mechanism as being responsible for the missing data: complete case analysis (assuming missing completely at random-MCAR), multiple imputation (assuming missing at random-MAR) and Heckman selection model (assuming missing not at random-MNAR). Differences in results were gauged by examining the signs of coefficients as well as the sizes of both coefficients and associated standard errors. Extra information strongly suggested that missing cost data were MCAR. The results show that MCAR and MAR-based methods yielded similar results with sizes of most coefficients and standard errors differing by less than 3.4% while those based on MNAR-methods were statistically different (up to 730% bigger). Significant variables in all regression models also had the same direction of influence on costs. All three mechanisms of missingness were shown to be potential causes of the missing EQ-5D and Barthel data. The method chosen to deal with missing data did not seem to have any significant effect on the results for these data as they led to broadly similar conclusions with sizes of coefficients and standard errors differing by less than 54% and 322%, respectively. Arbitrary selection of methods to deal with missing data should be avoided. Using extra information gathered during the data collection exercise about the cause of missingness to guide this selection would be more appropriate.

  18. Convolutional neural network features based change detection in satellite images

    NASA Astrophysics Data System (ADS)

    Mohammed El Amin, Arabi; Liu, Qingjie; Wang, Yunhong

    2016-07-01

    With the popular use of high resolution remote sensing (HRRS) satellite images, a huge research efforts have been placed on change detection (CD) problem. An effective feature selection method can significantly boost the final result. While hand-designed features have proven difficulties to design features that effectively capture high and mid-level representations, the recent developments in machine learning (Deep Learning) omit this problem by learning hierarchical representation in an unsupervised manner directly from data without human intervention. In this letter, we propose approaching the change detection problem from a feature learning perspective. A novel deep Convolutional Neural Networks (CNN) features based HR satellite images change detection method is proposed. The main guideline is to produce a change detection map directly from two images using a pretrained CNN. This method can omit the limited performance of hand-crafted features. Firstly, CNN features are extracted through different convolutional layers. Then, a concatenation step is evaluated after an normalization step, resulting in a unique higher dimensional feature map. Finally, a change map was computed using pixel-wise Euclidean distance. Our method has been validated on real bitemporal HRRS satellite images according to qualitative and quantitative analyses. The results obtained confirm the interest of the proposed method.

  19. How to deal with missing longitudinal data in cost of illness analysis in Alzheimer's disease-suggestions from the GERAS observational study.

    PubMed

    Belger, Mark; Haro, Josep Maria; Reed, Catherine; Happich, Michael; Kahle-Wrobleski, Kristin; Argimon, Josep Maria; Bruno, Giuseppe; Dodel, Richard; Jones, Roy W; Vellas, Bruno; Wimo, Anders

    2016-07-18

    Missing data are a common problem in prospective studies with a long follow-up, and the volume, pattern and reasons for missing data may be relevant when estimating the cost of illness. We aimed to evaluate the effects of different methods for dealing with missing longitudinal cost data and for costing caregiver time on total societal costs in Alzheimer's disease (AD). GERAS is an 18-month observational study of costs associated with AD. Total societal costs included patient health and social care costs, and caregiver health and informal care costs. Missing data were classified as missing completely at random (MCAR), missing at random (MAR) or missing not at random (MNAR). Simulation datasets were generated from baseline data with 10-40 % missing total cost data for each missing data mechanism. Datasets were also simulated to reflect the missing cost data pattern at 18 months using MAR and MNAR assumptions. Naïve and multiple imputation (MI) methods were applied to each dataset and results compared with complete GERAS 18-month cost data. Opportunity and replacement cost approaches were used for caregiver time, which was costed with and without supervision included and with time for working caregivers only being costed. Total costs were available for 99.4 % of 1497 patients at baseline. For MCAR datasets, naïve methods performed as well as MI methods. For MAR, MI methods performed better than naïve methods. All imputation approaches were poor for MNAR data. For all approaches, percentage bias increased with missing data volume. For datasets reflecting 18-month patterns, a combination of imputation methods provided more accurate cost estimates (e.g. bias: -1 % vs -6 % for single MI method), although different approaches to costing caregiver time had a greater impact on estimated costs (29-43 % increase over base case estimate). Methods used to impute missing cost data in AD will impact on accuracy of cost estimates although varying approaches to costing informal caregiver time has the greatest impact on total costs. Tailoring imputation methods to the reason for missing data will further our understanding of the best analytical approach for studies involving cost outcomes.

  20. Teachers' and Students' Perceptions of Students' Problem-Solving Difficulties in Physics: Implications for Remediation

    ERIC Educational Resources Information Center

    Ogunleye, Ayodele O.

    2009-01-01

    In recent times, science education researchers have identified a lot of instruments for evaluating conceptual understanding as well as students' attitudes and beliefs about physics; unfortunately however, there are no broad based evaluation instruments in the field of problem-solving in physics. This missing tool is an indication of the complexity…

  1. Factors Contributing to the Problem of Student Absenteeism in a Rural School

    ERIC Educational Resources Information Center

    Durborow, Angela

    2017-01-01

    Student attendance would seem to be a vital link in measuring student success in school. If students are not in school, they miss instruction from the teacher. Without instruction it seems incredibly difficult to complete the work needed to pass classes and be successful in school. The research explored the problem of practice of student…

  2. Correlation of Electronic Health Records Use and Reduced Prevalence of Diabetes Co-Morbidities

    ERIC Educational Resources Information Center

    Eller, James D.

    2013-01-01

    The general problem is Native American tribes have high prevalence rates of diabetes. The specific problem is the failure of IHS sites to adopt EHR may cause health care providers to miss critical opportunities to improve screening and triage processes that result in quality improvement. The purpose of the quantitative correlational study was to…

  3. Mental health workers. Graduation daze.

    PubMed

    Lewis, Carol

    2003-09-11

    PCTs are likely to miss the national target on employment of graduate mental health workers. Pilots are showing success in reducing referrals. Managers must address career progression problems and define roles more clearly.

  4. Dentistry to the rescue of missing children: A review.

    PubMed

    Vij, Nitika; Kochhar, Gulsheen Kaur; Chachra, Sanjay; Kaur, Taranjot

    2016-01-01

    Today's society is becoming increasingly unsafe for children: we frequently hear about new incidents of missing children, which lead to emotional trauma for the loved ones and expose systemic failures of law and order. Parents can take extra precautions to ensure the safety of their children by educating them about ways to protect themselves and keep important records of the child such as updated color photographs, fingerprints, deoxyribonucleic acid (DNA) samples, etc., handy. However, in spite of all efforts, the problem of missing children still remains. Developments in the field of dentistry have empowered dentists with various tools and techniques to play a pivotal role in tracing a missing child. One such tool is Toothprints, a patented arch-shaped thermoplastic dental impression wafer developed by Dr. David Tesini, a paediatric dentist from Massachusetts. Toothprints enables a unique identification of the missing children not only through the bite impression but also through salivary DNA. Besides the use of Toothprints, a dentist can assist investigating agencies in identifying the missing children in multiple ways, including postmortem dental profiling, labeled dental fixtures, DNA extraction from teeth, and serial number engraving on the children's teeth. More importantly, all these tools cause minimal inconvenience to the individual, making a dentist's role in tracking a missing child even more significant. Thus, the simple discipline of maintaining timely dental records with the help of their dentists can save potential hassles for the parents in the future.

  5. Real-time positioning in logging: Effects of forest stand characteristics, topography, and line-of-sight obstructions on GNSS-RF transponder accuracy and radio signal propagation.

    PubMed

    Zimbelman, Eloise G; Keefe, Robert F

    2018-01-01

    Real-time positioning on mobile devices using global navigation satellite system (GNSS) technology paired with radio frequency (RF) transmission (GNSS-RF) may help to improve safety on logging operations by increasing situational awareness. However, GNSS positional accuracy for ground workers in motion may be reduced by multipath error, satellite signal obstruction, or other factors. Radio propagation of GNSS locations may also be impacted due to line-of-sight (LOS) obstruction in remote, forested areas. The objective of this study was to characterize the effects of forest stand characteristics, topography, and other LOS obstructions on the GNSS accuracy and radio signal propagation quality of multiple Raveon Atlas PT GNSS-RF transponders functioning as a network in a range of forest conditions. Because most previous research with GNSS in forestry has focused on stationary units, we chose to analyze units in motion by evaluating the time-to-signal accuracy of geofence crossings in 21 randomly-selected stands on the University of Idaho Experimental Forest. Specifically, we studied the effects of forest stand characteristics, topography, and LOS obstructions on (1) the odds of missed GNSS-RF signals, (2) the root mean squared error (RMSE) of Atlas PTs, and (3) the time-to-signal accuracy of safety geofence crossings in forested environments. Mixed-effects models used to analyze the data showed that stand characteristics, topography, and obstructions in the LOS affected the odds of missed radio signals while stand variables alone affected RMSE. Both stand characteristics and topography affected the accuracy of geofence alerts.

  6. Real-time positioning in logging: Effects of forest stand characteristics, topography, and line-of-sight obstructions on GNSS-RF transponder accuracy and radio signal propagation

    PubMed Central

    2018-01-01

    Real-time positioning on mobile devices using global navigation satellite system (GNSS) technology paired with radio frequency (RF) transmission (GNSS-RF) may help to improve safety on logging operations by increasing situational awareness. However, GNSS positional accuracy for ground workers in motion may be reduced by multipath error, satellite signal obstruction, or other factors. Radio propagation of GNSS locations may also be impacted due to line-of-sight (LOS) obstruction in remote, forested areas. The objective of this study was to characterize the effects of forest stand characteristics, topography, and other LOS obstructions on the GNSS accuracy and radio signal propagation quality of multiple Raveon Atlas PT GNSS-RF transponders functioning as a network in a range of forest conditions. Because most previous research with GNSS in forestry has focused on stationary units, we chose to analyze units in motion by evaluating the time-to-signal accuracy of geofence crossings in 21 randomly-selected stands on the University of Idaho Experimental Forest. Specifically, we studied the effects of forest stand characteristics, topography, and LOS obstructions on (1) the odds of missed GNSS-RF signals, (2) the root mean squared error (RMSE) of Atlas PTs, and (3) the time-to-signal accuracy of safety geofence crossings in forested environments. Mixed-effects models used to analyze the data showed that stand characteristics, topography, and obstructions in the LOS affected the odds of missed radio signals while stand variables alone affected RMSE. Both stand characteristics and topography affected the accuracy of geofence alerts. PMID:29324794

  7. Congenitally missing teeth (hypodontia): A review of the literature concerning the etiology, prevalence, risk factors, patterns and treatment

    PubMed Central

    Rakhshan, Vahid

    2015-01-01

    Congenitally missing teeth (CMT), or as usually called hypodontia, is a highly prevalent and costly dental anomaly. Besides an unfavorable appearance, patients with missing teeth may suffer from malocclusion, periodontal damage, insufficient alveolar bone growth, reduced chewing ability, inarticulate pronunciation and other problems. Treatment might be usually expensive and multidisciplinary. This highly frequent and yet expensive anomaly is of interest to numerous clinical, basic science and public health fields such as orthodontics, pediatric dentistry, prosthodontics, periodontics, maxillofacial surgery, anatomy, anthropology and even the insurance industry. This essay reviews the findings on the etiology, prevalence, risk factors, occurrence patterns, skeletal changes and treatments of congenitally missing teeth. It seems that CMT usually appears in females and in the permanent dentition. It is not conclusive whether it tends to occur more in the maxilla or mandible and also in the anterior versus posterior segments. It can accompany various complications and should be attended by expert teams as soon as possible. PMID:25709668

  8. On the Use of Local Assessments for Monitoring Centrally Reviewed Endpoints with Missing Data in Clinical Trials*

    PubMed Central

    Brummel, Sean S.; Gillen, Daniel L.

    2014-01-01

    Due to ethical and logistical concerns it is common for data monitoring committees to periodically monitor accruing clinical trial data to assess the safety, and possibly efficacy, of a new experimental treatment. When formalized, monitoring is typically implemented using group sequential methods. In some cases regulatory agencies have required that primary trial analyses should be based solely on the judgment of an independent review committee (IRC). The IRC assessments can produce difficulties for trial monitoring given the time lag typically associated with receiving assessments from the IRC. This results in a missing data problem wherein a surrogate measure of response may provide useful information for interim decisions and future monitoring strategies. In this paper, we present statistical tools that are helpful for monitoring a group sequential clinical trial with missing IRC data. We illustrate the proposed methodology in the case of binary endpoints under various missingness mechanisms including missing completely at random assessments and when missingness depends on the IRC’s measurement. PMID:25540717

  9. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  10. A research on the application of software defined networking in satellite network architecture

    NASA Astrophysics Data System (ADS)

    Song, Huan; Chen, Jinqiang; Cao, Suzhi; Cui, Dandan; Li, Tong; Su, Yuxing

    2017-10-01

    Software defined network is a new type of network architecture, which decouples control plane and data plane of traditional network, has the feature of flexible configurations and is a direction of the next generation terrestrial Internet development. Satellite network is an important part of the space-ground integrated information network, while the traditional satellite network has the disadvantages of difficult network topology maintenance and slow configuration. The application of SDN technology in satellite network can solve these problems that traditional satellite network faces. At present, the research on the application of SDN technology in satellite network is still in the stage of preliminary study. In this paper, we start with introducing the SDN technology and satellite network architecture. Then we mainly introduce software defined satellite network architecture, as well as the comparison of different software defined satellite network architecture and satellite network virtualization. Finally, the present research status and development trend of SDN technology in satellite network are analyzed.

  11. Quantum Heterogeneous Computing for Satellite Positioning Optimization

    NASA Astrophysics Data System (ADS)

    Bass, G.; Kumar, V.; Dulny, J., III

    2016-12-01

    Hard optimization problems occur in many fields of academic study and practical situations. We present results in which quantum heterogeneous computing is used to solve a real-world optimization problem: satellite positioning. Optimization problems like this can scale very rapidly with problem size, and become unsolvable with traditional brute-force methods. Typically, such problems have been approximately solved with heuristic approaches; however, these methods can take a long time to calculate and are not guaranteed to find optimal solutions. Quantum computing offers the possibility of producing significant speed-up and improved solution quality. There are now commercially available quantum annealing (QA) devices that are designed to solve difficult optimization problems. These devices have 1000+ quantum bits, but they have significant hardware size and connectivity limitations. We present a novel heterogeneous computing stack that combines QA and classical machine learning and allows the use of QA on problems larger than the quantum hardware could solve in isolation. We begin by analyzing the satellite positioning problem with a heuristic solver, the genetic algorithm. The classical computer's comparatively large available memory can explore the full problem space and converge to a solution relatively close to the true optimum. The QA device can then evolve directly to the optimal solution within this more limited space. Preliminary experiments, using the Quantum Monte Carlo (QMC) algorithm to simulate QA hardware, have produced promising results. Working with problem instances with known global minima, we find a solution within 8% in a matter of seconds, and within 5% in a few minutes. Future studies include replacing QMC with commercially available quantum hardware and exploring more problem sets and model parameters. Our results have important implications for how heterogeneous quantum computing can be used to solve difficult optimization problems in any field.

  12. Satellite Photometric Error Determination

    DTIC Science & Technology

    2015-10-18

    the GEO Color Photometry Catalog (GCPC) that tried to remedy some of the problems listed above for geostationary satellites. In particular, we... Geostationary objects over the continental US  Develop new feature extraction methods such as albedo-area measurements  Understanding of seasonal...Fig. 4 and Fig. 5, there are examples of color photometry signatures for a 3-axis stabilized geostationary satellite. The data were taken with a

  13. Dynamics of the Uranian Rings

    NASA Technical Reports Server (NTRS)

    Dermott, S. F.

    1984-01-01

    Some of the problems of the shepherding satellite model of Goldreich ant tremaine are discussed. The following topics are studied: (1) optical depths of the all the observed narrow rings; (2) satellite and ring separation timescales; (3) ring edge sharpness; (4) shock formation in narrow rings; (5) the existence of small satellites near the Uranian rings; and (6) the apse and node alignments of the eccentric and inclined rings.

  14. a Permanent Magnet Hall Thruster for Orbit Control of Lunar Polar Satellites

    NASA Astrophysics Data System (ADS)

    Ferreira, Jose Leonardo; Silva Moraes, Bruno; Soares Ferreira, Ivan; Cardozo Mour, Decio; Winter, Othon

    Future moon missions devoted to lunar surface remote sensing and to many others scientific exploration topics will require more fine and higher precision orbit control. It is well known that, lunar satellites in polar orbits will suffer a high increase on the eccentricity due to the gravitational perturbation of the Earth. Without proper orbit correction the satellite life time will decrease and end up in a collision with the moon surface. It is pointed out by many authors that this effect is a natural consequence of the Lidov-Kozai resonance. In the present work, we propose a precise method of orbit eccentricity control based on the use of a low thrust Hall plasma thruster. The proposed method is based on an approach intended to keep the orbital eccentricity of the satellite at low values. A previous work on this subject was made using numerical integration considering two systems: the 3-body problem, Moon-Earth-satellite and the 4-body problem, Moon-Earth-Sun-satellite (??). In such simulation it is possible to follow the evolution of the satellite's eccentricity and find empirical expressions for the length of time needed to occur the collision with the moon. In this work, a satellite orbit eccentricity control maneuvering is proposed. It is based on working parameters of a low thrust propulsion permanent magnet Hall plasma thruster (PMHT), which is been developed at University of Brasilia, Brazil. We studied different arcs of active lunar satellite propulsion in order to be able to introduce a correction of the eccentricity at each cycle. The calculations were made considering a set of different thrust values, from 0.1N up to 0.4N which can be obtained by using the PMHT. In each calculation procedure we measured the length of eccentricity correction provided by active propulsion. From these results we obtained empirical expressions of the time needed for the corrections as a function of the initial altitude and as a function of the thrust value. 1. Winter, O. C. et all in Controlling the Eccentricity of Polar Lunar Orbits with Low Thrust Propulsion, Mathematical Problems in Engineering, vol. on Space Dynamics, 2009.

  15. Link Prediction in Criminal Networks: A Tool for Criminal Intelligence Analysis

    PubMed Central

    Berlusconi, Giulia; Calderoni, Francesco; Parolini, Nicola; Verani, Marco; Piccardi, Carlo

    2016-01-01

    The problem of link prediction has recently received increasing attention from scholars in network science. In social network analysis, one of its aims is to recover missing links, namely connections among actors which are likely to exist but have not been reported because data are incomplete or subject to various types of uncertainty. In the field of criminal investigations, problems of incomplete information are encountered almost by definition, given the obvious anti-detection strategies set up by criminals and the limited investigative resources. In this paper, we work on a specific dataset obtained from a real investigation, and we propose a strategy to identify missing links in a criminal network on the basis of the topological analysis of the links classified as marginal, i.e. removed during the investigation procedure. The main assumption is that missing links should have opposite features with respect to marginal ones. Measures of node similarity turn out to provide the best characterization in this sense. The inspection of the judicial source documents confirms that the predicted links, in most instances, do relate actors with large likelihood of co-participation in illicit activities. PMID:27104948

  16. Getting patients in the door: medical appointment reminder preferences

    PubMed Central

    Crutchfield, Trisha M; Kistler, Christine E

    2017-01-01

    Purpose Between 23% and 34% of outpatient appointments are missed annually. Patients who frequently miss medical appointments have poorer health outcomes and are less likely to use preventive health care services. Missed appointments result in unnecessary costs and organizational inefficiencies. Appointment reminders may help reduce missed appointments; particular types may be more effective than other types. We used a survey with a discrete choice experiment (DCE) to learn why individuals miss appointments and to assess appointment reminder preferences. Methods We enrolled a national sample of adults from an online survey panel to complete demographic and appointment habit questions as well as a 16-task DCE designed in Sawtooth Software’s Discover tool. We assessed preferences for four reminder attributes – initial reminder type, arrival of initial reminder, reminder content, and number of reminders. We derived utilities and importance scores. Results We surveyed 251 adults nationally, with a mean age of 43 (range 18–83) years: 51% female, 84% White, and 8% African American. Twenty-three percent of individuals missed one or more appointments in the past 12 months. Two primary reasons given for missing an appointment include transportation problems (28%) and forgetfulness (26%). Participants indicated the initial reminder type (21%) was the most important attribute, followed by the number of reminders (10%). Overall, individuals indicated a preference for a single reminder, arriving via email, phone call, or text message, delivered less than 2 weeks prior to an appointment. Preferences for reminder content were less clear. Conclusion The number of missed appointments and reasons for missing appointments are consistent with prior research. Patient-centered appointment reminders may improve appointment attendance by addressing some of the reasons individuals report missing appointments and by meeting patients’ needs. Future research is necessary to determine if preferred reminders used in practice will result in improved appointment attendance in clinical settings. PMID:28182131

  17. Multiple imputation to deal with missing EQ-5D-3L data: Should we impute individual domains or the actual index?

    PubMed

    Simons, Claire L; Rivero-Arias, Oliver; Yu, Ly-Mee; Simon, Judit

    2015-04-01

    Missing data are a well-known and widely documented problem in cost-effectiveness analyses alongside clinical trials using individual patient-level data. Current methodological research recommends multiple imputation (MI) to deal with missing health outcome data, but there is little guidance on whether MI for multi-attribute questionnaires, such as the EQ-5D-3L, should be carried out at domain or at summary score level. In this paper, we evaluated the impact of imputing individual domains versus imputing index values to deal with missing EQ-5D-3L data using a simulation study and developed recommendations for future practice. We simulated missing data in a patient-level dataset with complete EQ-5D-3L data at one point in time from a large multinational clinical trial (n = 1,814). Different proportions of missing data were generated using a missing at random (MAR) mechanism and three different scenarios were studied. The performance of using each method was evaluated using root mean squared error and mean absolute error of the actual versus predicted EQ-5D-3L indices. In large sample sizes (n > 500) and a missing data pattern that follows mainly unit non-response, imputing domains or the index produced similar results. However, domain imputation became more accurate than index imputation with pattern of missingness following an item non-response. For smaller sample sizes (n < 100), index imputation was more accurate. When MI models were misspecified, both domain and index imputations were inaccurate for any proportion of missing data. The decision between imputing the domains or the EQ-5D-3L index scores depends on the observed missing data pattern and the sample size available for analysis. Analysts conducting this type of exercises should also evaluate the sensitivity of the analysis to the MAR assumption and whether the imputation model is correctly specified.

  18. Using Deep Learning for Targeted Data Selection, Improving Satellite Observation Utilization for Model Initialization

    NASA Astrophysics Data System (ADS)

    Lee, Y. J.; Bonfanti, C. E.; Trailovic, L.; Etherton, B.; Govett, M.; Stewart, J.

    2017-12-01

    At present, a fraction of all satellite observations are ultimately used for model assimilation. The satellite data assimilation process is computationally expensive and data are often reduced in resolution to allow timely incorporation into the forecast. This problem is only exacerbated by the recent launch of Geostationary Operational Environmental Satellite (GOES)-16 satellite and future satellites providing several order of magnitude increase in data volume. At the NOAA Earth System Research Laboratory (ESRL) we are researching the use of machine learning the improve the initial selection of satellite data to be used in the model assimilation process. In particular, we are investigating the use of deep learning. Deep learning is being applied to many image processing and computer vision problems with great success. Through our research, we are using convolutional neural network to find and mark regions of interest (ROI) to lead to intelligent extraction of observations from satellite observation systems. These targeted observations will be used to improve the quality of data selected for model assimilation and ultimately improve the impact of satellite data on weather forecasts. Our preliminary efforts to identify the ROI's are focused in two areas: applying and comparing state-of-art convolutional neural network models using the analysis data from the National Center for Environmental Prediction (NCEP) Global Forecast System (GFS) weather model, and using these results as a starting point to optimize convolution neural network model for pattern recognition on the higher resolution water vapor data from GOES-WEST and other satellite. This presentation will provide an introduction to our convolutional neural network model to identify and process these ROI's, along with the challenges of data preparation, training the model, and parameter optimization.

  19. Materials on the International Space Station - Forward Technology Solar Cell Experiment

    NASA Technical Reports Server (NTRS)

    Walters, R. J.; Garner, J. C.; Lam, S. N.; Vazquez, J. A.; Braun, W. R.; Ruth, R. E.; Lorentzen, J. R.; Bruninga, R.; Jenkins, P. P.; Flatico, J. M.

    2005-01-01

    This paper describes a space solar cell experiment currently being built by the Naval Research Laboratory (NRL) in collaboration with NASA Glenn Research Center (GRC), and the US Naval Academy (USNA). The experiment has been named the Forward Technology Solar Cell Experiment (FTSCE), and the purpose is to rapidly put current and future generation space solar cells on orbit and provide validation data for these technologies. The FTSCE is being fielded in response to recent on-orbit and ground test anomalies associated with space solar arrays that have raised concern over the survivability of new solar technologies in the space environment and the validity of present ground test protocols. The FTSCE is being built as part of the Fifth Materials on the International Space Station (MISSE) Experiment (MISSE-5), which is a NASA program to characterize the performance of new prospective spacecraft materials when subjected to the synergistic effects of the space environment. Telemetry, command, control, and communication (TNC) for the FTSCE will be achieved through the Amateur Satellite Service using the PCSat2 system, which is an Amateur Radio system designed and built by the USNA. In addition to providing an off-the-shelf solution for FTSCE TNC, PCSat2 will provide a communications node for the Amateur Radio satellite system. The FTSCE and PCSat2 will be housed within the passive experiment container (PEC), which is an approximately 2ft x2ft x 4in metal container built by NASA Langley Research Center (NASA LaRC) as part of the MISSE-5 program. NASA LaRC has also supplied a thin film materials experiment that will fly on the exterior of the thermal blanket covering the PCSat2. The PEC is planned to be transported to the ISS on a Shuttle flight. The PEC will be mounted on the exterior of the ISS by an astronaut during an extravehicular activity (EVA). After nominally one year, the PEC will be retrieved and returned to Earth. At the time of writing this paper, the subsystems of the experiment are being integrated at NRL, and we are preparing to commence environmental testing.

  20. Quarantine constraints as applied to satellites.

    NASA Technical Reports Server (NTRS)

    Hoffman, A. R.; Stavro, W.; Gonzalez, C.

    1973-01-01

    Plans for unmanned missions to planets beyond Mars in the 1970s include satellite encounters. Recently published observations of data for Titan, a satellite of Saturn, indicate that conditions may be hospitable for the growth of microorganisms. Therefore, the problem of satisfying possible quarantine constraints for outer planet satellites was investigated. This involved determining the probability of impacting a satellite of Jupiter or Saturn by a spacecraft for a planned satellite encounter during an outer planet mission. Mathematical procedures were formulated which (1) determine the areas in the aim-plane that would result in trajectories that impact the satellite and (2) provide a technique for numerically integrating the navigation error function over the impact area to obtain impact probabilities. The results indicate which of the planned spacecraft trajectory correction maneuvers are most critical in terms of satellite quarantine violation.

  1. Introducing consultant outpatient clinics to community settings to improve access to paediatrics: an observational impact study.

    PubMed

    McLeod, Hugh; Heath, Gemma; Cameron, Elaine; Debelle, Geoff; Cummins, Carole

    2015-06-01

    In line with a national policy to move care 'closer to home', a specialist children's hospital in the National Health Service in England introduced consultant-led 'satellite' clinics to two community settings for general paediatric outpatient services. Objectives were to reduce non-attendance at appointments by providing care in more accessible locations and to create new physical clinic capacity. This study evaluated these satellite clinics to inform further development and identify lessons for stakeholders. Impact of the satellite clinics was assessed by comparing community versus hospital-based clinics across the following measures: (1) non-attendance rates and associated factors (including patient characteristics and travel distance) using a logistic regression model; (2) percentage of appointments booked within local catchment area; (3) contribution to total clinic capacity; (4) time allocated to clinics and appointments; and (5) clinic efficiency, defined as the ratio of income to staff-related costs. Satellite clinics did not increase attendance beyond their contribution to shorter travel distance, which was associated with higher attendance. Children living in the most-deprived areas were 1.8 times more likely to miss appointments compared with those from least-deprived areas. The satellite clinics' contribution to activity in catchment areas and to total capacity was small. However, one of the two satellite clinics was efficient compared with most hospital-based clinics. Outpatient clinics were relocated in pragmatically chosen community settings using a 'drag and drop' service model. Such clinics have potential to improve access to specialist paediatric healthcare, but do not provide a panacea. Work is required to improve attendance as part of wider efforts to support vulnerable families. Satellite clinics highlight how improved management could contribute to better use of existing capacity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Comparison of methods for dealing with missing values in the EPV-R.

    PubMed

    Paniagua, David; Amor, Pedro J; Echeburúa, Enrique; Abad, Francisco J

    2017-08-01

    The development of an effective instrument to assess the risk of partner violence is a topic of great social relevance. This study evaluates the scale of “Predicción del Riesgo de Violencia Grave Contra la Pareja” –Revisada– (EPV-R - Severe Intimate Partner Violence Risk Prediction Scale-Revised), a tool developed in Spain, which is facing the problem of how to treat the high rate of missing values, as is usual in this type of scale. First, responses to the EPV-R in a sample of 1215 male abusers who were reported to the police were used to analyze the patterns of occurrence of missing values, as well as the factor structure. Second, we analyzed the performance of various imputation methods using simulated data that emulates the missing data mechanism found in the empirical database. The imputation procedure originally proposed by the authors of the scale provides acceptable results, although the application of a method based on the Item Response Theory could provide greater accuracy and offers some additional advantages. Item Response Theory appears to be a useful tool for imputing missing data in this type of questionnaire.

  3. A stochastic multiple imputation algorithm for missing covariate data in tree-structured survival analysis.

    PubMed

    Wallace, Meredith L; Anderson, Stewart J; Mazumdar, Sati

    2010-12-20

    Missing covariate data present a challenge to tree-structured methodology due to the fact that a single tree model, as opposed to an estimated parameter value, may be desired for use in a clinical setting. To address this problem, we suggest a multiple imputation algorithm that adds draws of stochastic error to a tree-based single imputation method presented by Conversano and Siciliano (Technical Report, University of Naples, 2003). Unlike previously proposed techniques for accommodating missing covariate data in tree-structured analyses, our methodology allows the modeling of complex and nonlinear covariate structures while still resulting in a single tree model. We perform a simulation study to evaluate our stochastic multiple imputation algorithm when covariate data are missing at random and compare it to other currently used methods. Our algorithm is advantageous for identifying the true underlying covariate structure when complex data and larger percentages of missing covariate observations are present. It is competitive with other current methods with respect to prediction accuracy. To illustrate our algorithm, we create a tree-structured survival model for predicting time to treatment response in older, depressed adults. Copyright © 2010 John Wiley & Sons, Ltd.

  4. An Overview and Evaluation of Recent Machine Learning Imputation Methods Using Cardiac Imaging Data.

    PubMed

    Liu, Yuzhe; Gopalakrishnan, Vanathi

    2017-03-01

    Many clinical research datasets have a large percentage of missing values that directly impacts their usefulness in yielding high accuracy classifiers when used for training in supervised machine learning. While missing value imputation methods have been shown to work well with smaller percentages of missing values, their ability to impute sparse clinical research data can be problem specific. We previously attempted to learn quantitative guidelines for ordering cardiac magnetic resonance imaging during the evaluation for pediatric cardiomyopathy, but missing data significantly reduced our usable sample size. In this work, we sought to determine if increasing the usable sample size through imputation would allow us to learn better guidelines. We first review several machine learning methods for estimating missing data. Then, we apply four popular methods (mean imputation, decision tree, k-nearest neighbors, and self-organizing maps) to a clinical research dataset of pediatric patients undergoing evaluation for cardiomyopathy. Using Bayesian Rule Learning (BRL) to learn ruleset models, we compared the performance of imputation-augmented models versus unaugmented models. We found that all four imputation-augmented models performed similarly to unaugmented models. While imputation did not improve performance, it did provide evidence for the robustness of our learned models.

  5. Artificial Limbs

    MedlinePlus

    ... a number of reasons. Common ones include Circulation problems from atherosclerosis or diabetes. They may cause you to need an amputation. Traumatic injuries, including from traffic accidents and military combat Cancer Birth defects If you are missing ...

  6. High voltage space plasma interactions. [charging the solar power satellites

    NASA Technical Reports Server (NTRS)

    Mccoy, J. E.

    1980-01-01

    Two primary problems resulted from plasma interactions; one of concern to operations in geosynchronous orbit (GEO), the other in low orbits (LEO). The two problems are not the same. Spacecraft charging has become widely recognized as a problem, particularly for communications satellites operating in GEO. The very thin thermal plasmas at GEO are insufficient to bleed off voltage buildups due to higher energy charged particle radiation collected on outer surfaces. Resulting differential charging/discharging causes electrical transients, spurious command signals and possible direct overload damage. An extensive NASA/Air Force program has been underway for several years to address this problem. At lower altitudes, the denser plasmas of the plasmasphere/ionosphere provide sufficient thermal current to limit such charging to a few volts or less. Unfortunately, these thermal plasma currents which solve the GEO spacecraft charging problem can become large enough to cause just the opposite problem in LEO.

  7. Cosmological simulations of decaying dark matter: implications for small-scale structure of dark matter haloes

    NASA Astrophysics Data System (ADS)

    Wang, Mei-Yu; Peter, Annika H. G.; Strigari, Louis E.; Zentner, Andrew R.; Arant, Bryan; Garrison-Kimmel, Shea; Rocha, Miguel

    2014-11-01

    We present a set of N-body simulations of a class of models in which an unstable dark matter particle decays into a stable dark matter particle and a non-interacting light particle with decay lifetime comparable to the Hubble time. We study the effects of the recoil kick velocity (Vk) received by the stable dark matter on the structures of dark matter haloes ranging from galaxy-cluster to Milky Way-mass scales. For Milky Way-mass haloes, we use high-resolution, zoom-in simulations to explore the effects of decays on Galactic substructure. In general, haloes with circular velocities comparable to the magnitude of kick velocity are most strongly affected by decays. We show that models with lifetimes Γ-1 ˜ H_0^{-1} and recoil speeds Vk ˜ 20-40 km s-1 can significantly reduce both the abundance of Galactic subhaloes and their internal densities. We find that decaying dark matter models that do not violate current astrophysical constraints can significantly mitigate both the `missing satellites problem' and the more recent `too big to fail problem'. These decaying models predict significant time evolution of haloes, and this implies that at high redshifts decaying models exhibit the similar sequence of structure formation as cold dark matter. Thus, decaying dark matter models are significantly less constrained by high-redshift phenomena than warm dark matter models. We conclude that models of decaying dark matter make predictions that are relevant for the interpretation of small galaxies observations in the Local Group and can be tested as well as by forthcoming large-scale surveys.

  8. Problems related to menstruation amongst adolescent girls.

    PubMed

    Sharma, Pragya; Malhotra, Chetna; Taneja, D K; Saha, Renuka

    2008-02-01

    To study the types and frequency of problems related to menstruation in adolescent girls and the effect of these problems on daily routine. Girls in the age group 13-19 years who had had menarche for at least one year at the time of study. 198 adolescent girls have been studied. Data was collected by personal interviews on a pre-tested, semi-structured questionnaire. The questions covered menstrual problems, regularity of menses in last three cycles of menstruation and the effect of these problems on the daily routine. Analysis was done using SPSS version 12. Percentages were calculated for drawing inferences. More than a third (35.9%) of the study subjects were in the age group 13-15 years followed by 17-19 years, 15-17 years respectively. Mean age of study participants was calculated to be 16.2 years. Dysmenorrhea (67.2%) was the commonest problem and (63.1%) had one or the other symptoms of Pre-menstrual syndrome (PMS). Other related problems were present in 55.1% of study subjects. Daily routine of 60% girls was affected due to prolonged bed rest, missed social activities/commitments, disturbed sleep and decreased appetite. 17.24% had to miss a class and 25% had to abstain from work. Mothers and friends were the most common source of information on the issue. Screen adolescent girls for menstruation related problems and provide them with counseling services and relevant information on possible treatment options. Besides, there is a need to emphasize on designing menstrual health programmes for adolescents.

  9. Bayesian Network Structure Learning for Urban Land Use Classification from Landsat ETM+ and Ancillary Data

    NASA Astrophysics Data System (ADS)

    Park, M.; Stenstrom, M. K.

    2004-12-01

    Recognizing urban information from the satellite imagery is problematic due to the diverse features and dynamic changes of urban landuse. The use of Landsat imagery for urban land use classification involves inherent uncertainty due to its spatial resolution and the low separability among land uses. To resolve the uncertainty problem, we investigated the performance of Bayesian networks to classify urban land use since Bayesian networks provide a quantitative way of handling uncertainty and have been successfully used in many areas. In this study, we developed the optimized networks for urban land use classification from Landsat ETM+ images of Marina del Rey area based on USGS land cover/use classification level III. The networks started from a tree structure based on mutual information between variables and added the links to improve accuracy. This methodology offers several advantages: (1) The network structure shows the dependency relationships between variables. The class node value can be predicted even with particular band information missing due to sensor system error. The missing information can be inferred from other dependent bands. (2) The network structure provides information of variables that are important for the classification, which is not available from conventional classification methods such as neural networks and maximum likelihood classification. In our case, for example, bands 1, 5 and 6 are the most important inputs in determining the land use of each pixel. (3) The networks can be reduced with those input variables important for classification. This minimizes the problem without considering all possible variables. We also examined the effect of incorporating ancillary data: geospatial information such as X and Y coordinate values of each pixel and DEM data, and vegetation indices such as NDVI and Tasseled Cap transformation. The results showed that the locational information improved overall accuracy (81%) and kappa coefficient (76%), and lowered the omission and commission errors compared with using only spectral data (accuracy 71%, kappa coefficient 62%). Incorporating DEM data did not significantly improve overall accuracy (74%) and kappa coefficient (66%) but lowered the omission and commission errors. Incorporating NDVI did not much improve the overall accuracy (72%) and k coefficient (65%). Including Tasseled Cap transformation reduced the accuracy (accuracy 70%, kappa 61%). Therefore, additional information from the DEM and vegetation indices was not useful as locational ancillary data.

  10. Calculation of double-lunar swingby trajectories: Part 2: Numerical solutions in the restricted problem of three bodies

    NASA Technical Reports Server (NTRS)

    Stalos, S.

    1990-01-01

    The double-lunar swingby trajectory is a method for maintaining alignment of an Earth satellite's line of apsides with the Sun-Earth line. From a Keplerian point of view, successive close encounters with the Moon cause discrete, instantaneous changes in the satellite's eccentricity and semimajor axis. Numerical solutions to the planar, restricted problem of three bodies as double-lunar swingby trajectories are identified. The method of solution is described and the results compared to the Keplerian formulation.

  11. Techniques for computing regional radiant emittances of the earth-atmosphere system from observations by wide-angle satellite radiometers, phase 3

    NASA Technical Reports Server (NTRS)

    Pina, J. F.; House, F. B.

    1975-01-01

    Radiometers on earth orbiting satellites measure the exchange of radiant energy between the earth-atmosphere (E-A) system and space at observation points in space external to the E-A system. Observations by wideangle, spherical and flat radiometers are analyzed and interpreted with regard to the general problem of the earth energy budget (EEB) and to the problem of determining the energy budget of regions smaller than the field of view (FOV) of these radiometers.

  12. Implications for the UK of solar-power satellites /s.p.s/ as an energy source

    NASA Technical Reports Server (NTRS)

    Shelton, R. M.

    1980-01-01

    The solar power satellite concept which would make the sun's radiation available on earth as a source of energy, is discussed. Attention is given to the concept currently under evaluation in the USA, and also in Europe, though to a lesser extent. The advantages and problems associated with its adoption by the UK as a major source of electrical energy are discussed. The discussion covers topics such as sizing, reference system, and construction, costs, and problem areas.

  13. Insufficient sleep rather than the apnea-hypopnea index can be associated with sleepiness-related driving problems of Japanese obstructive sleep apnea syndrome patients residing in metropolitan areas.

    PubMed

    Matsui, Kentaro; Sasai-Sakuma, Taeko; Ishigooka, Jun; Inoue, Yuichi

    2017-05-01

    Obstructive sleep apnea syndrome (OSAS) and insufficient sleep might increase the risk of drowsy driving and sleepiness-related vehicular accidents. This study retrospectively investigated the factors associated with these driving problems, particularly addressing OSAS severity and sleep amounts of affected drivers. This study examined 161 patients (146 male and 15 female) with OSAS (apnea-hypopnea index [AHI] ≥ 5) who drove on a routine basis and who completed study questionnaires. To investigate factors associated with drowsy driving during the prior year and sleepiness-related vehicular accidents or near-miss events during the prior five years, logistic regression analyses were performed with age, body mass index, monthly driving distance, habitual sleep duration on weekdays, the Japanese version of Epworth Sleepiness Scale score, AHI, and periodic limb movement index as independent variables. Of the patients, 68 (42.2%) reported drowsy driving experiences, and 86 (53.4%) reported sleepiness-related vehicular accidents or near-miss events. Analyses revealed the following: older age (46-65 years, ≥66 years) was negatively associated with drowsy driving (p <0.05, p <0.05), and habitually shorter sleep duration on weekdays (≤6 hours) was positively associated with drowsy driving (p <0.01). Habitual sleep duration of ≤6 hours (p <0.01) and Epworth Sleepiness Scale score of ≥11 (p <0.01) were positively associated with sleepiness-related vehicular accidents and near-miss events. However, AHI was not associated with these driving problems. Insufficient sleep, rather than severity of OSAS, was associated with sleepiness-related driving problems in these Japanese OSAS patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Albinism

    MedlinePlus

    ... normal skin and hair Patches of missing skin color Many forms of albinism are associated with the following symptoms: Crossed eyes Light sensitivity Rapid eye movements Vision problems, or functional blindness Exams and Tests Genetic testing offers the most ...

  15. Tooth formation - delayed or absent

    MedlinePlus

    ... missing teeth they never developed. Cosmetic or orthodontic dentistry can correct this problem. Causes Specific diseases can ... process. In: Dean JA, ed. McDonald and Avery's Dentistry for the Child and Adolescent . 10th ed. St. ...

  16. Multi-view non-negative tensor factorization as relation learning in healthcare data.

    PubMed

    Hang Wu; Wang, May D

    2016-08-01

    Discovering patterns in co-occurrences data between objects and groups of concepts is a useful task in many domains, such as healthcare data analysis, information retrieval, and recommender systems. These relational representations come from objects' behaviors in different views, posing a challenging task of integrating information from these views to uncover the shared latent structures. The problem is further complicated by the high dimension of data and the large ratio of missing data. We propose a new paradigm of learning semantic relations using tensor factorization, by jointly factorizing multi-view tensors and searching for a consistent underlying semantic space across each views. We formulate the idea as an optimization problem and propose efficient optimization algorithms, with a special treatment of missing data as well as high-dimensional data. Experiments results show the potential and effectiveness of our algorithms.

  17. Case report of unexplained hypocalcaemia in a slightly haemolysed sample.

    PubMed

    Cornes, Michael

    2017-06-15

    The case presented highlights a common pre-analytical problem identified in the laboratory that was initially missed. It concerns a young, generally healthy adult patient with no significant medical history and no significant family history. They presented with common flu like symptoms to their primary care clinician who considered this was most likely a viral problem that would pass with time. The clinician, however, did some routine bloods to reassure the patient despite a lack of clinical indication. When the sample was analysed the sample was haemolysed with strikingly low calcium. This led to the patient being called into hospital for urgent repeat investigations, all of which turned out to be within normal ranges. On further investigation the original sample was found to be contaminated. This result would normally have been flagged but was missed due to the complication of haemolysis.

  18. Accelerating electron tomography reconstruction algorithm ICON with GPU.

    PubMed

    Chen, Yu; Wang, Zihao; Zhang, Jingrong; Li, Lun; Wan, Xiaohua; Sun, Fei; Zhang, Fa

    2017-01-01

    Electron tomography (ET) plays an important role in studying in situ cell ultrastructure in three-dimensional space. Due to limited tilt angles, ET reconstruction always suffers from the "missing wedge" problem. With a validation procedure, iterative compressed-sensing optimized NUFFT reconstruction (ICON) demonstrates its power in the restoration of validated missing information for low SNR biological ET dataset. However, the huge computational demand has become a major problem for the application of ICON. In this work, we analyzed the framework of ICON and classified the operations of major steps of ICON reconstruction into three types. Accordingly, we designed parallel strategies and implemented them on graphics processing units (GPU) to generate a parallel program ICON-GPU. With high accuracy, ICON-GPU has a great acceleration compared to its CPU version, up to 83.7×, greatly relieving ICON's dependence on computing resource.

  19. Oceanography from space

    NASA Technical Reports Server (NTRS)

    Stewart, R. H.

    1982-01-01

    Active and passive spaceborne instruments that can observe the sea are discussed. Attention is given to satellite observations of ocean surface temperature and heating, wind speed and direction, ocean currents, wave height, ocean color, and sea ice. Specific measurements now being made from space are described, the accuracy of various instruments is considered, and problems associated with the analysis of satellite data are examined. It is concluded that the satellites and techniques used by different nations should be sufficiently standard that data from one satellite can be directly compared with data from another and that accurate calibration and overlap of satellite data are necessary to confirm the continuity and homogeneity of the data.

  20. Quad-Tree Visual-Calculus Analysis of Satellite Coverage

    NASA Technical Reports Server (NTRS)

    Lo, Martin W.; Hockney, George; Kwan, Bruce

    2003-01-01

    An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.

  1. Small satellites and space debris issues

    NASA Astrophysics Data System (ADS)

    Yakovlev, M.; Kulik, S.; Agapov, V.

    2001-10-01

    The objective of this report is the analysis of the tendencies in designing of small satellites (SS) and the effect of small satellites on space debris population. It is shown that SS to include nano- and pico-satellites should be considered as a particularly dangerous source of space debris when elaborating international standards and legal documents concerning the space debris problem, in particular "International Space Debris Mitigation Standard". These issues are in accordance with the IADC goals in its main activity areas and should be carefully considered within the IADC framework.

  2. Payload spin assembly for the commercial Titan launch vehicle

    NASA Technical Reports Server (NTRS)

    Robinson, Wilf; Pech, Greg

    1991-01-01

    A contract was completed to design, build, and test a Payload Spin Assembly (PSA) for installation onto the Martin Marietta Titan 3 Commercial launch vehicle. This assembly provides launch support for satellite payloads up to 5783 kilograms (6.37 tons) and controls release, spin-up, and final separation of the satellite from the second stage. Once separated, the satellite's Perigee Kick Motor (PKM) boosts the satellite into its transfer orbit. The first successful flight occurred December 31, 1989. Requirements, design, test, and problems associated with this mechanical assembly are discussed.

  3. Observation of nuclear reactors on satellites with a balloon-borne gamma-ray telescope

    NASA Technical Reports Server (NTRS)

    O'Neill, Terrence J.; Kerrick, Alan D.; Ait-Ouamer, Farid; Tumer, O. Tumay; Zych, Allen D.

    1989-01-01

    Four Soviet nuclear-powered satellites flying over a double Compton gamma-ray telescope resulted in the detection of gamma rays with 0.3-8.0 MeV energies on April 15, 1988, as the balloonborne telescope searched, from a 35-km altitude, for celestial gamma-ray sources. The satellites included Cosmos 1900 and 1932. The USSR is the only nation currently employing moderated nuclear reactors for satellite power; reactors in space may cause significant problems for gamma-ray astronomy by increasing backgrounds, especially in the case of gamma-ray bursts.

  4. Network reconstruction via graph blending

    NASA Astrophysics Data System (ADS)

    Estrada, Rolando

    2016-05-01

    Graphs estimated from empirical data are often noisy and incomplete due to the difficulty of faithfully observing all the components (nodes and edges) of the true graph. This problem is particularly acute for large networks where the number of components may far exceed available surveillance capabilities. Errors in the observed graph can render subsequent analyses invalid, so it is vital to develop robust methods that can minimize these observational errors. Errors in the observed graph may include missing and spurious components, as well fused (multiple nodes are merged into one) and split (a single node is misinterpreted as many) nodes. Traditional graph reconstruction methods are only able to identify missing or spurious components (primarily edges, and to a lesser degree nodes), so we developed a novel graph blending framework that allows us to cast the full estimation problem as a simple edge addition/deletion problem. Armed with this framework, we systematically investigate the viability of various topological graph features, such as the degree distribution or the clustering coefficients, and existing graph reconstruction methods for tackling the full estimation problem. Our experimental results suggest that incorporating any topological feature as a source of information actually hinders reconstruction accuracy. We provide a theoretical analysis of this phenomenon and suggest several avenues for improving this estimation problem.

  5. On the Tesseral-Harmonics Resonance Problem in Artificial-Satellite Theory, Part 2

    NASA Technical Reports Server (NTRS)

    Romanowicz, B. A.

    1976-01-01

    Equations were derived for the perturbations on an artificial satellite when the motion of the satellite is commensurable with that of the earth. This was done by first selecting the tesseral harmonics that contribute the most to the perturbations and then by applying Hori's method by use of Lie series. Here, are introduced some modifications to the perturbations, which now result in better agreement with numerical integration.

  6. Development of a Remotely Operated Autonomous Satellite Tracking System

    DTIC Science & Technology

    2010-03-01

    ability of Commercial-Off-The-Shelf (COTS) optical observation equipment to track and image Low Earth Orbiting (LEO) satellites. Using radar data in...SOR operates one of the world’s premier adaptive-optics telescopes capable of tracking low -earth orbiting satellites. The telescope has a 3.5-meter...student) published his thesis Initial Determination of Low Earth Orbits Using Commercial Telescopes. According to this document’s Problem Statement

  7. Heuristic approach to Satellite Range Scheduling with Bounds using Lagrangian Relaxation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Arguello, Bryan; Nozick, Linda Karen

    This paper focuses on scheduling antennas to track satellites using a heuristic method. In order to validate the performance of the heuristic, bounds are developed using Lagrangian relaxation. The performance of the algorithm is established using several illustrative problems.

  8. Ionospheric range-rate effects in satellite-to-satellite tracking

    NASA Technical Reports Server (NTRS)

    Lipofsky, J. R.; Bent, R. B.; Llewellyn, S. K.; Schmid, P. E.

    1977-01-01

    Investigation of ionospheric range and range-rate corrections in satellite-to-satellite tracking were investigated. Major problems were cited and the magnitude of errors that have to be considered for communications between satellites and related experiments was defined. The results point to the need of using a sophisticated modeling approach incorporating daily solar data, and where possible actual ionospheric measurements as update information, as a simple median model cannot possibly account for the complex interaction of the many variables. The findings provide a basis from which the residual errors can be estimated after ionospheric modeling is incorporated in the reduction. Simulations were performed for satellites at various heights: Apollo, Geos, and Nimbus tracked by ATS-6; and in two different geometric configurations: coplanar and perpendicular orbits.

  9. Research on anti - interference based on GNSS

    NASA Astrophysics Data System (ADS)

    Yu, Huanran; Liu, Yijun

    2017-05-01

    Satellite Navigation System has been widely used in military and civil fields. It has all-functional, all-weather, continuity and real-time characteristics, can provide the precise position, velocity and timing information's for the users. The environments where the receiver of satellite navigation system works become more and more complex, and the satellite signals are susceptible to intentional or unintentional interferences, anti-jamming capability has become a key problem of satellite navigation receiver's ability to work normal. In this paper, we study a DOA estimation algorithm based on linear symmetric matrix to improve the anti-jamming capability of the satellite navigation receiver, has great significance to improve the performance of satellite navigation system in complex electromagnetic environment and enhance its applicability in various environments.

  10. From Addition to Multiplication ... and Back: The Development of Students' Additive and Multiplicative Reasoning Skills

    ERIC Educational Resources Information Center

    Van Dooren, Wim; De Bock, Dirk; Verschaffel, Lieven

    2010-01-01

    This study builds on two lines of research that have so far developed largely separately: the use of additive methods to solve proportional word problems and the use of proportional methods to solve additive word problems. We investigated the development with age of both kinds of erroneous solution methods. We gave a test containing missing-value…

  11. Pairing Nurses and Social Workers in Schools: North Carolina's School-Based Child and Family Support Teams

    ERIC Educational Resources Information Center

    Gifford, Elizabeth J.; Wells, Rebecca; Bai, Yu; Troop, Tony O.; Miller, Shari; Babinski, Leslie M.

    2010-01-01

    When children are struggling in school, underlying causes often include physical or behavioral health problems, poverty, abuse, and/or neglect. Children's poor physical health status has been linked to deficits in memory and reading ability. Children with behavioral problems are much more likely than others to have lower grades, miss school, be…

  12. An Educational Multimedia Presentation on the Introduction to Spacecraft Charging

    NASA Technical Reports Server (NTRS)

    Lin, E.; dePayrebrune, M.

    2004-01-01

    Over the last few decades, significant knowledge has been gained in how to protect spacecraft from charging; however, the continuing technical advancement in the design and build of satellites requires on-going effort in the study of spacecraft charging. A situation that we have encountered is that not all satellite designers and builders are familiar with the problem of spacecraft charging. The design of a satellite involves many talented people with diverse backgrounds, ranging from manufacturing and assembly to engineering and program management. The complex design and build of a satellite system requires people with highly specialized skills such that cross-specialization is often not achievable. As a result, designers and builders of satellites are not usually familiar with the problems outside their specialization. This is also true for spacecraft charging. Not everyone is familiar with the definition of spacecraft charging and the damage that spacecraft charging can cause. Understanding the problem is an important first step in getting everyone involved in addressing the appropriate spacecraft charging issues during the satellite design and build phases. To address this important first step, an educational multimedia presentation has been created to inform the general engineering community about the basics of spacecraft charging. The content of this educational presentation is based on relevant published technical papers. The presentation was developed using Macromedia Flash. This software produces a more dynamic learning environment than a typical slide show , resulting in a more effective learning experience. The end result is that the viewer will have learned about the basics of spacecraft charging. This presentation is available to the public through our website, www.dplscience.com, free of charge. Viewers are encouraged to pass this presentation to colleagues within their own work environment. This paper describes the content of the multimedia presentation.

  13. Downscaling of Remotely Sensed Land Surface Temperature with multi-sensor based products

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Baik, J.; Choi, M.

    2016-12-01

    Remotely sensed satellite data provides a bird's eye view, which allows us to understand spatiotemporal behavior of hydrologic variables at global scale. Especially, geostationary satellite continuously observing specific regions is useful to monitor the fluctuations of hydrologic variables as well as meteorological factors. However, there are still problems regarding spatial resolution whether the fine scale land cover can be represented with the spatial resolution of the satellite sensor, especially in the area of complex topography. To solve these problems, many researchers have been trying to establish the relationship among various hydrological factors and combine images from multi-sensor to downscale land surface products. One of geostationary satellite, Communication, Ocean and Meteorological Satellite (COMS), has Meteorological Imager (MI) and Geostationary Ocean Color Imager (GOCI). MI performing the meteorological mission produce Rainfall Intensity (RI), Land Surface Temperature (LST), and many others every 15 minutes. Even though it has high temporal resolution, low spatial resolution of MI data is treated as major research problem in many studies. This study suggests a methodology to downscale 4 km LST datasets derived from MI in finer resolution (500m) by using GOCI datasets in Northeast Asia. Normalized Difference Vegetation Index (NDVI) recognized as variable which has significant relationship with LST are chosen to estimate LST in finer resolution. Each pixels of NDVI and LST are separated according to land cover provided from MODerate resolution Imaging Spectroradiometer (MODIS) to achieve more accurate relationship. Downscaled LST are compared with LST observed from Automated Synoptic Observing System (ASOS) for assessing its accuracy. The downscaled LST results of this study, coupled with advantage of geostationary satellite, can be applied to observe hydrologic process efficiently.

  14. A statistical inference approach for the retrieval of the atmospheric ozone profile from simulated satellite measurements of solar backscattered ultraviolet radiation

    NASA Technical Reports Server (NTRS)

    Bonavito, N. L.; Gordon, C. L.; Inguva, R.; Serafino, G. N.; Barnes, R. A.

    1994-01-01

    NASA's Mission to Planet Earth (MTPE) will address important interdisciplinary and environmental issues such as global warming, ozone depletion, deforestation, acid rain, and the like with its long term satellite observations of the Earth and with its comprehensive Data and Information System. Extensive sets of satellite observations supporting MTPE will be provided by the Earth Observing System (EOS), while more specific process related observations will be provided by smaller Earth Probes. MTPE will use data from ground and airborne scientific investigations to supplement and validate the global observations obtained from satellite imagery, while the EOS satellites will support interdisciplinary research and model development. This is important for understanding the processes that control the global environment and for improving the prediction of events. In this paper we illustrate the potential for powerful artificial intelligence (AI) techniques when used in the analysis of the formidable problems that exist in the NASA Earth Science programs and of those to be encountered in the future MTPE and EOS programs. These techniques, based on the logical and probabilistic reasoning aspects of plausible inference, strongly emphasize the synergetic relation between data and information. As such, they are ideally suited for the analysis of the massive data streams to be provided by both MTPE and EOS. To demonstrate this, we address both the satellite imagery and model enhancement issues for the problem of ozone profile retrieval through a method based on plausible scientific inferencing. Since in the retrieval problem, the atmospheric ozone profile that is consistent with a given set of measured radiances may not be unique, an optimum statistical method is used to estimate a 'best' profile solution from the radiances and from additional a priori information.

  15. A Very Large Area Network (VLAN) knowledge-base applied to space communication problems

    NASA Technical Reports Server (NTRS)

    Zander, Carol S.

    1988-01-01

    This paper first describes a hierarchical model for very large area networks (VLAN). Space communication problems whose solution could profit by the model are discussed and then an enhanced version of this model incorporating the knowledge needed for the missile detection-destruction problem is presented. A satellite network or VLAN is a network which includes at least one satellite. Due to the complexity, a compromise between fully centralized and fully distributed network management has been adopted. Network nodes are assigned to a physically localized group, called a partition. Partitions consist of groups of cell nodes with one cell node acting as the organizer or master, called the Group Master (GM). Coordinating the group masters is a Partition Master (PM). Knowledge is also distributed hierarchically existing in at least two nodes. Each satellite node has a back-up earth node. Knowledge must be distributed in such a way so as to minimize information loss when a node fails. Thus the model is hierarchical both physically and informationally.

  16. Book Review: Book review

    NASA Astrophysics Data System (ADS)

    Clevers, Jan G. P. W.

    2016-09-01

    For many years a good introductory book for undergraduate and postgraduate students on remote sensing of the Earth's land surface, which was not starting with an emphasis on traditional photographic techniques, was missing. In 2010 the first edition of the book Fundamentals of Satellite Remote Sensing by Emilio Chuvieco and Alfredo Huete was published by CRC Press and it was filling this gap. Now the second edition by Emilio Chuvieco was published by CRC Press. This second edition is made more attractive by the use of colour and including colour illustrations instead of the black-and-white ones in the first edition.

  17. Stennis supports LCROSS sessions

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Stennis Space Center personnel supported a pair of events marking NASA's Lunar CRater Observation and Sensing Satellite (LCROSS) mission Oct. 9. Stennis participated in daylong activities at the Russell C. Davis Planetarium in Jackson, Miss., and the Kenner (La.) Planetarium Megadome Cinema, providing exhibits, videos and educational activities for students at both sites. The LCROSS mission involved crashing two objects into the moon in order to search for evidence of lunar water ice, an important resource for future sustainable exploration. Pictured at the Kenner planetarium are students from Emily C. Watkins and St. John the Baptist Parish schools at Kenner.

  18. Satellite Contributions to Global Change Studies

    NASA Technical Reports Server (NTRS)

    Parkinson, Claire L.

    2009-01-01

    By providing a global view with a level playing field (no region missed because of unfavorable surface conditions or political boundaries), satellites have made major contributions to improved monitoring and understanding of our constantly changing planet. The global view has allowed surprising realizations like the relative sparsity of lightning strikes over oceans and the large-scale undulations on the massive Antarctic ice sheet. It has allowed the tracking of all sorts of phenomena, including aerosols, both natural and anthropogenic, as they move with the atmospheric circulation and impact weather and human health. But probably nothing that the global view allows is more important in the long term than its provision. of unbiased data sets to address the issue of global change, considered by many to be among the most important issues facing humankind today. With satellites we can monitor atmospheric temperatures at all latitudes and longitudes, and obtain a global average that lessens the likelihood of becoming endlessly mired in the confusions brought about by the certainty of regional differences. With satellites we can monitor greenhouse gases such as CO2 not just above individual research stations but around the globe. With satellites we can monitor the polar sea ice covers, as we have done since the late 1970s, determining and quantifying the significant reduction in Arctic sea ice and the slight growth in Antarctic sea ice over that period, With satellites we can map the full extent and changes in the Antarctic stratospheric ozone depletions that were first identified from using a single ground station; and through satellite data we have witnessed from afar land surface changes brought about by humans both intentionally, as with wide-scale deforestation, and unintentionally, as with the decay of the Aral Sea. The satellite data are far from sufficient for all that we need in order to understand the global system and forecast its changes, as we also need sophisticated climate models, in situ process studies, and data sets that extend back well before the introduction of satellite technology. Nonetheless, the repetitive, global view provided by satellites is contributing in a major way to our improved recognition of how the Earth im changing, a recognition that is none too soon in view of the magnitude of the impacts that humans can now have.

  19. University Satellite Consortium and Space Education in Japan Centered on Micro-Nano Satellites

    NASA Astrophysics Data System (ADS)

    Nakasuka, S.; Kawashima, R.

    2002-01-01

    in Japan especially centered on micro or nano class satellites. Hands-on training using micro-nano satellites provide unique opportunity of space education to university level students, by giving them a chance to experience the whole space project cycle from mission creation, satellite design, fabrication, test, launch, operation through analysis of the results. Project management and team working are other important skills that can be trained in these projects. include 1) low cost, which allows one laboratory in university to carry out a project, 2) short development period such as one or two year, which enables students to obtain the results of their projects before they graduate, and 3) small size and weight, which enables fabrication and test within usually very narrow university laboratory areas. In Japan, several projects such as CanSat, CubeSat or Whale Observation Satellite have been carried out, proving that micro-nano satellites provide very unique and valuable educational opportunity. with the objective to make a university student and staff community of these micro-nano satellite related activities in Japan. This consortium aims for many activities including facilitating information and skills exchange and collaborations between member universities, helping students to use ground test facilities of national laboratories, consulting them on political or law related matters, coordinating joint development of equipments or projects, and bridging between these university activities and the needs or interests of the people in general. This kind of outreach activity is essential because how to create missions of micro-nano satellites should be pursued in order for this field to grow larger than a merely educational enterprise. The final objectives of the consortium is to make a huge community of the users, mission creators, investors and manufactures(i.e., university students) of micro-nano satellites, and provide a unique contribution to the activation of the space development. activities, including how to acquire frequency permission, how to obtain launch opportunity and financial support, how to operate the launched satellites using cheap ground stations, etc. Especially, the frequency problem should be solved as soon as possible because so many universities in the world are planning similar projects and the frequency in the amateur band are already very congested. One idea is that universities should make a world wide "university satellite community" and collaboratively ask for a kind of "Educational frequency" to ITU, and share the obtained frequency within the community under the community's own management. This kind of community will also be useful for collaborative satellite operation, because the universities which have a ground station spread over the world. I hope the IAC meeting will provide a good opportunity for discussing these problems and facilitating the construction of world wide university community to tackle with these problems.

  20. Determination of motion extrema in multi-satellite systems

    NASA Astrophysics Data System (ADS)

    Allgeier, Shawn E.

    Spacecraft, or satellite formation flight has been a topic of interest dating back to the Gemini program of the 1960s. Traditionally space missions have been designed around large monolithic assets. Recent interest in low cost, rapid call up mission architectures structured around fractionated systems, small satellites, and constellations has spurred renewed efforts in spacecraft relative motion problems. While such fractionated, or multi-body systems may provide benefits in terms of risk mitigation and cost savings, they introduce new technical challenges in terms of satellite coordination. Characterization of satellite formations is a vital requirement for them to have utility to industry and government entities. Satellite formations introduce challenges in the form of constellation maintenance, inter-satellite communications, and the demand for more sophisticated guidance, navigation, and control systems. At the core of these challenges is the orbital mechanics which govern the resulting motion. New applications of algebraic techniques are applied to the formation flight problem, specifically Gröbner basis tools, as a means of determining extrema of certain quantities pertaining to formation flight. Specifically, bounds are calculated for the relative position components, relative speed, relative velocity components, and range rate. The position based metrics are relevant for planning formation geometry, particularly in constellation or Earth observation applications. The velocity metrics are relevant in the design of end game interactions for rendezvous and proximity operations. The range rate of one satellite to another is essential in the design of radio frequency hardware for inter-satellite communications so that the doppler shift can be calculated a priori. Range rate may also have utility in space based surveillance and space situational awareness concerns, such as cross tagging. The results presented constitute a geometric perspective and have utility to mission designers, particularly for missions involving rendezvous and proximity operations.

  1. Progress developing the JAXA next generation satellite data repository (G-Portal).

    NASA Astrophysics Data System (ADS)

    Ikehata, Y.

    2016-12-01

    JAXA has been operating the "G-Portal" as a repository for search and access data of Earth observation satellite related JAXA since February 2013. The G-Portal handles ten satellites data; GPM, TRMM, Aqua, ADEOS-II, ALOS (search only), ALOS-2 (search only), MOS-1, MOS-1b, ERS-1 and JERS-1. G-Portal plans to import future satellites GCOM-C and EarthCARE. Except for ALOS and ALOS-2, all of these data are open and free. The G-Portal supports web search, catalogue search (CSW and OpenSearch) and direct download by SFTP for data access. However, the G-Portal has some problems about performance and usability. For example, about performance, the G-Portal is based on 10Gbps network and uses scale out architecture. (Conceptual design was reported in AGU fall meeting 2015. (IN23D-1748)) In order to improve those problems, JAXA is developing the next generation repository since February 2016. This paper describes usability problems improvements and challenges towards the next generation system. The improvements and challenges include the following points. Current web interface uses "step by step" design and URL is generated randomly. For that reason, users must see the Web page and click many times to get desired satellite data. So, Web design will be changed completely from "step by step" to "1 page" and URL will be based on REST (REpresentational State Transfer). Regarding direct download, the current method(SFTP) is very hard to use because of anomaly port assign and key-authentication. So, we will support FTP protocol. Additionally, the next G-Portal improve catalogue service. Currently catalogue search is available only to limited users including NASA, ESA and CEOS due to performance and reliability issue, but we will remove this limitation. Furthermore, catalogue search client function will be implemented to take in other agencies satellites catalogue. Users will be able to search satellite data across agencies.

  2. Helping a loved one with a drinking problem

    MedlinePlus

    ... school because of alcohol use Have trouble with relationships because of drinking Miss important work, school, or ... Call Your Doctor If you feel that your relationship with this person is becoming dangerous or is ...

  3. Streaming PCA with many missing entries.

    DOT National Transportation Integrated Search

    2015-12-01

    This paper considers the problem of matrix completion when some number of the columns are : completely and arbitrarily corrupted, potentially by a malicious adversary. It is well-known that standard : algorithms for matrix completion can return arbit...

  4. Orbit-spectrum sharing between the fixed-satellite and broadcasting-satellite services with applications to 12 GHz domestic systems

    NASA Technical Reports Server (NTRS)

    Reinhart, E. E.

    1974-01-01

    A systematic, tutorial analysis of the general problem of orbit-spectrum sharing among inhomogeneous satellite system is presented. Emphasis is placed on extrapolating and applying the available data on rain attenuation and on reconciling differences in the results of various measurements of the subjective effects of interference on television picture quality. An analytic method is presented for determining the approximate values of the intersatellite spacings required to keep mutual interference levels within prescribed limits when many dissimilar satellites share the orbit. A computer model was developed for assessing the interference compatibility of arbitrary configurations of large numbers of geostationary satellite systems. It is concluded that the band from 11.7 c GHz can be shared effectively by broadcasting-satellite and fixed-satellite systems. Recommendations for future study are included.

  5. User needs for propagation data

    NASA Technical Reports Server (NTRS)

    Sullivan, Thomas M.

    1993-01-01

    New and refined models of radio signal propagation phenomena are needed to support studies of evolving satellite services and systems. Taking an engineering perspective, applications for propagation measurements and models in the context of various types of analyses that are of ongoing interest are reviewed. Problems that were encountered in the signal propagation aspects of these analyses are reviewed, and potential solutions to these problems are discussed. The focus is on propagation measurements and models needed to support design and performance analyses of systems in the Mobile-Satellite Service (MSS) operating in the 1-3 GHz range. These systems may use geostationary or non-geostationary satellites and Frequency Division Multiple Access (FDMA), Time Division Multiple Access Digital (TDMA), or Code Division Multiple Access (CDMA) techniques. Many of the propagation issues raised in relation to MSS are also pertinent to other services such as broadcasting-satellite (sound) at 2310-2360 MHz. In particular, services involving mobile terminals or terminals with low gain antennas are of concern.

  6. Heart Monitoring By Satellite

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The ambulance antenna shown is a specially designed system that allows satellite-relayed two-way communications between a moving emergency vehicle and a hospital emergency room. It is a key component of a demonstration program aimed at showing how emergency medical service can be provided to people in remote rural areas. Satellite communication permits immediate, hospital- guided treatment of heart attacks or other emergencies by ambulance personnel, saving vital time when the scene of the emergency is remote from the hospital. If widely adopted, the system could save tens of thousands of lives annually in the U.S. alone, medical experts say. The problem in conventional communication with rural areas is the fact that radio signals travel in line of sight. They may be blocked by tall buildings, hills and mountains, or even by the curvature of the Earth, so signal range is sharply limited. Microwave relay towers could solve the problem, but a complete network of repeater towers would be extremely expensive. The satellite provides an obstruction-free relay station in space.

  7. Applications systems verification and transfer project. Volume 5: Operational applications of satellite snow-cover observations, northwest United States

    NASA Technical Reports Server (NTRS)

    Dillard, J. P.

    1981-01-01

    The study objective was to develop or modify methods in an operational framework that would allow incorporation of satellite derived snow cover observations for prediction of snowmelt derived runoff. Data were reviewed and verified for five basins in the Pacific Northwest. The data were analyzed for up to a 6-year period ending July 1978, and in all cases cover a low, average, and high snow cover/runoff year. Cloud cover is a major problem in these springtime runoff analyses and have hampered data collection for periods of up to 52 days. Tree cover and terrain are sufficiently dense and rugged to have caused problems. The interpretation of snowlines from satellite data was compared with conventional ground truth data and tested in operational streamflow forecasting models. When the satellite snow-covered area (SCA) data are incorporated in the SSARR (Streamflow Synthesis and Reservoir Regulation) model, there is a definite but minor improvement.

  8. Geometric model of pseudo-distance measurement in satellite location systems

    NASA Astrophysics Data System (ADS)

    Panchuk, K. L.; Lyashkov, A. A.; Lyubchinov, E. V.

    2018-04-01

    The existing mathematical model of pseudo-distance measurement in satellite location systems does not provide a precise solution of the problem, but rather an approximate one. The existence of such inaccuracy, as well as bias in measurement of distance from satellite to receiver, results in inaccuracy level of several meters. Thereupon, relevance of refinement of the current mathematical model becomes obvious. The solution of the system of quadratic equations used in the current mathematical model is based on linearization. The objective of the paper is refinement of current mathematical model and derivation of analytical solution of the system of equations on its basis. In order to attain the objective, geometric analysis is performed; geometric interpretation of the equations is given. As a result, an equivalent system of equations, which allows analytical solution, is derived. An example of analytical solution implementation is presented. Application of analytical solution algorithm to the problem of pseudo-distance measurement in satellite location systems allows to improve the accuracy such measurements.

  9. MVIAeval: a web tool for comprehensively evaluating the performance of a new missing value imputation algorithm.

    PubMed

    Wu, Wei-Sheng; Jhou, Meng-Jhun

    2017-01-13

    Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.

  10. Evaluation of methods to estimate missing days' supply within pharmacy data of the Clinical Practice Research Datalink (CPRD) and The Health Improvement Network (THIN).

    PubMed

    Lum, Kirsten J; Newcomb, Craig W; Roy, Jason A; Carbonari, Dena M; Saine, M Elle; Cardillo, Serena; Bhullar, Harshvinder; Gallagher, Arlene M; Lo Re, Vincent

    2017-01-01

    The extent to which days' supply data are missing in pharmacoepidemiologic databases and effective methods for estimation is unknown. We determined the percentage of missing days' supply on prescription and patient levels for oral anti-diabetic drugs (OADs) and evaluated three methods for estimating days' supply within the Clinical Practice Research Datalink (CPRD) and The Health Improvement Network (THIN). We estimated the percentage of OAD prescriptions and patients with missing days' supply in each database from 2009 to 2013. Within a random sample of prescriptions with known days' supply, we measured the accuracy of three methods to estimate missing days' supply by imputing the following: (1) 28 days' supply, (2) mode number of tablets/day by drug strength and number of tablets/prescription, and (3) number of tablets/day via a machine learning algorithm. We determined incidence rates (IRs) of acute myocardial infarction (AMI) using each method to evaluate the impact on ascertainment of exposure time and outcomes. Days' supply was missing for 24 % of OAD prescriptions in CPRD and 33 % in THIN (affecting 48 and 57 % of patients, respectively). Methods 2 and 3 were very accurate in estimating days' supply for OADs prescribed at a consistent number of tablets/day. Method 3 was more accurate for OADs prescribed at varying number of tablets/day. IRs of AMI were similar across methods for most OADs. Missing days' supply is a substantial problem in both databases. Method 2 is easy and very accurate for most OADs and results in IRs comparable to those from method 3.

  11. Addressing the too big to fail problem with baryon physics and sterile neutrino dark matter

    NASA Astrophysics Data System (ADS)

    Lovell, Mark R.; Gonzalez-Perez, Violeta; Bose, Sownak; Boyarsky, Alexey; Cole, Shaun; Frenk, Carlos S.; Ruchayskiy, Oleg

    2017-07-01

    N-body dark matter simulations of structure formation in the Λ cold dark matter (ΛCDM) model predict a population of subhaloes within Galactic haloes that have higher central densities than inferred for the Milky Way satellites, a tension known as the 'too big to fail' problem. Proposed solutions include baryonic effects, a smaller mass for the Milky Way halo and warm dark matter (WDM). We test these possibilities using a semi-analytic model of galaxy formation to generate luminosity functions for Milky Way halo-analogue satellite populations, the results of which are then coupled to the Jiang & van den Bosch model of subhalo stripping to predict the subhalo Vmax functions for the 10 brightest satellites. We find that selecting the brightest satellites (as opposed to the most massive) and modelling the expulsion of gas by supernovae at early times increases the likelihood of generating the observed Milky Way satellite Vmax function. The preferred halo mass is 6 × 1011 M⊙, which has a 14 per cent probability to host a Vmax function like that of the Milky Way satellites. We conclude that the Milky Way satellite Vmax function is compatible with a CDM cosmology, as previously found by Sawala et al. using hydrodynamic simulations. Sterile neutrino-WDM models achieve a higher degree of agreement with the observations, with a maximum 50 per cent chance of generating the observed Milky Way satellite Vmax function. However, more work is required to check that the semi-analytic stripping model is calibrated correctly for each sterile neutrino cosmology.

  12. Refinement of ground reference data with segmented image data

    NASA Technical Reports Server (NTRS)

    Robinson, Jon W.; Tilton, James C.

    1991-01-01

    One of the ways to determine ground reference data (GRD) for satellite remote sensing data is to photo-interpret low altitude aerial photographs and then digitize the cover types on a digitized tablet and register them to 7.5 minute U.S.G.S. maps (that were themselves digitized). The resulting GRD can be registered to the satellite image or, vice versa. Unfortunately, there are many opportunities for error when using digitizing tablet and the resolution of the edges for the GRD depends on the spacing of the points selected on the digitizing tablet. One of the consequences of this is that when overlaid on the image, errors and missed detail in the GRD become evident. An approach is discussed for correcting these errors and adding detail to the GRD through the use of a highly interactive, visually oriented process. This process involves the use of overlaid visual displays of the satellite image data, the GRD, and a segmentation of the satellite image data. Several prototype programs were implemented which provide means of taking a segmented image and using the edges from the reference data to mask out these segment edges that are beyond a certain distance from the reference data edges. Then using the reference data edges as a guide, those segment edges that remain and that are judged not to be image versions of the reference edges are manually marked and removed. The prototype programs that were developed and the algorithmic refinements that facilitate execution of this task are described.

  13. Characterizing and Managing Missing Structured Data in Electronic Health Records: Data Analysis.

    PubMed

    Beaulieu-Jones, Brett K; Lavage, Daniel R; Snyder, John W; Moore, Jason H; Pendergrass, Sarah A; Bauer, Christopher R

    2018-02-23

    Missing data is a challenge for all studies; however, this is especially true for electronic health record (EHR)-based analyses. Failure to appropriately consider missing data can lead to biased results. While there has been extensive theoretical work on imputation, and many sophisticated methods are now available, it remains quite challenging for researchers to implement these methods appropriately. Here, we provide detailed procedures for when and how to conduct imputation of EHR laboratory results. The objective of this study was to demonstrate how the mechanism of missingness can be assessed, evaluate the performance of a variety of imputation methods, and describe some of the most frequent problems that can be encountered. We analyzed clinical laboratory measures from 602,366 patients in the EHR of Geisinger Health System in Pennsylvania, USA. Using these data, we constructed a representative set of complete cases and assessed the performance of 12 different imputation methods for missing data that was simulated based on 4 mechanisms of missingness (missing completely at random, missing not at random, missing at random, and real data modelling). Our results showed that several methods, including variations of Multivariate Imputation by Chained Equations (MICE) and softImpute, consistently imputed missing values with low error; however, only a subset of the MICE methods was suitable for multiple imputation. The analyses we describe provide an outline of considerations for dealing with missing EHR data, steps that researchers can perform to characterize missingness within their own data, and an evaluation of methods that can be applied to impute clinical data. While the performance of methods may vary between datasets, the process we describe can be generalized to the majority of structured data types that exist in EHRs, and all of our methods and code are publicly available. ©Brett K Beaulieu-Jones, Daniel R Lavage, John W Snyder, Jason H Moore, Sarah A Pendergrass, Christopher R Bauer. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 23.02.2018.

  14. Push-Broom-Type Very High-Resolution Satellite Sensor Data Correction Using Combined Wavelet-Fourier and Multiscale Non-Local Means Filtering.

    PubMed

    Kang, Wonseok; Yu, Soohwan; Seo, Doochun; Jeong, Jaeheon; Paik, Joonki

    2015-09-10

    In very high-resolution (VHR) push-broom-type satellite sensor data, both destriping and denoising methods have become chronic problems and attracted major research advances in the remote sensing fields. Since the estimation of the original image from a noisy input is an ill-posed problem, a simple noise removal algorithm cannot preserve the radiometric integrity of satellite data. To solve these problems, we present a novel method to correct VHR data acquired by a push-broom-type sensor by combining wavelet-Fourier and multiscale non-local means (NLM) filters. After the wavelet-Fourier filter separates the stripe noise from the mixed noise in the wavelet low- and selected high-frequency sub-bands, random noise is removed using the multiscale NLM filter in both low- and high-frequency sub-bands without loss of image detail. The performance of the proposed method is compared to various existing methods on a set of push-broom-type sensor data acquired by Korean Multi-Purpose Satellite 3 (KOMPSAT-3) with severe stripe and random noise, and the results of the proposed method show significantly improved enhancement results over existing state-of-the-art methods in terms of both qualitative and quantitative assessments.

  15. Push-Broom-Type Very High-Resolution Satellite Sensor Data Correction Using Combined Wavelet-Fourier and Multiscale Non-Local Means Filtering

    PubMed Central

    Kang, Wonseok; Yu, Soohwan; Seo, Doochun; Jeong, Jaeheon; Paik, Joonki

    2015-01-01

    In very high-resolution (VHR) push-broom-type satellite sensor data, both destriping and denoising methods have become chronic problems and attracted major research advances in the remote sensing fields. Since the estimation of the original image from a noisy input is an ill-posed problem, a simple noise removal algorithm cannot preserve the radiometric integrity of satellite data. To solve these problems, we present a novel method to correct VHR data acquired by a push-broom-type sensor by combining wavelet-Fourier and multiscale non-local means (NLM) filters. After the wavelet-Fourier filter separates the stripe noise from the mixed noise in the wavelet low- and selected high-frequency sub-bands, random noise is removed using the multiscale NLM filter in both low- and high-frequency sub-bands without loss of image detail. The performance of the proposed method is compared to various existing methods on a set of push-broom-type sensor data acquired by Korean Multi-Purpose Satellite 3 (KOMPSAT-3) with severe stripe and random noise, and the results of the proposed method show significantly improved enhancement results over existing state-of-the-art methods in terms of both qualitative and quantitative assessments. PMID:26378532

  16. Considerations for blending data from various sensors

    USGS Publications Warehouse

    Bauer, Brian P.; Barringer, Anthony R.

    1980-01-01

    A project is being proposed at the EROS Data Center to blend the information from sensors aboard various satellites. The problems of, and considerations for, blending data from several satellite-borne sensors are discussed. System descriptions of the sensors aboard the HCMM, TIROS-N, GOES-D, Landsat 3, Landsat D, Seasat, SPOT, Stereosat, and NOSS satellites, and the quantity, quality, image dimensions, and availability of these data are summaries to define attributes of a multi-sensor satellite data base. Unique configurations of equipment, storage, media, and specialized hardware to meet the data system requirement are described as well as archival media and improved sensors that will be on-line within the next 5 years. Definitions and rigor required for blending various sensor data are given. Problems of merging data from the same sensor (intrasensor comparison) and from different sensors (intersensor comparison), the characteristics and advantages of cross-calibration of data, and integration of data into a product matrix field are addressed. Data processing considerations as affected by formation, resolution, and problems of merging large data sets, and organization of data bases for blending data are presented. Examples utilizing GOES and Landsat data are presented to demonstrate techniques of data blending, and recommendations for future implementation of a set of standard scenes and their characteristics necessary for optimal data blending are discussed.

  17. Space-Based Remote Sensing of Atmospheric Aerosols: The Multi-Angle Spectro-Polarimetric Frontier

    NASA Technical Reports Server (NTRS)

    Kokhanovsky, A. A.; Davis, A. B.; Cairns, B.; Dubovik, O.; Hasekamp, O. P.; Sano, I.; Mukai, S.; Rozanov, V. V.; Litvinov, P.; Lapyonok, T.; hide

    2015-01-01

    The review of optical instrumentation, forward modeling, and inverse problem solution for the polarimetric aerosol remote sensing from space is presented. The special emphasis is given to the description of current airborne and satellite imaging polarimeters and also to modern satellite aerosol retrieval algorithms based on the measurements of the Stokes vector of reflected solar light as detected on a satellite. Various underlying surface reflectance models are discussed and evaluated.

  18. The effects of 'does not apply' on measurement of temperament with the Infant Behavior Questionnaire-Revised: A cautionary tale for very young infants.

    PubMed

    Giesbrecht, Gerald F; Dewey, Deborah

    2014-10-01

    The Infant Behavior Questionnaire-Revised (IBQ-R) is a widely used parent report measure of infant temperament. Items marked 'does not apply' (NA) are treated as missing data when calculating scale scores, but the effect of this practice on assessment of infant temperament has not been reported. To determine the effect of NA responses on assessment of infant temperament and to evaluate the remedy offered by several missing data strategies. A prospective, community-based longitudinal cohort study. 401 infants who were born>37 weeks of gestation. Mothers completed the short form of the IBQ-R when infants were 3-months and 6-months of age. The rate of NA responses at the 3-month assessment was three times as high (22%) as the rate at six months (7%). Internal consistency was appreciably reduced and scale means were inflated in the presence of NA responses, especially at 3-months. The total number of NA items endorsed by individual parents was associated with infant age and parity. None of the missing data strategies completely eliminated problems related to NA responses but the Expectation Maximization algorithm greatly reduced these problems. The findings suggest that researchers should exercise caution when interpreting results obtained from infants at 3 months of age. Careful selection of scales, selecting a full length version of the IBQ-R, and use of a modern missing data technique may help to maintain the quality of data obtained from very young infants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. An introduction to the global positioning system and some geological applications

    NASA Technical Reports Server (NTRS)

    Dixon, T. H.

    1991-01-01

    The fundamental principles of the global positioning system (GPS) are reviewed, with consideration given to geological and geophysical applications and related accuracy requirements. Recent improvements are emphasized which relate to areas such as equipment cost, limitations in the GPS satellite constellation, data analysis, uncertainties in satellite orbits and propagation delays, and problems in resolving carrier phase cycle ambiguities. Earthquake processes and near-fault crustal deformation monitoring have been facilitated by advances in GPS data acquisition and analysis. Horizontal positioning capability has been improved by new satellite constellation, better models, and global tracking networks. New classes of tectonic problems may now be studied through GPS, such as kinematic descriptions of crustal deformation and the measurement of relative plate motion at convergent boundaries. Continued improvements in the GPS are foreseen.

  20. A Comparison of Techniques for Scheduling Fleets of Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2003-01-01

    Earth observing satellite (EOS) scheduling is a complex real-world domain representative of a broad class of over-subscription scheduling problems. Over-subscription problems are those where requests for a facility exceed its capacity. These problems arise in a wide variety of NASA and terrestrial domains and are .XI important class of scheduling problems because such facilities often represent large capital investments. We have run experiments comparing multiple variants of the genetic algorithm, hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on two variants of a realistically-sized model of the EOS scheduling problem. These are implemented as permutation-based methods; methods that search in the space of priority orderings of observation requests and evaluate each permutation by using it to drive a greedy scheduler. Simulated annealing performs best and random mutation operators outperform our squeaky (more intelligent) operator. Furthermore, taking smaller steps towards the end of the search improves performance.

Top