Sample records for minimum time needed

  1. Computationally-Efficient Minimum-Time Aircraft Routes in the Presence of Winds

    NASA Technical Reports Server (NTRS)

    Jardin, Matthew R.

    2004-01-01

    A computationally efficient algorithm for minimizing the flight time of an aircraft in a variable wind field has been invented. The algorithm, referred to as Neighboring Optimal Wind Routing (NOWR), is based upon neighboring-optimal-control (NOC) concepts and achieves minimum-time paths by adjusting aircraft heading according to wind conditions at an arbitrary number of wind measurement points along the flight route. The NOWR algorithm may either be used in a fast-time mode to compute minimum- time routes prior to flight, or may be used in a feedback mode to adjust aircraft heading in real-time. By traveling minimum-time routes instead of direct great-circle (direct) routes, flights across the United States can save an average of about 7 minutes, and as much as one hour of flight time during periods of strong jet-stream winds. The neighboring optimal routes computed via the NOWR technique have been shown to be within 1.5 percent of the absolute minimum-time routes for flights across the continental United States. On a typical 450-MHz Sun Ultra workstation, the NOWR algorithm produces complete minimum-time routes in less than 40 milliseconds. This corresponds to a rate of 25 optimal routes per second. The closest comparable optimization technique runs approximately 10 times slower. Airlines currently use various trial-and-error search techniques to determine which of a set of commonly traveled routes will minimize flight time. These algorithms are too computationally expensive for use in real-time systems, or in systems where many optimal routes need to be computed in a short amount of time. Instead of operating in real-time, airlines will typically plan a trajectory several hours in advance using wind forecasts. If winds change significantly from forecasts, the resulting flights will no longer be minimum-time. The need for a computationally efficient wind-optimal routing algorithm is even greater in the case of new air-traffic-control automation concepts. For air-traffic-control automation, thousands of wind-optimal routes may need to be computed and checked for conflicts in just a few minutes. These factors motivated the need for a more efficient wind-optimal routing algorithm.

  2. A simple device for measuring the minimum current velocity to maintain semi-buoyant fish eggs in suspension

    USGS Publications Warehouse

    Mueller, Julia S.; Cheek, Brandon D.; Chen, Qingman; Groeschel, Jillian R.; Brewer, Shannon K.; Grabowski, Timothy B.

    2013-01-01

    Pelagic broadcast spawning cyprinids are common to Great Plains rivers and streams. This reproductive guild produces non-adhesive semi-buoyant eggs that require sufficient current velocity to remain in suspension during development. Although studies have shown that there may be a minimum velocity needed to keep the eggs in suspension, this velocity has not been estimated directly nor has the influence of physicochemical factors on egg buoyancy been determined. We developed a simple, inexpensive flow chamber that allowed for evaluation of minimum current velocity needed to keep semi-buoyant eggs in suspension at any time frame during egg development. The device described here has the capability of testing the minimum current velocity needed to keep semi-buoyant eggs in suspension at a wide range of physicochemical conditions. We used gellan beads soaked in freshwater for 0, 24, and 48 hrs as egg surrogates and evaluated minimum current velocities necessary to keep them in suspension at different combinations of temperature (20.0 ± 1.0° C, 25.0 ± 1.0° C, and 28.0 ± 1.0° C) and total dissolved solids (TDS; 1,000 mg L-1, 3,000 mg L-1, and 6,000 mg L-1). We found that our methodology generated consistent, repeatable results within treatment groups. Current velocities ranging from 0.001–0.026 needed to keep the gellan beads in suspension were negatively correlated to soak times and TDS and positively correlated with temperature. The flow chamber is a viable approach for evaluating minimum current velocities needed to keep the eggs of pelagic broadcast spawning cyprinids in suspension during development.

  3. Fire behavior simulation in Mediterranean forests using the minimum travel time algorithm

    Treesearch

    Kostas Kalabokidis; Palaiologos Palaiologou; Mark A. Finney

    2014-01-01

    Recent large wildfires in Greece exemplify the need for pre-fire burn probability assessment and possible landscape fire flow estimation to enhance fire planning and resource allocation. The Minimum Travel Time (MTT) algorithm, incorporated as FlamMap's version five module, provide valuable fire behavior functions, while enabling multi-core utilization for the...

  4. Trade-offs between driving nodes and time-to-control in complex networks

    PubMed Central

    Pequito, Sérgio; Preciado, Victor M.; Barabási, Albert-László; Pappas, George J.

    2017-01-01

    Recent advances in control theory provide us with efficient tools to determine the minimum number of driving (or driven) nodes to steer a complex network towards a desired state. Furthermore, we often need to do it within a given time window, so it is of practical importance to understand the trade-offs between the minimum number of driving/driven nodes and the minimum time required to reach a desired state. Therefore, we introduce the notion of actuation spectrum to capture such trade-offs, which we used to find that in many complex networks only a small fraction of driving (or driven) nodes is required to steer the network to a desired state within a relatively small time window. Furthermore, our empirical studies reveal that, even though synthetic network models are designed to present structural properties similar to those observed in real networks, their actuation spectra can be dramatically different. Thus, it supports the need to develop new synthetic network models able to replicate controllability properties of real-world networks. PMID:28054597

  5. Trade-offs between driving nodes and time-to-control in complex networks

    NASA Astrophysics Data System (ADS)

    Pequito, Sérgio; Preciado, Victor M.; Barabási, Albert-László; Pappas, George J.

    2017-01-01

    Recent advances in control theory provide us with efficient tools to determine the minimum number of driving (or driven) nodes to steer a complex network towards a desired state. Furthermore, we often need to do it within a given time window, so it is of practical importance to understand the trade-offs between the minimum number of driving/driven nodes and the minimum time required to reach a desired state. Therefore, we introduce the notion of actuation spectrum to capture such trade-offs, which we used to find that in many complex networks only a small fraction of driving (or driven) nodes is required to steer the network to a desired state within a relatively small time window. Furthermore, our empirical studies reveal that, even though synthetic network models are designed to present structural properties similar to those observed in real networks, their actuation spectra can be dramatically different. Thus, it supports the need to develop new synthetic network models able to replicate controllability properties of real-world networks.

  6. Coding for Communication Channels with Dead-Time Constraints

    NASA Technical Reports Server (NTRS)

    Moision, Bruce; Hamkins, Jon

    2004-01-01

    Coding schemes have been designed and investigated specifically for optical and electronic data-communication channels in which information is conveyed via pulse-position modulation (PPM) subject to dead-time constraints. These schemes involve the use of error-correcting codes concatenated with codes denoted constrained codes. These codes are decoded using an interactive method. In pulse-position modulation, time is partitioned into frames of Mslots of equal duration. Each frame contains one pulsed slot (all others are non-pulsed). For a given channel, the dead-time constraints are defined as a maximum and a minimum on the allowable time between pulses. For example, if a Q-switched laser is used to transmit the pulses, then the minimum allowable dead time is the time needed to recharge the laser for the next pulse. In the case of bits recorded on a magnetic medium, the minimum allowable time between pulses depends on the recording/playback speed and the minimum distance between pulses needed to prevent interference between adjacent bits during readout. The maximum allowable dead time for a given channel is the maximum time for which it is possible to satisfy the requirement to synchronize slots. In mathematical shorthand, the dead-time constraints for a given channel are represented by the pair of integers (d,k), where d is the minimum allowable number of zeroes between ones and k is the maximum allowable number of zeroes between ones. A system of the type to which the present schemes apply is represented by a binary- input, real-valued-output channel model illustrated in the figure. At the transmitting end, information bits are first encoded by use of an error-correcting code, then further encoded by use of a constrained code. Several constrained codes for channels subject to constraints of (d,infinity) have been investigated theoretically and computationally. The baseline codes chosen for purposes of comparison were simple PPM codes characterized by M-slot PPM frames separated by d-slot dead times.

  7. Efficiency and large deviations in time-asymmetric stochastic heat engines

    DOE PAGES

    Gingrich, Todd R.; Rotskoff, Grant M.; Vaikuntanathan, Suriyanarayanan; ...

    2014-10-24

    In a stochastic heat engine driven by a cyclic non-equilibrium protocol, fluctuations in work and heat give rise to a fluctuating efficiency. Using computer simulations and tools from large deviation theory, we have examined these fluctuations in detail for a model two-state engine. We find in general that the form of efficiency probability distributions is similar to those described by Verley et al (2014 Nat. Commun. 5 4721), in particular featuring a local minimum in the long-time limit. In contrast to the time-symmetric engine protocols studied previously, however, this minimum need not occur at the value characteristic of a reversible Carnot engine. Furthermore, while the local minimum may reside at the global minimum of a large deviation rate function, it does not generally correspond to the least likely efficiency measured over finite time. Lastly, we introduce a general approximation for the finite-time efficiency distribution,more » $$P(\\eta )$$, based on large deviation statistics of work and heat, that remains very accurate even when $$P(\\eta )$$ deviates significantly from its large deviation form.« less

  8. How Does Definition of Minimum Break Length Affect Objective Measures of Sitting Outcomes Among Office Workers?

    PubMed

    Kloster, Stine; Danquah, Ida Høgstedt; Holtermann, Andreas; Aadahl, Mette; Tolstrup, Janne Schurmann

    2017-01-01

    Harmful health effects associated with sedentary behavior may be attenuated by breaking up long periods of sitting by standing or walking. However, studies assess interruptions in sitting time differently, making comparisons between studies difficult. It has not previously been described how the definition of minimum break duration affects sitting outcomes. Therefore, the aim was to address how definitions of break length affect total sitting time, number of sit-to-stand transitions, prolonged sitting periods and time accumulated in prolonged sitting periods among office workers. Data were collected from 317 office workers. Thigh position was assessed with an ActiGraph GT3X+ fixed on the right thigh. Data were exported with varying bout length of breaks. Afterward, sitting outcomes were calculated for the respective break lengths. Absolute numbers of sit-to-stand transitions decreased, and number of prolonged sitting periods and total time accumulated in prolonged sitting periods increased, with increasing minimum break length. Total sitting time was not influenced by varying break length. The definition of minimum break length influenced the sitting outcomes with the exception of total sitting time. A standard definition of break length is needed for comparison and interpretation of studies in the evolving research field of sedentary behavior.

  9. Application-oriented offloading in heterogeneous networks for mobile cloud computing

    NASA Astrophysics Data System (ADS)

    Tseng, Fan-Hsun; Cho, Hsin-Hung; Chang, Kai-Di; Li, Jheng-Cong; Shih, Timothy K.

    2018-04-01

    Nowadays Internet applications have become more complicated that mobile device needs more computing resources for shorter execution time but it is restricted to limited battery capacity. Mobile cloud computing (MCC) is emerged to tackle the finite resource problem of mobile device. MCC offloads the tasks and jobs of mobile devices to cloud and fog environments by using offloading scheme. It is vital to MCC that which task should be offloaded and how to offload efficiently. In the paper, we formulate the offloading problem between mobile device and cloud data center and propose two algorithms based on application-oriented for minimum execution time, i.e. the Minimum Offloading Time for Mobile device (MOTM) algorithm and the Minimum Execution Time for Cloud data center (METC) algorithm. The MOTM algorithm minimizes offloading time by selecting appropriate offloading links based on application categories. The METC algorithm minimizes execution time in cloud data center by selecting virtual and physical machines with corresponding resource requirements of applications. Simulation results show that the proposed mechanism not only minimizes total execution time for mobile devices but also decreases their energy consumption.

  10. Proton Fluxes Measured by the PAMELA Experiment from the Minimum to the Maximum Solar Activity for Solar Cycle 24

    NASA Astrophysics Data System (ADS)

    Martucci, M.; Munini, R.; Boezio, M.; Di Felice, V.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bruno, A.; Cafagna, F.; Campana, D.; Carlson, P.; Casolino, M.; Castellini, G.; De Santis, C.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Leonov, A.; Malakhov, V.; Marcelli, L.; Marcelli, N.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Osteria, G.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Potgieter, M. S.; Raath, J. L.

    2018-02-01

    Precise measurements of the time-dependent intensity of the low-energy (<50 GeV) galactic cosmic rays (GCRs) are fundamental to test and improve the models that describe their propagation inside the heliosphere. In particular, data spanning different solar activity periods, i.e., from minimum to maximum, are needed to achieve comprehensive understanding of such physical phenomena. The minimum phase between solar cycles 23 and 24 was peculiarly long, extending up to the beginning of 2010 and followed by the maximum phase, reached during early 2014. In this Letter, we present proton differential spectra measured from 2010 January to 2014 February by the PAMELA experiment. For the first time the GCR proton intensity was studied over a wide energy range (0.08–50 GeV) by a single apparatus from a minimum to a maximum period of solar activity. The large statistics allowed the time variation to be investigated on a nearly monthly basis. Data were compared and interpreted in the context of a state-of-the-art three-dimensional model describing the GCRs propagation through the heliosphere.

  11. 40 CFR 1065.308 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers not...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... meets a minimum response time. You may use the results of this test to determine transformation time, t... you use any analog or real-time digital filters during emission testing, you must operate those... the rise time and fall time as needed. You may also configure analog or digital filters before...

  12. White-crowned sparrow males show immediate flexibility in song amplitude but not in song minimum frequency in response to changes in noise levels in the field.

    PubMed

    Derryberry, Elizabeth P; Gentry, Katherine; Derryberry, Graham E; Phillips, Jennifer N; Danner, Raymond M; Danner, Julie E; Luther, David A

    2017-07-01

    The soundscape acts as a selective agent on organisms that use acoustic signals to communicate. A number of studies document variation in structure, amplitude, or timing of signal production in correspondence with environmental noise levels thus supporting the hypothesis that organisms are changing their signaling behaviors to avoid masking. The time scale at which organisms respond is of particular interest. Signal structure may evolve across generations through processes such as cultural or genetic transmission. Individuals may also change their behavior during development (ontogenetic change) or in real time (i.e., immediate flexibility). These are not mutually exclusive mechanisms, and all must be investigated to understand how organisms respond to selection pressures from the soundscape. Previous work on white-crowned sparrows ( Zonotrichia leucophrys ) found that males holding territories in louder areas tend to sing higher frequency songs and that both noise levels and song frequency have increased over time (30 years) in urban areas. These previous findings suggest that songs are changing across generations; however, it is not known if this species also exhibits immediate flexibility. Here, we conducted an exploratory, observational study to ask whether males change the minimum frequency of their song in response to immediate changes in noise levels. We also ask whether males sing louder, as increased minimum frequency may be physiologically linked to producing sound at higher amplitudes, in response to immediate changes in environmental noise. We found that territorial males adjust song amplitude but not minimum frequency in response to changes in environmental noise levels. Our results suggest that males do not show immediate flexibility in song minimum frequency, although experimental manipulations are needed to test this hypothesis further. Our work highlights the need to investigate multiple mechanisms of adaptive response to soundscapes.

  13. Minimizing the area required for time constants in integrated circuits

    NASA Technical Reports Server (NTRS)

    Lyons, J. C.

    1972-01-01

    When a medium- or large-scale integrated circuit is designed, efforts are usually made to avoid the use of resistor-capacitor time constant generators. The capacitor needed for this circuit usually takes up more surface area on the chip than several resistors and transistors. When the use of this network is unavoidable, the designer usually makes an effort to see that the choice of resistor and capacitor combinations is such that a minimum amount of surface area is consumed. The optimum ratio of resistance to capacitance that will result in this minimum area is equal to the ratio of resistance to capacitance which may be obtained from a unit of surface area for the particular process being used. The minimum area required is a function of the square root of the reciprocal of the products of the resistance and capacitance per unit area. This minimum occurs when the area required by the resistor is equal to the area required by the capacitor.

  14. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    NASA Astrophysics Data System (ADS)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  15. 29 CFR 785.34 - Effect of section 4 of the Portal-to-Portal Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... failure to pay the minimum wage or overtime compensation for time spent in “walking, riding, or traveling... employee is employed to perform either prior to the time on any particular workday at which such employee... from time clock to work-bench) need not be counted as working time unless it is compensable by contract...

  16. Leisure time physical activity and mortality: a detailed pooled analysis of the dose-response relationship.

    PubMed

    Arem, Hannah; Moore, Steven C; Patel, Alpa; Hartge, Patricia; Berrington de Gonzalez, Amy; Visvanathan, Kala; Campbell, Peter T; Freedman, Michal; Weiderpass, Elisabete; Adami, Hans Olov; Linet, Martha S; Lee, I-Min; Matthews, Charles E

    2015-06-01

    The 2008 Physical Activity Guidelines for Americans recommended a minimum of 75 vigorous-intensity or 150 moderate-intensity minutes per week (7.5 metabolic-equivalent hours per week) of aerobic activity for substantial health benefit and suggested additional benefits by doing more than double this amount. However, the upper limit of longevity benefit or possible harm with more physical activity is unclear. To quantify the dose-response association between leisure time physical activity and mortality and define the upper limit of benefit or harm associated with increased levels of physical activity. We pooled data from 6 studies in the National Cancer Institute Cohort Consortium (baseline 1992-2003). Population-based prospective cohorts in the United States and Europe with self-reported physical activity were analyzed in 2014. A total of 661,137 men and women (median age, 62 years; range, 21-98 years) and 116,686 deaths were included. We used Cox proportional hazards regression with cohort stratification to generate multivariable-adjusted hazard ratios (HRs) and 95% CIs. Median follow-up time was 14.2 years. Leisure time moderate- to vigorous-intensity physical activity. The upper limit of mortality benefit from high levels of leisure time physical activity. Compared with individuals reporting no leisure time physical activity, we observed a 20% lower mortality risk among those performing less than the recommended minimum of 7.5 metabolic-equivalent hours per week (HR, 0.80 [95% CI, 0.78-0.82]), a 31% lower risk at 1 to 2 times the recommended minimum (HR, 0.69 [95% CI, 0.67-0.70]), and a 37% lower risk at 2 to 3 times the minimum (HR, 0.63 [95% CI, 0.62-0.65]). An upper threshold for mortality benefit occurred at 3 to 5 times the physical activity recommendation (HR, 0.61 [95% CI, 0.59-0.62]); however, compared with the recommended minimum, the additional benefit was modest (31% vs 39%). There was no evidence of harm at 10 or more times the recommended minimum (HR, 0.69 [95% CI, 0.59-0.78]). A similar dose-response relationship was observed for mortality due to cardiovascular disease and to cancer. Meeting the 2008 Physical Activity Guidelines for Americans minimum by either moderate- or vigorous-intensity activities was associated with nearly the maximum longevity benefit. We observed a benefit threshold at approximately 3 to 5 times the recommended leisure time physical activity minimum and no excess risk at 10 or more times the minimum. In regard to mortality, health care professionals should encourage inactive adults to perform leisure time physical activity and do not need to discourage adults who already participate in high-activity levels.

  17. Determining the minimum ripening time of artisanal Minas cheese, a traditional Brazilian cheese

    PubMed Central

    Martins, José M.; Galinari, Éder; Pimentel-Filho, Natan J.; Ribeiro, José I.; Furtado, Mauro M.; Ferreira, Célia L.L.F.

    2015-01-01

    Physical, physicochemical, and microbiological changes were monitored in 256 samples of artisanal Minas cheese from eight producers from Serro region (Minas Gerais, Brazil) for 64 days of ripening to determine the minimum ripening time for the cheese to reach the safe microbiological limits established by Brazilian legislation. The cheeses were produced between dry season (April–September) and rainy season (October–March); 128 cheeses were ripened at room temperature (25 ± 4 °C), and 128 were ripened under refrigeration (8 ± 1 °C), as a control. No Listeria monocytogenes was found, but one cheese under refrigeration had Salmonella at first 15 days of ripening. However, after 22 days, the pathogen was not detected. Seventeen days was the minimum ripening time at room temperature to reduce at safe limits of total coliforms > 1000 cfu.g −1 ), Escherichia coli and Staphylococcus aureus (> 100 cfu.g −1 ) in both periods of manufacture. Otherwise under refrigeration, as expected, the minimum ripening time was longer, 33 days in the dry season and 63 days in the rainy season. To sum up, we suggest that the ripening of artisanal Minas cheese be done at room temperature, since this condition shortens the time needed to reach the microbiological quality that falls within the safety parameters required by Brazilian law, and at the same time maintain the appearance and flavor characteristics of this traditional cheese. PMID:26221111

  18. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    PubMed

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  19. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    PubMed Central

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  20. 5 CFR 300.503 - Conditions for using private sector temporaries.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... including vacations or other circumstances which are not shown to be compelling in the judgment of the... through the direct appointment of temporary employees within the time available by the date, and for the duration of time, help is needed. At minimum, this should include an agency determination that there are no...

  1. 75 FR 48370 - Biweekly Notice Applications and Amendments to Facility Operating Licenses Involving No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-10

    ... revise the minimum Emergency Diesel Generator (EDG) output voltage acceptance criterion in Surveillance... ensures the timely transfer of plant safety system loads to the Emergency Diesel Generators in the event a... from the emergency diesel generators in a timely manner. This change is needed to bring Fermi 2 into...

  2. Application of QUAL2K Model to Assess Ecological Purification Technology for a Polluted River

    PubMed Central

    Zhu, Wenting; Niu, Qian; Zhang, Ruibin; Ye, Rui; Qian, Xin; Qian, Yu

    2015-01-01

    Industrialization and urbanization have caused water pollution and ecosystem degradation, especially in urban canals and rivers in China; accordingly, effective water quality improvement programs are needed. In this study, the Tianlai River in Jiangsu, China was taken as a research site, and a combination of ecological purification technologies consisting of biological rope, phytoremediation, and activated carbon were applied in a laboratory-scale study to examine degradation coefficients under dynamic water conditions. Coefficients were then input into the QUAL2K model to simulate various hypothetical scenarios and determine the minimum density of ecological purification combination and hydraulic retention time (HRT) to meet Grade V or IV of the China standard for surface water. The minimum densities for Grade V and IV were 1.6 times and 2 times the experimental density, while the minimum HRTs for Grade V and IV were 2.4 day and 3 day. The results of this study should provide a practical and efficient design method for ecological purification programs. PMID:25689997

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H; Guerrero, M; Prado, K

    Purpose: Building up a TG-71 based electron monitor-unit (MU) calculation protocol usually involves massive measurements. This work investigates a minimum data set of measurements and its calculation accuracy and measurement time. Methods: For 6, 9, 12, 16, and 20 MeV of our Varian Clinac-Series linear accelerators, the complete measurements were performed at different depth using 5 square applicators (6, 10, 15, 20 and 25 cm) with different cutouts (2, 3, 4, 6, 10, 15 and 20 cm up to applicator size) for 5 different SSD’s. For each energy, there were 8 PDD scans and 150 point measurements for applicator factors,more » cutout factors and effective SSDs that were then converted to air-gap factors for SSD 99–110cm. The dependence of each dosimetric quantity on field size and SSD was examined to determine the minimum data set of measurements as a subset of the complete measurements. The “missing” data excluded in the minimum data set were approximated by linear or polynomial fitting functions based on the included data. The total measurement time and the calculated electron MU using the minimum and the complete data sets were compared. Results: The minimum data set includes 4 or 5 PDD’s and 51 to 66 point measurements for each electron energy, and more PDD’s and fewer point measurements are generally needed as energy increases. Using only <50% of complete measurement time, the minimum data set generates acceptable MU calculation results compared to those with the complete data set. The PDD difference is within 1 mm and the calculated MU difference is less than 1.5%. Conclusion: Data set measurement for TG-71 electron MU calculations can be minimized based on the knowledge of how each dosimetric quantity depends on various setup parameters. The suggested minimum data set allows acceptable MU calculation accuracy and shortens measurement time by a few hours.« less

  4. The impact of minimum wages on population health: evidence from 24 OECD countries.

    PubMed

    Lenhart, Otto

    2017-11-01

    This study examines the relationship between minimum wages and several measures of population health by analyzing data from 24 OECD countries for a time period of 31 years. Specifically, I test for health effects as a result of within-country variations in the generosity of minimum wages, which are measured by the Kaitz index. The paper finds that higher levels of minimum wages are associated with significant reductions of overall mortality rates as well as in the number of deaths due to outcomes that have been shown to be more prevalent among individuals with low socioeconomic status (e.g., diabetes, disease of the circulatory system, stroke). A 10% point increase of the Kaitz index is associated with significant declines in death rates and an increase in life expectancy of 0.44 years. Furthermore, I provide evidence for potential channels through which minimum wages impact population health by showing that more generous minimum wages impact outcomes such as poverty, the share of the population with unmet medical needs, the number of doctor consultations, tobacco consumption, calorie intake, and the likelihood of people being overweight.

  5. Development of Gis Tool for the Solution of Minimum Spanning Tree Problem using Prim's Algorithm

    NASA Astrophysics Data System (ADS)

    Dutta, S.; Patra, D.; Shankar, H.; Alok Verma, P.

    2014-11-01

    minimum spanning tree (MST) of a connected, undirected and weighted network is a tree of that network consisting of all its nodes and the sum of weights of all its edges is minimum among all such possible spanning trees of the same network. In this study, we have developed a new GIS tool using most commonly known rudimentary algorithm called Prim's algorithm to construct the minimum spanning tree of a connected, undirected and weighted road network. This algorithm is based on the weight (adjacency) matrix of a weighted network and helps to solve complex network MST problem easily, efficiently and effectively. The selection of the appropriate algorithm is very essential otherwise it will be very hard to get an optimal result. In case of Road Transportation Network, it is very essential to find the optimal results by considering all the necessary points based on cost factor (time or distance). This paper is based on solving the Minimum Spanning Tree (MST) problem of a road network by finding it's minimum span by considering all the important network junction point. GIS technology is usually used to solve the network related problems like the optimal path problem, travelling salesman problem, vehicle routing problems, location-allocation problems etc. Therefore, in this study we have developed a customized GIS tool using Python script in ArcGIS software for the solution of MST problem for a Road Transportation Network of Dehradun city by considering distance and time as the impedance (cost) factors. It has a number of advantages like the users do not need a greater knowledge of the subject as the tool is user-friendly and that allows to access information varied and adapted the needs of the users. This GIS tool for MST can be applied for a nationwide plan called Prime Minister Gram Sadak Yojana in India to provide optimal all weather road connectivity to unconnected villages (points). This tool is also useful for constructing highways or railways spanning several cities optimally or connecting all cities with minimum total road length.

  6. A minimum propellant solution to an orbit-to-orbit transfer using a low thrust propulsion system

    NASA Technical Reports Server (NTRS)

    Cobb, Shannon S.

    1991-01-01

    The Space Exploration Initiative is considering the use of low thrust (nuclear electric, solar electric) and intermediate thrust (nuclear thermal) propulsion systems for transfer to Mars and back. Due to the duration of such a mission, a low thrust minimum-fuel solution is of interest; a savings of fuel can be substantial if the propulsion system is allowed to be turned off and back on. This switching of the propulsion system helps distinguish the minimal-fuel problem from the well-known minimum-time problem. Optimal orbit transfers are also of interest to the development of a guidance system for orbital maneuvering vehicles which will be needed, for example, to deliver cargoes to the Space Station Freedom. The problem of optimizing trajectories for an orbit-to-orbit transfer with minimum-fuel expenditure using a low thrust propulsion system is addressed.

  7. Optimization of memory use of fragment extension-based protein-ligand docking with an original fast minimum cost flow algorithm.

    PubMed

    Yanagisawa, Keisuke; Komine, Shunta; Kubota, Rikuto; Ohue, Masahito; Akiyama, Yutaka

    2018-06-01

    The need to accelerate large-scale protein-ligand docking in virtual screening against a huge compound database led researchers to propose a strategy that entails memorizing the evaluation result of the partial structure of a compound and reusing it to evaluate other compounds. However, the previous method required frequent disk accesses, resulting in insufficient acceleration. Thus, more efficient memory usage can be expected to lead to further acceleration, and optimal memory usage could be achieved by solving the minimum cost flow problem. In this research, we propose a fast algorithm for the minimum cost flow problem utilizing the characteristics of the graph generated for this problem as constraints. The proposed algorithm, which optimized memory usage, was approximately seven times faster compared to existing minimum cost flow algorithms. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. 22 CFR 226.22 - Payment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Post-award Requirements Financial and Program Management § 226.22 Payment. (a) Payment methods... recipient organization shall be limited to the minimum amounts needed and be timed to be in accordance with...

  9. 22 CFR 518.22 - Payment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial... and disbursement by the recipient, and (ii) Financial management systems that meet the standards for... organization shall be limited to the minimum amounts needed and be timed to be in accordance with the actual...

  10. 38 CFR 49.22 - Payment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 49.22 Payment. (a... elapsing between the transfer of funds and disbursement by the recipient, and financial management systems... a recipient organization shall be limited to the minimum amounts needed and be timed to be in...

  11. 28 CFR 70.22 - Payment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (INCLUDING SUBAWARDS) WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS AND OTHER NON-PROFIT ORGANIZATIONS... transfer of funds and disbursement by the recipient, and financial management systems that meet the... organization will be limited to the minimum amounts needed and be timed to be in accordance with the actual...

  12. 28 CFR 70.22 - Payment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (INCLUDING SUBAWARDS) WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS AND OTHER NON-PROFIT ORGANIZATIONS... transfer of funds and disbursement by the recipient, and financial management systems that meet the... organization will be limited to the minimum amounts needed and be timed to be in accordance with the actual...

  13. 22 CFR 518.22 - Payment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial... and disbursement by the recipient, and (ii) Financial management systems that meet the standards for... organization shall be limited to the minimum amounts needed and be timed to be in accordance with the actual...

  14. 7 CFR 3019.22 - Payment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program... established in § 3019.21. Cash advances to a recipient organization shall be limited to the minimum amounts needed and be timed to be in accordance with the actual, immediate cash requirements of the recipient...

  15. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time

    PubMed Central

    Avellar, Gustavo S. C.; Pereira, Guilherme A. S.; Pimenta, Luciano C. A.; Iscold, Paulo

    2015-01-01

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem’s (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles’ maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs. PMID:26540055

  16. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time.

    PubMed

    Avellar, Gustavo S C; Pereira, Guilherme A S; Pimenta, Luciano C A; Iscold, Paulo

    2015-11-02

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem's (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles' maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs.

  17. Dynamo magnetic field-induced angular momentum transport in protostellar nebulae - The 'minimum mass' protosolar nebula

    NASA Technical Reports Server (NTRS)

    Stepinski, T. F.; Levy, E. H.

    1990-01-01

    Magnetic torques can produce angular momentum redistribution in protostellar nebulas. Dynamo magnetic fields can be generated in differentially rotating and turbulent nebulas and can be the source of magnetic torques that transfer angular momentum from a protostar to a disk, as well as redistribute angular momentum within a disk. A magnetic field strength of 100-1000 G is needed to transport the major part of a protostar's angular momentum into a surrounding disk in a time characteristic of star formation, thus allowing formation of a solar-system size protoplanetary nebula in the usual 'minimum-mass' model of the protosolar nebula. This paper examines the possibility that a dynamo magnetic field could have induced the needed angular momentum transport from the proto-Sun to the protoplanetary nebula.

  18. Development of minimum standards for event-based data collection loggers and performance measure definitions for signalized intersections [summary].

    DOT National Transportation Integrated Search

    2017-01-01

    New traffic signal controllers, which have advanced data collection abilities, offer better information about the response of traffic signal timings to traffic flows. However, traffic engineers need more than raw data. The controllers must be set up ...

  19. 15 CFR 14.22 - Payment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ORGANIZATIONS Post-Award Requirements Financial and Program Management § 14.22 Payment. (a) Payment methods... transfer of funds and disbursement by the recipient, and financial management systems that meet the... organization shall be limited to the minimum amounts needed and be timed to be in accordance with the actual...

  20. 24 CFR 84.22 - Payment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 84.22 Payment. (a... management systems that meet the standards for fund control and accountability as established in § 84.21. Cash advances to a recipient organization shall be limited to the minimum amounts needed and be timed...

  1. 24 CFR 84.22 - Payment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 84.22 Payment. (a... management systems that meet the standards for fund control and accountability as established in § 84.21. Cash advances to a recipient organization shall be limited to the minimum amounts needed and be timed...

  2. 40 CFR 30.22 - Payment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 30.22 Payment. (a... systems that meet the standards for fund control and accountability as established in § 30.21. Cash advances to a recipient organization shall be limited to the minimum amounts needed and be timed to be in...

  3. 40 CFR 30.22 - Payment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 30.22 Payment. (a... systems that meet the standards for fund control and accountability as established in § 30.21. Cash advances to a recipient organization shall be limited to the minimum amounts needed and be timed to be in...

  4. Library Skills and Resources for Business Research.

    ERIC Educational Resources Information Center

    Lyle, Stanley P.; Ashbaugh, Donald L.

    This independent study module is intended to introduce business administration students and managers to business information available in books, periodicals, and other library sources, and to teach library search strategies for the acquisition of needed information with a minimum expenditure of time. The module consists of six parts. The first…

  5. 33 CFR 151.2036 - Extension of compliance date.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Extension of compliance date. 151.2036 Section 151.2036 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY... posted on the Internet. Extensions will be for no longer than the minimum time needed, as determined by...

  6. 33 CFR 151.2036 - Extension of compliance date.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Extension of compliance date. 151.2036 Section 151.2036 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY... posted on the Internet. Extensions will be for no longer than the minimum time needed, as determined by...

  7. 33 CFR 151.2036 - Extension of compliance date.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Extension of compliance date. 151.2036 Section 151.2036 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY... posted on the Internet. Extensions will be for no longer than the minimum time needed, as determined by...

  8. A method for calculating minimum biodiversity offset multipliers accounting for time discounting, additionality and permanence

    PubMed Central

    Laitila, Jussi; Moilanen, Atte; Pouzols, Federico M

    2014-01-01

    Biodiversity offsetting, which means compensation for ecological and environmental damage caused by development activity, has recently been gaining strong political support around the world. One common criticism levelled at offsets is that they exchange certain and almost immediate losses for uncertain future gains. In the case of restoration offsets, gains may be realized after a time delay of decades, and with considerable uncertainty. Here we focus on offset multipliers, which are ratios between damaged and compensated amounts (areas) of biodiversity. Multipliers have the attraction of being an easily understandable way of deciding the amount of offsetting needed. On the other hand, exact values of multipliers are very difficult to compute in practice if at all possible. We introduce a mathematical method for deriving minimum levels for offset multipliers under the assumption that offsetting gains must compensate for the losses (no net loss offsetting). We calculate absolute minimum multipliers that arise from time discounting and delayed emergence of offsetting gains for a one-dimensional measure of biodiversity. Despite the highly simplified model, we show that even the absolute minimum multipliers may easily be quite large, in the order of dozens, and theoretically arbitrarily large, contradicting the relatively low multipliers found in literature and in practice. While our results inform policy makers about realistic minimal offsetting requirements, they also challenge many current policies and show the importance of rigorous models for computing (minimum) offset multipliers. The strength of the presented method is that it requires minimal underlying information. We include a supplementary spreadsheet tool for calculating multipliers to facilitate application. PMID:25821578

  9. Determination of the Volume of Water for Suppressing the Thermal Decomposition of Forest Combustibles

    NASA Astrophysics Data System (ADS)

    Volkov, R. S.; Zhdanova, A. O.; Kuznetsov, G. V.; Strizhak, P. A.

    2017-07-01

    From the results of experimental studies of the processes of suppressing the thermal decomposition of the typical forest combustibles (birch leaves, fir needles, asp twigs, and a mixture of these three materials) by water aerosol, the minimum volumes of the fire-extinguishing liquid have been determined (by varying the volume of samples of the forest combustibles from 0.00002 m3 to 0.0003 m3 and the area of their open surface from 0.0001 m2 to 0.018 m2). The dependences of the minimum volume of water on the area of the open surface of the forest combustible have been established. Approximation expressions for these dependences have been obtained. Forecast has been made of the minimum volume of water for suppressing the process of thermal decomposition of forest combustibles in areas from 1 cm2 to 1 km2, as well as of the characteristic quenching times by varying the water concentration per unit time. It has been shown that the amount of water needed for effective suppression of the process of thermal decomposition of forest combustibles is several times less than is customarily assumed.

  10. Laboratory Ventilation and Safety.

    ERIC Educational Resources Information Center

    Steere, Norman V.

    1965-01-01

    In order to meet the needs of both safety and economy, laboratory ventilation systems must effectively remove air-borne toxic and flammable materials and at the same time exhaust a minimum volume of air. Laboratory hoods are the most commonly used means of removing gases, dusts, mists, vapors, and fumed from laboratory operations. To be effective,…

  11. Methods, metrics and research gaps around minimum data sets for nursing practice and fundamental care: A scoping literature review.

    PubMed

    Muntlin Athlin, Åsa

    2018-06-01

    To examine and map research on minimum data sets linked to nursing practice and the fundamentals of care. Another aim was to identify gaps in the evidence to suggest future research questions to highlight the need for standardisation of terminology around nursing practice and fundamental care. Addressing fundamental care has been highlighted internationally as a response to missed nursing care. Systematic performance measurements are needed to capture nursing practice outcomes. Overview of the literature framed by the scoping study methodology. PubMed and CINAHL were searched using the following inclusion criteria: peer-reviewed empirical quantitative and qualitative studies related to minimum data sets and nursing practice published in English. No time restrictions were set. Exclusion criteria were as follows: no available full text, reviews and methodological and discursive studies. Data were categorised into one of the fundamentals of care elements. The review included 20 studies published in 1999-2016. Settings were mainly nursing homes or hospitals. Of 14 elements of the fundamentals of care, 11 were identified as measures in the included studies, but their frequency varied. The most commonly identified elements concerned safety, prevention and medication (n = 11), comfort (n = 6) and eating and drinking (n = 5). Studies have used minimum data sets and included variables linked to nursing practices and fundamentals of care. However, the relations of these variables to nursing practice were not always clearly described and the main purpose of the studies was seldom to measure the outcomes of nursing interventions. More robust studies focusing on nursing practice and patient outcomes are warranted. Using minimum data sets can highlight the nurses' work and what impact it has on direct patient care. Appropriate models, systems and standardised terminology are needed to facilitate the documentation of nursing activities. © 2017 John Wiley & Sons Ltd.

  12. Attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm.

    PubMed

    Zhang, Jie; Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.

  13. Attribute Index and Uniform Design Based Multiobjective Association Rule Mining with Evolutionary Algorithm

    PubMed Central

    Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption. PMID:23766683

  14. A decision-making tool to determine economic feasibility and break-even prices for artisan cheese operations.

    PubMed

    Durham, Catherine A; Bouma, Andrea; Meunier-Goddik, Lisbeth

    2015-12-01

    Artisan cheese makers lack access to valid economic data to help them evaluate business opportunities and make important business decisions such as determining cheese pricing structure. The objective of this study was to utilize an economic model to evaluate the net present value (NPV), internal rate of return, and payback period for artisan cheese production at different annual production volumes. The model was also used to determine the minimum retail price necessary to ensure positive NPV for 5 different cheese types produced at 4 different production volumes. Milk type, cheese yield, and aging time all affected variable costs. However, aged cheeses required additional investment for aging space (which needs to be larger for longer aging times), as did lower yield cheeses (by requiring larger-volume equipment for pasteurization and milk handling). As the volume of milk required increased, switching from vat pasteurization to high-temperature, short-time pasteurization was necessary for low-yield cheeses before being required for high-yield cheeses, which causes an additional increase in investment costs. Because of these differences, high-moisture, fresh cow milk cheeses can be sold for about half the price of hard, aged goat milk cheeses at the largest production volume or for about two-thirds the price at the lowest production volume examined. For example, for the given model assumptions, at an annual production of 13,608kg of cheese (30,000 lb), a fresh cow milk mozzarella should be sold at a minimum retail price of $27.29/kg ($12.38/lb), whereas a goat milk Gouda needs a minimum retail price of $49.54/kg ($22.47/lb). Artisan cheese makers should carefully evaluate annual production volumes. Although larger production volumes decrease average fixed cost and improve production efficiency, production can reach volumes where it becomes necessary to sell through distributors. Because distributors might pay as little as 35% of retail price, the retail price needs to be higher to compensate. An artisan cheese company that has not achieved the recognition needed to achieve a premium price may not find distribution through distributors profitable. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Simulation of hydrodynamics, temperature, and dissolved oxygen in Beaver Lake, Arkansas, 1994-1995

    USGS Publications Warehouse

    Haggard, Brian; Green, W. Reed

    2002-01-01

    The tailwaters of Beaver Lake and other White River reservoirs support a cold-water trout fishery of significant economic yield in northwestern Arkansas. The Arkansas Game and Fish Commission has requested an increase in existing minimum flows through the Beaver Lake dam to increase the amount of fishable waters downstream. Information is needed to assess the impact of additional minimum flows on temperature and dissolved-oxygen qualities of reservoir water above the dam and the release water. A two-dimensional, laterally averaged hydrodynamic, thermal and dissolved-oxygen model was developed and calibrated for Beaver Lake, Arkansas. The model simulates surface-water elevation, currents, heat transport and dissolved-oxygen dynamics. The model was developed to assess the impacts of proposed increases in minimum flows from 1.76 cubic meters per second (the existing minimum flow) to 3.85 cubic meters per second (the additional minimum flow). Simulations included assessing (1) the impact of additional minimum flows on tailwater temperature and dissolved-oxygen quality and (2) increasing initial water-surface elevation 0.5 meter and assessing the impact of additional minimum flow on tailwater temperatures and dissolved-oxygen concentrations. The additional minimum flow simulation (without increasing initial pool elevation) appeared to increase the water temperature (<0.9 degrees Celsius) and decrease dissolved oxygen concentration (<2.2 milligrams per liter) in the outflow discharge. Conversely, the additional minimum flow plus initial increase in pool elevation (0.5 meter) simulation appeared to decrease outflow water temperature (0.5 degrees Celsius) and increase dissolved oxygen concentration (<1.2 milligrams per liter) through time. However, results from both minimum flow scenarios for both water temperature and dissolved oxygen concentration were within the boundaries or similar to the error between measured and simulated water column profile values.

  16. PROPERTY APPRAISAL PROVIDES CONTROL, INSURANCE BASIS, AND VALUE ESTIMATE.

    ERIC Educational Resources Information Center

    THOMSON, JACK

    A COMPLETE PROPERTY APPRAISAL SERVES AS A BASIS FOR CONTROL, INSURANCE AND VALUE ESTIMATE. A PROFESSIONAL APPRAISAL FIRM SHOULD PERFORM THIS FUNCTION BECAUSE (1) IT IS FAMILIAR WITH PROPER METHODS, (2) IT CAN PREPARE THE REPORT WITH MINIMUM CONFUSION AND INTERRRUPTION OF THE COLLEGE OPERATION, (3) USE OF ITS PRICING LIBRARY REDUCES TIME NEEDED AND…

  17. Antidepressant treatment of depression in rural nursing home residents.

    PubMed

    Kerber, Cindy Sullivan; Dyck, Mary J; Culp, Kennith R; Buckwalter, Kathleen

    2008-09-01

    Under-diagnosis and under-treatment of depression are major problems in nursing home residents. The purpose of this study was to determine antidepressant use among nursing home residents who were diagnosed with depression using three different methods: (1) the Geriatric Depression Scale, (2) Minimum Data Set, and (3) primary care provider assessments. As one would expect, the odds of being treated with an antidepressant were about eight times higher for those diagnosed as depressed by the primary care provider compared to the Geriatric Depression Scale or the Minimum Data Set. Men were less likely to be diagnosed and treated with antidepressants by their primary care provider than women. Depression detected by nurses through the Minimum Data Set was treated at a lower rate with antidepressants, which generates issues related to interprofessional communication, nursing staff communication, and the need for geropsychiatric role models in nursing homes.

  18. Support Minimized Inversion of Acoustic and Elastic Wave Scattering

    NASA Astrophysics Data System (ADS)

    Safaeinili, Ali

    Inversion of limited data is common in many areas of NDE such as X-ray Computed Tomography (CT), Ultrasonic and eddy current flaw characterization and imaging. In many applications, it is common to have a bias toward a solution with minimum (L^2)^2 norm without any physical justification. When it is a priori known that objects are compact as, say, with cracks and voids, by choosing "Minimum Support" functional instead of the minimum (L^2)^2 norm, an image can be obtained that is equally in agreement with the available data, while it is more consistent with what is most probably seen in the real world. We have utilized a minimum support functional to find a solution with the smallest volume. This inversion algorithm is most successful in reconstructing objects that are compact like voids and cracks. To verify this idea, we first performed a variational nonlinear inversion of acoustic backscatter data using minimum support objective function. A full nonlinear forward model was used to accurately study the effectiveness of the minimized support inversion without error due to the linear (Born) approximation. After successful inversions using a full nonlinear forward model, a linearized acoustic inversion was developed to increase speed and efficiency in imaging process. The results indicate that by using minimum support functional, we can accurately size and characterize voids and/or cracks which otherwise might be uncharacterizable. An extremely important feature of support minimized inversion is its ability to compensate for unknown absolute phase (zero-of-time). Zero-of-time ambiguity is a serious problem in the inversion of the pulse-echo data. The minimum support inversion was successfully used for the inversion of acoustic backscatter data due to compact scatterers without the knowledge of the zero-of-time. The main drawback to this type of inversion is its computer intensiveness. In order to make this type of constrained inversion available for common use, work needs to be performed in three areas: (1) exploitation of state-of-the-art parallel computation, (2) improvement of theoretical formulation of the scattering process for better computation efficiency, and (3) development of better methods for guiding the non-linear inversion. (Abstract shortened by UMI.).

  19. An Analysis of Minimum System Requirements to Support Computerized Adaptive Testing.

    DTIC Science & Technology

    1986-09-01

    adaptive test ( CAT ); adaptive test ing A;4SRAC:’ (Continue on reverie of necessary and ident4f by block number) % This pape-r discusses the minimum system...requirements needed to develop a computerized adaptive test ( CAT ). It lists some of the benefits of adaptive testing, establishes a set of...discusses the minimum system requirements needed to develop a computerized adaptive test ( CAT ). It lists some of the benefits of adaptive testing

  20. Associations Between Minimum Wage Policy and Access to Health Care: Evidence From the Behavioral Risk Factor Surveillance System, 1996–2007

    PubMed Central

    Zimmerman, Frederick J.; Ralston, James D.; Martin, Diane P.

    2011-01-01

    Objectives. We examined whether minimum wage policy is associated with access to medical care among low-skilled workers in the United States. Methods. We used multilevel logistic regression to analyze a data set consisting of individual-level indicators of uninsurance and unmet medical need from the Behavioral Risk Factor Surveillance System and state-level ecological controls from the US Census, Bureau of Labor Statistics, and several other sources in all 50 states and the District of Columbia between 1996 and 2007. Results. Higher state-level minimum wage rates were associated with significantly reduced odds of reporting unmet medical need after control for the ecological covariates, substate region fixed effects, and individual demographic and health characteristics (odds ratio = 0.853; 95% confidence interval = 0.750, 0.971). Minimum wage rates were not significantly associated with being uninsured. Conclusions. Higher minimum wages may be associated with a reduced likelihood of experiencing unmet medical need among low-skilled workers, and do not appear to be associated with uninsurance. These findings appear to refute the suggestion that minimum wage laws have detrimental effects on access to health care, as opponents of the policies have suggested. PMID:21164102

  1. Associations between minimum wage policy and access to health care: evidence from the Behavioral Risk Factor Surveillance System, 1996-2007.

    PubMed

    McCarrier, Kelly P; Zimmerman, Frederick J; Ralston, James D; Martin, Diane P

    2011-02-01

    We examined whether minimum wage policy is associated with access to medical care among low-skilled workers in the United States. We used multilevel logistic regression to analyze a data set consisting of individual-level indicators of uninsurance and unmet medical need from the Behavioral Risk Factor Surveillance System and state-level ecological controls from the US Census, Bureau of Labor Statistics, and several other sources in all 50 states and the District of Columbia between 1996 and 2007. Higher state-level minimum wage rates were associated with significantly reduced odds of reporting unmet medical need after control for the ecological covariates, substate region fixed effects, and individual demographic and health characteristics (odds ratio = 0.853; 95% confidence interval = 0.750, 0.971). Minimum wage rates were not significantly associated with being uninsured. Higher minimum wages may be associated with a reduced likelihood of experiencing unmet medical need among low-skilled workers, and do not appear to be associated with uninsurance. These findings appear to refute the suggestion that minimum wage laws have detrimental effects on access to health care, as opponents of the policies have suggested.

  2. APSIDAL MOTION AND A LIGHT CURVE SOLUTION FOR 13 LMC ECCENTRIC ECLIPSING BINARIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zasche, P.; Wolf, M.; Vraštil, J.

    2015-12-15

    New CCD observations for 13 eccentric eclipsing binaries from the Large Magellanic Cloud were carried out using the Danish 1.54 m telescope located at the La Silla Observatory in Chile. These systems were observed for their times of minimum and 56 new minima were obtained. These are needed for accurate determination of the apsidal motion. Besides that, in total 436 times of minimum were derived from the photometric databases OGLE and MACHO. The O – C diagrams of minimum timings for these B-type binaries were analyzed and the parameters of the apsidal motion were computed. The light curves of thesemore » systems were fitted using the program PHOEBE, giving the light curve parameters. We derived for the first time relatively short periods of the apsidal motion ranging from 21 to 107 years. The system OGLE-LMC-ECL-07902 was also analyzed using the spectra and radial velocities, resulting in masses of 6.8 and 4.4 M{sub ⊙} for the eclipsing components. For one system (OGLE-LMC-ECL-20112), the third-body hypothesis was also used to describe the residuals after subtraction of the apsidal motion, resulting in a period of about 22 years. For several systems an additional third light was also detected, which makes these systems suspect for triplicity.« less

  3. First Precision Photometric Observations and Analyses of the Totally Eclipsing, Solar Type Binary V573 Pegasi

    NASA Astrophysics Data System (ADS)

    Samec, R. G.; Caton, D. B.; Faulkner, D. R.

    2018-06-01

    CCD VRcIc light curves of V573 Peg were taken 26 and 27 September and 2, 4, and 6 October, 2017, at the Dark Sky Observatory in North Carolina with the 0.81-m reflector of Appalachian State University. Five times of minimum light were calculated, two primary and three secondary eclipses, from our present observations. The following quadratic ephemeris was determined from all available times of minimum light: JD Hel MinI = 2456876.4958 (2) d + 0.41744860 (8) × E -2.74 (12) × 10^-10 × E2, where the parentheses hold the ± error in the last two digits of the preceding value. A 14-year period study (covered by 24 times of minimum light) reveals a decreasing orbital period with high confidence, possibly due to magnetic braking. The mass ratio is found to be somewhat extreme, M2 / M1 = 0.2629 ± 0.0006 (M1 / M2 = 3.8). Its Roche Lobe fill-out is ˜25%. The solution had no need of spots. The component temperature difference is about 130 K, with the less massive component as the hotter one, so it is a W-type W UMa Binary. The inclination is 80.4 ± 0.1°. Our secondary eclipse shows a time of constant light with an eclipse duration of 24 minutes. More information is given in the following report.

  4. Minimum number of days required for a reliable estimate of daily step count and energy expenditure, in people with MS who walk unaided.

    PubMed

    Norris, Michelle; Anderson, Ross; Motl, Robert W; Hayes, Sara; Coote, Susan

    2017-03-01

    The purpose of this study was to examine the minimum number of days needed to reliably estimate daily step count and energy expenditure (EE), in people with multiple sclerosis (MS) who walked unaided. Seven days of activity monitor data were collected for 26 participants with MS (age=44.5±11.9years; time since diagnosis=6.5±6.2years; Patient Determined Disease Steps=≤3). Mean daily step count and mean daily EE (kcal) were calculated for all combinations of days (127 combinations), and compared to the respective 7-day mean daily step count or mean daily EE using intra-class correlations (ICC), the Generalizability Theory and Bland-Altman. For step count, ICC values of 0.94-0.98 and a G-coefficient of 0.81 indicate a minimum of any random 2-day combination is required to reliably calculate mean daily step count. For EE, ICC values of 0.96-0.99 and a G-coefficient of 0.83 indicate a minimum of any random 4-day combination is required to reliably calculate mean daily EE. For Bland-Altman analyses all combinations of days, bar single day combinations, resulted in a mean bias within ±10%, when expressed as a percentage of the 7-day mean daily step count or mean daily EE. A minimum of 2days for step count and 4days for EE, regardless of day type, is needed to reliably estimate daily step count and daily EE, in people with MS who walk unaided. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Establishing Minimum Flow Requirements Based on Benthic Vegetation: What are Some Issues Related to Identifying Quantity of Inflow and Tools Used to Quantify Ecosystem Response?

    NASA Astrophysics Data System (ADS)

    Hunt, M. J.; Nuttle, W. K.; Cosby, B. J.; Marshall, F. E.

    2005-05-01

    Establishing minimum flow requirements in aquatic ecosystems is one way to stipulate controls on water withdrawals in a watershed. The basis of the determination is to identify the amount of flow needed to sustain a threshold ecological function. To develop minimum flow criteria an understanding of ecological response in relation to flow is essential. Several steps are needed including: (1) identification of important resources and ecological functions, (2) compilation of available information, (3) determination of historical conditions, (4) establishment of technical relationships between inflow and resources, and (5) identification of numeric criteria that reflect the threshold at which resources are harmed. The process is interdisciplinary requiring the integration of hydrologic and ecologic principles with quantitative assessments. The tools used quantify the ecological response and key questions related to how the quantity of flow influences the ecosystem are examined by comparing minimum flow determination in two different aquatic systems in South Florida. Each system is characterized by substantial hydrologic alteration. The first, the Caloosahatchee River is a riverine system, located on the southwest coast of Florida. The second, the Everglades- Florida Bay ecotone, is a wetland mangrove ecosystem, located on the southern tip of the Florida peninsula. In both cases freshwater submerged aquatic vegetation (Vallisneria americana or Ruppia maritima), located in areas of the saltwater- freshwater interface has been identified as a basis for minimum flow criteria. The integration of field studies, laboratory studies, and literature review was required. From this information we developed ecological modeling tools to quantify and predict plant growth in response to varying environmental variables. Coupled with hydrologic modeling tools questions relating to the quantity and timing of flow and ecological consequences in relation to normal variability are addressed.

  6. Need for Tolerances and Tolerance Exemptions for Minimum Risk Pesticides

    EPA Pesticide Factsheets

    The ingredients used in minimum risk products used on food, food crops, food contact surfaces, or animal feed commodities generally have a tolerance or tolerance exemption. Learn about tolerances and tolerance exemptions for minimum risk ingredients.

  7. [Minimum Standards for the Spatial Accessibility of Primary Care: A Systematic Review].

    PubMed

    Voigtländer, S; Deiters, T

    2015-12-01

    Regional disparities of access to primary care are substantial in Germany, especially in terms of spatial accessibility. However, there is no legally or generally binding minimum standard for the spatial accessibility effort that is still acceptable. Our objective is to analyse existing minimum standards, the methods used as well as their empirical basis. A systematic literature review was undertaken of publications regarding minimum standards for the spatial accessibility of primary care based on a title word and keyword search using PubMed, SSCI/Web of Science, EMBASE and Cochrane Library. 8 minimum standards from the USA, Germany and Austria could be identified. All of them specify the acceptable spatial accessibility effort in terms of travel time; almost half include also distance(s). The travel time maximum, which is acceptable, is 30 min and it tends to be lower in urban areas. Primary care is, according to the identified minimum standards, part of the local area (Nahbereich) of so-called central places (Zentrale Orte) providing basic goods and services. The consideration of means of transport, e. g. public transport, is heterogeneous. The standards are based on empirical studies, consultation with service providers, practical experiences, and regional planning/central place theory as well as on legal or political regulations. The identified minimum standards provide important insights into the effort that is still acceptable regarding spatial accessibility, i. e. travel time, distance and means of transport. It seems reasonable to complement the current planning system for outpatient care, which is based on provider-to-population ratios, by a gravity-model method to identify places as well as populations with insufficient spatial accessibility. Due to a lack of a common minimum standard we propose - subject to further discussion - to begin with a threshold based on the spatial accessibility limit of the local area, i. e. 30 min to the next primary care provider for at least 90% of the regional population. The exceeding of the threshold would necessitate a discussion of a health care deficit and in line with this a potential need for intervention, e. g. in terms of alternative forms of health care provision. © Georg Thieme Verlag KG Stuttgart · New York.

  8. ILP-based co-optimization of cut mask layout, dummy fill, and timing for sub-14nm BEOL technology

    NASA Astrophysics Data System (ADS)

    Han, Kwangsoo; Kahng, Andrew B.; Lee, Hyein; Wang, Lutong

    2015-10-01

    Self-aligned multiple patterning (SAMP), due to its low overlay error, has emerged as the leading option for 1D gridded back-end-of-line (BEOL) in sub-14nm nodes. To form actual routing patterns from a uniform "sea of wires", a cut mask is needed for line-end cutting or realization of space between routing segments. Constraints on cut shapes and minimum cut spacing result in end-of-line (EOL) extensions and non-functional (i.e. dummy fill) patterns; the resulting capacitance and timing changes must be consistent with signoff performance analyses and their impacts should be minimized. In this work, we address the co-optimization of cut mask layout, dummy fill, and design timing for sub-14nm BEOL design. Our central contribution is an optimizer based on integer linear programming (ILP) to minimize the timing impact due to EOL extensions, considering (i) minimum cut spacing arising in sub-14nm nodes; (ii) cut assignment to different cut masks (color assignment); and (iii) the eligibility to merge two unit-size cuts into a bigger cut. We also propose a heuristic approach to remove dummy fills after the ILP-based optimization by extending the usage of cut masks. Our heuristic can improve critical path performance under minimum metal density and mask density constraints. In our experiments, we study the impact of number of cut masks, minimum cut spacing and metal density under various constraints. Our studies of optimized cut mask solutions in these varying contexts give new insight into the tradeoff of performance and cost that is afforded by cut mask patterning technology options.

  9. Minimum time and fuel flight profiles for an F-15 airplane with a Highly Integrated Digital Electronic Control (HIDEC) system

    NASA Technical Reports Server (NTRS)

    Haering, E. A., Jr.; Burcham, F. W., Jr.

    1984-01-01

    A simulation study was conducted to optimize minimum time and fuel consumption paths for an F-15 airplane powered by two F100 Engine Model Derivative (EMD) engines. The benefits of using variable stall margin (uptrim) to increase performance were also determined. This study supports the NASA Highly Integrated Digital Electronic Control (HIDEC) program. The basis for this comparison was minimum time and fuel used to reach Mach 2 at 13,716 m (45,000 ft) from the initial conditions of Mach 0.15 at 1524 m (5000 ft). Results were also compared to a pilot's estimated minimum time and fuel trajectory determined from the F-15 flight manual and previous experience. The minimum time trajectory took 15 percent less time than the pilot's estimate for the standard EMD engines, while the minimum fuel trajectory used 1 percent less fuel than the pilot's estimate for the minimum fuel trajectory. The F-15 airplane with EMD engines and uptrim, was 23 percent faster than the pilot's estimate. The minimum fuel used was 5 percent less than the estimate.

  10. Satellite scheduling considering maximum observation coverage time and minimum orbital transfer fuel cost

    NASA Astrophysics Data System (ADS)

    Zhu, Kai-Jian; Li, Jun-Feng; Baoyin, He-Xi

    2010-01-01

    In case of an emergency like the Wenchuan earthquake, it is impossible to observe a given target on earth by immediately launching new satellites. There is an urgent need for efficient satellite scheduling within a limited time period, so we must find a way to reasonably utilize the existing satellites to rapidly image the affected area during a short time period. Generally, the main consideration in orbit design is satellite coverage with the subsatellite nadir point as a standard of reference. Two factors must be taken into consideration simultaneously in orbit design, i.e., the maximum observation coverage time and the minimum orbital transfer fuel cost. The local time of visiting the given observation sites must satisfy the solar radiation requirement. When calculating the operational orbit elements as optimal parameters to be evaluated, we obtain the minimum objective function by comparing the results derived from the primer vector theory with those derived from the Hohmann transfer because the operational orbit for observing the disaster area with impulse maneuvers is considered in this paper. The primer vector theory is utilized to optimize the transfer trajectory with three impulses and the Hohmann transfer is utilized for coplanar and small inclination of non-coplanar cases. Finally, we applied this method in a simulation of the rescue mission at Wenchuan city. The results of optimizing orbit design with a hybrid PSO and DE algorithm show that the primer vector and Hohmann transfer theory proved to be effective methods for multi-object orbit optimization.

  11. Updates to research on recommended minimum levels for pavement marking retroreflectivity to meet driver night visibility needs

    DOT National Transportation Integrated Search

    2007-10-01

    This study was aimed at completing the research to develop and scrutinize minimum levels for pavement marking retroreflectivity to meet nighttime driving needs. A previous study carried out in the 1990s was based on the CARVE model developed at Ohio ...

  12. A Dichotomous Key for the Identification of Common British Wild Flower Families

    ERIC Educational Resources Information Center

    Wood, Piers

    2004-01-01

    This article argues the need for, and provides, a dichotomous single access key for the identification of common British wild flower families. A minimum of technical vocabulary is used while at the same time retaining most of the recent botanical names of families. The key provides a user-friendly opportunity for school pupils to become familiar…

  13. Temperature fine-tunes Mediterranean Arabidopsis thaliana life-cycle phenology geographically.

    PubMed

    Marcer, A; Vidigal, D S; James, P M A; Fortin, M-J; Méndez-Vigo, B; Hilhorst, H W M; Bentsink, L; Alonso-Blanco, C; Picó, F X

    2018-01-01

    To understand how adaptive evolution in life-cycle phenology operates in plants, we need to unravel the effects of geographic variation in putative agents of natural selection on life-cycle phenology by considering all key developmental transitions and their co-variation patterns. We address this goal by quantifying the temperature-driven and geographically varying relationship between seed dormancy and flowering time in the annual Arabidopsis thaliana across the Iberian Peninsula. We used data on genetic variation in two major life-cycle traits, seed dormancy (DSDS50) and flowering time (FT), in a collection of 300 A. thaliana accessions from the Iberian Peninsula. The geographically varying relationship between life-cycle traits and minimum temperature, a major driver of variation in DSDS50 and FT, was explored with geographically weighted regressions (GWR). The environmentally varying correlation between DSDS50 and FT was analysed by means of sliding window analysis across a minimum temperature gradient. Maximum local adjustments between minimum temperature and life-cycle traits were obtained in the southwest Iberian Peninsula, an area with the highest minimum temperatures. In contrast, in off-southwest locations, the effects of minimum temperature on DSDS50 were rather constant across the region, whereas those of minimum temperature on FT were more variable, with peaks of strong local adjustments of GWR models in central and northwest Spain. Sliding window analysis identified a minimum temperature turning point in the relationship between DSDS50 and FT around a minimum temperature of 7.2 °C. Above this minimum temperature turning point, the variation in the FT/DSDS50 ratio became rapidly constrained and the negative correlation between FT and DSDS50 did not increase any further with increasing minimum temperatures. The southwest Iberian Peninsula emerges as an area where variation in life-cycle phenology appears to be restricted by the duration and severity of the hot summer drought. The temperature-driven varying relationship between DSDS50 and FT detected environmental boundaries for the co-evolution between FT and DSDS50 in A. thaliana. In the context of global warming, we conclude that A. thaliana phenology from the southwest Iberian Peninsula, determined by early flowering and deep seed dormancy, might become the most common life-cycle phenotype for this annual plant in the region. © 2017 German Botanical Society and The Royal Botanical Society of the Netherlands.

  14. Just-in-Time Training for High-Risk Low-Volume Therapies: An Approach to Ensure Patient Safety.

    PubMed

    Helman, Stephanie; Lisanti, Amy Jo; Adams, Ann; Field, Cynthia; Davis, Katherine Finn

    2016-01-01

    High-risk low-volume therapies are those therapies that are practiced infrequently and yet carry an increased risk to patients because of their complexity. Staff nurses are required to competently manage these therapies to treat patients' unique needs and optimize outcomes; however, maintaining competence is challenging. This article describes implementation of Just-in-Time Training, which requires validation of minimum competency of bedside nurses managing high-risk low-volume therapies through direct observation of a return-demonstration competency checklist.

  15. Scientist/AMPS equipment interface study

    NASA Technical Reports Server (NTRS)

    Anderson, H. R.

    1977-01-01

    The principal objective was to determine for each experiment how the operating procedures and modes of equipment onboard shuttle can be managed in real-time or near-real-time to enhance the quality of results. As part of this determination the data and display devices that a man will need for real-time management are defined. The secondary objectives, as listed in the RFQ and technical proposal, were to: (1) determine what quantities are to be measured (2) determine permissible background levels (3) decide in what portions of space measurements are to be made (4) estimate bit rates (5) establish time-lines for operating the experiments on a mission or set of missions and (6) determine the minimum set of hardware needed for real-time display. Experiment descriptions and requirements were written. The requirements of the various experiments are combined and a minimal set of joint requirements are defined.

  16. What could a strengthened right to health bring to the post-2015 health development agenda?: interrogating the role of the minimum core concept in advancing essential global health needs.

    PubMed

    Forman, Lisa; Ooms, Gorik; Chapman, Audrey; Friedman, Eric; Waris, Attiya; Lamprea, Everaldo; Mulumba, Moses

    2013-12-01

    Global health institutions increasingly recognize that the right to health should guide the formulation of replacement goals for the Millennium Development Goals, which expire in 2015. However, the right to health's contribution is undercut by the principle of progressive realization, which links provision of health services to available resources, permitting states to deny even basic levels of health coverage domestically and allowing international assistance for health to remain entirely discretionary. To prevent progressive realization from undermining both domestic and international responsibilities towards health, international human rights law institutions developed the idea of non-derogable "minimum core" obligations to provide essential health services. While minimum core obligations have enjoyed some uptake in human rights practice and scholarship, their definition in international law fails to specify which health services should fall within their scope, or to specify wealthy country obligations to assist poorer countries. These definitional gaps undercut the capacity of minimum core obligations to protect essential health needs against inaction, austerity and illegitimate trade-offs in both domestic and global action. If the right to health is to effectively advance essential global health needs in these contexts, weaknesses within the minimum core concept must be resolved through innovative research on social, political and legal conceptualizations of essential health needs. We believe that if the minimum core concept is strengthened in these ways, it will produce a more feasible and grounded conception of legally prioritized health needs that could assist in advancing health equity, including by providing a framework rooted in legal obligations to guide the formulation of new health development goals, providing a baseline of essential health services to be protected as a matter of right against governmental claims of scarcity and inadequate international assistance, and empowering civil society to claim fulfillment of their essential health needs from domestic and global decision-makers.

  17. What could a strengthened right to health bring to the post-2015 health development agenda?: interrogating the role of the minimum core concept in advancing essential global health needs

    PubMed Central

    2013-01-01

    Background Global health institutions increasingly recognize that the right to health should guide the formulation of replacement goals for the Millennium Development Goals, which expire in 2015. However, the right to health’s contribution is undercut by the principle of progressive realization, which links provision of health services to available resources, permitting states to deny even basic levels of health coverage domestically and allowing international assistance for health to remain entirely discretionary. Discussion To prevent progressive realization from undermining both domestic and international responsibilities towards health, international human rights law institutions developed the idea of non-derogable “minimum core” obligations to provide essential health services. While minimum core obligations have enjoyed some uptake in human rights practice and scholarship, their definition in international law fails to specify which health services should fall within their scope, or to specify wealthy country obligations to assist poorer countries. These definitional gaps undercut the capacity of minimum core obligations to protect essential health needs against inaction, austerity and illegitimate trade-offs in both domestic and global action. If the right to health is to effectively advance essential global health needs in these contexts, weaknesses within the minimum core concept must be resolved through innovative research on social, political and legal conceptualizations of essential health needs. Summary We believe that if the minimum core concept is strengthened in these ways, it will produce a more feasible and grounded conception of legally prioritized health needs that could assist in advancing health equity, including by providing a framework rooted in legal obligations to guide the formulation of new health development goals, providing a baseline of essential health services to be protected as a matter of right against governmental claims of scarcity and inadequate international assistance, and empowering civil society to claim fulfillment of their essential health needs from domestic and global decision-makers. PMID:24289096

  18. Learning about the Human Body. Superific Science Book IV. A Good Apple Science Activity Book for Grades 5-8+.

    ERIC Educational Resources Information Center

    Conway, Lorraine

    Designed to supplement a basic life science or biology program, this document provides teachers with experiential learning activities dealing with the human body. The learning activities vary in the length of time needed for their completion, and require a minimum of equipment and materials. The activities focus on: (1) the human skeleton; (2)…

  19. Airplane Mesh Development with Grid Density Studies

    NASA Technical Reports Server (NTRS)

    Cliff, Susan E.; Baker, Timothy J.; Thomas, Scott D.; Lawrence, Scott L.; Rimlinger, Mark J.

    1999-01-01

    Automatic Grid Generation Wish List Geometry handling, including CAD clean up and mesh generation, remains a major bottleneck in the application of CFD methods. There is a pressing need for greater automation in several aspects of the geometry preparation in order to reduce set up time and eliminate user intervention as much as possible. Starting from the CAD representation of a configuration, there may be holes or overlapping surfaces which require an intensive effort to establish cleanly abutting surface patches, and collections of many patches may need to be combined for more efficient use of the geometrical representation. Obtaining an accurate and suitable body conforming grid with an adequate distribution of points throughout the flow-field, for the flow conditions of interest, is often the most time consuming task for complex CFD applications. There is a need for a clean unambiguous definition of the CAD geometry. Ideally this would be carried out automatically by smart CAD clean up software. One could also define a standard piece-wise smooth surface representation suitable for use by computational methods and then create software to translate between the various CAD descriptions and the standard representation. Surface meshing remains a time consuming, user intensive procedure. There is a need for automated surface meshing, requiring only minimal user intervention to define the overall density of mesh points. The surface mesher should produce well shaped elements (triangles or quadrilaterals) whose size is determined initially according to the surface curvature with a minimum size for flat pieces, and later refined by the user in other regions if necessary. Present techniques for volume meshing all require some degree of user intervention. There is a need for fully automated and reliable volume mesh generation. In addition, it should be possible to create both surface and volume meshes that meet guaranteed measures of mesh quality (e.g. minimum and maximum angle, stretching ratios, etc.).

  20. Characteristics of Pitch Angle Distributions of 100s Kev Electrons in the Slot Region and Inner Radiation Belt­­­­­­­­

    NASA Astrophysics Data System (ADS)

    Zhao, H.; Li, X.; Blake, J. B.; Fennell, J.; Claudepierre, S. G.; Baker, D. N.; Jaynes, A. N.; Malaspina, D.

    2014-12-01

    The pitch angle distribution (PAD) of energetic electrons in the slot region and inner radiation belt received little attention in the past decades due to the lack of quality measurements. Using the state-of-art pitch-angle-resolved data from the Magnetic Electron Ion Spectrometer (MagEIS) instrument onboard the Van Allen Probes, a detailed analysis of 100s keV electron PADs below L =4 is performed, in which the PADs is categorized into three types: normal (flux peaking at 90°), cap (exceedingly peaking narrowly around 90°) and 90°-minimum (lower flux at 90°) PADs. By examining the characteristics of the PADs of 460 keV electrons for over a year, we find that the 90°-minimum PADs are generally present in the inner belt (L<2), while normal PADs dominate at L~3.5-4. In the region between, 90°-minimum PADs dominate during injection times and normal PADs dominate during quiet times. Cap PADs appear mostly at the decay phase of storms in the slot region and are likely caused by the pitch angle scattering of hiss waves. Fitting the normal PADs into sinnα form, the parameter n is much higher below L=3 than that in the outer belt and relatively constant in the inner belt but changes significantly in the slot region (2

  1. Research protocols in National Park Service wilderness

    Treesearch

    Jim Walters

    2000-01-01

    While the National Park Service encourages the use of its wilderness resource for research, management policies require that all research apply “minimum requirement” protocols to determine: 1) if the research is needed to support the purposes of wilderness and, 2) if it is appropriate, determine the minimum tool needed to accomplish the work.

  2. The dynamics and control of large flexible space structures - 12, supplement 11

    NASA Technical Reports Server (NTRS)

    Bainum, Peter M.; Reddy, A. S. S. R.; Li, Feiyue; Xu, Jianke

    1989-01-01

    The rapid 2-D slewing and vibrational control of the unsymmetrical flexible SCOLE (Spacecraft Control Laboratory Experiment) with multi-bounded controls is considered. Pontryagin's Maximum Principle is applied to the nonlinear equations of the system to derive the necessary conditions for the optimal control. The resulting two point boundary value problem is then solved by using the quasilinearization technique, and the near minimum time is obtained by sequentially shortening the slewing time until the controls are near the bang-bang type. The tradeoff between the minimum time and the minimum flexible amplitude requirements is discussed. The numerical results show that the responses of the nonlinear system are significantly different from those of the linearized system for rapid slewing. The SCOLE station-keeping closed loop dynamics are re-examined by employing a slightly different method for developing the equations of motion in which higher order terms in the expressions for the mast modal shape functions are now included. A preliminary study on the effect of actuator mass on the closed loop dynamics of large space systems is conducted. A numerical example based on a coupled two-mass two-spring system illustrates the effect of changes caused in the mass and stiffness matrices on the closed loop system eigenvalues. In certain cases the need for redesigning control laws previously synthesized, but not accounting for actuator masses, is indicated.

  3. Meeting the oral health needs of 12-year-olds in China: human resources for oral health.

    PubMed

    Sun, Xiangyu; Bernabé, Eduardo; Liu, Xuenan; Zheng, Shuguo; Gallagher, Jennifer E

    2017-06-20

    An appropriate level of human resources for oral health [HROH] is required to meet the oral health needs of population, and enable maximum improvement in health outcomes. The aim of this study was to estimate the required HROH to meet the oral health needs of the World Health Organization [WHO] reference group of 12-year-olds in China and consider the implications for education, practice, policy and HROH nationally. We estimated the need of HROH to meet the needs of 12-year-olds based on secondary analysis of the epidemiological and questionnaire data from the 3rd Chinese National Oral Health Survey, including caries experience and periodontal factors (calculus), dentally-related behaviour (frequency of toothbrushing and sugar intake), and social factors (parental education). Children's risk for dental caries was classified in four levels from low (level 1) to high (level 4). We built maximum and minimum intervention models of dental care for each risk level, informed by contemporary evidence-based practice. The needs-led HROH model we used in the present study incorporated need for treatment and risk-based prevention using timings verified by experts in China. These findings were used to estimate HROH for the survey sample, extrapolated to 12-year-olds nationally and the total population, taking account of urban and rural coverage, based on different levels of clinical commitment (60-90%). We found that between 40,139 and 51,906 dental professionals were required to deliver care for 12-year-olds nationally based on 80% clinical commitment. We demonstrated that the majority of need for HROH was in the rural population (72.5%). Over 93% of HROH time was dedicated to prevention within the model. Extrapolating the results to the total population, the estimate for HROH nationally was 3.16-4.09 million to achieve national coverage; however, current HROH are only able to serve an estimated 5% of the population with minimum intervention based on a HROH spending 90% of their time in providing clinical care. The findings highlight the gap between dental workforce needs and workforce capacity in China. Significant implications for health policy and human resources for oral health in this country with a developing health system are discussed including the need for public health action.

  4. Yet one more dwell time algorithm

    NASA Astrophysics Data System (ADS)

    Haberl, Alexander; Rascher, Rolf

    2017-06-01

    The current demand of even more powerful and efficient microprocessors, for e.g. deep learning, has led to an ongoing trend of reducing the feature size of the integrated circuits. These processors are patterned with EUV-lithography which enables 7 nm chips [1]. To produce mirrors which satisfy the needed requirements is a challenging task. Not only increasing requirements on the imaging properties, but also new lens shapes, such as aspheres or lenses with free-form surfaces, require innovative production processes. However, these lenses need new deterministic sub-aperture polishing methods that have been established in the past few years. These polishing methods are characterized, by an empirically determined TIF and local stock removal. Such a deterministic polishing method is ion-beam-figuring (IBF). The beam profile of an ion beam is adjusted to a nearly ideal Gaussian shape by various parameters. With the known removal function, a dwell time profile can be generated for each measured error profile. Such a profile is always generated pixel-accurately to the predetermined error profile, with the aim always of minimizing the existing surface structures up to the cut-off frequency of the tool used [2]. The processing success of a correction-polishing run depends decisively on the accuracy of the previously computed dwell-time profile. So the used algorithm to calculate the dwell time has to accurately reflect the reality. But furthermore the machine operator should have no influence on the dwell-time calculation. Conclusively there mustn't be any parameters which have an influence on the calculation result. And lastly it should take a minimum of machining time to get a minimum of remaining error structures. Unfortunately current dwell time algorithm calculations are divergent, user-dependent, tending to create high processing times and need several parameters to bet set. This paper describes an, realistic, convergent and user independent dwell time algorithm. The typical processing times are reduced to about 80 % up to 50 % compared to conventional algorithms (Lucy-Richardson, Van-Cittert …) as used in established machines. To verify its effectiveness a plane surface was machined on an IBF.

  5. Can households earning minimum wage in Nova Scotia afford a nutritious diet?

    PubMed

    Williams, Patricia L; Johnson, Christine P; Kratzmann, Meredith L V; Johnson, C Shanthi Jacob; Anderson, Barbara J; Chenhall, Cathy

    2006-01-01

    To assess the affordability of a nutritious diet for households earning minimum wage in Nova Scotia. Food costing data were collected in 43 randomly selected grocery stores throughout NS in 2002 using the National Nutritious Food Basket (NNFB). To estimate the affordability of a nutritious diet for households earning minimum wage, average monthly costs for essential expenses were subtracted from overall income to see if enough money remained for the cost of the NNFB. This was calculated for three types of household: 1) two parents and two children; 2) lone parent and two children; and 3) single male. Calculations were also made for the proposed 2006 minimum wage increase with expenses adjusted using the Consumer Price Index (CPI). The monthly cost of the NNFB priced in 2002 for the three types of household was 572.90 dollars, 351.68 dollars, and 198.73 dollars, respectively. Put into the context of basic living, these data showed that Nova Scotians relying on minimum wage could not afford to purchase a nutritious diet and meet their basic needs, placing their health at risk. These basic expenses do not include other routine costs, such as personal hygiene products, household and laundry cleaners, and prescriptions and costs associated with physical activity, education or savings for unexpected expenses. People working at minimum wage in Nova Scotia have not had adequate income to meet basic needs, including a nutritious diet. The 2006 increase in minimum wage to 7.15 dollars/hr is inadequate to ensure that Nova Scotians working at minimum wage are able to meet these basic needs. Wage increases and supplements, along with supports for expenses such as childcare and transportation, are indicated to address this public health problem.

  6. Trajectory of social isolation following hip fracture: an analysis of the English Longitudinal Study of Ageing (ELSA) cohort.

    PubMed

    Smith, Toby O; Dainty, Jack R; MacGregor, Alex

    2018-01-01

    social isolation is defined as a lack of meaningful and sustained communication or interactions with social networks. There is limited understanding on the prevalence of social isolation and loneliness in people following hip fracture and no previous understanding of how this changes over time. to determine the prevalence and trajectory of social isolation and loneliness before a hip fracture, during the recovery phase and a minimum of 2 years post-hip fracture in an English population. data were from the English Longitudinal Study of Ageing (ELSA) cohort (2004/5-2014/15). The sample comprised of 215 participants who had sustained a hip fracture. Measures of social isolation and loneliness were analysed through multilevel modelling to determine their trajectories during three-time intervals (pre-fracture; interval at hip fracture and recovery; minimum 2 years post-fracture). The prevalence of social isolation and loneliness were determined pre- and post-fracture. prevalence of social isolation was 19% post-hip fracture and loneliness 13% post-hip fracture. There was no statistically significant change in social isolation pre-fracture compared to a minimum of 2 years post-fracture (P = 0.78). Similarly, there was no statistically significant change in loneliness pre-fracture compared to a minimum of 2 years post-fracture (P = 0.12). this analysis has determined that whilst social isolation and loneliness do not change over time following hip fracture, these remain a significant problem for this population. Interventions are required to address these physical and psychological health needs. This is important as they may have short and longer term health benefits for people post-hip fracture. © The Author 2017. Published by Oxford University Press on behalf of the British Geriatrics Society.All rights reserved. For permissions, please email: journals.permissions@oup.com

  7. Minimally invasive transforaminal lumbar interbody fusion for spondylolisthesis and degenerative spondylosis: 5-year results.

    PubMed

    Park, Yung; Ha, Joong Won; Lee, Yun Tae; Sung, Na Young

    2014-06-01

    Multiple studies have reported favorable short-term results after treatment of spondylolisthesis and other degenerative lumbar diseases with minimally invasive transforaminal lumbar interbody fusion. However, to our knowledge, results at a minimum of 5 years have not been reported. We determined (1) changes to the Oswestry Disability Index, (2) frequency of radiographic fusion, (3) complications and reoperations, and (4) the learning curve associated with minimally invasive transforaminal lumbar interbody fusion at minimum 5-year followup. We reviewed our first 124 patients who underwent minimally invasive transforaminal lumbar interbody fusion to treat low-grade spondylolisthesis and degenerative lumbar diseases and did not need a major deformity correction. This represented 63% (124 of 198) of the transforaminal lumbar interbody fusion procedures we performed for those indications during the study period (2003-2007). Eighty-three (67%) patients had complete 5-year followup. Plain radiographs and CT scans were evaluated by two reviewers. Trends of surgical time, blood loss, and hospital stay over time were examined by logarithmic curve fit-regression analysis to evaluate the learning curve. At 5 years, mean Oswestry Disability Index improved from 60 points preoperatively to 24 points and 79 of 83 patients (95%) had improvement of greater than 10 points. At 5 years, 67 of 83 (81%) achieved radiographic fusion, including 64 of 72 patients (89%) who had single-level surgery. Perioperative complications occurred in 11 of 124 patients (9%), and another surgical procedure was performed in eight of 124 patients (6.5%) involving the index level and seven of 124 patients (5.6%) at adjacent levels. There were slowly decreasing trends of surgical time and hospital stay only in single-level surgery and almost no change in intraoperative blood loss over time, suggesting a challenging learning curve. Oswestry Disability Index scores improved for patients with spondylolisthesis and degenerative lumbar diseases treated with minimally invasive transforaminal lumbar interbody fusion at minimum 5-year followup. We suggest this procedure is reasonable for properly selected patients with these indications; however, traditional approaches should still be performed for patients with high-grade spondylolisthesis, patients with a severely collapsed disc space and no motion seen on the dynamic radiographs, patients who need multilevel decompression and arthrodesis, and patients with kyphoscoliosis needing correction. Level IV, therapeutic study. See the Instructions for Authors for a complete description of levels of evidence.

  8. Time-dependent rheological behavior of natural polysaccharide xanthan gum solutions in interrupted shear and step-incremental/reductional shear flow fields

    NASA Astrophysics Data System (ADS)

    Lee, Ji-Seok; Song, Ki-Won

    2015-11-01

    The objective of the present study is to systematically elucidate the time-dependent rheological behavior of concentrated xanthan gum systems in complicated step-shear flow fields. Using a strain-controlled rheometer (ARES), step-shear flow behaviors of a concentrated xanthan gum model solution have been experimentally investigated in interrupted shear flow fields with a various combination of different shear rates, shearing times and rest times, and step-incremental and step-reductional shear flow fields with various shearing times. The main findings obtained from this study are summarized as follows. (i) In interrupted shear flow fields, the shear stress is sharply increased until reaching the maximum stress at an initial stage of shearing times, and then a stress decay towards a steady state is observed as the shearing time is increased in both start-up shear flow fields. The shear stress is suddenly decreased immediately after the imposed shear rate is stopped, and then slowly decayed during the period of a rest time. (ii) As an increase in rest time, the difference in the maximum stress values between the two start-up shear flow fields is decreased whereas the shearing time exerts a slight influence on this behavior. (iii) In step-incremental shear flow fields, after passing through the maximum stress, structural destruction causes a stress decay behavior towards a steady state as an increase in shearing time in each step shear flow region. The time needed to reach the maximum stress value is shortened as an increase in step-increased shear rate. (iv) In step-reductional shear flow fields, after passing through the minimum stress, structural recovery induces a stress growth behavior towards an equilibrium state as an increase in shearing time in each step shear flow region. The time needed to reach the minimum stress value is lengthened as a decrease in step-decreased shear rate.

  9. 75 FR 68814 - Notice of Submission of Proposed Information Collection to OMB Minimum Property Standards for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-09

    ... Proposed Information Collection to OMB Minimum Property Standards for Multifamily and Care-Type Occupancy... Lists the Following Information Title of Proposal: Minimum Property Standards for Multifamily and Care-Type Occupancy Housing. OMB Approval Number: 2502-0321. Form Numbers: None. Description of the Need for...

  10. 42 CFR 84.83 - Timers; elapsed time indicators; remaining service life indicators; minimum requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Timers; elapsed time indicators; remaining service life indicators; minimum requirements. 84.83 Section 84.83 Public Health PUBLIC HEALTH SERVICE... indicators; remaining service life indicators; minimum requirements. (a) Elapsed time indicators shall be...

  11. 42 CFR 84.83 - Timers; elapsed time indicators; remaining service life indicators; minimum requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Timers; elapsed time indicators; remaining service life indicators; minimum requirements. 84.83 Section 84.83 Public Health PUBLIC HEALTH SERVICE... indicators; remaining service life indicators; minimum requirements. (a) Elapsed time indicators shall be...

  12. Motor Controller

    NASA Technical Reports Server (NTRS)

    1988-01-01

    M.H. Marks Enterprises' Power Factor Controller (PFC) matches voltage with motor's actual need. Plugged into a motor, PFC continuously determines motor load by sensing shifts between voltage and current flow. When it senses a light load, it cuts voltage to the minimum needed. It offers potential energy savings ranging from eight percent up to 65 percent depending on the application. Myles Marks started out with the notion of writing an article for Popular Electronics magazine at the same time offering to furnish kits to readers interested in assembling PFC's. Within two weeks from publication he had orders for 500 kits and orders are still coming three years later.

  13. Runtime Speculative Software-Only Fault Tolerance

    DTIC Science & Technology

    2012-06-01

    reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing

  14. Temporal modulation transfer functions in auditory receptor fibres of the locust ( Locusta migratoria L.).

    PubMed

    Prinz, P; Ronacher, B

    2002-08-01

    The temporal resolution of auditory receptors of locusts was investigated by applying noise stimuli with sinusoidal amplitude modulations and by computing temporal modulation transfer functions. These transfer functions showed mostly bandpass characteristics, which are rarely found in other species at the level of receptors. From the upper cut-off frequencies of the modulation transfer functions the minimum integration times were calculated. Minimum integration times showed no significant correlation to the receptor spike rates but depended strongly on the body temperature. At 20 degrees C the average minimum integration time was 1.7 ms, dropping to 0.95 ms at 30 degrees C. The values found in this study correspond well to the range of minimum integration times found in birds and mammals. Gap detection is another standard paradigm to investigate temporal resolution. In locusts and other grasshoppers application of this paradigm yielded values of the minimum detectable gap widths that are approximately twice as large than the minimum integration times reported here.

  15. 75 FR 19541 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-15

    .... The large number of SIAPs, Takeoff Minimums and ODPs, in addition to their complex nature and the need... DP, Amdt 2 Alexandria, MN, Chandler Field, RNAV (GPS) RWY 22, Orig Bemidji, MN, Bemidji Rgnl, RNAV (GPS) RWY 25, Orig Granite Falls, MN, Granite Falls Muni/Lenzen-Roe Meml Fld, Takeoff Minimums and...

  16. Modification of Prim’s algorithm on complete broadcasting graph

    NASA Astrophysics Data System (ADS)

    Dairina; Arif, Salmawaty; Munzir, Said; Halfiani, Vera; Ramli, Marwan

    2017-09-01

    Broadcasting is an information dissemination from one object to another object through communication between two objects in a network. Broadcasting for n objects can be solved by n - 1 communications and minimum time unit defined by ⌈2log n⌉ In this paper, weighted graph broadcasting is considered. The minimum weight of a complete broadcasting graph will be determined. Broadcasting graph is said to be complete if every vertex is connected. Thus to determine the minimum weight of complete broadcasting graph is equivalent to determine the minimum spanning tree of a complete graph. The Kruskal’s and Prim’s algorithm will be used to determine the minimum weight of a complete broadcasting graph regardless the minimum time unit ⌈2log n⌉ and modified Prim’s algorithm for the problems of the minimum time unit ⌈2log n⌉ is done. As an example case, here, the training of trainer problem is solved using these algorithms.

  17. [The history of optical signals for traffic regulation].

    PubMed

    Draeger, J; Harsch, V

    2008-04-01

    For signal transmission in traffic today, different optical, acoustic, or other physical or technical means are used for information. The different kinds of traffic (water navigation, road and rail, and, later air transport) made traffic regulation necessary early on. This regulation, from its very beginning in ancient times, began by means of optical signals; nowadays, this remains the most important method. From the very start, minimum requirements for the navigator's vision, color discrimination, dark adaptation, and even visual field were needed. For historical reasons, it was in seafaring medicine that these first developed. Besides the development of the different signals, methods for checking the requirements were soon developed. National and international requirements have been very different. Only within the last 50 years has international cooperation led to the acceptance of general standards for the different traffic modes. This article discusses the technical development of optical signals for the different kinds of traffic, from ancient times to the present, and explains the development of minimum requirements for the different visual functions.

  18. Laser propulsion to earth orbit. Has its time come?

    NASA Technical Reports Server (NTRS)

    Kantrowitz, Arthur

    1989-01-01

    Recent developments in high energy lasers, adaptive optics, and atmospheric transmission bring laser propulsion much closer to realization. Proposed here is a reference vehicle for study which consists of payload and solid propellant (e.g. ice). A suitable laser pulse is proposed for using a Laser Supported Detonation wave to produce thrust efficiently. It seems likely that a minimum system (10 Mw CO2 laser and 10 m dia. mirror) could be constructed for about $150 M. This minimum system could launch payloads of about 13 kg to a 400 km orbit every 10 minutes. The annual launch capability would be about 683 tons times the duty factor. Laser propulsion would be an order of magnitude cheaper than chemical rockets if the duty factor was 20 percent (10,000 launches/yr). Launches beyond that would be even cheaper. The chief problem which needs to be addressed before these possibilities could be realized is the design of a propellant to turn laser energy into thrust efficiently and to withstand the launch environment.

  19. VizieR Online Data Catalog: Evolution of solar irradiance during Holocene (Vieira+, 2011)

    NASA Astrophysics Data System (ADS)

    Vieira, L. E. A.; Solanki, S. K.; Krivova, N. A.; Usoskin, I.

    2011-05-01

    This is a composite total solar irradiance (TSI) time series for 9495BC to 2007AD constructed as described in Sect. 3.3 of the paper. Since the TSI is the main external heat input into the Earth's climate system, a consistent record covering as long period as possible is needed for climate models. This was our main motivation for constructing this composite TSI time series. In order to produce a representative time series, we divided the Holocene into four periods according to the available data for each period. Table 4 (see below) summarizes the periods considered and the models available for each period. After the end of the Maunder Minimum we compute daily values, while prior to the end of the Maunder Minimum we compute 10-year averages. For the period for which both solar disk magnetograms and continuum images are available (period 1) we employ the SATIRE-S reconstruction (Krivova et al. 2003A&A...399L...1K; Wenzler et al. 2006A&A...460..583W). SATIRE-T (Krivova et al. 2010JGRA..11512112K) reconstruction is used from the beginning of the Maunder Minimum (approximately 1640AD) to 1977AD. Prior to 1640AD reconstructions are based on cosmogenic isotopes (this paper). Different models of the Earth's geomagnetic field are available before and after approximately 5000BC. Therefore we treat periods 3 and 4 (before and after 5000BC) separately. Further details can be found in the paper. We emphasize that the reconstructions based on different proxies have different time resolutions. (1 data file).

  20. Usefulness of cardiovascular magnetic resonance imaging to predict the need for intervention in patients with coarctation of the aorta.

    PubMed

    Muzzarelli, Stefano; Meadows, Alison Knauth; Ordovas, Karen Gomes; Higgins, Charles Bernard; Meadows, Jeffery Joshua

    2012-03-15

    Cardiovascular magnetic resonance (CMR) imaging can predict hemodynamically significant coarctation of the aorta (CoA) with a high degree of discrimination. However, the ability of CMR to predict important clinical outcomes in this patient population is unknown. Therefore, we sought to define the ability of CMR to predict the need for surgical or transcatheter intervention in patients with CoA. We retrospectively reviewed the data from 133 consecutive patients who had undergone CMR for the evaluation of known or suspected CoA. The characteristics of the CMR-derived variables predicting the need for surgical or transcatheter intervention for CoA within 1 year were determined through logistic regression analysis. Therapeutic aortic intervention was performed in 41 (31%) of the 133 patients during the study period. The indexed minimum aortic cross-sectional area was the strongest predictor of subsequent intervention (area under the receiver operating characteristic curve 0.975) followed by heart rate-corrected deceleration time in the descending aorta (area under the receiver operating characteristic curve 0.951), and the percentage of flow increase (area under the receiver operating characteristic curve 0.867). The combination of the indexed minimum aortic cross-sectional area and rate-corrected deceleration time in the descending aorta provided the best predictive model (area under the receiver operating characteristic curve 0.986). In conclusion, CMR findings can predict the need for subsequent intervention in CoA. These findings reinforce the "gate-keeper role" of CMR to cardiac catheterization by providing valuable diagnostic and powerful prognostic information and could guide additional treatment of patients with CoA with the final intent of reducing the number of diagnostic catheterizations in such patients. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Human equivalent power: towards an optimum energy level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafner, E.

    1979-01-01

    How much energy would be needed to support the average individual in an efficient technological culture. Present knowledge provides information about minimum dietary power needs; but so far we have not been able to find ways of analyzing other human needs which, in a civilized society, rise far above the power of metabolism. Thus we understand the level at its minimum but not at its optimum. This paper attempts to quantify an optimum power level for civilized society. The author describes a method he uses in seminars to quantify how many servants in units of human equivalent power (HEP) aremore » needed to supply a person in a upper-middle-class lifestyle. Typical seminar participants determine a per-capita power budget of 15 HEPs (perfect servants) would be required. Each human being on earth today is, according to the author, the master of forty slaves; in the U.S., he says, the number is close to 200. He concludes that a highly civilized standard of living may be closely associated with an optimum per capita power budget of 1500 watts; and since the average individual in the U.S. participates in energy turnover at almost ten times the rate he knows intuitively to be reasonable, reformation of American power habits will require reconstruction that shakes the house from top to bottom.« less

  2. Theoretical analysis of the cost of antagonistic activity for aquatic bacteria in oligotrophic environments.

    PubMed

    Aguirre-von-Wobeser, Eneas; Eguiarte, Luis E; Souza, Valeria; Soberón-Chávez, Gloria

    2015-01-01

    Many strains of bacteria produce antagonistic substances that restrain the growth of others, and potentially give them a competitive advantage. These substances are commonly released to the surrounding environment, involving metabolic costs in terms of energy and nutrients. The rate at which these molecules need to be produced to maintain a certain amount of them close to the producing cell before they are diluted into the environment has not been explored so far. To understand the potential cost of production of antagonistic substances in water environments, we used two different theoretical approaches. Using a probabilistic model, we determined the rate at which a cell needs to produce individual molecules in order to keep on average a single molecule in its vicinity at all times. For this minimum protection, a cell would need to invest 3.92 × 10(-22) kg s(-1) of organic matter, which is 9 orders of magnitude lower than the estimated expense for growth. Next, we used a continuous model, based on Fick's laws, to explore the production rate needed to sustain minimum inhibitory concentrations around a cell, which would provide much more protection from competitors. In this scenario, cells would need to invest 1.20 × 10(-11) kg s(-1), which is 2 orders of magnitude higher than the estimated expense for growth, and thus not sustainable. We hypothesize that the production of antimicrobial compounds by bacteria in aquatic environments lies between these two extremes.

  3. Dynamic Positron Emission Tomography [PET] in Man Using Small Bismuth Germanate Crystals

    DOE R&D Accomplishments Database

    Derenzo, S. E.; Budinger, T. F.; Huesman, R. H.; Cahoon, J. L.

    1982-04-01

    Primary considerations for the design of positron emission tomographs for medical studies in humans are the need for high imaging sensitivity, whole organ coverage, good spatial resolution, high maximum data rates, adequate spatial sampling with minimum mechanical motion, shielding against out of plane activity, pulse height discrimination against scattered photons, and timing discrimination against accidental coincidences. We discuss the choice of detectors, sampling motion, shielding, and electronics to meet these objectives.

  4. Thermal and mass implications of magmatic evolution in the Lassen volcanic region, California, and minimum constraints on basalt influx to the lower crust

    USGS Publications Warehouse

    Guffanti, M.; Clynne, M.A.; Muffler, L.J.P.

    1996-01-01

    We have analyzed the heat and mass demands of a petrologic model of basaltdriven magmatic evolution in which variously fractionated mafic magmas mix with silicic partial melts of the lower crust. We have formulated steady state heat budgets for two volcanically distinct areas in the Lassen region: the large, late Quaternary, intermediate to silicic Lassen volcanic center and the nearby, coeval, less evolved Caribou volcanic field. At Caribou volcanic field, heat provided by cooling and fractional crystallization of 52 km3 of basalt is more than sufficient to produce 10 km3 of rhyolitic melt by partial melting of lower crust. Net heat added by basalt intrusion at Caribou volcanic field is equivalent to an increase in lower crustal heat flow of ???7 mW m-2, indicating that the field is not a major crustal thermal anomaly. Addition of cumulates from fractionation is offset by removal of erupted partial melts. A minimum basalt influx of 0.3 km3 (km2 Ma)-1 is needed to supply Caribou volcanic field. Our methodology does not fully account for an influx of basalt that remains in the crust as derivative intrusives. On the basis of comparison to deep heat flow, the input of basalt could be ???3 to 7 times the amount we calculate. At Lassen volcanic center, at least 203 km3 of mantle-derived basalt is needed to produce 141 km3 of partial melt and drive the volcanic system. Partial melting mobilizes lower crustal material, augmenting the magmatic volume available for eruption at Lassen volcanic center; thus the erupted volume of 215 km3 exceeds the calculated basalt input of 203 km3. The minimum basalt input of 1.6 km3 (km2 Ma)-1 is >5 times the minimum influx to the Caribou volcanic field. Basalt influx high enough to sustain considerable partial melting, coupled with locally high extension rate, is a crucial factor in development of Lassen volcanic center; in contrast. Caribou volcanic field has failed to develop into a large silicic center primarily because basalt supply there has been insufficient.

  5. Study on statistical breakdown delay time in argon gas using a W-band millimeter-wave gyrotron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Dongsung; Yu, Dongho; Choe, MunSeok

    2016-04-15

    In this study, we investigated plasma initiation delay times for argon volume breakdown at the W-band frequency regime. The threshold electric field is defined as the minimum electric field amplitude needed for plasma breakdown at various pressures. The measured statistical delay time showed an excellent agreement with the theoretical Gaussian distribution and the theoretically estimated formative delay time. Also, we demonstrated that the normalized effective electric field as a function of the product of pressure and formative time shows an outstanding agreement to that of 1D particle-in-cell simulation coupled with a Monte Carlo collision model [H. C. Kim and J.more » P. Verboncoeur, Phys. Plasmas 13, 123506 (2006)].« less

  6. Influence of different types of pulp treatment during isolation in the obtention of human dental pulp stem cells

    PubMed Central

    Viña-Almunia, Jose; Borras, Consuelo; Gambini, Juan; El Alamy, Marya; Viña, Jose

    2016-01-01

    Background Different methods have been used in order to isolate dental pulp stem cells. The aim of this study was to study the effect of different types of pulp treatment during isolation, under 3% O2 conditions, in the time needed and the efficacy for obtaining dental pulp stem cells. Material and Methods One hundred and twenty dental pulps were used to isolate dental pulp stem cells treating the pulp tissue during isolation using 9 different methods, using digestive, disgregation, or mechanical agents, or combining them. The cells were positive for CD133, Oct4, Nestin, Stro-1, CD34 markers, and negative for the hematopoietic cell marker CD-45, thus confirming the presence of mesenchymal stem cells. The efficacy of dental pulp stem cells obtention and the minimum time needed to obtain such cells comparing the 9 different methods was analyzed. Results Dental pulp stem cells were obtained from 97 of the 120 pulps used in the study, i.e. 80.8% of the cases. They were obtained with all the methods used except with mechanical fragmentation of the pulp, where no enzymatic digestion was performed. The minimum time needed to isolate dental pulp stem cells was 8 hours, digesting with 2mg/ml EDTA for 10 minutes, 4mg/ml of type I collagenase, 4mg/ml of type II dispase for 40 minutes, 13ng/ml of thermolysine for 40 minutes and sonicating the culture for one minute. Conclusions Dental pulp stem cells were obtained in 97 cases from a series of 120 pulps. The time for obtaining dental pulp stem cells was reduced maximally, without compromising the obtention of the cells, by combining digestive, disgregation, and mechanical agents. Key words:Dental pulp stem cells, mesenchymal stem cells, isolation method. PMID:26946201

  7. Road map to adaptive optimal control. [jet engine control

    NASA Technical Reports Server (NTRS)

    Boyer, R.

    1980-01-01

    A building block control structure leading toward adaptive, optimal control for jet engines is developed. This approach simplifies the addition of new features and allows for easier checkout of the control by providing a baseline system for comparison. Also, it is possible to eliminate certain features that do not have payoff by being selective in the addition of new building blocks to be added to the baseline system. The minimum risk approach specifically addresses the need for active identification of the plant to be controlled in real time and real time optimization of the control for the identified plant.

  8. Minimum-Time Consensus-Based Approach for Power System Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Tao; Wu, Di; Sun, Yannan

    2016-02-01

    This paper presents minimum-time consensus based distributed algorithms for power system applications, such as load shedding and economic dispatch. The proposed algorithms are capable of solving these problems in a minimum number of time steps instead of asymptotically as in most of existing studies. Moreover, these algorithms are applicable to both undirected and directed communication networks. Simulation results are used to validate the proposed algorithms.

  9. 50 CFR 259.34 - Minimum and maximum deposits; maximum time to deposit.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... B objective. A time longer than 10 years, either by original scheduling or by subsequent extension... OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE AID TO FISHERIES CAPITAL CONSTRUCTION FUND...) Minimum annual deposit. The minimum annual (based on each party's taxable year) deposit required by the...

  10. 78 FR 56829 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Minimums and ODP copies may be obtained from: 1. FAA Public Inquiry Center (APA-200), FAA Headquarters..., and the need for a special format make their verbatim publication in the Federal Register expensive...

  11. 77 FR 56762 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-14

    ... Minimums and ODP copies may be obtained from: 1. FAA Public Inquiry Center (APA-200), FAA Headquarters..., and the need for a special format make their verbatim publication in the Federal Register expensive...

  12. Practical Algorithms for the Longest Common Extension Problem

    NASA Astrophysics Data System (ADS)

    Ilie, Lucian; Tinta, Liviu

    The Longest Common Extension problem considers a string s and computes, for each of a number of pairs (i,j), the longest substring of s that starts at both i and j. It appears as a subproblem in many fundamental string problems and can be solved by linear-time preprocessing of the string that allows (worst-case) constant-time computation for each pair. The two known approaches use powerful algorithms: either constant-time computation of the Lowest Common Ancestor in trees or constant-time computation of Range Minimum Queries (RMQ) in arrays. We show here that, from practical point of view, such complicated approaches are not needed. We give two very simple algorithms for this problem that require no preprocessing. The first needs only the string and is significantly faster than all previous algorithms on the average. The second combines the first with a direct RMQ computation on the Longest Common Prefix array. It takes advantage of the superior speed of the cache memory and is the fastest on virtually all inputs.

  13. Predictive minimum description length principle approach to inferring gene regulatory networks.

    PubMed

    Chaitankar, Vijender; Zhang, Chaoyang; Ghosh, Preetam; Gong, Ping; Perkins, Edward J; Deng, Youping

    2011-01-01

    Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold that defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we propose a new inference algorithm that incorporates mutual information (MI), conditional mutual information (CMI), and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm is evaluated using both synthetic time series data sets and a biological time series data set (Saccharomyces cerevisiae). The results show that the proposed algorithm produced fewer false edges and significantly improved the precision when compared to existing MDL algorithm.

  14. 77 FR 37799 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-25

    ..., individual SIAP and Takeoff Minimums and ODP copies may be obtained from: 1. FAA Public Inquiry Center (APA... number of SIAPs, their complex nature, and the need for a special format make their verbatim publication...

  15. 77 FR 5694 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... Minimums and ODP copies may be obtained from: 1. FAA Public Inquiry Center (APA-200), FAA Headquarters..., their complex nature, and the need for a special format make their verbatim publication in the Federal...

  16. Protocol for the "Michigan Awareness Control Study": A prospective, randomized, controlled trial comparing electronic alerts based on bispectral index monitoring or minimum alveolar concentration for the prevention of intraoperative awareness.

    PubMed

    Mashour, George A; Tremper, Kevin K; Avidan, Michael S

    2009-11-05

    The incidence of intraoperative awareness with explicit recall is 1-2/1000 cases in the United States. The Bispectral Index monitor is an electroencephalographic method of assessing anesthetic depth that has been shown in one prospective study to reduce the incidence of awareness in the high-risk population. In the B-Aware trial, the number needed to treat in order to prevent one case of awareness in the high-risk population was 138. Since the number needed to treat and the associated cost of treatment would be much higher in the general population, the efficacy of the Bispectral Index monitor in preventing awareness in all anesthetized patients needs to be clearly established. This is especially true given the findings of the B-Unaware trial, which demonstrated no significant difference between protocols based on the Bispectral Index monitor or minimum alveolar concentration for the reduction of awareness in high risk patients. To evaluate efficacy in the general population, we are conducting a prospective, randomized, controlled trial comparing the Bispectral Index monitor to a non-electroencephalographic gauge of anesthetic depth. The total recruitment for the study is targeted for 30,000 patients at both low and high risk for awareness. We have developed a novel algorithm that is capable of real-time analysis of our electronic perioperative information system. In one arm of the study, anesthesia providers will receive an electronic page if the Bispectral Index value is >60. In the other arm of the study, anesthesia providers will receive a page if the age-adjusted minimum alveolar concentration is <0.5. Our minimum alveolar concentration algorithm is sensitive to both inhalational anesthetics and intravenous sedative-hypnotic agents. Awareness during general anesthesia is a persistent problem and the role of the Bispectral Index monitor in its prevention is still unclear. The Michigan Awareness Control Study is the largest prospective trial of awareness prevention ever conducted. Clinical Trial NCT00689091.

  17. Simultaneous CCD Photometry of Two Eclipsing Binary Stars in Pegasus - Part2: BX Peg

    NASA Astrophysics Data System (ADS)

    Alton, K. B.

    2013-05-01

    BX Peg is an overcontact W UMa binary system (P = 0.280416 d) which has been rather well studied, but not fully understood due to complex changes in eclipse timings and light curve variations attributed to star spots. Photometric data collected in three bandpasses (B, V, and Ic) produced nineteen new times of minimum for BX Peg. These were used to update the linear ephemeris and further analyze potential changes in orbital periodicity by examining long-term changes in eclipse timings. In addition, synthetic fitting of light curves by Roche modeling was accomplished with the assistance of three different programs, two of which employ the Wilson-Devinney code. Different spotted solutions were necessary to achieve the best Roche model fits for BX Peg light curves collected in 2008 and 2011. Overall, the long-;term decrease (9.66 × 10-3 sec y-1) in orbital period defined by the parabolic fit of eclipse timing data could arise from mass transfer or angular momentum loss. The remaining residuals from observed minus predicted eclipse timings for BX Peg exhibit complex but non-random behavior. These may be related to magnetic activity cycles and/or the presence of an unseen mass influencing the times of minimum, however, additional minima need to be collected over a much longer timescale to resolve the nature of these complex changes.

  18. Return to work after spinal cord injury: factors related to time to first job.

    PubMed

    Ramakrishnan, K; Mazlan, M; Julia, P E; Abdul Latif, L

    2011-08-01

    Cross-sectional survey. To investigate factors related to length of time between spinal cord injury (SCI) onset and start of first post-injury employment. Persons living with SCI in the community who are members of a disability support organization. Participants were randomly selected from the membership list of a non-governmental voluntary organization. They met the following four criteria: traumatic SCI, minimum of 15 years of age at the time of survey, a minimum of 2 years after SCI and had been employed for some time since SCI. The main outcome measure was time (in years) from injury onset to beginning first post-injury job. Participants averaged 4.9 years (s.d. 5.1) from the time of SCI to their first post-injury job, with a range of 3 months to 20 years. Fifty percent of the participants who eventually returned to work had done so by 4 years. Return to pre-injury employer and employment were associated with early return, whereas having less years in education and being older at the time of injury were associated with longer time to return to work. Rehabilitation team need to consider return to employment as a realistic goal even many years after SCI. Perhaps a focus on returning more people to their pre-injury employer and employment with added focus and input from rehabilitation team for those with lower education status and older age at time of injury might expedite the process of reintegration.

  19. On pressure measurement and seasonal pressure variations during the Phoenix mission

    NASA Astrophysics Data System (ADS)

    Taylor, Peter A.; Kahanpää, Henrik; Weng, Wensong; Akingunola, Ayodeji; Cook, Clive; Daly, Mike; Dickinson, Cameron; Harri, Ari-Matti; Hill, Darren; Hipkin, Victoria; Polkko, Jouni; Whiteway, Jim

    2010-03-01

    In situ surface pressures measured at 2 s intervals during the 150 sol Phoenix mission are presented and seasonal variations discussed. The lightweight Barocap®/Thermocap® pressure sensor system performed moderately well. However, the original data processing routine had problems because the thermal environment of the sensor was subject to more rapid variations than had been expected. Hence, the data processing routine was updated after Phoenix landed. Further evaluation and the development of a correction are needed since the temperature dependences of the Barocap sensor heads have drifted after the calibration of the sensor. The inaccuracy caused by this appears when the temperature of the unit rises above 0°C. This frequently affects data in the afternoons and precludes a full study of diurnal pressure variations at this time. Short-term fluctuations, on time scales of order 20 s are unaffected and are reported in a separate paper in this issue. Seasonal variations are not significantly affected by this problem and show general agreement with previous measurements from Mars. During the 151 sol mission the surface pressure dropped from around 860 Pa to a minimum (daily average) of 724 Pa on sol 140 (Ls 143). This local minimum occurred several sols earlier than expected based on GCM studies and Viking data. Since battery power was lost on sol 151 we are not sure if the timing of the minimum that we saw could have been advanced by a low-pressure meteorological event. On sol 95 (Ls 122), we also saw a relatively low-pressure feature. This was accompanied by a large number of vertical vortex events, characterized by short, localized (in time), low-pressure perturbations.

  20. Identification of registered nursing care of residents in English nursing homes using the Minimum Data Set Resident Assessment Instrument (MDS/RAI) and Resource Utilisation Groups version III (RUG-III).

    PubMed

    Carpenter, Iain; Perry, Michelle; Challis, David; Hope, Kevin

    2003-05-01

    to determine if a combination of Minimum Data Set/Resident Assessment Instrument (MDS/RAI) assessment variables and the Resource Utilisation Groups version III (RUG-III) case-mix system could be used as a method of identifying and reimbursing registered nursing care needs in long-term care. the sample included 193 nursing home residents from four nursing homes from three different locations and care providers in England. The study included assessments of residents' care needs using either the MDS/RAI assessments or RUG stand-alone questionnaires and a time study that recorded the amount of nursing time received by residents over a 24-h period. Validity of RUG-III for explaining the distribution of care time between residents in different RUG-III groups was tested. The difference in direct and indirect care provided by registered general nurses (RGN) and care assistants (CA) to residents in RUG-III clinical groups was compared. the RUG-III system explained 56% of the variance in care time (Eta2, P=0.0001). Residents in RUG-III groups associated with particular medical and nursing needs (enhanced RGN care) received more than twice as much indirect RGN care time (t-test, P<0.001) and 1.4 times as much direct RGN and direct CA time (t-test, P<0.01) than residents with primarily cognitive impairment or physical problems only (standard RGN care). Residents with enhanced RGN care received an average of 48.1 min of RGN care in 24 h (95% CI 4.1-55.2) compared with an average of 31.1 min (95% CI 26.8-35.5) for residents in the standard RGN care group. A third low RGN care group was created following publication of the Department of Health guidance on NHS Funded Nursing Care. With three levels, the enhanced care group receives about 38% more than the standard group, and the low group receives about 50% of the standard group. the RUG-III system effectively differentiated between nursing home residents who are receiving 'low', 'standard' and 'enhanced' RGN care time. The findings could provide the basis of a reimbursement system for registered nursing time in long-term care facilities in the UK.

  1. Quasi-elastic light scattering: Signal storage, correlation, and spectrum analysis under control of an 8-bit microprocessor

    NASA Astrophysics Data System (ADS)

    Glatter, Otto; Fuchs, Heribert; Jorde, Christian; Eigner, Wolf-Dieter

    1987-03-01

    The microprocessor of an 8-bit PC system is used as a central control unit for the acquisition and evaluation of data from quasi-elastic light scattering experiments. Data are sampled with a width of 8 bits under control of the CPU. This limits the minimum sample time to 20 μs. Shorter sample times would need a direct memory access channel. The 8-bit CPU can address a 64-kbyte RAM without additional paging. Up to 49 000 sample points can be measured without interruption. After storage, a correlation function or a power spectrum can be calculated from such a primary data set. Furthermore access is provided to the primary data for stability control, statistical tests, and for comparison of different evaluation methods for the same experiment. A detailed analysis of the signal (histogram) and of the effect of overflows is possible and shows that the number of pulses but not the number of overflows determines the error in the result. The correlation function can be computed with reasonable accuracy from data with a mean pulse rate greater than one, the power spectrum needs a three times higher pulse rate for convergence. The statistical accuracy of the results from 49 000 sample points is of the order of a few percent. Additional averages are necessary to improve their quality. The hardware extensions for the PC system are inexpensive. The main disadvantage of the present system is the high minimum sampling time of 20 μs and the fact that the correlogram or the power spectrum cannot be computed on-line as it can be done with hardware correlators or spectrum analyzers. These shortcomings and the storage size restrictions can be removed with a faster 16/32-bit CPU.

  2. Bacterioplankton Populations within the Oxygen Minimum Zone of the Sargasso Sea

    NASA Astrophysics Data System (ADS)

    Schuler, G.; Parsons, R. J.; Johnson, R. J.

    2016-02-01

    Oxygen minimum zones are present throughout the world's oceans, and occur at depths between 200 to 1000m. Heterotrophic bacteria reduce the dissolved oxygen within this layer through respiration, while metabolizing falling particles. This report studied the bacterioplankton in the oxygen minimum zone at the BATS (Bermuda Atlantic Times-series Study) site from July 2014 until November 2014. Total bacterioplankton populations were enumerated through direct counts. In the transitional zone (400m-800m) of the oxygen minimum zone, a secondary bacterioplankton peak formed. This study used FISH (Fluorescent in situ hybridization) and CARD-FISH (Catalyzed Reporter Deposition-Fluorescent in situ hybridization) to enumerate specific bacterial and archaeal taxa. Crenarchaeota (including Thaumarchaeota) increased in abundance within the upper oxycline. Thaumarchaeota have the ammonia monooxygenase gene that oxidizes ammonium into nitrite in low oxygen conditions. Amplification of the amoA gene confirmed that ammonia oxidizing archaea (AOA) were present within the OMZ. Using Terminal Restriction Fragment Length Polymorphism (T-RFLP), the bacterial community structure showed high similarity based depth zones (0-80m, 160-600m, and 800-4500m). Niskin experiments determined that water collected at 800m had an exponential increase in bacterioplankton over time. While experimental design did not allow for oxygen levels to be maintained, the bacterioplankton community was predominantly bacteria with eubacteria positive cells making up 89.3% of the of the total bacterioplankton community by day 34. Improvements to the experimental design are required to determine which specific bacterial taxa caused this increase at 800m. This study suggests that there are factors other than oxygen influencing bacterioplankton populations at the BATS site, and more analysis is needed once the BATS data is available to determine the key drivers of bacterioplankton dynamics within the BATS OMZ.

  3. A new approach for minimum phase output definition

    NASA Astrophysics Data System (ADS)

    Jahangiri, Fatemeh; Talebi, Heidar Ali; Menhaj, Mohammad Bagher; Ebenbauer, Christian

    2017-01-01

    This paper presents a novel method for output redefinition for linear systems. The approach also determines possible relative degrees for the systems corresponding to any new output vector. To guarantee the minimum phase property with a prescribed relative degree, a set of new conditions is introduced. A key feature of these conditions is that there is no need to any form of transformations which make the scheme suitable for optimisation problems in control to ensure the minimum phase property. Moreover, the results are useful for sensor placement problems and for obtaining minimum phase approximations of non-minimum phase systems. Numerical examples including an example of unmanned aerial vehicle systems are given to demonstrate the effectiveness of the methodology.

  4. Preparing routine health information systems for immediate health responses to disasters

    PubMed Central

    Aung, Eindra; Whittaker, Maxine

    2013-01-01

    During disaster times, we need specific information to rapidly plan a disaster response, especially in sudden-onset disasters. Due to the inadequate capacity of Routine Health Information Systems (RHIS), many developing countries face a lack of quality pre-disaster health-related data and efficient post-disaster data processes in the immediate aftermath of a disaster. Considering the significance of local capacity during the early stages of disaster response, RHIS at local, provincial/state and national levels need to be strengthened so that they provide relief personnel up-to-date information to plan, organize and monitor immediate relief activities. RHIS professionals should be aware of specific information needs in disaster response (according to the Sphere Project’s Humanitarian Minimum Standards) and requirements in data processes to fulfil those information needs. Preparing RHIS for disasters can be guided by key RHIS-strengthening frameworks; and disaster preparedness must be incorporated into countries’ RHIS. Mechanisms must be established in non-disaster times and maintained between RHIS and information systems of non-health sectors for exchanging disaster-related information and sharing technologies and cost. PMID:23002249

  5. Will Increasing Alcohol Availability By Lowering the Minimum Legal Drinking Age Decrease Drinking and Related Consequences Among Youths?

    PubMed Central

    Wechsler, Henry

    2010-01-01

    Alcohol use health consequences are considerable; prevention efforts are needed, particularly for adolescents and college students. The national minimum legal drinking age of 21 years is a primary alcohol-control policy in the United States. An advocacy group supported by some college presidents seeks public debate on the minimum legal drinking age and proposes reducing it to 18 years. We reviewed recent trends in drinking and related consequences, evidence on effectiveness of the minimum legal drinking age of 21 years, research on drinking among college students related to the minimum legal drinking age, and the case to lower the minimum legal drinking age. Evidence supporting the minimum legal drinking age of 21 years is strong and growing. A wide range of empirically supported interventions is available to reduce underage drinking. Public health professionals can play a role in advocating these interventions. PMID:20395573

  6. 40 CFR 63.1257 - Test methods and compliance procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...)(2), or 63.1256(h)(2)(i)(C) with a minimum residence time of 0.5 seconds and a minimum temperature of... temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B) For a...

  7. 40 CFR 63.1257 - Test methods and compliance procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...)(2), or 63.1256(h)(2)(i)(C) with a minimum residence time of 0.5 seconds and a minimum temperature of... temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B) For a...

  8. 40 CFR 63.1257 - Test methods and compliance procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...)(2), or 63.1256(h)(2)(i)(C) with a minimum residence time of 0.5 seconds and a minimum temperature of... temperature of the organic HAP, must consider the vent stream flow rate, and must establish the design minimum and average temperature in the combustion zone and the combustion zone residence time. (B) For a...

  9. LDPC Codes with Minimum Distance Proportional to Block Size

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Jones, Christopher; Dolinar, Samuel; Thorpe, Jeremy

    2009-01-01

    Low-density parity-check (LDPC) codes characterized by minimum Hamming distances proportional to block sizes have been demonstrated. Like the codes mentioned in the immediately preceding article, the present codes are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. The previously mentioned codes have low decoding thresholds and reasonably low error floors. However, the minimum Hamming distances of those codes do not grow linearly with code-block sizes. Codes that have this minimum-distance property exhibit very low error floors. Examples of such codes include regular LDPC codes with variable degrees of at least 3. Unfortunately, the decoding thresholds of regular LDPC codes are high. Hence, there is a need for LDPC codes characterized by both low decoding thresholds and, in order to obtain acceptably low error floors, minimum Hamming distances that are proportional to code-block sizes. The present codes were developed to satisfy this need. The minimum Hamming distances of the present codes have been shown, through consideration of ensemble-average weight enumerators, to be proportional to code block sizes. As in the cases of irregular ensembles, the properties of these codes are sensitive to the proportion of degree-2 variable nodes. A code having too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code having too many such nodes tends not to exhibit a minimum distance that is proportional to block size. Results of computational simulations have shown that the decoding thresholds of codes of the present type are lower than those of regular LDPC codes. Included in the simulations were a few examples from a family of codes characterized by rates ranging from low to high and by thresholds that adhere closely to their respective channel capacity thresholds; the simulation results from these examples showed that the codes in question have low error floors as well as low decoding thresholds. As an example, the illustration shows the protograph (which represents the blueprint for overall construction) of one proposed code family for code rates greater than or equal to 1.2. Any size LDPC code can be obtained by copying the protograph structure N times, then permuting the edges. The illustration also provides Field Programmable Gate Array (FPGA) hardware performance simulations for this code family. In addition, the illustration provides minimum signal-to-noise ratios (Eb/No) in decibels (decoding thresholds) to achieve zero error rates as the code block size goes to infinity for various code rates. In comparison with the codes mentioned in the preceding article, these codes have slightly higher decoding thresholds.

  10. Improving Data for Behavioral Health Workforce Planning: Development of a Minimum Data Set.

    PubMed

    Beck, Angela J; Singer, Phillip M; Buche, Jessica; Manderscheid, Ronald W; Buerhaus, Peter

    2018-06-01

    The behavioral health workforce, which encompasses a broad range of professions providing prevention, treatment, and rehabilitation services for mental health conditions and substance use disorders, is in the midst of what is considered by many to be a workforce crisis. The workforce shortage can be attributed to both insufficient numbers and maldistribution of workers, leaving some communities with no behavioral health providers. In addition, demand for behavioral health services has increased more rapidly as a result of federal legislation over the past decade supporting mental health and substance use parity and by healthcare reform. In order to address workforce capacity issues that impact access to care, the field must engage in extensive planning; however, these efforts are limited by the lack of timely and useable data on the behavioral health workforce. One method for standardizing data collection efforts is the adoption of a Minimum Data Set. This article describes workforce data limitations, the need for standardizing data collection, and the development of a behavioral health workforce Minimum Data Set intended to address these gaps. The Minimum Data Set includes five categorical data themes to describe worker characteristics: demographics, licensure and certification, education and training, occupation and area of practice, and practice characteristics and settings. Some data sources align with Minimum Data Set themes, although deficiencies in the breadth and quality of data exist. Development of a Minimum Data Set is a foundational step for standardizing the collection of behavioral health workforce data. Key challenges for dissemination and implementation of the Minimum Data Set are also addressed. This article is part of a supplement entitled The Behavioral Health Workforce: Planning, Practice, and Preparation, which is sponsored by the Substance Abuse and Mental Health Services Administration and the Health Resources and Services Administration of the U.S. Department of Health and Human Services. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  11. A Method for Modeling Household Occupant Behavior to Simulate Residential Energy Consumption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brandon J; Starke, Michael R; Abdelaziz, Omar

    2014-01-01

    This paper presents a statistical method for modeling the behavior of household occupants to estimate residential energy consumption. Using data gathered by the U.S. Census Bureau in the American Time Use Survey (ATUS), actions carried out by survey respondents are categorized into ten distinct activities. These activities are defined to correspond to the major energy consuming loads commonly found within the residential sector. Next, time varying minute resolution Markov chain based statistical models of different occupant types are developed. Using these behavioral models, individual occupants are simulated to show how an occupant interacts with the major residential energy consuming loadsmore » throughout the day. From these simulations, the minimum number of occupants, and consequently the minimum number of multiple occupant households, needing to be simulated to produce a statistically accurate representation of aggregate residential behavior can be determined. Finally, future work will involve the use of these occupant models along side residential load models to produce a high-resolution energy consumption profile and estimate the potential for demand response from residential loads.« less

  12. Air cushioning in drop impact

    NASA Astrophysics Data System (ADS)

    de Ruiter, Jolet; Oh, Jung; van den Ende, Dirk; Mugele, Frieder

    2011-11-01

    Liquid drops impacting on solid surfaces deform under the influence of the ambient gas that needs to be squeezed out before a true solid-liquid contact can be established. We demonstrate experimentally the existence of this theoretically predicted air layer and follow its evolution with time for moderate impact speeds (We ~ 1 ... 10) using reflection interference microscopy with a thickness resolution of approximately 10nm. For a wide range of fluid properties (ρ, γ, η) we find a very robust generic behavior that includes the predicted formation of a dimple in the center of the drop with a local minimum of the air film thickness at its boundary. Depending on We as well as the fluid properties, a skating layer of more or less constant thickness as well as a second local minimum of the air film thickness farther away from the drop center develop in time. Eventually, solid-liquid contact is generated via random nucleation event. The nucleation spot spreads across the drop-substrate interface within a few milliseconds. This process can lead to the entrapment of an air bubble.

  13. Feedback laws for fuel minimization for transport aircraft

    NASA Technical Reports Server (NTRS)

    Price, D. B.; Gracey, C.

    1984-01-01

    The Theoretical Mechanics Branch has as one of its long-range goals to work toward solving real-time trajectory optimization problems on board an aircraft. This is a generic problem that has application to all aspects of aviation from general aviation through commercial to military. Overall interest is in the generic problem, but specific problems to achieve concrete results are examined. The problem is to develop control laws that generate approximately optimal trajectories with respect to some criteria such as minimum time, minimum fuel, or some combination of the two. These laws must be simple enough to be implemented on a computer that is flown on board an aircraft, which implies a major simplification from the two point boundary value problem generated by a standard trajectory optimization problem. In addition, the control laws allow for changes in end conditions during the flight, and changes in weather along a planned flight path. Therefore, a feedback control law that generates commands based on the current state rather than a precomputed open-loop control law is desired. This requirement, along with the need for order reduction, argues for the application of singular perturbation techniques.

  14. Ion Exchange Method - Diffusion Barrier Investigations

    NASA Astrophysics Data System (ADS)

    Pielak, G.; Szustakowski, M.; Kiezun, A.

    1990-01-01

    Ion exchange method is used to GRIN-rod lenses manufacturing. In this process the ion exchange occurs between bulk glass (rod) and a molten salt. It was find that diffusion barrier exists on a border of glass surface and molten salt. The investigations of this barrier show that it value varies with ion exchange time and process temperature. It was find that in the case when thalium glass rod was treated in KNO3, bath, the minimum of the potential after 24 h was in temperature of 407°C, after 48 h in 422°C, after 72 h in 438°C and so on. So there are the possibility to keep the minimum of diffusion barrier by changing the temperature of the process and then the effectiveness of ion exchange process is the most effective. The time needed to obtain suitable refractive index distribution in a process when temperature was linearly changed from 400°C to 460°C was shorter of about 30% compare with the process in which temperature was constant and equal 450°C.

  15. The Fate of Meniscus Tears Left in situ at the time of Anterior Cruciate Ligament Reconstruction: A 6-year Follow-up Study from the MOON Cohort

    PubMed Central

    Duchman, Kyle R.; Westermann, Robert W.; Spindler, Kurt P.; Reinke, Emily K.; Huston, Laura J.; Amendola, Annunziato; Wolf, Brian R.

    2016-01-01

    Background The management of meniscus tears identified at the time of primary ACL reconstruction is highly variable and includes repair, meniscectomy, and non-treatment. Hypothesis/Purpose The purpose of this study is to determine the reoperation rate for meniscus tears left untreated at the time of ACL reconstruction with minimum follow-up of 6 years. We hypothesize that small, peripheral tears identified at the time of ACL reconstruction managed with “no treatment” will have successful clinical outcomes. Study Design Retrospective study of a prospective cohort; Level of Evidence, 3 Methods Patients with meniscus tears left untreated at the time of primary ACL reconstruction were identified from a multicenter study group with minimum 6-year follow-up. Patient, tear, and reoperation data were obtained for analysis. Need for reoperation was used as the primary endpoint, with analysis performed to determine patient and tear characteristics associated with reoperation. Results There were 194 patients with 208 meniscus tears (71 medial; 137 lateral) left in situ without treatment with complete follow-up for analysis. Of these, 97.8% of lateral and 94.4% of medial untreated tears required no reoperation. Sixteen tears (7.7%) left in situ without treatment underwent subsequent reoperation: 9 tears (4.3%) underwent reoperation in the setting of revision ACL reconstruction and 7 tears (3.4%) underwent reoperation for isolated meniscus pathology. Patient age was significantly lower in patients requiring reoperation, while tears measuring ≥ 10 mm more frequently required reoperation. Conclusions Lateral and medial meniscus tears left in situ at the time of ACL reconstruction did not require reoperation at minimum 6-year follow-up for 97.8% and 94.4% of tears, respectively. These findings reemphasize the low reoperation rate following non-treatment of small, peripheral lateral meniscus tears while noting less predictable results for medial meniscus tears left without treatment. PMID:26430058

  16. Minimum Contradictions Physics and Propulsion via Superconducting Magnetic Field Trapping

    NASA Astrophysics Data System (ADS)

    Nassikas, A. A.

    2010-01-01

    All theories are based on Axioms which obviously are arbitrary; e.g. SRT, GRT, QM Axioms. Instead of manipulating the experience through a new set of Arbitrary Axioms it would be useful to search, through a basic tool that we have at our disposal i.e. Logic Analysis, for a set of privileged axioms. Physics theories, beyond their particular axioms, can be restated through the basic communication system as consisting of the Classical Logic, the Sufficient Reason Principle and the Anterior-Posterior Axiom. By means of a theorem this system can be proven as contradictory. The persistence in logic is the way for a set of privileged axioms to be found. This can be achieved on the basis of the Claim for Minimum Contradictions. Further axioms beyond the ones of the basic communications imply further contradictions. Thus, minimum contradictions can be achieved when things are described through anterior-posterior terms; due to existing contradictions through stochastic space-time, which is matter itself, described through a Ψ wave function and distributed, in a Hypothetical Measuring Field (HMF), through the density probability function P(r, t). On this basis, a space-time QM is obtained and this QM is a unified theory satisfying the requirements of quantum gravity. There are both mass-gravitational space-time (g) regarded as real and charge-electromagnetic (em) space-time that could be regarded as imaginary. In a closed system energy conversion-conservation and momentum action take place through photons, which can be regarded either as (g) or (em) space-time formation whose rest mass is equal to zero. Universe Evolution is described through the interaction of the gravitational (g) with the electromagnetic (em) space-time-matter field and not through any other entities. This methodology implies that there is no need for dark matter. An experiment is proposed relative to the (g)+(em) interaction based on Superconducting Magnetic Field Trapping to validate this approach.

  17. The Fate of Meniscus Tears Left In Situ at the Time of Anterior Cruciate Ligament Reconstruction: A 6-Year Follow-up Study From the MOON Cohort.

    PubMed

    Duchman, Kyle R; Westermann, Robert W; Spindler, Kurt P; Reinke, Emily K; Huston, Laura J; Amendola, Annunziato; Wolf, Brian R

    2015-11-01

    The management of meniscus tears identified at the time of primary anterior cruciate ligament (ACL) reconstruction is highly variable and includes repair, meniscectomy, and nontreatment. The purpose of this study was to determine the reoperation rate for meniscus tears left untreated at the time of ACL reconstruction with a minimum follow-up of 6 years. The hypothesis was that small peripheral tears identified at the time of ACL reconstruction managed with "no treatment" would have successful clinical outcomes. Cohort study; Level of evidence, 3. Patients with meniscus tears left untreated at the time of primary ACL reconstruction were identified from a multicenter study group with a minimum 6-year follow-up. Patient, tear, and reoperation data were obtained for analysis. The need for reoperation was used as the primary endpoint, with analysis performed to determine patient and tear characteristics associated with reoperation. There were 194 patients with 208 meniscus tears (71 medial, 137 lateral) left in situ without treatment with a complete follow-up for analysis. Of these, 97.8% of lateral and 94.4% of medial untreated tears required no reoperation. Sixteen tears (7.7%) left in situ without treatment underwent subsequent reoperation: 9 tears (4.3%) underwent reoperation in the setting of revision ACL reconstruction, and 7 tears (3.4%) underwent reoperation for an isolated meniscus injury. The patient age was significantly lower in patients requiring reoperation, while tears measuring ≥10 mm more frequently required reoperation. Lateral and medial meniscus tears left in situ at the time of ACL reconstruction did not require reoperation at a minimum 6-year follow-up for 97.8% and 94.4% of tears, respectively. These findings re-emphasize the low reoperation rate after the nontreatment of small, peripheral lateral meniscus tears while noting less predictable results for medial meniscus tears left without treatment. © 2015 The Author(s).

  18. Hearing on H.R. 770, The Family and Medical Leave Act of 1989. Hearing before the Subcommittee on Labor-Management Relations of the Committee on Education and Labor. House of Representatives, One Hundred First Congress, First Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Education and Labor.

    This report presents testimony concerning the Family and Medical Leave Act of 1989. The bill establishes a basic, minimum labor standard, ensuring job protection to workers who need time off to care for themselves or their family members. The testimony covered the following topics: personal experiences of people whose employment was affected after…

  19. A helicopter handling-qualities study of the effects of engine response characteristics, height-control dynamics, and excess power on nap-of-the-Earth operations

    NASA Technical Reports Server (NTRS)

    Corliss, L. D.

    1982-01-01

    The helicopter configuration with an rpm-governed gas-turbine engine was examined. A wide range of engine response time, vehicle damping and sensitivity, and excess power levels was studied. The data are compared with the existing handling-qualities specifications, MIL-F-83300 and AGARD 577, and in general show a need for higher minimums when performing such NOE maneuvers as a dolphin and bob-up task.

  20. Minimum number of measurements for evaluating Bertholletia excelsa.

    PubMed

    Baldoni, A B; Tonini, H; Tardin, F D; Botelho, S C C; Teodoro, P E

    2017-09-27

    Repeatability studies on fruit species are of great importance to identify the minimum number of measurements necessary to accurately select superior genotypes. This study aimed to identify the most efficient method to estimate the repeatability coefficient (r) and predict the minimum number of measurements needed for a more accurate evaluation of Brazil nut tree (Bertholletia excelsa) genotypes based on fruit yield. For this, we assessed the number of fruits and dry mass of seeds of 75 Brazil nut genotypes, from native forest, located in the municipality of Itaúba, MT, for 5 years. To better estimate r, four procedures were used: analysis of variance (ANOVA), principal component analysis based on the correlation matrix (CPCOR), principal component analysis based on the phenotypic variance and covariance matrix (CPCOV), and structural analysis based on the correlation matrix (mean r - AECOR). There was a significant effect of genotypes and measurements, which reveals the need to study the minimum number of measurements for selecting superior Brazil nut genotypes for a production increase. Estimates of r by ANOVA were lower than those observed with the principal component methodology and close to AECOR. The CPCOV methodology provided the highest estimate of r, which resulted in a lower number of measurements needed to identify superior Brazil nut genotypes for the number of fruits and dry mass of seeds. Based on this methodology, three measurements are necessary to predict the true value of the Brazil nut genotypes with a minimum accuracy of 85%.

  1. [Home health resource utilization measures using a case-mix adjustor model].

    PubMed

    You, Sun-Ju; Chang, Hyun-Sook

    2005-08-01

    The purpose of this study was to measure home health resource utilization using a Case-Mix Adjustor Model developed in the U.S. The subjects of this study were 484 patients who had received home health care more than 4 visits during a 60-day episode at 31 home health care institutions. Data on the 484 patients had to be merged onto a 60-day payment segment. Based on the results, the researcher classified home health resource groups (HHRG). The subjects were classified into 34 HHRGs in Korea. Home health resource utilization according to clinical severity was in order of Minimum (C0) < 'Low (C1) < 'Moderate (C2) < 'High (C3), according to dependency in daily activities was in order of Minimum (F0) < 'High (F3) < 'Medium (F2) < 'Low (F1) < 'Maximum (F4). Resource utilization by HHRGs was the highest 564,735 won in group C0F0S2 (clinical severity minimum, dependency in daily activity minimum, service utilization moderate), and the lowest 97,000 won in group C2F3S1, so the former was 5.82 times higher than the latter. Resource utilization in home health care has become an issue of concern due to rising costs for home health care. The results suggest the need for more analytical attention on the utilization and expenditures for home care using a Case-Mix Adjustor Model.

  2. Evaluation of AQUI-S(TM) (efficacy and minimum toxic concentration) as a fish anaesthetic/sedative for public aquaculture in the United States

    USGS Publications Warehouse

    Stehly, G.R.; Gingerich, W.H.

    1999-01-01

    A preliminary evaluation of efficacy and minimum toxic concentration of AQUI-S(TM), a fish anaesthetic/sedative, was determined in two size classes of six species of fish important to US public aquaculture (bluegill, channel catfish, lake trout, rainbow trout, walleye and yellow perch). In addition, efficacy and minimum toxic concentration were determined in juvenile-young adult (fish aged 1 year or older) rainbow trout acclimated to water at 7 ??C, 12 ??C and 17 ??C. Testing concentrations were based on determinations made with range-finding studies for both efficacy and minimum toxic concentration. Most of the tested juvenile-young adult fish species were induced in 3 min or less at a nominal AQUI-S(TM) concentration of 20 mg L-1. In juvenile-young adult fish, the minimum toxic concentration was at least 2.5 times the selected efficacious concentration. Three out of five species of fry-fingerlings (1.25-12.5 cm in length and < 1 year old) were induced in ??? 4.1 min at a nominal concentration of 20 mg L-1 AQUI-S(TM), with the other two species requiring nominal concentrations of 25 and 35 mg L-1 for similar times of induction. Recovery times were ??? 7.3 rain for all species in the two size classes. In fry-fingerlings, the minimum toxic concentration was at least 1.4 times the selected efficacious concentration. There appeared to be little relationship between size of fish and concentrations or times to induction, recovery times and minimum toxic concentration. The times required for induction and for recovery were increased in rainbow trout as the acclimation temperature was reduced.

  3. 20 CFR 416.2055 - Mandatory minimum supplementation reduced.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    .... 416.2055 Section 416.2055 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY... Mandatory minimum supplementation reduced. If for any month after December 1973 there is a change with respect to any special need or special circumstance which, if such change had existed in December 1973...

  4. 20 CFR 416.2055 - Mandatory minimum supplementation reduced.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... 416.2055 Section 416.2055 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY... Mandatory minimum supplementation reduced. If for any month after December 1973 there is a change with respect to any special need or special circumstance which, if such change had existed in December 1973...

  5. The Need for Higher Minimum Staffing Standards in U.S. Nursing Homes

    PubMed Central

    Harrington, Charlene; Schnelle, John F.; McGregor, Margaret; Simmons, Sandra F.

    2016-01-01

    Many U.S. nursing homes have serious quality problems, in part, because of inadequate levels of nurse staffing. This commentary focuses on two issues. First, there is a need for higher minimum nurse staffing standards for U.S. nursing homes based on multiple research studies showing a positive relationship between nursing home quality and staffing and the benefits of implementing higher minimum staffing standards. Studies have identified the minimum staffing levels necessary to provide care consistent with the federal regulations, but many U.S. facilities have dangerously low staffing. Second, the barriers to staffing reform are discussed. These include economic concerns about costs and a focus on financial incentives. The enforcement of existing staffing standards has been weak, and strong nursing home industry political opposition has limited efforts to establish higher standards. Researchers should study the ways to improve staffing standards and new payment, regulatory, and political strategies to improve nursing home staffing and quality. PMID:27103819

  6. Associations between state minimum wage policy and health care access: a multi-level analysis of the 2004 Behavioral Risk Factor survey.

    PubMed

    McCarrier, Kelly P; Martin, Diane P; Ralston, James D; Zimmerman, Frederick J

    2010-05-01

    Minimum wage policies have been advanced as mechanisms to improve the economic conditions of the working poor. Both positive and negative effects of such policies on health care access have been hypothesized, but associations have yet to be thoroughly tested. To examine whether the presence of minimum wage policies in excess of the federal standard of $5.15 per hour was associated with health care access indicators among low-skilled adults of working age, a cross-sectional analysis of 2004 Behavioral Risk Factor Surveillance System data was conducted. Self-reported health insurance status and experience with cost-related barriers to needed medical care were adjusted in multi-level logistic regression models to control for potential confounding at the state, county, and individual levels. State-level wage policy was not found to be associated with insurance status or unmet medical need in the models, providing early evidence that increased minimum wage rates may neither strengthen nor weaken access to care as previously predicted.

  7. Nonlinear Control Theory for Missile Autopilot Design.

    DTIC Science & Technology

    1987-04-24

    minimum-time controller which includes constraints on both controls and angle-of-attack is developed and an example is given. -12- - - -~ *% PO PmCF E- A...constructed. In this case, some ideas from robotics on minimum-time trajectory planning under path constraints (see, e.g., Rajan (1985), Sahar and...Auto Cont., Vol. AC-29, No. 4, p. 361. Rajan, V.T. (1985), "Minimum-Time Trajectory Planning ", Proc IEEE Kobotics and Automation Conf., St. Louis. Reed

  8. An Open-Source Arduino-based Controller for Mechanical Rain Simulators

    NASA Astrophysics Data System (ADS)

    Cantilina, K. K.

    2017-12-01

    Many commercial rain simulators currently used in hydrology rely on inflexible and outdated controller designs. These analog controllers typically only allow a handful of discrete parameter options, and do not support internal timing functions or continuously-changing parameters. A desire for finer control of rain simulation events necessitated the design and construction of a microcontroller-based controller, using widely available off-the-shelf components. A menu driven interface allows users to fine-tune simulation parameters without the need for training or experience with microcontrollers, and the accessibility of the Arduino IDE allows users with a minimum of programming and hardware experience to modify the controller program to suit the needs of individual experiments.

  9. A novel gene network inference algorithm using predictive minimum description length approach.

    PubMed

    Chaitankar, Vijender; Ghosh, Preetam; Perkins, Edward J; Gong, Ping; Deng, Youping; Zhang, Chaoyang

    2010-05-28

    Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold which defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we proposed a new inference algorithm which incorporated mutual information (MI), conditional mutual information (CMI) and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm was evaluated using both synthetic time series data sets and a biological time series data set for the yeast Saccharomyces cerevisiae. The benchmark quantities precision and recall were used as performance measures. The results show that the proposed algorithm produced less false edges and significantly improved the precision, as compared to the existing algorithm. For further analysis the performance of the algorithms was observed over different sizes of data. We have proposed a new algorithm that implements the PMDL principle for inferring gene regulatory networks from time series DNA microarray data that eliminates the need of a fine tuning parameter. The evaluation results obtained from both synthetic and actual biological data sets show that the PMDL principle is effective in determining the MI threshold and the developed algorithm improves precision of gene regulatory network inference. Based on the sensitivity analysis of all tested cases, an optimal CMI threshold value has been identified. Finally it was observed that the performance of the algorithms saturates at a certain threshold of data size.

  10. 26 CFR 5c.168(f)(8)-4 - Minimum investment of lessor.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 14 2010-04-01 2010-04-01 false Minimum investment of lessor. 5c.168(f)(8)-4....168(f)(8)-4 Minimum investment of lessor. (a) Minimum investment. Under section 168(f)(8)(B)(ii), an... has a minimum at risk investment which, at the time the property is placed in service under the lease...

  11. 26 CFR 5c.168(f)(8)-4 - Minimum investment of lessor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 14 2011-04-01 2010-04-01 true Minimum investment of lessor. 5c.168(f)(8)-4....168(f)(8)-4 Minimum investment of lessor. (a) Minimum investment. Under section 168(f)(8)(B)(ii), an... has a minimum at risk investment which, at the time the property is placed in service under the lease...

  12. Laser ignition of liquid petroleum gas at elevated pressures

    NASA Astrophysics Data System (ADS)

    Loktionov, E.; Pasechnikov, N.; Telekh, V.

    2017-11-01

    Recent development of laser spark plugs for internal combustion engines have shown lack of data on laser ignition of fuel mixtures at multi-bar pressures needed for laser pulse energy and focusing optimisation. Methane and hydrogen based mixtures are comparatively well investigated, but propane and butane based ones (LPG), which are widely used in vehicles, are still almost unstudied. Optical breakdown thresholds in gases decrease with pressure increase up to ca. 100 bar, but breakdown is not a sufficient condition for combustion ignition. So minimum ignition energy (MIE) becomes more important for combustion core onset, and its dependency on mixture composition and pressure has several important features. For example, unlike breakdown threshold, is poorly dependent on laser pulse length, at least in pico- and to microsecond range. We have defined experimentally the dependencies of minimum picosecond laser pulse energies (MIE related value) needed for ignition of LPG based mixtures of 1.0 to 1.6 equivalence ratios and pressure of 1.0 to 3.5 bar. In addition to expected values decrease, low-energy flammability range broadening has been found at pressure increase. Laser ignition of LPG in Wankel rotary engine is reported for the first time.

  13. Climate Change and Its Impact on the Yield of Major Food Crops: Evidence from Pakistan

    PubMed Central

    Ali, Sajjad; Liu, Ying; Ishaq, Muhammad; Shah, Tariq; Abdullah; Ilyas, Aasir; Din, Izhar Ud

    2017-01-01

    Pakistan is vulnerable to climate change, and extreme climatic conditions are threatening food security. This study examines the effects of climate change (e.g., maximum temperature, minimum temperature, rainfall, relative humidity, and the sunshine) on the major crops of Pakistan (e.g., wheat, rice, maize, and sugarcane). The methods of feasible generalized least square (FGLS) and heteroscedasticity and autocorrelation (HAC) consistent standard error were employed using time series data for the period 1989 to 2015. The results of the study reveal that maximum temperature adversely affects wheat production, while the effect of minimum temperature is positive and significant for all crops. Rainfall effect towards the yield of a selected crop is negative, except for wheat. To cope with and mitigate the adverse effects of climate change, there is a need for the development of heat- and drought-resistant high-yielding varieties to ensure food security in the country. PMID:28538704

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loef, P.A.; Smed, T.; Andersson, G.

    The minimum singular value of the power flow Jacobian matrix has been used as a static voltage stability index, indicating the distance between the studied operating point and the steady state voltage stability limit. In this paper a fast method to calculate the minimum singular value and the corresponding (left and right) singular vectors is presented. The main advantages of the developed algorithm are the small amount of computation time needed, and that it only requires information available from an ordinary program for power flow calculations. Furthermore, the proposed method fully utilizes the sparsity of the power flow Jacobian matrixmore » and hence the memory requirements for the computation are low. These advantages are preserved when applied to various submatrices of the Jacobian matrix, which can be useful in constructing special voltage stability indices. The developed algorithm was applied to small test systems as well as to a large (real size) system with over 1000 nodes, with satisfactory results.« less

  15. Climate Change and Its Impact on the Yield of Major Food Crops: Evidence from Pakistan.

    PubMed

    Ali, Sajjad; Liu, Ying; Ishaq, Muhammad; Shah, Tariq; Abdullah; Ilyas, Aasir; Din, Izhar Ud

    2017-05-24

    Pakistan is vulnerable to climate change, and extreme climatic conditions are threatening food security. This study examines the effects of climate change (e.g., maximum temperature, minimum temperature, rainfall, relative humidity, and the sunshine) on the major crops of Pakistan (e.g., wheat, rice, maize, and sugarcane). The methods of feasible generalized least square (FGLS) and heteroscedasticity and autocorrelation (HAC) consistent standard error were employed using time series data for the period 1989 to 2015. The results of the study reveal that maximum temperature adversely affects wheat production, while the effect of minimum temperature is positive and significant for all crops. Rainfall effect towards the yield of a selected crop is negative, except for wheat. To cope with and mitigate the adverse effects of climate change, there is a need for the development of heat- and drought-resistant high-yielding varieties to ensure food security in the country.

  16. Necessity to review the Brazilian regulation about fluoride toothpastes

    PubMed Central

    Cury, Jaime Aparecido; Caldarelli, Pablo Guilherme; Tenuta, Livia Maria Andaló

    2015-01-01

    The aim of this study was to evaluate the adequacy of the Brazilian legislation about fluoride toothpaste. A search was conducted in LILACS, Medline and SciELO databases about the fluoride concentration found in Brazilians toothpastes, using descriptors on health. Publications since 1981 have shown that some Brazilian toothpastes are not able to maintain, during their expiration time, a minimum of 1,000 ppm F of soluble fluoride in the formulation. However, the Brazilian regulation (ANVISA, Resolution 79, August 28, 2000) only sets the maximum total fluoride (0.15%; 1,500 ppm F) that a toothpaste may contain but not the minimum concentration of soluble fluoride that it should contain to have anticaries potential, which according to systematic reviews should be 1,000 ppm F. Therefore, the Brazilian regulation on fluoride toothpastes needs to be revised to assure the efficacy of those products for caries control. PMID:26487295

  17. Determination of tailored filter sets to create rayfiles including spatial and angular resolved spectral information.

    PubMed

    Rotscholl, Ingo; Trampert, Klaus; Krüger, Udo; Perner, Martin; Schmidt, Franz; Neumann, Cornelius

    2015-11-16

    To simulate and optimize optical designs regarding perceived color and homogeneity in commercial ray tracing software, realistic light source models are needed. Spectral rayfiles provide angular and spatial varying spectral information. We propose a spectral reconstruction method with a minimum of time consuming goniophotometric near field measurements with optical filters for the purpose of creating spectral rayfiles. Our discussion focuses on the selection of the ideal optical filter combination for any arbitrary spectrum out of a given filter set by considering measurement uncertainties with Monte Carlo simulations. We minimize the simulation time by a preselection of all filter combinations, which bases on factorial design.

  18. Plasma plume expansion dynamics in nanosecond Nd:YAG laserosteotome

    NASA Astrophysics Data System (ADS)

    Abbasi, Hamed; Rauter, Georg; Guzman, Raphael; Cattin, Philippe C.; Zam, Azhar

    2018-02-01

    In minimal invasive laser osteotomy precise information about the ablation process can be obtained with LIBS in order to avoid carbonization, or cutting of wrong types of tissue. Therefore, the collecting fiber for LIBS needs to be optimally placed in narrow cavities in the endoscope. To determine this optimal placement, the plasma plume expansion dynamics in ablation of bone tissue by the second harmonic of a nanosecond Nd:YAG laser at 532 nm has been studied. The laserinduced plasma plume was monitored in different time delays, from one nanosecond up to one hundred microseconds. Measurements were performed using high-speed gated illumination imaging. The expansion features were studied using illumination of the overall visible emission by using a gated intensified charged coupled device (ICCD). The camera was capable of having a minimum gate width (Optical FWHM) of 3 ns and the timing resolution (minimum temporal shift of the gate) of 10 ps. The imaging data were used to generate position-time data of the luminous plasma-front. Moreover, the velocity of the plasma plume expansion was studied based on the time-resolved intensity data. By knowing the plasma plume profile over time, the optimum position (axial distance from the laser spot) of the collecting fiber and optimal time delay (to have the best signal to noise ratio) in spatial-resolved and time-resolved laser-induced breakdown spectroscopy (LIBS) can be determined. Additionally, the function of plasma plume expansion could be used to study the shock wave of the plasma plume.

  19. Minimum Competency Testing (MCT). Some Remarks.

    ERIC Educational Resources Information Center

    Howell, John F.

    The effort to institute minimum competency testing (MCT) is nearly universal despite the need to debate its basic definitions, implications, and consequences beforehand. There are seven distinct reasons for the MCT movement: (1) legislative zeal; (2) unfavorable allegations by local and national press; (3) economic accountability; (4) the…

  20. 23 CFR 970.210 - Federal lands bridge management system (BMS).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Section 970.210 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS... needs using, as a minimum, the following components: (1) A database and an ongoing program for the... BMS. The minimum BMS database shall include: (i) Data described by the inventory section of the...

  1. Expedite random structure searching using objects from Wyckoff positions

    NASA Astrophysics Data System (ADS)

    Wang, Shu-Wei; Hsing, Cheng-Rong; Wei, Ching-Ming

    2018-02-01

    Random structure searching has been proved to be a powerful approach to search and find the global minimum and the metastable structures. A true random sampling is in principle needed yet it would be highly time-consuming and/or practically impossible to find the global minimum for the complicated systems in their high-dimensional configuration space. Thus the implementations of reasonable constraints, such as adopting system symmetries to reduce the independent dimension in structural space and/or imposing chemical information to reach and relax into low-energy regions, are the most essential issues in the approach. In this paper, we propose the concept of "object" which is either an atom or composed of a set of atoms (such as molecules or carbonates) carrying a symmetry defined by one of the Wyckoff positions of space group and through this process it allows the searching of global minimum for a complicated system to be confined in a greatly reduced structural space and becomes accessible in practice. We examined several representative materials, including Cd3As2 crystal, solid methanol, high-pressure carbonates (FeCO3), and Si(111)-7 × 7 reconstructed surface, to demonstrate the power and the advantages of using "object" concept in random structure searching.

  2. Minimum tailwater flows in relation to habitat suitability and sport-fish harvest

    USGS Publications Warehouse

    Jacobs, K.E.; Swink, W.D.; Novotny, J.F.

    1987-01-01

    The instream flow needs of four sport fishes (rainbow trout Salmo gairdneri, channel catfish Ictalurus punctatus, smallmouth bass Micropterus dolomieui, and white crappie Pomoxis annularis) were evaluated in the tailwater below Green River Lake, Kentucky. The Newcombe method, a simple procedure developed in British Columbia that is based on the distribution of water depths and velocities at various flows, was used to predict usable habitat at seven flows. Predicted usable habitat was two to six times greater for rainbow trout than for any of the other species at all flows. Angler harvest corresponded to the predicted abundance for rainbow trout and smallmouth bass, but the catch of channel catfish and white crappies was seasonally greater than expected. The presence of the dam and reservoir apparently disrupted the normal movement and feeding patterns of these species and periodically overrode the relation between usable habitat and abundance assumed in the Newcombe method. The year-round minimum flow of 4.6 m 3/s recommended for the tailwater would generally increase the amount of habitat available in the tailwater from April through October, and the minimum flow of 2.4 m3/s recommended for periods of drought would allow the maintenance of a trout fishery.

  3. The minimum test battery to screen for binocular vision anomalies: report 3 of the BAND study.

    PubMed

    Hussaindeen, Jameel Rizwana; Rakshit, Archayeeta; Singh, Neeraj Kumar; Swaminathan, Meenakshi; George, Ronnie; Kapur, Suman; Scheiman, Mitchell; Ramani, Krishna Kumar

    2018-03-01

    This study aims to report the minimum test battery needed to screen non-strabismic binocular vision anomalies (NSBVAs) in a community set-up. When large numbers are to be screened we aim to identify the most useful test battery when there is no opportunity for a more comprehensive and time-consuming clinical examination. The prevalence estimates and normative data for binocular vision parameters were estimated from the Binocular Vision Anomalies and Normative Data (BAND) study, following which cut-off estimates and receiver operating characteristic curves to identify the minimum test battery have been plotted. In the receiver operating characteristic phase of the study, children between nine and 17 years of age were screened in two schools in the rural arm using the minimum test battery, and the prevalence estimates with the minimum test battery were found. Receiver operating characteristic analyses revealed that near point of convergence with penlight and red filter (> 7.5 cm), monocular accommodative facility (< 10 cycles per minute), and the difference between near and distance phoria (> 1.25 prism dioptres) were significant factors with cut-off values for best sensitivity and specificity. This minimum test battery was applied to a cohort of 305 children. The mean (standard deviation) age of the subjects was 12.7 (two) years with 121 males and 184 females. Using the minimum battery of tests obtained through the receiver operating characteristic analyses, the prevalence of NSBVAs was found to be 26 per cent. Near point of convergence with penlight and red filter > 10 cm was found to have the highest sensitivity (80 per cent) and specificity (73 per cent) for the diagnosis of convergence insufficiency. For the diagnosis of accommodative infacility, monocular accommodative facility with a cut-off of less than seven cycles per minute was the best predictor for screening (92 per cent sensitivity and 90 per cent specificity). The minimum test battery of near point of convergence with penlight and red filter, difference between distance and near phoria, and monocular accommodative facility yield good sensitivity and specificity for diagnosis of NSBVAs in a community set-up. © 2017 Optometry Australia.

  4. Financial Implications of Half- and Full-Time Employment for Persons with Disabilities.

    ERIC Educational Resources Information Center

    Schloss, Patrick J.; And Others

    1987-01-01

    Balance sheets comparing yearly income and expenses were developed for three disabled worker situations: no earned income, half-time minimum-wage job, and full-time minimum-wage job. Net disposable income was comparable for part-time and full-time disabled workers, since eligibility for Medicaid, Food Stamps, and Supplemental Security Income was…

  5. Literature review of the benefits and obstacle of horizontal directional drilling

    NASA Astrophysics Data System (ADS)

    Norizam, M. S. Mohd; Nuzul Azam, H.; Helmi Zulhaidi, S.; Aziz, A. Abdul; Nadzrol Fadzilah, A.

    2017-11-01

    In this new era the construction industry not only need to be completed within budget, timely, at acceptable quality and safety but the stakeholders especially the local authorities and the public realises for the important need of sustainable construction method to be used for our younger generation to heritage if not better a safer world for them to live and raise up their children’s. Horizontal Directional Drilling method is the most commonly recognised trenchless utilities method as a preferred construction method in this age. Among the reasons HDD method offers less disturbance on traffic, the public, business activities and neighbourhood, lower restoration cost, less noise, dust and minimum import/export of the construction materials. In addition HDD method can drill through congested utilities areas with minimum cutting and shorter time. This paper aims to appraise the benefits and obstacle of HDD method in construction industry. It is an endeavour to fulfil the local authorities cry for alternative method that less damages to the roads, road furniture’s and public complaints compared to the conventional open cut method. In addition HDD method is seem to be in line with sustainable development requirements e.g. reduce, reuse, recycle and etc. Hence, it is important to determine the benefits and obstacle factors of HDD implementation. The factors are based on the literature review conducted by the author on the subject matters gathered from previous studies, journals, text books, guidelines, magazine articles, newspaper cutting and etc.

  6. Barriers and dispersal surfaces in minimum-time interception. [for optimizing aircraft flight paths

    NASA Technical Reports Server (NTRS)

    Rajan, N.; Ardema, M. D.

    1984-01-01

    A method is proposed for mapping the barrier, dispersal, and control-level surfaces for a class of minimum-time interception and pursuit-evasion problems. Minimum-time interception of a target moving in a horizontal plane is formulated in a coordinate system whose origin is at the interceptor's terminal position and whose x-axis is along the terminal line of sight. This approach makes it possible to discuss the nature of the interceptor's extremals, using its extremal trajectory maps (ETMs), independently of target motion. The game surfaces are constructed by drawing sections of the isochrones, or constant minimum-time loci, from the interceptor and target ETMs. In this way, feedback solutions for the optimal controls are obtained. An example involving the interception of a target moving in a straight line at constant speed is presented.

  7. 34 CFR 668.3 - Academic year.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...— (1)(i) For a program offered in credit hours, a minimum of 30 weeks of instructional time; or (ii) For a program offered in clock hours, a minimum of 26 weeks of instructional time; and (2) For an undergraduate educational program, an amount of instructional time whereby a full-time student is expected to...

  8. Considering the dynamic refueling behavior in locating electric vehicle charging stations

    NASA Astrophysics Data System (ADS)

    Liu, K.; Sun, X. H.

    2014-11-01

    Electric vehicles (EVs) will certainly play an important role in addressing the energy and environmental challenges at current situation. However, location problem of EV charging stations was realized as one of the key issues of EVs launching strategy. While for the case of locating EV charging stations, more influence factors and constraints need to be considered since the EVs have some special attributes. The minimum requested charging time for EVs is usually more than 30minutes, therefore the possible delay time due to waiting or looking for an available station is one of the most important influence factors. In addition, the intention to purchase and use of EVs that also affects the location of EV charging stations is distributed unevenly among regions and should be considered when modelling. Unfortunately, these kinds of time-spatial constraints were always ignored in previous models. Based on the related research of refuelling behaviours and refuelling demands, this paper developed a new concept with dual objectives of minimum waiting time and maximum service accessibility for locating EV charging stations - named as Time-Spatial Location Model (TSLM). The proposed model and the traditional flow-capturing location model are applied on an example network respectively and the results are compared. Results demonstrate that time constraint has great effects on the location of EV charging stations. The proposed model has some obvious advantages and will help energy providers to make a viable plan for the network of EV charging stations.

  9. Performance evaluation of the inverse dynamics method for optimal spacecraft reorientation

    NASA Astrophysics Data System (ADS)

    Ventura, Jacopo; Romano, Marcello; Walter, Ulrich

    2015-05-01

    This paper investigates the application of the inverse dynamics in the virtual domain method to Euler angles, quaternions, and modified Rodrigues parameters for rapid optimal attitude trajectory generation for spacecraft reorientation maneuvers. The impact of the virtual domain and attitude representation is numerically investigated for both minimum time and minimum energy problems. Owing to the nature of the inverse dynamics method, it yields sub-optimal solutions for minimum time problems. Furthermore, the virtual domain improves the optimality of the solution, but at the cost of more computational time. The attitude representation also affects solution quality and computational speed. For minimum energy problems, the optimal solution can be obtained without the virtual domain with any considered attitude representation.

  10. Sunspot variation and selected associated phenomena: A look at solar cycle 21 and beyond

    NASA Technical Reports Server (NTRS)

    Wilson, R. M.

    1982-01-01

    Solar sunspot cycles 8 through 21 are reviewed. Mean time intervals are calculated for maximum to maximum, minimum to minimum, minimum to maximum, and maximum to minimum phases for cycles 8 through 20 and 8 through 21. Simple cosine functions with a period of 132 years are compared to, and found to be representative of, the variation of smoothed sunspot numbers at solar maximum and minimum. A comparison of cycles 20 and 21 is given, leading to a projection for activity levels during the Spacelab 2 era (tentatively, November 1984). A prediction is made for cycle 22. Major flares are observed to peak several months subsequent to the solar maximum during cycle 21 and to be at minimum level several months after the solar minimum. Additional remarks are given for flares, gradual rise and fall radio events and 2800 MHz radio emission. Certain solar activity parameters, especially as they relate to the near term Spacelab 2 time frame are estimated.

  11. Optimal Reservoir Operation using Stochastic Model Predictive Control

    NASA Astrophysics Data System (ADS)

    Sahu, R.; McLaughlin, D.

    2016-12-01

    Hydropower operations are typically designed to fulfill contracts negotiated with consumers who need reliable energy supplies, despite uncertainties in reservoir inflows. In addition to providing reliable power the reservoir operator needs to take into account environmental factors such as downstream flooding or compliance with minimum flow requirements. From a dynamical systems perspective, the reservoir operating strategy must cope with conflicting objectives in the presence of random disturbances. In order to achieve optimal performance, the reservoir system needs to continually adapt to disturbances in real time. Model Predictive Control (MPC) is a real-time control technique that adapts by deriving the reservoir release at each decision time from the current state of the system. Here an ensemble-based version of MPC (SMPC) is applied to a generic reservoir to determine both the optimal power contract, considering future inflow uncertainty, and a real-time operating strategy that attempts to satisfy the contract. Contract selection and real-time operation are coupled in an optimization framework that also defines a Pareto trade off between the revenue generated from energy production and the environmental damage resulting from uncontrolled reservoir spills. Further insight is provided by a sensitivity analysis of key parameters specified in the SMPC technique. The results demonstrate that SMPC is suitable for multi-objective planning and associated real-time operation of a wide range of hydropower reservoir systems.

  12. 40 CFR 63.1365 - Test methods and initial compliance procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... design minimum and average temperature in the combustion zone and the combustion zone residence time. (B... establish the design minimum and average flame zone temperatures and combustion zone residence time, and... carbon bed temperature after regeneration, design carbon bed regeneration time, and design service life...

  13. Long-term trends in daily temperature extremes in Iraq

    NASA Astrophysics Data System (ADS)

    Salman, Saleem A.; Shahid, Shamsuddin; Ismail, Tarmizi; Chung, Eun-Sung; Al-Abadi, Alaa M.

    2017-12-01

    The existence of long-term persistence (LTP) in hydro-climatic time series can lead to considerable change in significance of trends. Therefore, past findings of climatic trend studies that did not consider LTP became a disputable issue. A study has been conducted to assess the trends in temperature and temperature extremes in Iraq in recent years (1965-2015) using both ordinary Mann-Kendal (MK) test; and the modified Mann-Kendall (m-MK) test, which can differentiate the multi-decadal oscillatory variations from secular trends. Trends in annual and seasonal minimum and maximum temperatures, diurnal temperature range (DTR), and 14 temperature-related extremes were assessed. MK test detected the significant increases in minimum and maximum temperature at all stations, where m-MK test detected at 86% and 80% of all stations, respectively. The temperature in Iraq is increasing 2 to 7 times faster than global temperature rise. The minimum temperature is increasing more (0.48-1.17 °C/decade) than maximum temperature (0.25-1.01 °C/decade). Temperature rise is higher in northern Iraq and in summer. The hot extremes particularly warm nights are increasing all over Iraq at a rate of 2.92-10.69 days/decade, respectively. On the other hand, numbers of cold days are decreasing at some stations at a rate of - 2.65 to - 8.40 days/decade. The use of m-MK test along with MK test confirms the significant increase in temperature and some of the temperature extremes in Iraq. This study suggests that trends in many temperature extremes in the region estimated in previous studies using MK test may be due to natural variability of climate, which empathizes the need for validation of the trends by considering LTP in time series.

  14. Time and frequency constrained sonar signal design for optimal detection of elastic objects.

    PubMed

    Hamschin, Brandon; Loughlin, Patrick J

    2013-04-01

    In this paper, the task of model-based transmit signal design for optimizing detection is considered. Building on past work that designs the spectral magnitude for optimizing detection, two methods for synthesizing minimum duration signals with this spectral magnitude are developed. The methods are applied to the design of signals that are optimal for detecting elastic objects in the presence of additive noise and self-noise. Elastic objects are modeled as linear time-invariant systems with known impulse responses, while additive noise (e.g., ocean noise or receiver noise) and acoustic self-noise (e.g., reverberation or clutter) are modeled as stationary Gaussian random processes with known power spectral densities. The first approach finds the waveform that preserves the optimal spectral magnitude while achieving the minimum temporal duration. The second approach yields a finite-length time-domain sequence by maximizing temporal energy concentration, subject to the constraint that the spectral magnitude is close (in a least-squares sense) to the optimal spectral magnitude. The two approaches are then connected analytically, showing the former is a limiting case of the latter. Simulation examples that illustrate the theory are accompanied by discussions that address practical applicability and how one might satisfy the need for target and environmental models in the real-world.

  15. Operative needs in HIV+ populations: An estimation for sub-Saharan Africa.

    PubMed

    Cherewick, Megan L; Cherewick, Steven D; Kushner, Adam L

    2017-05-01

    In 2015, it was estimated that approximately 36.7 million people were living with HIV globally and approximately 25.5 million of those people were living in sub-Saharan Africa. Limitations in the availability and access to adequate operative care require policy and planning to enhance operative capacity. Data estimating the total number of persons living with HIV by country, sex, and age group were obtained from the Joint United Nations Programme on HIV/AIDS (UNAIDS) in 2015. Using minimum proposed surgical rates per 100,000 for 4, defined, sub-Saharan regions of Africa, country-specific and regional estimates were calculated. The total need and unmet need for operative procedures were estimated. A minimum of 1,539,138 operative procedures were needed in 2015 for the 25.5 million persons living with HIV in sub-Saharan Africa. In 2015, there was an unmet need of 908,513 operative cases in sub-Saharan Africa with the greatest unmet need in eastern sub-Saharan Africa (427,820) and western sub-Saharan Africa (325,026). Approximately 55.6% of the total need for operative cases is adult women, 38.4% are adult men, and 6.0% are among children under the age of 15. A minimum of 1.5 million operative procedures annually are required to meet the needs of persons living with HIV in sub-Saharan Africa. The unmet need for operative care is greatest in eastern and western sub-Saharan Africa and will require investments in personnel, infrastructure, facilities, supplies, and equipment. We highlight the need for global planning and investment in resources to meet targets of operative capacity. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Reference respiratory waveforms by minimum jerk model analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anetai, Yusuke, E-mail: anetai@radonc.med.osaka-u.ac.jp; Sumida, Iori; Takahashi, Yutaka

    Purpose: CyberKnife{sup ®} robotic surgery system has the ability to deliver radiation to a tumor subject to respiratory movements using Synchrony{sup ®} mode with less than 2 mm tracking accuracy. However, rapid and rough motion tracking causes mechanical tracking errors and puts mechanical stress on the robotic joint, leading to unexpected radiation delivery errors. During clinical treatment, patient respiratory motions are much more complicated, suggesting the need for patient-specific modeling of respiratory motion. The purpose of this study was to propose a novel method that provides a reference respiratory wave to enable smooth tracking for each patient. Methods: The minimummore » jerk model, which mathematically derives smoothness by means of jerk, or the third derivative of position and the derivative of acceleration with respect to time that is proportional to the time rate of force changed was introduced to model a patient-specific respiratory motion wave to provide smooth motion tracking using CyberKnife{sup ®}. To verify that patient-specific minimum jerk respiratory waves were being tracked smoothly by Synchrony{sup ®} mode, a tracking laser projection from CyberKnife{sup ®} was optically analyzed every 0.1 s using a webcam and a calibrated grid on a motion phantom whose motion was in accordance with three pattern waves (cosine, typical free-breathing, and minimum jerk theoretical wave models) for the clinically relevant superior–inferior directions from six volunteers assessed on the same node of the same isocentric plan. Results: Tracking discrepancy from the center of the grid to the beam projection was evaluated. The minimum jerk theoretical wave reduced the maximum-peak amplitude of radial tracking discrepancy compared with that of the waveforms modeled by cosine and typical free-breathing model by 22% and 35%, respectively, and provided smooth tracking for radial direction. Motion tracking constancy as indicated by radial tracking discrepancy affected by respiratory phase was improved in the minimum jerk theoretical model by 7.0% and 13% compared with that of the waveforms modeled by cosine and free-breathing model, respectively. Conclusions: The minimum jerk theoretical respiratory wave can achieve smooth tracking by CyberKnife{sup ®} and may provide patient-specific respiratory modeling, which may be useful for respiratory training and coaching, as well as quality assurance of the mechanical CyberKnife{sup ®} robotic trajectory.« less

  17. Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion

    PubMed Central

    Du, Shichuan; Martinez, Aleix M.

    2013-01-01

    Abstract Facial expressions of emotion are essential components of human behavior, yet little is known about the hierarchical organization of their cognitive analysis. We study the minimum exposure time needed to successfully classify the six classical facial expressions of emotion (joy, surprise, sadness, anger, disgust, fear) plus neutral as seen at different image resolutions (240 × 160 to 15 × 10 pixels). Our results suggest a consistent hierarchical analysis of these facial expressions regardless of the resolution of the stimuli. Happiness and surprise can be recognized after very short exposure times (10–20 ms), even at low resolutions. Fear and anger are recognized the slowest (100–250 ms), even in high-resolution images, suggesting a later computation. Sadness and disgust are recognized in between (70–200 ms). The minimum exposure time required for successful classification of each facial expression correlates with the ability of a human subject to identify it correctly at low resolutions. These results suggest a fast, early computation of expressions represented mostly by low spatial frequencies or global configural cues and a later, slower process for those categories requiring a more fine-grained analysis of the image. We also demonstrate that those expressions that are mostly visible in higher-resolution images are not recognized as accurately. We summarize implications for current computational models. PMID:23509409

  18. Does the Minimum Wage Affect Welfare Caseloads?

    ERIC Educational Resources Information Center

    Page, Marianne E.; Spetz, Joanne; Millar, Jane

    2005-01-01

    Although minimum wages are advocated as a policy that will help the poor, few studies have examined their effect on poor families. This paper uses variation in minimum wages across states and over time to estimate the impact of minimum wage legislation on welfare caseloads. We find that the elasticity of the welfare caseload with respect to the…

  19. Minimum Standards for Tribal Child Care Centers.

    ERIC Educational Resources Information Center

    Administration on Children, Youth, and Families (DHHS), Washington, DC. Child Care Bureau.

    These minimum standards for tribal child care centers are being issued as guidance. An interim period of at least 1 year will allow tribal agencies to identify implementation issues, ensure that the standards reflect tribal needs, and guarantee that the standards provide adequate protection for children. The standards will be issued as regulations…

  20. Minimum viable populations: Is there a 'magic number' for conservation practitioners?

    Treesearch

    Curtis H. Flather; Gregory D. Hayward; Steven R. Beissinger; Philip A. Stephens

    2011-01-01

    Establishing species conservation priorities and recovery goals is often enhanced by extinction risk estimates. The need to set goals, even in data-deficient situations, has prompted researchers to ask whether general guidelines could replace individual estimates of extinction risk. To inform conservation policy, recent studies have revived the concept of the minimum...

  1. 77 FR 40320 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... irradiation treatment of imported fruits and vegetables including a minimum generic dose for the fruit fly family, the minimum dose of irradiation for some specific fruit fly species, and provides for the use of irradiation as a treatment for cut flowers and foliage. Need and Use of the Information: Certain fruits and...

  2. Solving constrained minimum-time robot problems using the sequential gradient restoration algorithm

    NASA Technical Reports Server (NTRS)

    Lee, Allan Y.

    1991-01-01

    Three constrained minimum-time control problems of a two-link manipulator are solved using the Sequential Gradient and Restoration Algorithm (SGRA). The inequality constraints considered are reduced via Valentine-type transformations to nondifferential path equality constraints. The SGRA is then used to solve these transformed problems with equality constraints. The results obtained indicate that at least one of the two controls is at its limits at any instant in time. The remaining control then adjusts itself so that none of the system constraints is violated. Hence, the minimum-time control is either a pure bang-bang control or a combined bang-bang/singular control.

  3. Theoretical assessment of whole body counting performances using numerical phantoms of different gender and sizes.

    PubMed

    Marzocchi, O; Breustedt, B; Mostacci, D; Zankl, M; Urban, M

    2011-03-01

    A goal of whole body counting (WBC) is the estimation of the total body burden of radionuclides disregarding the actual position within the body. To achieve the goal, the detectors need to be placed in regions where the photon flux is as independent as possible from the distribution of the source. At the same time, the detectors need high photon fluxes in order to achieve better efficiency and lower minimum detectable activities. This work presents a method able to define the layout of new WBC systems and to study the behaviour of existing ones using both detection efficiency and its dependence on the position of the source within the body of computational phantoms.

  4. 40 CFR Table 6 to Subpart Dddd of... - Model Rule-Emission Limitations That Apply to Incinerators on and After [Date to be specified in...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... per million dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10... (Reapproved 2008) c. Oxides of nitrogen 53 parts per million dry volume 3-run average (1 hour minimum sample... average (1 hour minimum sample time per run) Performance test (Method 6 or 6c at 40 CFR part 60, appendix...

  5. 40 CFR Table 6 to Subpart Dddd of... - Model Rule-Emission Limitations That Apply to Incinerators on and After [Date to be specified in...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... per million dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10... (Reapproved 2008) c. Oxides of nitrogen 53 parts per million dry volume 3-run average (1 hour minimum sample... average (1 hour minimum sample time per run) Performance test (Method 6 or 6c at 40 CFR part 60, appendix...

  6. 40 CFR Table 2 to Subpart Dddd of... - Model Rule-Emission Limitations That Apply to Incinerators Before [Date to be specified in state...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test..., appendix A-4). Oxides of nitrogen 388 parts per million by dry volume 3-run average (1 hour minimum sample... (1 hour minimum sample time per run) Performance test (Method 6 or 6c of appendix A of this part) a...

  7. 40 CFR Table 2 to Subpart Dddd of... - Model Rule-Emission Limitations That Apply to Incinerators Before [Date to be specified in state...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test..., appendix A-4). Oxides of nitrogen 388 parts per million by dry volume 3-run average (1 hour minimum sample... (1 hour minimum sample time per run) Performance test (Method 6 or 6c of appendix A of this part) a...

  8. The fastest spreader in SIS epidemics on networks

    NASA Astrophysics Data System (ADS)

    He, Zhidong; Van Mieghem, Piet

    2018-05-01

    Identifying the fastest spreaders in epidemics on a network helps to ensure an efficient spreading. By ranking the average spreading time for different spreaders, we show that the fastest spreader may change with the effective infection rate of a SIS epidemic process, which means that the time-dependent influence of a node is usually strongly coupled to the dynamic process and the underlying network. With increasing effective infection rate, we illustrate that the fastest spreader changes from the node with the largest degree to the node with the shortest flooding time. (The flooding time is the minimum time needed to reach all other nodes if the process is reduced to a flooding process.) Furthermore, by taking the local topology around the spreader and the average flooding time into account, we propose the spreading efficiency as a metric to quantify the efficiency of a spreader and identify the fastest spreader, which is adaptive to different infection rates in general networks.

  9. How long is enough to detect terrestrial animals? Estimating the minimum trapping effort on camera traps

    PubMed Central

    Si, Xingfeng; Kays, Roland

    2014-01-01

    Camera traps is an important wildlife inventory tool for estimating species diversity at a site. Knowing what minimum trapping effort is needed to detect target species is also important to designing efficient studies, considering both the number of camera locations, and survey length. Here, we take advantage of a two-year camera trapping dataset from a small (24-ha) study plot in Gutianshan National Nature Reserve, eastern China to estimate the minimum trapping effort actually needed to sample the wildlife community. We also evaluated the relative value of adding new camera sites or running cameras for a longer period at one site. The full dataset includes 1727 independent photographs captured during 13,824 camera days, documenting 10 resident terrestrial species of birds and mammals. Our rarefaction analysis shows that a minimum of 931 camera days would be needed to detect the resident species sufficiently in the plot, and c. 8700 camera days to detect all 10 resident species. In terms of detecting a diversity of species, the optimal sampling period for one camera site was c. 40, or long enough to record about 20 independent photographs. Our analysis of evaluating the increasing number of additional camera sites shows that rotating cameras to new sites would be more efficient for measuring species richness than leaving cameras at fewer sites for a longer period. PMID:24868493

  10. Ascent trajectory optimization for stratospheric airship with thermal effects

    NASA Astrophysics Data System (ADS)

    Guo, Xiao; Zhu, Ming

    2013-09-01

    Ascent trajectory optimization with thermal effects is addressed for a stratospheric airship. Basic thermal characteristics of the stratospheric airship are introduced. Besides, the airship’s equations of motion are constructed by including the factors about aerodynamic force, added mass and wind profiles which are developed based on horizontal-wind model. For both minimum-time and minimum-energy flights during ascent, the trajectory optimization problem is described with the path and terminal constraints in different scenarios and then, is converted into a parameter optimization problem by a direct collocation method. Sparse Nonlinear OPTimizer(SNOPT) is employed as a nonlinear programming solver and two scenarios are adopted. The solutions obtained illustrate that the trajectories are greatly affected by the thermal behaviors which prolong the daytime minimum-time flights of about 20.8% compared with that of nighttime in scenario 1 and of about 10.5% in scenario 2. And there is the same trend for minimum-energy flights. For the energy consumption of minimum-time flights, 6% decrease is abstained in scenario 1 and 5% decrease in scenario 2. However, a few energy consumption reduction is achieved for minimum-energy flights. Solar radiation is the principal component and the natural wind also affects the thermal behaviors of stratospheric airship during ascent. The relationship between take-off time and performance of airship during ascent is discussed. it is found that the take-off time at dusk is best choice for stratospheric airship. And in addition, for saving energy, airship prefers to fly downwind.

  11. Psychosocial assessment of nursing home residents via MDS 3.0: recommendations for social service training, staffing, and roles in interdisciplinary care.

    PubMed

    Simons, Kelsey; Connolly, Robert P; Bonifas, Robin; Allen, Priscilla D; Bailey, Kathleen; Downes, Deirdre; Galambos, Colleen

    2012-02-01

    The Minimum Data Set 3.0 has introduced a higher set of expectations for assessment of residents' psychosocial needs, including new interviewing requirements, new measures of depression and resident choice, and new discharge screening procedures. Social service staff are primary providers of psychosocial assessment and care in nursing homes; yet, research demonstrates that many do not possess the minimum qualifications, as specified in federal regulations, to effectively provide these services given the clinical complexity of this client population. Likewise, social service caseloads generally exceed manageable levels. This article addresses the need for enhanced training and support of social service and interdisciplinary staff in long term care facilities in light of the new Minimum Data Set 3.0 assessment procedures as well as new survey and certification guidelines emphasizing quality of life. A set of recommendations will be made with regard to training, appropriate role functions within the context of interdisciplinary care, and needs for more realistic staffing ratios. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  12. An Examination of Sunspot Number Rates of Growth and Decay in Relation to the Sunspot Cycle

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2006-01-01

    On the basis of annual sunspot number averages, sunspot number rates of growth and decay are examined relative to both minimum and maximum amplitudes and the time of their occurrences using cycles 12 through present, the most reliably determined sunspot cycles. Indeed, strong correlations are found for predicting the minimum and maximum amplitudes and the time of their occurrences years in advance. As applied to predicting sunspot minimum for cycle 24, the next cycle, its minimum appears likely to occur in 2006, especially if it is a robust cycle similar in nature to cycles 17-23.

  13. A Time Series Analysis: Weather Factors, Human Migration and Malaria Cases in Endemic Area of Purworejo, Indonesia, 2005–2014

    PubMed Central

    REJEKI, Dwi Sarwani Sri; NURHAYATI, Nunung; AJI, Budi; MURHANDARWATI, E. Elsa Herdiana; KUSNANTO, Hari

    2018-01-01

    Background: Climatic and weather factors become important determinants of vector-borne diseases transmission like malaria. This study aimed to prove relationships between weather factors with considering human migration and previous case findings and malaria cases in endemic areas in Purworejo during 2005–2014. Methods: This study employed ecological time series analysis by using monthly data. The independent variables were the maximum temperature, minimum temperature, maximum humidity, minimum humidity, precipitation, human migration, and previous malaria cases, while the dependent variable was positive malaria cases. Three models of count data regression analysis i.e. Poisson model, quasi-Poisson model, and negative binomial model were applied to measure the relationship. The least Akaike Information Criteria (AIC) value was also performed to find the best model. Negative binomial regression analysis was considered as the best model. Results: The model showed that humidity (lag 2), precipitation (lag 3), precipitation (lag 12), migration (lag1) and previous malaria cases (lag 12) had a significant relationship with malaria cases. Conclusion: Weather, migration and previous malaria cases factors need to be considered as prominent indicators for the increase of malaria case projection. PMID:29900134

  14. Levels of physical activity and sedentary time among 10- to 12-year-old boys and girls across 5 European countries using accelerometers: an observational study within the ENERGY-project.

    PubMed

    Verloigne, Maïté; Van Lippevelde, Wendy; Maes, Lea; Yıldırım, Mine; Chinapaw, Mai; Manios, Yannis; Androutsos, Odysseas; Kovács, Eva; Bringolf-Isler, Bettina; Brug, Johannes; De Bourdeaudhuij, Ilse

    2012-03-31

    The study aim was to objectively assess levels of sedentary time, light, moderate and vigorous physical activity (PA) among 10-12 year olds across five European countries and to examine differences in sedentary time and PA according to gender and country. 686 children (mean age = 11.6 ± 0.8 years, 53% girls, mean BMI = 19.0 ± 3.4 kg/m(2)) from Belgium, Greece, Hungary, the Netherlands and Switzerland wore Actigraph accelerometers and had at least 2 weekdays with minimum 10 h-wearing time and 1 weekend day with minimum 8 h-wearing time. Data were analyzed using multivariate analyses of covariance. Girls spent significantly more time sedentary (500 minutes/day) than boys (474 minutes/day) and significantly less time in light (267 minutes/day) and moderate-to-vigorous PA (32 minutes/day) than boys (284 minutes/day; 43 minutes/day respectively; p < 0.001). 4.6% of the girls and 16.8% of the boys met moderate-to-vigorous PA recommendations of at least 60 minutes/day. Greek boys were more sedentary (510 minutes/day; all at p < 0.05) than other boys. Dutch girls were less sedentary (457 minutes/day; all at p < 0.05) than other girls. Swiss girls displayed more moderate-to-vigorous PA (43 minutes/day; at p < 0.05) than other girls. Large proportions of children across different European countries did not meet PA recommendations and spent a lot of time sedentary. Mean time spent in moderate-to-vigorous PA was significantly lower than the recommended 60 minutes. Obesity prevention programmes focusing on both decreasing sedentary time and increasing light, moderate and vigorous PA are needed for European children, particularly girls.

  15. CERA: Clerkships Need National Curricula on Care Delivery, Awareness of Their NCC Gaps.

    PubMed

    Cochella, Susan; Liaw, Winston; Binienda, Juliann; Hustedde, Carol

    2016-06-01

    The Society of Teachers of Family Medicine's (STFM) National Clerkship Curriculum (NCC) was created to standardize and improve teaching of a minimum core curriculum in family medicine clerkships, promoting the Triple Aim of better care and population health at lower cost. It includes competencies all clerkships should teach and tools to support clerkship directors (CDs). This 2014 CERA survey of clerkship directors is one of several needs assessments that guide STFM's NCC Editorial Board in targeting improvements and peer-review processes. CERA's 2014 survey of CDs was sent to all 137 CDs at US and Canadian allopathic medical schools. Primary aims included: (1) Identify curricular topics of greatest need, (2) Inventory the percent of family medicine clerkships teaching each NCC topic, and (3) Determine if longitudinal or blended clerkship have unique needs. This survey also assessed use of NCC to advocate for teaching resources and collaborate with colleagues at other institutions. Ninety-one percent of CDs completed the survey. Sixty-four percent reported their clerkship covers all of the NCC minimum core, but on detailed analysis, only 1% teach all topics. CDs need curricula on care delivery topics (cost-effective approach to acute care, role of family medicine in the health care system, quality/safety, and comorbid substance abuse). Single-question assessments overestimate the percentage of clerkships teaching all of the NCC minimum core. Clerkships need national curricula on care delivery topics and tools to help them find their curricular gaps.

  16. Optimization of fixed-range trajectories for supersonic transport aircraft

    NASA Astrophysics Data System (ADS)

    Windhorst, Robert Dennis

    1999-11-01

    This thesis develops near-optimal guidance laws that generate minimum fuel, time, or direct operating cost fixed-range trajectories for supersonic transport aircraft. The approach uses singular perturbation techniques to time-scale de-couple the equations of motion into three sets of dynamics, two of which are analyzed in the main body of this thesis and one of which is analyzed in the Appendix. The two-point-boundary-value-problems obtained by application of the maximum principle to the dynamic systems are solved using the method of matched asymptotic expansions. Finally, the two solutions are combined using the matching principle and an additive composition rule to form a uniformly valid approximation of the full fixed-range trajectory. The approach is used on two different time-scale formulations. The first holds weight constant, and the second allows weight and range dynamics to propagate on the same time-scale. Solutions for the first formulation are only carried out to zero order in the small parameter, while solutions for the second formulation are carried out to first order. Calculations for a HSCT design were made to illustrate the method. Results show that the minimum fuel trajectory consists of three segments: a minimum fuel energy-climb, a cruise-climb, and a minimum drag glide. The minimum time trajectory also has three segments: a maximum dynamic pressure ascent, a constant altitude cruise, and a maximum dynamic pressure glide. The minimum direct operating cost trajectory is an optimal combination of the two. For realistic costs of fuel and flight time, the minimum direct operating cost trajectory is very similar to the minimum fuel trajectory. Moreover, the HSCT has three local optimum cruise speeds, with the globally optimum cruise point at the highest allowable speed, if range is sufficiently long. The final range of the trajectory determines which locally optimal speed is best. Ranges of 500 to 6,000 nautical miles, subsonic and supersonic mixed flight, and varying fuel efficiency cases are analyzed. Finally, the payload-range curve of the HSCT design is determined.

  17. Is the National Guideline Clearinghouse a Trustworthy Source of Practice Guidelines for Child and Youth Anxiety and Depression?

    PubMed

    Duda, Stephanie; Fahim, Christine; Szatmari, Peter; Bennett, Kathryn

    2017-07-01

    Innovative strategies that facilitate the use of high quality practice guidelines (PG) are needed. Accordingly, repositories designed to simplify access to PGs have been proposed as a critical component of the network of linked interventions needed to drive increased PG implementation. The National Guideline Clearinghouse (NGC) is a free, international online repository. We investigated whether it is a trustworthy source of child and youth anxiety and depression PGs. English language PGs published between January 2009 and February 2016 relevant to anxiety or depression in children and adolescents (≤ 18 years of age) were eligible. Two trained raters assessed PG quality using Appraisal of Guidelines for Research and Evaluation (AGREE II). Scores on at least three AGREE II domains (stakeholder involvement, rigor of development, and editorial independence) were used to designate PGs as: i) minimum quality (≥ 50%); and ii) high quality (≥ 70%). Eight eligible PGs were identified (depression, n=6; anxiety and depression, n=1; social anxiety disorder, n=1). Four of eight PGs met minimum quality criteria; three of four met high quality criteria. At present, NGC users without the time and special skills required to evaluate PG quality may unknowingly choose flawed PGs to guide decisions about child and youth anxiety and depression. The recent NGC decision to explore the inclusion of PG quality profiles based on Institute of Medicine standards provides needed leadership that can strengthen PG repositories, prevent harm and wasted resources, and build PG developer capacity.

  18. On the minimum orbital intersection distance computation: a new effective method

    NASA Astrophysics Data System (ADS)

    Hedo, José M.; Ruíz, Manuel; Peláez, Jesús

    2018-06-01

    The computation of the Minimum Orbital Intersection Distance (MOID) is an old, but increasingly relevant problem. Fast and precise methods for MOID computation are needed to select potentially hazardous asteroids from a large catalogue. The same applies to debris with respect to spacecraft. An iterative method that strictly meets these two premises is presented.

  19. Scientific Terminology and Minimum Terms in Speech Communication: Some Philosophical Ramblings.

    ERIC Educational Resources Information Center

    Krivonos, Paul D.; Sussman, Lyle.

    Philosophers of science have emphasized the need for primitive terms, or "givens," in the construction of theory for any discipline. While there are inherent dangers regarding the use of primitive terms, they can have great value in serving as the basis for minimum terms, which are primitive terms unique to a discipline. (Borrowed terms are those…

  20. Nursing Minimum Data Set for School Nursing Practice. Position Statement. Revised

    ERIC Educational Resources Information Center

    Denehy, Janice

    2012-01-01

    It is the position of the National Association of School Nurses (NASN) to support the collection of essential nursing data as listed in the Nursing Minimum Data Set (NMDS). The NMDS provides a basic structure to identify the data needed to delineate nursing care delivered to clients as well as relevant characteristics of those clients. Structure…

  1. Slicing cluster mass functions with a Bayesian razor

    NASA Astrophysics Data System (ADS)

    Sealfon, C. D.

    2010-08-01

    We apply a Bayesian ``razor" to forecast Bayes factors between different parameterizations of the galaxy cluster mass function. To demonstrate this approach, we calculate the minimum size N-body simulation needed for strong evidence favoring a two-parameter mass function over one-parameter mass functions and visa versa, as a function of the minimum cluster mass.

  2. 78 FR 34559 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... airports. These regulatory actions are needed because of the adoption of new or revised criteria, or... Minimums and ODPS contained in this amendment are based on the criteria contained in the U.S. Standard for... criteria were applied to the conditions existing or anticipated at the affected airports. Because of the...

  3. 77 FR 31180 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-25

    ... airports. These regulatory actions are needed because of the adoption of new or revised criteria, or... Minimums and ODPS contained in this amendment are based on the criteria contained in the U.S. Standard for... criteria were applied to the conditions existing or anticipated at the affected airports. Because of the...

  4. The Formation Environment of Jupiter's Moons

    NASA Technical Reports Server (NTRS)

    Turner, Neal; Lee, Man Hoi; Sano, Takayoshi

    2012-01-01

    Do circumjovian disk models have conductivities consistent with the assumed accretion stresses? Broadly, YES, for both minimum-mass and gas-starved models: magnetic stresses are weak in the MM models, as needed to keep the material in place. Stresses are stronger in the gas-starved models, as assumed in deriving the flow to the planet. However, future minimum-mass modeling may need to consider the loss of dust-depleted gas from the surface layers to the planet. The gas-starved models should have stress varying in radius. Dust evolution is a key process for further study, since the recombination occurs on the grains.

  5. The 2014 X-Ray Minimum of Eta Carinae as Seen by Swift

    NASA Technical Reports Server (NTRS)

    Corcoran, M. F.; Liburd, J.; Morris, D.; Russell, C. M. P.; Hamaguchi, K.; Gull, T. R.; Madura, T. I.; Teodoro, M.; Moffat, A. F. J.; Richardson, N. D.

    2017-01-01

    We report on Swift X-ray Telescope observations of Eta Carinae ( Car), an extremely massive, long-period, highly eccentric binary obtained during the 2014.6 X-ray minimumperiastron passage. These observations show that Car may have been particularly bright in X-rays going into the X-ray minimum state, while the duration of the 2014 X-ray minimum was intermediate between the extended minima seen in 1998.0 and 2003.5 by Rossi X-Ray Timing Explorer (RXTE), and the shorter minimum in 2009.0. The hardness ratios derived from the Swift observations showed a relatively smooth increase to a peak value occurring 40.5 days after the start of the X-ray minimum, though these observations cannot reliably measure the X-ray hardness during the deepest part of the X-ray minimum when contamination by the central constant emission component is significant. By comparing the timings of the RXTE and Swift observations near the X-ray minima, we derive an updated X-ray period of P X equals 2023.7 +/- 0.7 days, in good agreement with periods derived from observations at other wavelengths, and we compare the X-ray changes with variations in the He ii lambda 4686 emission. The middle of the Deep Minimum interval, as defined by the Swift column density variations, is in good agreement with the time of periastron passage derived from the He ii 4686 line variations.

  6. First Precision Photometric Observations and Analyses of The Totally Eclipsing, Solar Type Binary, V573 Pegasi

    NASA Astrophysics Data System (ADS)

    Samec, Ronald G.; Caton, Daniel B.; Faulkner, Danny R.

    2018-06-01

    CCD, VRI light curves of V573 Peg were taken in 26,27 September and 2, 4 and 6 October, 2017 at the Dark Sky Observatory in North Carolina with the 0.81-m reflector of Appalachian State University by D. Caton. V573 Peg was discovered by the SAVS survey which classified it as a V= 0.51 amplitude, EW variable. They included a rough spectrum identifying the binary was about a type G, although the period would indicate it is an F-type contact binary. Five times of minimum light were calculated, two primary eclipses and three secondary, from our present observations:HJD I = 2458023.6420±0.0012, 2458028.6522±0.0021,HJD II = 2458022.5991±0.0011, 2458023.8510±0.0010 and 2458028.8608±0.0005,The following linear and quadratic ephemerides were determined from all available times of minimum light.JD Hel MinI = 2456876.49437±0.00078d + 0.41745021±0. 00000017 × E, -JD Hel MinI = 2456876.49580±0.00023d + 0. 417448601±0.000000083× E - 0.000000000274±0.000000000012 X E2A 14-year period study (24 times of minimum light) revealed that the orbital period decreasing with a high level of confidence, possibly due to magnetic braking. The mass ratio is found to be somewhat extreme, M2/M1=0.2629±0.0006.Its Roche Lobe fill-out is 25%. The solution had no need of spots. The temperature difference of the components is about ~130 K, with the secondary as the hotter star, so it is a W-type W UMa Binary. The inclination is 80.4±0.1° . The secondary eclipse shows a time of constant light with an eclipse duration of 24 minutes. More details of our results will be given at the meeting.

  7. A channel estimation scheme for MIMO-OFDM systems

    NASA Astrophysics Data System (ADS)

    He, Chunlong; Tian, Chu; Li, Xingquan; Zhang, Ce; Zhang, Shiqi; Liu, Chaowen

    2017-08-01

    In view of the contradiction of the time-domain least squares (LS) channel estimation performance and the practical realization complexity, a reduced complexity channel estimation method for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) based on pilot is obtained. This approach can transform the complexity of MIMO-OFDM channel estimation problem into a simple single input single output-orthogonal frequency division multiplexing (SISO-OFDM) channel estimation problem and therefore there is no need for large matrix pseudo-inverse, which greatly reduces the complexity of algorithms. Simulation results show that the bit error rate (BER) performance of the obtained method with time orthogonal training sequences and linear minimum mean square error (LMMSE) criteria is better than that of time-domain LS estimator and nearly optimal performance.

  8. Survey of Occupational Noise Exposure in CF Personnel in Selected High-Risk Trades

    DTIC Science & Technology

    2003-11-01

    peak, maximum level , minimum level , average sound level , time weighted average, dose, projected 8-hour dose, and upper limit time were measured for...10 4.4.2 Maximum Sound Level ...11 4.4.3 Minimum Sound Level

  9. Classification and recognition of dynamical models: the role of phase, independent components, kernels and optimal transport.

    PubMed

    Bissacco, Alessandro; Chiuso, Alessandro; Soatto, Stefano

    2007-11-01

    We address the problem of performing decision tasks, and in particular classification and recognition, in the space of dynamical models in order to compare time series of data. Motivated by the application of recognition of human motion in image sequences, we consider a class of models that include linear dynamics, both stable and marginally stable (periodic), both minimum and non-minimum phase, driven by non-Gaussian processes. This requires extending existing learning and system identification algorithms to handle periodic modes and nonminimum phase behavior, while taking into account higher-order statistics of the data. Once a model is identified, we define a kernel-based cord distance between models that includes their dynamics, their initial conditions as well as input distribution. This is made possible by a novel kernel defined between two arbitrary (non-Gaussian) distributions, which is computed by efficiently solving an optimal transport problem. We validate our choice of models, inference algorithm, and distance on the tasks of human motion synthesis (sample paths of the learned models), and recognition (nearest-neighbor classification in the computed distance). However, our work can be applied more broadly where one needs to compare historical data while taking into account periodic trends, non-minimum phase behavior, and non-Gaussian input distributions.

  10. Impact of school health management committees on health services delivery in Ghana: A national level assessment.

    PubMed

    Bowman, Angela S; Owusu, Andrew; Trueblood, Amber B; Bosumtwi-Sam, Cynthia

    2018-05-07

    To examine the prevalence, determinants, and impact of local school health management committees on implementation of minimum-recommended school health services delivery among basic and secondary schools in Ghana. National level cross-sectional data from the first-ever assessment of Ghana Global-School Health Policies and Practices Survey was utilized. Complex sample analyses were used to quantify school-level implementation of recommended minimum package for health services delivery. Of 307 schools, 98% were basic and government run, and 33% offered at least half of the recommended health service delivery areas measured. Schools with a school health management committee (53%) were 4.8 (95% CI = 3.23-5.18) times as likely to offer at least 50% of the minimum health services package than schools that did not. There is significant deficit concerning delivery of school health services in schools across Ghana. However, school health management committees positively impact implementation of health service delivery. School health management committees provide a significant impact on delivery of school health services; thus, it is recommended that policy makers and programmers place greater emphasis on the value and need for these advisory boards in all Ghanaian schools. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Modelling Ecosystem Dynamics of the Oxygen Minimum Zones in the Angola Gyre and the Northern Benguela Upwelling System.

    NASA Astrophysics Data System (ADS)

    Schmidt, M.; Eggert, A.

    2016-02-01

    The Angola Gyre and the Northern Benguela Upwelling System are two major oxygen minimum zones (OMZ) of different kind connected by the system of African Eastern Boundary Currents. We discuss results from a 3-dimensional coupled biogeochemical model covering both oxygen-deficient systems. The biogeochemical model component comprises trophic levels up to zooplankton. Physiological properties of organisms are parameterized from field data gained mainly in the course of the project "Geochemistry and Ecology of the Namibian Upwelling System" (GENUS). The challenge of the modelling effort is the different nature of both systems. The Angola Gyre, located in a "shadow zone" of the tropical Atlantic, has a low productivity and little ventilation, hence a long residence time of water masses. In the northern Benguela Upwelling System, trade winds drive an intermittent, but permanent nutrient supply into the euphotic zone which fuels a high coastal productivity, large particle export and high oxygen consumption from dissimilatory processes. In addition to the local processes, oxygen-deficient water formed in the Angola Gyre is one of the source water masses of the poleward undercurrent, which feeds oxygen depleted water into the Benguela system. In order to simulate the oxygen distribution in the Benguela system, both physical transport as well as local biological processes need to be carefully adjusted in the model. The focus of the analysis is on the time scale and the relative contribution of the different oxygen related processes to the oxygen budgets in both the oxygen minimum zones. Although these are very different in both the OMZ, the model is found as suitable to produce oxygen minimum zones comparable with observations in the Benguela and the Angola Gyre as well. Variability of the oxygen concentration in the Angola Gyre depends strongly on organismic oxygen consumption, whereas the variability of the oxygen concentration on the Namibian shelf is governed mostly by pole-ward advection of tropical water masses.

  12. On the estimation of intracluster correlation for time-to-event outcomes in cluster randomized trials.

    PubMed

    Kalia, Sumeet; Klar, Neil; Donner, Allan

    2016-12-30

    Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs. Analyses of such data need to account for the correlation among cluster members. The intracluster correlation coefficient (ICC) is used to assess the similarity among binary and continuous outcomes that belong to the same cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge because of the presence of censored observations. The literature suggests that the ICC may be estimated using either censoring indicators or observed event times. A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that ICC estimators derived from censoring indicators or observed event times are negatively biased. Analytic work further supports these results. Observed event times are preferred to estimate the ICC under minimum frequency of administrative censoring. To our knowledge, the existing literature provides no practical guidance on the estimation of ICC when substantial amount of administrative censoring is present. The results from this study corroborate the need for further methodological research on estimating the ICC for correlated time-to-event outcomes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Turbulence Generation Using Localized Sources of Energy: Direct Numerical Simulations and the Effects of Thermal Non-Equilibrium

    NASA Astrophysics Data System (ADS)

    Maqui, Agustin Francisco

    Turbulence in high-speed flows is an important problem in aerospace applications, yet extremely difficult from a theoretical, computational and experimental perspective. A main reason for the lack of complete understanding is the difficulty of generating turbulence in the lab at a range of speeds which can also include hypersonic effects such as thermal non-equilibrium. This work studies the feasibility of a new approach to generate turbulence based on laser-induced photo-excitation/dissociation of seeded molecules. A large database of incompressible and compressible direct numerical simulations (DNS) has been generated to systematically study the development and evolution of the flow towards realistic turbulence. Governing parameters and the conditions necessary for the establishment of turbulence, as well as the length and time scales associated with such process, are identified. For both the compressible and incompressible experiments a minimum Reynolds number is found to be needed for the flow to evolve towards fully developed turbulence. Additionally, for incompressible cases a minimum time scale is required, while for compressible cases a minimum distance from the grid and limit on the maximum temperature introduced are required. Through an extensive analysis of single and two point statistics, as well as spectral dynamics, the primary mechanisms leading to turbulence are shown. As commonly done in compressible turbulence, dilatational and solenoidal components are separated to understand the effect of acoustics on the development of turbulence. Finally, a large database of forced isotropic turbulence has been generated to study the effect of internal degrees of freedom on the evolution of turbulence.

  14. Monitoring of V380 Oph requested in support of HST observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2012-08-01

    On behalf of a large Hubble Space Telescope consortium of which they are members, Dr. Joseph Patterson (Columbia University, Center for Backyard Astrophysics) and Dr. Arne Henden (AAVSO) requested observations from the amateur astronomer community in support of upcoming HST observations of the novalike VY Scl-type cataclysmic variable V380 Oph. The HST observations will likely take place in September but nightly visual observations are needed beginning immediately and continuing through at least October 2012. The astronomers plan to observe V380 Oph while it is in its current low state. Observations beginning now are needed to determine the behavior of this system at minimum and to ensure that the system is not in its high state at the time of the HST observations. V380 Oph is very faint in its low state: magnitude 17 to 19 and perhaps even fainter. Nightly snapshot observations, not time series, are requested, as is whatever technique - adding frames, lengthening exposur! es, etc. - necessary to measure the magnitude. It is not known whether V380 Oph is relatively inactive at minimum or has flares of one to two magnitudes; it is this behavior that is essential to learn in order to safely execute the HST observations. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details. NOTE: This campaign was subsequently cancelled when it was learned V830 Oph was not truly in its low state. See AAVSO Alert Notice 468 for details.

  15. Evolution of canine parvovirus--a need for new vaccines?

    PubMed

    Truyen, Uwe

    2006-10-05

    Canine parvovirus (CPV) is a new virus, which is continuing to evolve, giving rise to new antigenic types and virus mutants that spread through the dog population. The most successful mutants, from an evolutionary perspective, appear to be selected by improved binding to the CPV receptor, the canine transferrin receptor, and by an extended host range, which for the newer antigenic types now includes both the dog and the cat. The new viruses also show antigenic differences that can be defined by binding of certain monoclonal antibodies; they also differ in their reactivity in virus neutralisation tests, using immune sera raised against the various antigenic types. These differences may influence the susceptibility of young animals to infection at the time when the level of maternally derived antibody decreases to the minimum protective titre. This minimum protective titre may vary depending on the infecting virus type. There is, however, a high degree of cross-protection between the virus types and the true relevance of the differences in neutralisation titer is currently not known.

  16. What Is the 'Minimum Inhibitory Concentration' (MIC) of Pexiganan Acting on Escherichia coli?-A Cautionary Case Study.

    PubMed

    Jepson, Alys K; Schwarz-Linek, Jana; Ryan, Lloyd; Ryadnov, Maxim G; Poon, Wilson C K

    2016-01-01

    We measured the minimum inhibitory concentration (MIC) of the antimicrobial peptide pexiganan acting on Escherichia coli , and found an intrinsic variability in such measurements. These results led to a detailed study of the effect of pexiganan on the growth curve of E. coli, using a plate reader and manual plating (i.e. time-kill curves). The measured growth curves, together with single-cell observations and peptide depletion assays, suggested that addition of a sub-MIC concentration of pexiganan to a population of this bacterium killed a fraction of the cells, reducing peptide activity during the process, while leaving the remaining cells unaffected. This pharmacodynamic hypothesis suggests a considerable inoculum effect, which we quantified. Our results cast doubt on the use of the MIC as 'a measure of the concentration needed for peptide action' and show how 'coarse-grained' studies at the population level give vital information for the correct planning and interpretation of MIC measurements.

  17. Multi-bottle, no compressor, mean pressure control system for a Stirling engine

    DOEpatents

    Corey, John A.

    1990-01-01

    The invention relates to an apparatus for mean pressure control of a Stirling engine without the need for a compressor. The invention includes a multi-tank system in which there is at least one high pressure level tank and one low pressure level tank wherein gas flows through a maximum pressure and supply line from the engine to the high pressure tank when a first valve is opened until the maximum pressure of the engine drops below that of the high pressure tank opening an inlet regulator to permit gas flow from the engine to the low pressure tank. When gas flows toward the engine it flows through the minimum pressure supply line 2 when a second valve is opened from the low pressure tank until the tank reaches the engine's minimum pressure level at which time the outlet regulator opens permitting gas to be supplied from the high pressure tank to the engine. Check valves between the two tanks prevent any backflow of gas from occurring.

  18. The association of minimum wage change on child nutritional status in LMICs: A quasi-experimental multi-country study.

    PubMed

    Ponce, Ninez; Shimkhada, Riti; Raub, Amy; Daoud, Adel; Nandi, Arijit; Richter, Linda; Heymann, Jody

    2017-08-02

    There is recognition that social protection policies such as raising the minimum wage can favourably impact health, but little evidence links minimum wage increases to child health outcomes. We used multi-year data (2003-2012) on national minimum wages linked to individual-level data from the Demographic and Health Surveys (DHS) from 23 low- and middle-income countries (LMICs) that had least two DHS surveys to establish pre- and post-observation periods. Over a pre- and post-interval ranging from 4 to 8 years, we examined minimum wage growth and four nutritional status outcomes among children under 5 years: stunting, wasting, underweight, and anthropometric failure. Using a differences-in-differences framework with country and time-fixed effects, a 10% increase in minimum wage growth over time was associated with a 0.5 percentage point decline in stunting (-0.054, 95% CI (-0.084,-0.025)), and a 0.3 percentage point decline in failure (-0.031, 95% CI (-0.057,-0.005)). We did not observe statistically significant associations between minimum wage growth and underweight or wasting. We found similar results for the poorest households working in non-agricultural and non-professional jobs, where minimum wage growth may have the most leverage. Modest increases in minimum wage over a 4- to 8-year period might be effective in reducing child undernutrition in LMICs.

  19. From Lawson to Burning Plasmas: a Multi-Fluid Approach

    NASA Astrophysics Data System (ADS)

    Guazzotto, Luca; Betti, Riccardo

    2017-10-01

    The Lawson criterion, easily compared to experimental parameters, gives the value for the triple product of plasma density, temperature and energy confinement time needed for the plasma to ignite. Lawson's inaccurate assumptions of 0D geometry and single-fluid plasma model were improved in recent work, where 1D geometry and multi-fluid (ions, electrons and alphas) physics were included in the model, accounting for physical equilibration times and different energy confinement times between species. A much more meaningful analysis than Lawson's for current and future experiment would be expressed in terms of burning plasma state (Q=5, where Q is the ratio between fusion power and heating power). Minimum parameters for reaching Q=5 are calculated based on experimental profiles for density and temperatures and can immediately be compared with experimental performance by defining a no-alpha pressure. This is done in terms of the pressure that the plasma needs to reach for breakeven once the alpha heating has been subtracted from the energy balance. These calculations can be applied to current experiments and future burning-plasma devices. DE-FG02-93ER54215.

  20. High Tensile Strength Amalgams for In-Space Fabrication and Repair

    NASA Technical Reports Server (NTRS)

    Grugel, Richard N.

    2006-01-01

    Amalgams are well known for their use in dental practice as a tooth filling material. They have a number of useful attributes that include room temperature fabrication, corrosion resistance, dimensional stability, and very good compressive strength. These properties well serve dental needs but, unfortunately, amalgams have extremely poor tensile strength, a feature that severely limits other potential applications. Improved material properties (strength and temperature) of amalgams may have application to the freeform fabrication of repairs or parts that might be necessary during an extended space mission. Advantages would include, but are not limited to: the ability to produce complex parts, a minimum number of processing steps, minimum crew interaction, high yield - minimum wasted material, reduced gravity compatibility, minimum final finishing, safety, and minimum power consumption. The work presented here shows how the properties of amalgams can be improved by changing particle geometries in conjunction with novel engineering metals.

  1. Minimum Information about a Cardiac Electrophysiology Experiment (MICEE): Standardised Reporting for Model Reproducibility, Interoperability, and Data Sharing

    PubMed Central

    Quinn, TA; Granite, S; Allessie, MA; Antzelevitch, C; Bollensdorff, C; Bub, G; Burton, RAB; Cerbai, E; Chen, PS; Delmar, M; DiFrancesco, D; Earm, YE; Efimov, IR; Egger, M; Entcheva, E; Fink, M; Fischmeister, R; Franz, MR; Garny, A; Giles, WR; Hannes, T; Harding, SE; Hunter, PJ; Iribe, G; Jalife, J; Johnson, CR; Kass, RS; Kodama, I; Koren, G; Lord, P; Markhasin, VS; Matsuoka, S; McCulloch, AD; Mirams, GR; Morley, GE; Nattel, S; Noble, D; Olesen, SP; Panfilov, AV; Trayanova, NA; Ravens, U; Richard, S; Rosenbaum, DS; Rudy, Y; Sachs, F; Sachse, FB; Saint, DA; Schotten, U; Solovyova, O; Taggart, P; Tung, L; Varró, A; Volders, PG; Wang, K; Weiss, JN; Wettwer, E; White, E; Wilders, R; Winslow, RL; Kohl, P

    2011-01-01

    Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step toward establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work. PMID:21745496

  2. Time optimal paths for high speed maneuvering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reister, D.B.; Lenhart, S.M.

    1993-01-01

    Recent theoretical results have completely solved the problem of determining the minimum length path for a vehicle with a minimum turning radius moving from an initial configuration to a final configuration. Time optimal paths for a constant speed vehicle are a subset of the minimum length paths. This paper uses the Pontryagin maximum principle to find time optimal paths for a constant speed vehicle. The time optimal paths consist of sequences of axes of circles and straight lines. The maximum principle introduces concepts (dual variables, bang-bang solutions, singular solutions, and transversality conditions) that provide important insight into the nature ofmore » the time optimal paths. We explore the properties of the optimal paths and present some experimental results for a mobile robot following an optimal path.« less

  3. A recent time of minimum for and atmospheric-eclipse in the ultraviolet spectrum of the Wolf-Rayet eclipsing binary V444 Cygni

    NASA Technical Reports Server (NTRS)

    Eaton, J. E.; Cherepashchuk, A. M.; Khaliullin, K. F.

    1982-01-01

    The 1200-1900 angstrom region and fine error sensor observations in the optical for V444 Cyg were continuously observed. More than half of a primary minimum and almost a complete secondary minimum were observed. It is found that the time of minimum for the secondary eclipse is consistent with that for primary eclipse, and the ultraviolet times of minimum are consistent with the optical ones. The spectrum shows a considerable amount of phase dependence. The general shaps and depths of the light curves for the FES signal and the 1565-1900 angstrom continuum are similar to those for the blue continuum. The FES, however, detected an atmospheric eclipse in line absorption at about the phase the NIV absorption was strongest. It is suggested that there is a source of continuum absorption shortward of 1460 angstrom which exists throughout a large part of the extended atmosphere and which, by implication, must redden considerably the ultraviolet continuua of WN stars. A fairly high degree of ionization for the inner part of the WN star a atmosphere is implied.

  4. 24 CFR 1000.328 - What is the minimum amount that an Indian tribe may receive under the need component of the formula?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false What is the minimum amount that an... Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT NATIVE AMERICAN HOUSING...

  5. 24 CFR 1000.328 - What is the minimum amount that an Indian tribe may receive under the need component of the formula?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false What is the minimum amount that an... Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT NATIVE AMERICAN HOUSING...

  6. 24 CFR 1000.328 - What is the minimum amount that an Indian tribe may receive under the need component of the formula?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false What is the minimum amount that an... Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT NATIVE AMERICAN HOUSING...

  7. 24 CFR 1000.328 - What is the minimum amount that an Indian tribe may receive under the need component of the formula?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false What is the minimum amount that an... Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT NATIVE AMERICAN HOUSING...

  8. 75 FR 32096 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-07

    .... The large number of SIAPs, Takeoff Minimums and ODPs, in addition to their complex nature and the need... Rivers, MI, Three Rivers Muni Dr. Haines, NDB RWY 27, Amdt 7A, CANCELLED Brainerd, MN, Brainerd Lakes Rgnl, RNAV (GPS) RWY 5, Amdt 1 Brainerd, MN, Brainerd Lakes Rgnl, RNAV (GPS) RWY 12, Amdt 1 Brainerd...

  9. 77 FR 71497 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    .... The large number of SIAPs, Takeoff Minimums and ODPs, in addition to their complex nature and the need.../Springfield, MA, Barnes Muni, RNAV (GPS) RWY 20, Amdt 1 Moose Lake, MN, Moose Lake Carlton County, GPS RWY 4, Orig, CANCELED Moose Lake, MN, Moose Lake Carlton County, RNAV (GPS) RWY 4, Orig Indianola, MS...

  10. 20 CFR 229.47 - Child's benefit.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Child's benefit. 229.47 Section 229.47... OVERALL MINIMUM GUARANTEE Computation of the Overall Minimum Rate § 229.47 Child's benefit. If a child is included in the computation of the overall minimum, a child's benefit of 50 percent times the Overall...

  11. 20 CFR 229.47 - Child's benefit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Child's benefit. 229.47 Section 229.47... OVERALL MINIMUM GUARANTEE Computation of the Overall Minimum Rate § 229.47 Child's benefit. If a child is included in the computation of the overall minimum, a child's benefit of 50 percent times the Overall...

  12. 20 CFR 229.47 - Child's benefit.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Child's benefit. 229.47 Section 229.47... OVERALL MINIMUM GUARANTEE Computation of the Overall Minimum Rate § 229.47 Child's benefit. If a child is included in the computation of the overall minimum, a child's benefit of 50 percent times the Overall...

  13. 20 CFR 229.47 - Child's benefit.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Child's benefit. 229.47 Section 229.47... OVERALL MINIMUM GUARANTEE Computation of the Overall Minimum Rate § 229.47 Child's benefit. If a child is included in the computation of the overall minimum, a child's benefit of 50 percent times the Overall...

  14. 20 CFR 229.47 - Child's benefit.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Child's benefit. 229.47 Section 229.47... OVERALL MINIMUM GUARANTEE Computation of the Overall Minimum Rate § 229.47 Child's benefit. If a child is included in the computation of the overall minimum, a child's benefit of 50 percent times the Overall...

  15. Assessing the impacts of Saskatchewan's minimum alcohol pricing regulations on alcohol-related crime.

    PubMed

    Stockwell, Tim; Zhao, Jinhui; Sherk, Adam; Callaghan, Russell C; Macdonald, Scott; Gatley, Jodi

    2017-07-01

    Saskatchewan's introduction in April 2010 of minimum prices graded by alcohol strength led to an average minimum price increase of 9.1% per Canadian standard drink (=13.45 g ethanol). This increase was shown to be associated with reduced consumption and switching to lower alcohol content beverages. Police also informally reported marked reductions in night-time alcohol-related crime. This study aims to assess the impacts of changes to Saskatchewan's minimum alcohol-pricing regulations between 2008 and 2012 on selected crime events often related to alcohol use. Data were obtained from Canada's Uniform Crime Reporting Survey. Auto-regressive integrated moving average time series models were used to test immediate and lagged associations between minimum price increases and rates of night-time and police identified alcohol-related crimes. Controls were included for simultaneous crime rates in the neighbouring province of Alberta, economic variables, linear trend, seasonality and autoregressive and/or moving-average effects. The introduction of increased minimum-alcohol prices was associated with an abrupt decrease in night-time alcohol-related traffic offences for men (-8.0%, P < 0.001), but not women. No significant immediate changes were observed for non-alcohol-related driving offences, disorderly conduct or violence. Significant monthly lagged effects were observed for violent offences (-19.7% at month 4 to -18.2% at month 6), which broadly corresponded to lagged effects in on-premise alcohol sales. Increased minimum alcohol prices may contribute to reductions in alcohol-related traffic-related and violent crimes perpetrated by men. Observed lagged effects for violent incidents may be due to a delay in bars passing on increased prices to their customers, perhaps because of inventory stockpiling. [Stockwell T, Zhao J, Sherk A, Callaghan RC, Macdonald S, Gatley J. Assessing the impacts of Saskatchewan's minimum alcohol pricing regulations on alcohol-related crime. Drug Alcohol Rev 2017;36:492-501]. © 2016 Australasian Professional Society on Alcohol and other Drugs.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reister, D.B.; Lenhart, S.M.

    Recent theoretical results have completely solved the problem of determining the minimum length path for a vehicle with a minimum turning radius moving from an initial configuration to a final configuration. Time optimal paths for a constant speed vehicle are a subset of the minimum length paths. This paper uses the Pontryagin maximum principle to find time optimal paths for a constant speed vehicle. The time optimal paths consist of sequences of axes of circles and straight lines. The maximum principle introduces concepts (dual variables, bang-bang solutions, singular solutions, and transversality conditions) that provide important insight into the nature ofmore » the time optimal paths. We explore the properties of the optimal paths and present some experimental results for a mobile robot following an optimal path.« less

  17. 77 FR 39774 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... orders; Chapter VI, Section 1(e)(3) to provide that Minimum Quantity Orders are treated as having a time... Intermarket Sweep Orders (``ISOs'') may have any time-in-force designation except WAIT; Chapter VI, Section 2... Chapter VI, Section 1(e)(3), to provide that Minimum Quantity Orders are treated as having a time-in...

  18. Minimum time acceleration of aircraft turbofan engines by using an algorithm based on nonlinear programming

    NASA Technical Reports Server (NTRS)

    Teren, F.

    1977-01-01

    Minimum time accelerations of aircraft turbofan engines are presented. The calculation of these accelerations was made by using a piecewise linear engine model, and an algorithm based on nonlinear programming. Use of this model and algorithm allows such trajectories to be readily calculated on a digital computer with a minimal expenditure of computer time.

  19. Determining size and dispersion of minimum viable populations for land management planning and species conservation

    NASA Astrophysics Data System (ADS)

    Lehmkuhl, John F.

    1984-03-01

    The concept of minimum populations of wildlife and plants has only recently been discussed in the literature. Population genetics has emerged as a basic underlying criterion for determining minimum population size. This paper presents a genetic framework and procedure for determining minimum viable population size and dispersion strategies in the context of multiple-use land management planning. A procedure is presented for determining minimum population size based on maintenance of genetic heterozygosity and reduction of inbreeding. A minimum effective population size ( N e ) of 50 breeding animals is taken from the literature as the minimum shortterm size to keep inbreeding below 1% per generation. Steps in the procedure adjust N e to account for variance in progeny number, unequal sex ratios, overlapping generations, population fluctuations, and period of habitat/population constraint. The result is an approximate census number that falls within a range of effective population size of 50 500 individuals. This population range defines the time range of short- to long-term population fitness and evolutionary potential. The length of the term is a relative function of the species generation time. Two population dispersion strategies are proposed: core population and dispersed population.

  20. The 2014 X-Ray Minimum of η Carinae as Seen by Swift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corcoran, M. F.; Hamaguchi, K.; Liburd, J.

    We report on Swift X-ray Telescope observations of Eta Carinae ( η Car), an extremely massive, long-period, highly eccentric binary obtained during the 2014.6 X-ray minimum/periastron passage. These observations show that η Car may have been particularly bright in X-rays going into the X-ray minimum state, while the duration of the 2014 X-ray minimum was intermediate between the extended minima seen in 1998.0 and 2003.5 by Rossi X-Ray Timing Explorer ( RXTE ), and the shorter minimum in 2009.0. The hardness ratios derived from the Swift observations showed a relatively smooth increase to a peak value occurring 40.5 days aftermore » the start of the X-ray minimum, though these observations cannot reliably measure the X-ray hardness during the deepest part of the X-ray minimum when contamination by the “central constant emission” component is significant. By comparing the timings of the RXTE and Swift observations near the X-ray minima, we derive an updated X-ray period of P {sub X} = 2023.7 ± 0.7 days, in good agreement with periods derived from observations at other wavelengths, and we compare the X-ray changes with variations in the He ii 4686 emission. The middle of the “Deep Minimum” interval, as defined by the Swift column density variations, is in good agreement with the time of periastron passage derived from the He ii λ 4686 line variations.« less

  1. Estimating occupancy and predicting numbers of gray wolf packs in Montana using hunter surveys

    USGS Publications Warehouse

    Rich, Lindsey N.; Russell, Robin E.; Glenn, Elizabeth M.; Mitchell, Michael S.; Gude, Justin A.; Podruzny, Kevin M.; Sime, Carolyn A.; Laudon, Kent; Ausband, David E.; Nichols, James D.

    2013-01-01

    Reliable knowledge of the status and trend of carnivore populations is critical to their conservation and management. Methods for monitoring carnivores, however, are challenging to conduct across large spatial scales. In the Northern Rocky Mountains, wildlife managers need a time- and cost-efficient method for monitoring gray wolf (Canis lupus) populations. Montana Fish, Wildlife and Parks (MFWP) conducts annual telephone surveys of >50,000 deer and elk hunters. We explored how survey data on hunters' sightings of wolves could be used to estimate the occupancy and distribution of wolf packs and predict their abundance in Montana for 2007–2009. We assessed model utility by comparing our predictions to MFWP minimum known number of wolf packs. We minimized false positive detections by identifying a patch as occupied if 2–25 wolves were detected by ≥3 hunters. Overall, estimates of the occupancy and distribution of wolf packs were generally consistent with known distributions. Our predictions of the total area occupied increased from 2007 to 2009 and predicted numbers of wolf packs were approximately 1.34–1.46 times the MFWP minimum counts for each year of the survey. Our results indicate that multi-season occupancy models based on public sightings can be used to monitor populations and changes in the spatial distribution of territorial carnivores across large areas where alternative methods may be limited by personnel, time, accessibility, and budget constraints.

  2. Biomarker selection and classification of "-omics" data using a two-step bayes classification framework.

    PubMed

    Assawamakin, Anunchai; Prueksaaroon, Supakit; Kulawonganunchai, Supasak; Shaw, Philip James; Varavithya, Vara; Ruangrajitpakorn, Taneth; Tongsima, Sissades

    2013-01-01

    Identification of suitable biomarkers for accurate prediction of phenotypic outcomes is a goal for personalized medicine. However, current machine learning approaches are either too complex or perform poorly. Here, a novel two-step machine-learning framework is presented to address this need. First, a Naïve Bayes estimator is used to rank features from which the top-ranked will most likely contain the most informative features for prediction of the underlying biological classes. The top-ranked features are then used in a Hidden Naïve Bayes classifier to construct a classification prediction model from these filtered attributes. In order to obtain the minimum set of the most informative biomarkers, the bottom-ranked features are successively removed from the Naïve Bayes-filtered feature list one at a time, and the classification accuracy of the Hidden Naïve Bayes classifier is checked for each pruned feature set. The performance of the proposed two-step Bayes classification framework was tested on different types of -omics datasets including gene expression microarray, single nucleotide polymorphism microarray (SNParray), and surface-enhanced laser desorption/ionization time-of-flight (SELDI-TOF) proteomic data. The proposed two-step Bayes classification framework was equal to and, in some cases, outperformed other classification methods in terms of prediction accuracy, minimum number of classification markers, and computational time.

  3. A study on effects of and stance over tuition fees.

    PubMed

    Karay, Yassin; Matthes, Jan

    2016-01-01

    Regarding tuition fees (that in Germany already have been abrogated) putative drawbacks like prolonged study duration have been suspected while benefits are not clearly proven. We investigated whether tuition fees (500 Euro per semester) affected the course of studies of Cologne medical students and asked for students' stance over tuition fees. Of 1,324 students we analyzed the rate of those passing their first medical exam ("Physikum") within minimum time and students' discontinuation rate, respectively. Regression analysis tested for putative influences of tuition fees and demographic factors. In an additional online survey 400 students answered questions regarding the load by and their stance over tuition fees. We find that fees did not affect rate of Cologne students passing their first medical exam within minimum time or students' discontinuation rate. According to the online survey, at times of tuition fees significantly more students did not attend courses as scheduled. Time spent on earning money was significantly increased. 51% of students who had to pay tuition fees and 71% of those who never had to stated tuition fees to be not justified. More than two thirds of students did not recognize any lasting benefit from tuition fees. Tuition fees did not affect discontinuation rate or study duration of Cologne medical students. However, they obviously influenced the study course due to an increased need to pursue a sideline. Cologne medical students rather refused tuition fees and did not recognize their advantages in terms of enhanced quality of studies.

  4. A study on effects of and stance over tuition fees

    PubMed Central

    Karay, Yassin; Matthes, Jan

    2016-01-01

    Aim: Regarding tuition fees (that in Germany already have been abrogated) putative drawbacks like prolonged study duration have been suspected while benefits are not clearly proven. We investigated whether tuition fees (500 Euro per semester) affected the course of studies of Cologne medical students and asked for students’ stance over tuition fees. Methods: Of 1,324 students we analyzed the rate of those passing their first medical exam (“Physikum”) within minimum time and students’ discontinuation rate, respectively. Regression analysis tested for putative influences of tuition fees and demographic factors. In an additional online survey 400 students answered questions regarding the load by and their stance over tuition fees. Results: We find that fees did not affect rate of Cologne students passing their first medical exam within minimum time or students’ discontinuation rate. According to the online survey, at times of tuition fees significantly more students did not attend courses as scheduled. Time spent on earning money was significantly increased. 51% of students who had to pay tuition fees and 71% of those who never had to stated tuition fees to be not justified. More than two thirds of students did not recognize any lasting benefit from tuition fees. Conclusion: Tuition fees did not affect discontinuation rate or study duration of Cologne medical students. However, they obviously influenced the study course due to an increased need to pursue a sideline. Cologne medical students rather refused tuition fees and did not recognize their advantages in terms of enhanced quality of studies. PMID:26958654

  5. Magnetization Switching of a Co /Pt Multilayered Perpendicular Nanomagnet Assisted by a Microwave Field with Time-Varying Frequency

    NASA Astrophysics Data System (ADS)

    Suto, Hirofumi; Kanao, Taro; Nagasawa, Tazumi; Mizushima, Koichi; Sato, Rie

    2018-05-01

    Microwave-assisted magnetization switching (MAS) is attracting attention as a method for reversing nanomagnets with a high magnetic anisotropy by using a small-amplitude magnetic field. We experimentally study MAS of a perpendicularly magnetized nanomagnet by applying a microwave magnetic field with a time-varying frequency. Because the microwave field frequency can follow the nonlinear decrease of the resonance frequency, larger magnetization excitation than that in a constant-frequency microwave field is induced, which enhances the MAS effect. The switching field decreases almost linearly as the start value of the time-varying microwave field frequency increases, and it becomes smaller than the minimum switching field in a constant-frequency microwave field. To obtain this enhancement of the MAS effect, the end value of the time-varying microwave field frequency needs to be almost the same as or lower than the critical frequency for MAS in a constant-frequency microwave field. In addition, the frequency change typically needs to take 1 ns or longer to make the rate of change slow enough for the magnetization to follow the frequency change. This switching behavior is qualitatively explained by the theory based on the macrospin model.

  6. EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards

    EPA Pesticide Factsheets

    EPA's air monitoring regulations require the use of Protocol Gases to set air pollution monitors. This protocol balances the government's need for accuracy with the producers' need for flexibility, low cost, and minimum external oversight.

  7. Quantifying groundwater travel time near managed recharge operations using 35S as an intrinsic tracer

    DOE PAGES

    Urióstegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.; ...

    2016-04-23

    By identifying groundwater retention times near managed aquifer recharge (MAR) facilities is a high priority for managing water quality, especially for operations that incorporate recycled wastewater. In order to protect public health, California guidelines for Groundwater Replenishment Reuse Projects require a minimum 2–6 month subsurface retention time for recycled water depending on the level of disinfection, which highlights the importance of quantifying groundwater travel times on short time scales. This study developed and evaluated a new intrinsic tracer method using the naturally occurring radioisotope sulfur-35 (35S). The 87.5 day half-life of 35S is ideal for investigating groundwater travel times onmore » the <1 year timescale of interest to MAR managers. Natural concentrations of 35S found in water as dissolved sulfate (35SO4) were measured in source waters and groundwater at the Rio Hondo Spreading Grounds in Los Angeles County, CA, and Orange County Groundwater Recharge Facilities in Orange County, CA. 35SO4 travel times are comparable to travel times determined by well-established deliberate tracer studies. The study also revealed that 35SO4 in MAR source water can vary seasonally and therefore careful characterization of 35SO4 is needed to accurately quantify groundwater travel time. But, more data is needed to fully assess whether or not this tracer could become a valuable tool for managers.« less

  8. Quantifying groundwater travel time near managed recharge operations using 35S as an intrinsic tracer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urióstegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.

    By identifying groundwater retention times near managed aquifer recharge (MAR) facilities is a high priority for managing water quality, especially for operations that incorporate recycled wastewater. In order to protect public health, California guidelines for Groundwater Replenishment Reuse Projects require a minimum 2–6 month subsurface retention time for recycled water depending on the level of disinfection, which highlights the importance of quantifying groundwater travel times on short time scales. This study developed and evaluated a new intrinsic tracer method using the naturally occurring radioisotope sulfur-35 (35S). The 87.5 day half-life of 35S is ideal for investigating groundwater travel times onmore » the <1 year timescale of interest to MAR managers. Natural concentrations of 35S found in water as dissolved sulfate (35SO4) were measured in source waters and groundwater at the Rio Hondo Spreading Grounds in Los Angeles County, CA, and Orange County Groundwater Recharge Facilities in Orange County, CA. 35SO4 travel times are comparable to travel times determined by well-established deliberate tracer studies. The study also revealed that 35SO4 in MAR source water can vary seasonally and therefore careful characterization of 35SO4 is needed to accurately quantify groundwater travel time. But, more data is needed to fully assess whether or not this tracer could become a valuable tool for managers.« less

  9. Quantifying groundwater travel time near managed recharge operations using 35S as an intrinsic tracer

    NASA Astrophysics Data System (ADS)

    Urióstegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.; Clark, Jordan F.

    2016-12-01

    Identifying groundwater retention times near managed aquifer recharge (MAR) facilities is a high priority for managing water quality, especially for operations that incorporate recycled wastewater. To protect public health, California guidelines for Groundwater Replenishment Reuse Projects require a minimum 2-6 month subsurface retention time for recycled water depending on the level of disinfection, which highlights the importance of quantifying groundwater travel times on short time scales. This study developed and evaluated a new intrinsic tracer method using the naturally occurring radioisotope sulfur-35 (35S). The 87.5 day half-life of 35S is ideal for investigating groundwater travel times on the <1 year timescale of interest to MAR managers. Natural concentrations of 35S found in water as dissolved sulfate (35SO4) were measured in source waters and groundwater at the Rio Hondo Spreading Grounds in Los Angeles County, CA, and Orange County Groundwater Recharge Facilities in Orange County, CA. 35SO4 travel times are comparable to travel times determined by well-established deliberate tracer studies. The study also revealed that 35SO4 in MAR source water can vary seasonally and therefore careful characterization of 35SO4 is needed to accurately quantify groundwater travel time. More data is needed to fully assess whether or not this tracer could become a valuable tool for managers.

  10. Is outdoor vector control needed for malaria elimination? An individual-based modelling study.

    PubMed

    Zhu, Lin; Müller, Günter C; Marshall, John M; Arheart, Kristopher L; Qualls, Whitney A; Hlaing, WayWay M; Schlein, Yosef; Traore, Sekou F; Doumbia, Seydou; Beier, John C

    2017-07-03

    Residual malaria transmission has been reported in many areas even with adequate indoor vector control coverage, such as long-lasting insecticidal nets (LLINs). The increased insecticide resistance in Anopheles mosquitoes has resulted in reduced efficacy of the widely used indoor tools and has been linked with an increase in outdoor malaria transmission. There are considerations of incorporating outdoor interventions into integrated vector management (IVM) to achieve malaria elimination; however, more information on the combination of tools for effective control is needed to determine their utilization. A spatial individual-based model was modified to simulate the environment and malaria transmission activities in a hypothetical, isolated African village setting. LLINs and outdoor attractive toxic sugar bait (ATSB) stations were used as examples of indoor and outdoor interventions, respectively. Different interventions and lengths of efficacy periods were tested. Simulations continued for 420 days, and each simulation scenario was repeated 50 times. Mosquito populations, entomologic inoculation rates (EIRs), probabilities of local mosquito extinction, and proportion of time when the annual EIR was reduced below one were compared between different intervention types and efficacy periods. In the village setting with clustered houses, the combinational intervention of 50% LLINs plus outdoor ATSBs significantly reduced mosquito population and EIR in short term, increased the probability of local mosquito extinction, and increased the time when annual EIR is less than one per person compared to 50% LLINs alone; outdoor ATSBs alone significantly reduced mosquito population in short term, increased the probability of mosquito extinction, and increased the time when annual EIR is less than one compared to 50% LLINs alone, but there was no significant difference in EIR in short term between 50% LLINs and outdoor ATSBs. In the village setting with dispersed houses, the combinational intervention of 50% LLINs plus outdoor ATSBs significantly reduced mosquito population in short term, increased the probability of mosquito extinction, and increased the time when annual EIR is less than one per person compared to 50% LLINs alone; outdoor ATSBs alone significantly reduced mosquito population in short term, but there were no significant difference in the probability of mosquito extinction and the time when annual EIR is less than one between 50% LLIN and outdoor ATSBs; and there was no significant difference in EIR between all three interventions. A minimum of 2 months of efficacy period is needed to bring out the best possible effect of the vector control tools, and to achieve long-term mosquito reduction, a minimum of 3 months of efficacy period is needed. The results highlight the value of incorporating outdoor vector control into IVM as a supplement to traditional indoor practices for malaria elimination in Africa, especially in village settings of clustered houses where LLINs alone is far from sufficient.

  11. 12 CFR 1750.4 - Minimum capital requirement computation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... amounts: (1) 2.50 percent times the aggregate on-balance sheet assets of the Enterprise; (2) 0.45 percent times the unpaid principal balance of mortgage-backed securities and substantially equivalent... last day of the quarter just ended (or the date for which the minimum capital report is filed, if...

  12. Use of satellite data in volcano monitoring

    NASA Technical Reports Server (NTRS)

    Mcclelland, Lindsay

    1987-01-01

    It is argued that Total Ozone Mapping Spectrometer (TOMS) data, especially data on sulfur dioxide detection in volcanic clouds, and weather satellite data complement each other. TOMS data is most useful for discovering previously unknown eruptions and indicating a minimum volume of SO sub 2 produced by a given eruption. Once an eruption has been reported, weather satellite data can be used to accurately monitor its progress. To be used effectively, these data need to be analyzed jointly and in real time. Toward this end, it is hoped that full and timely utilization can be made of existing TOMS data, a polar orbiting TOMS can be launched in the near future, and that TOMS type instruments can be included on future geostationary satellites.

  13. Insights on correlation dimension from dynamics mapping of three experimental nonlinear laser systems.

    PubMed

    McMahon, Christopher J; Toomey, Joshua P; Kane, Deb M

    2017-01-01

    We have analysed large data sets consisting of tens of thousands of time series from three Type B laser systems: a semiconductor laser in a photonic integrated chip, a semiconductor laser subject to optical feedback from a long free-space-external-cavity, and a solid-state laser subject to optical injection from a master laser. The lasers can deliver either constant, periodic, pulsed, or chaotic outputs when parameters such as the injection current and the level of external perturbation are varied. The systems represent examples of experimental nonlinear systems more generally and cover a broad range of complexity including systematically varying complexity in some regions. In this work we have introduced a new procedure for semi-automatically interrogating experimental laser system output power time series to calculate the correlation dimension (CD) using the commonly adopted Grassberger-Proccacia algorithm. The new CD procedure is called the 'minimum gradient detection algorithm'. A value of minimum gradient is returned for all time series in a data set. In some cases this can be identified as a CD, with uncertainty. Applying the new 'minimum gradient detection algorithm' CD procedure, we obtained robust measurements of the correlation dimension for many of the time series measured from each laser system. By mapping the results across an extended parameter space for operation of each laser system, we were able to confidently identify regions of low CD (CD < 3) and assign these robust values for the correlation dimension. However, in all three laser systems, we were not able to measure the correlation dimension at all parts of the parameter space. Nevertheless, by mapping the staged progress of the algorithm, we were able to broadly classify the dynamical output of the lasers at all parts of their respective parameter spaces. For two of the laser systems this included displaying regions of high-complexity chaos and dynamic noise. These high-complexity regions are differentiated from regions where the time series are dominated by technical noise. This is the first time such differentiation has been achieved using a CD analysis approach. More can be known of the CD for a system when it is interrogated in a mapping context, than from calculations using isolated time series. This has been shown for three laser systems and the approach is expected to be useful in other areas of nonlinear science where large data sets are available and need to be semi-automatically analysed to provide real dimensional information about the complex dynamics. The CD/minimum gradient algorithm measure provides additional information that complements other measures of complexity and relative complexity, such as the permutation entropy; and conventional physical measurements.

  14. Insights on correlation dimension from dynamics mapping of three experimental nonlinear laser systems

    PubMed Central

    McMahon, Christopher J.; Toomey, Joshua P.

    2017-01-01

    Background We have analysed large data sets consisting of tens of thousands of time series from three Type B laser systems: a semiconductor laser in a photonic integrated chip, a semiconductor laser subject to optical feedback from a long free-space-external-cavity, and a solid-state laser subject to optical injection from a master laser. The lasers can deliver either constant, periodic, pulsed, or chaotic outputs when parameters such as the injection current and the level of external perturbation are varied. The systems represent examples of experimental nonlinear systems more generally and cover a broad range of complexity including systematically varying complexity in some regions. Methods In this work we have introduced a new procedure for semi-automatically interrogating experimental laser system output power time series to calculate the correlation dimension (CD) using the commonly adopted Grassberger-Proccacia algorithm. The new CD procedure is called the ‘minimum gradient detection algorithm’. A value of minimum gradient is returned for all time series in a data set. In some cases this can be identified as a CD, with uncertainty. Findings Applying the new ‘minimum gradient detection algorithm’ CD procedure, we obtained robust measurements of the correlation dimension for many of the time series measured from each laser system. By mapping the results across an extended parameter space for operation of each laser system, we were able to confidently identify regions of low CD (CD < 3) and assign these robust values for the correlation dimension. However, in all three laser systems, we were not able to measure the correlation dimension at all parts of the parameter space. Nevertheless, by mapping the staged progress of the algorithm, we were able to broadly classify the dynamical output of the lasers at all parts of their respective parameter spaces. For two of the laser systems this included displaying regions of high-complexity chaos and dynamic noise. These high-complexity regions are differentiated from regions where the time series are dominated by technical noise. This is the first time such differentiation has been achieved using a CD analysis approach. Conclusions More can be known of the CD for a system when it is interrogated in a mapping context, than from calculations using isolated time series. This has been shown for three laser systems and the approach is expected to be useful in other areas of nonlinear science where large data sets are available and need to be semi-automatically analysed to provide real dimensional information about the complex dynamics. The CD/minimum gradient algorithm measure provides additional information that complements other measures of complexity and relative complexity, such as the permutation entropy; and conventional physical measurements. PMID:28837602

  15. Non-invasive imaging methods applied to neo- and paleontological cephalopod research

    NASA Astrophysics Data System (ADS)

    Hoffmann, R.; Schultz, J. A.; Schellhorn, R.; Rybacki, E.; Keupp, H.; Gerden, S. R.; Lemanis, R.; Zachow, S.

    2013-11-01

    Several non-invasive methods are common practice in natural sciences today. Here we present how they can be applied and contribute to current topics in cephalopod (paleo-) biology. Different methods will be compared in terms of time necessary to acquire the data, amount of data, accuracy/resolution, minimum-maximum size of objects that can be studied, of the degree of post-processing needed and availability. Main application of the methods is seen in morphometry and volumetry of cephalopod shells in order to improve our understanding of diversity and disparity, functional morphology and biology of extinct and extant cephalopods.

  16. 76 FR 64005 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-17

    .... The large number of SIAPs, Takeoff Minimums and ODPs, in addition to their complex nature and the need.... Part 97 is amended to read as follows: Effective 20 OCT 2011 Albert Lea, MN, Albert Lea Muni, RNAV (GPS) RWY 17, Amdt 2 Albert Lea, MN, Albert Lea Muni, RNAV (GPS) RWY 35, Amdt 1 Albert Lea, MN, Albert Lea...

  17. 26 CFR 1.58-9 - Application of the tax benefit rule to the minimum tax for taxable years beginning prior to 1987.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... no current tax benefit is derived because available credits would have reduced or eliminated the.... However, any credits that, because of such preference items, are not needed for use against regular tax.... The freed-up credits are then reduced by an amount equal to such portion of the minimum tax. (2...

  18. A sense of urgency: Evaluating the link between clinical trial development time and the accrual performance of cancer therapy evaluation program (NCI-CTEP) sponsored studies.

    PubMed

    Cheng, Steven K; Dietrich, Mary S; Dilts, David M

    2010-11-15

    Postactivation barriers to oncology clinical trial accruals are well documented; however, potential barriers prior to trial opening are not. We investigate one such barrier: trial development time. National Cancer Institute Cancer Therapy Evaluation Program (CTEP)-sponsored trials for all therapeutic, nonpediatric phase I, I/II, II, and III studies activated between 2000 and 2004 were investigated for an 8-year period (n = 419). Successful trials were those achieving 100% of minimum accrual goal. Time to open a study was the calendar time from initial CTEP submission to trial activation. Multivariate logistic regression analysis was used to calculate unadjusted and adjusted odds ratios (OR), controlling for study phase and size of expected accruals. Among the CTEP-approved oncology trials, 37.9% (n = 221) failed to attain the minimum accrual goals, with 70.8% (n = 14) of phase III trials resulting in poor accrual. A total of 16,474 patients (42.5% of accruals) accrued to those studies were unable to achieve the projected minimum accrual goal. Trials requiring less than 12 months of development were significantly more likely to achieve accrual goals (OR, 2.15; 95% confidence interval, 1.29-3.57, P = 0.003) than trials with the median development times of 12 to 18 months. Trials requiring a development time of greater than 24 months were significantly less likely to achieve accrual goals (OR, 0.40; 95% confidence interval, 0.20-0.78; P = 0.011) than trials with the median development time. A large percentage of oncology clinical trials do not achieve minimum projected accruals. Trial development time appears to be one important predictor of the likelihood of successfully achieving the minimum accrual goals. ©2010 AACR.

  19. 12 CFR Appendix M1 to Part 1026 - Repayment Disclosures

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... terms of a cardholder's account that will expire in a fixed period of time, as set forth by the card... estimates. (1) Minimum payment formulas. When calculating the minimum payment repayment estimate, card... calculate the minimum payment amount for special purchases, such as a “club plan purchase.” Also, assume...

  20. 12 CFR Appendix M1 to Part 1026 - Repayment Disclosures

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... terms of a cardholder's account that will expire in a fixed period of time, as set forth by the card... estimates. (1) Minimum payment formulas. When calculating the minimum payment repayment estimate, card... calculate the minimum payment amount for special purchases, such as a “club plan purchase.” Also, assume...

  1. 5 CFR 875.212 - Is there a minimum application age?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Is there a minimum application age? 875.212 Section 875.212 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE... application age? Yes, there is a minimum application age. You must be at least 18 years old at the time you...

  2. 5 CFR 875.212 - Is there a minimum application age?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Is there a minimum application age? 875.212 Section 875.212 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE... application age? Yes, there is a minimum application age. You must be at least 18 years old at the time you...

  3. 5 CFR 875.212 - Is there a minimum application age?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Is there a minimum application age? 875.212 Section 875.212 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE... application age? Yes, there is a minimum application age. You must be at least 18 years old at the time you...

  4. 5 CFR 875.212 - Is there a minimum application age?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Is there a minimum application age? 875.212 Section 875.212 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE... application age? Yes, there is a minimum application age. You must be at least 18 years old at the time you...

  5. 5 CFR 875.212 - Is there a minimum application age?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Is there a minimum application age? 875.212 Section 875.212 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE... application age? Yes, there is a minimum application age. You must be at least 18 years old at the time you...

  6. The Effect of Minimum Wage Rates on High School Completion

    ERIC Educational Resources Information Center

    Warren, John Robert; Hamrock, Caitlin

    2010-01-01

    Does increasing the minimum wage reduce the high school completion rate? Previous research has suffered from (1. narrow time horizons, (2. potentially inadequate measures of states' high school completion rates, and (3. potentially inadequate measures of minimum wage rates. Overcoming each of these limitations, we analyze the impact of changes in…

  7. 29 CFR 780.301 - Other pertinent statutory provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Employment in Agriculture That Is Exempted From the Minimum Wage and Overtime Pay Requirements Under Section... minimum wage protection (section 6(a)(5)) for agriculture workers for the first time sought to provide a minimum wage floor for the farmworkers on large farms or agri-business enterprises. The section 13(a)(6)(A...

  8. Implications of Responsive Space on the Flight Software Architecture

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.

  9. A sampling plan for conduit-flow karst springs: Minimizing sampling cost and maximizing statistical utility

    USGS Publications Warehouse

    Currens, J.C.

    1999-01-01

    Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.

  10. 49 CFR 238.230 - Safety appliances-new equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... a minimum weld strength, based on yield, of three times the strength of the number of SAE grade 2, 1...; (v) The weld is designed for infinite fatigue life in the application that it will be placed; (vi... upon request. At a minimum, this record shall include the date, time, location, identification of the...

  11. Recent Studies of the Behavior of the Sun's White-Light Corona Over Time

    NASA Technical Reports Server (NTRS)

    SaintCyr, O. C.; Young, D. E.; Pesnell, W. D.; Lecinski, A.; Eddy, J.

    2008-01-01

    Predictions of upcoming solar cycles are often related to the nature and dynamics of the Sun's polar magnetic field and its influence on the corona. For the past 30 years we have a more-or-less continuous record of the Sun's white-light corona from groundbased and spacebased coronagraphs. Over that interval, the large scale features of the corona have varied in what we now consider a 'predictable' fashion--complex, showing multiple streamers at all latitudes during solar activity maximum; and a simple dipolar shape aligned with the rotational pole during solar minimum. Over the past three decades the white-light corona appears to be a better indicator of 'true' solar minimum than sunspot number since sunspots disappear for months (even years) at solar minimum. Since almost all predictions of the timing of the next solar maximum depend on the timing of solar minimum, the white-light corona is a potentially important observational discriminator for future predictors. In this contribution we describe recent work quantifying the large-scale appearance of the Sun's corona to correlate it with the sunspot record, especially around solar minimum. These three decades can be expanded with the HAO archive of eclipse photographs which, although sparse compared to the coronagraphic coverage, extends back to 1869. A more extensive understanding of this proxy would give researchers confidence in using the white-light corona as an indicator of solar minimum conditions.

  12. Architecting the Communication and Navigation Networks for NASA's Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Bhassin, Kul B.; Putt, Chuck; Hayden, Jeffrey; Tseng, Shirley; Biswas, Abi; Kennedy, Brian; Jennings, Esther H.; Miller, Ron A.; Hudiburg, John; Miller, Dave; hide

    2007-01-01

    NASA is planning a series of short and long duration human and robotic missions to explore the Moon and then Mars. A key objective of the missions is to grow, through a series of launches, a system of systems communication, navigation, and timing infrastructure at minimum cost while providing a network-centric infrastructure that maximizes the exploration capabilities and science return. There is a strong need to use architecting processes in the mission pre-formulation stage to describe the systems, interfaces, and interoperability needed to implement multiple space communication systems that are deployed over time, yet support interoperability with each deployment phase and with 20 years of legacy systems. In this paper we present a process for defining the architecture of the communications, navigation, and networks needed to support future space explorers with the best adaptable and evolable network-centric space exploration infrastructure. The process steps presented are: 1) Architecture decomposition, 2) Defining mission systems and their interfaces, 3) Developing the communication, navigation, networking architecture, and 4) Integrating systems, operational and technical views and viewpoints. We demonstrate the process through the architecture development of the communication network for upcoming NASA space exploration missions.

  13. Analysis of Trajectory Parameters for Probe and Round-Trip Missions to Venus

    NASA Technical Reports Server (NTRS)

    Dugan, James F., Jr.; Simsic, Carl R.

    1960-01-01

    For one-way transfers between Earth and Venus, charts are obtained that show velocity, time, and angle parameters as functions of the eccentricity and semilatus rectum of the Sun-focused vehicle conic. From these curves, others are obtained that are useful in planning one-way and round-trip missions to Venus. The analysis is characterized by circular coplanar planetary orbits, successive two-body approximations, impulsive velocity changes, and circular parking orbits at 1.1 planet radii. For round trips the mission time considered ranges from 65 to 788 days, while wait time spent in the parking orbit at Venus ranges from 0 to 467 days. Individual velocity increments, one-way travel times, and departure dates are presented for round trips requiring the minimum total velocity increment. For both single-pass and orbiting Venusian probes, the time span available for launch becomes appreciable with only a small increase in velocity-increment capability above the minimum requirement. Velocity-increment increases are much more effective in reducing travel time for single-pass probes than they are for orbiting probes. Round trips composed of a direct route along an ellipse tangent to Earth's orbit and an aphelion route result in the minimum total velocity increment for wait times less than 100 days and mission times ranging from 145 to 612 days. Minimum-total-velocity-increment trips may be taken along perihelion-perihelion routes for wait times ranging from 300 to 467 days. These wait times occur during missions lasting from 640 to 759 days.

  14. Numerical investigation of implementation of air-earth boundary by acoustic-elastic boundary approach

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2007-01-01

    The need for incorporating the traction-free condition at the air-earth boundary for finite-difference modeling of seismic wave propagation has been discussed widely. A new implementation has been developed for simulating elastic wave propagation in which the free-surface condition is replaced by an explicit acoustic-elastic boundary. Detailed comparisons of seismograms with different implementations for the air-earth boundary were undertaken using the (2,2) (the finite-difference operators are second order in time and space) and the (2,6) (second order in time and sixth order in space) standard staggered-grid (SSG) schemes. Methods used in these comparisons to define the air-earth boundary included the stress image method (SIM), the heterogeneous approach, the scheme of modifying material properties based on transversely isotropic medium approach, the acoustic-elastic boundary approach, and an analytical approach. The method proposed achieves the same or higher accuracy of modeled body waves relative to the SIM. Rayleigh waves calculated using the explicit acoustic-elastic boundary approach differ slightly from those calculated using the SIM. Numerical results indicate that when using the (2,2) SSG scheme for SIM and our new method, a spatial step of 16 points per minimum wavelength is sufficient to achieve 90% accuracy; 32 points per minimum wavelength achieves 95% accuracy in modeled Rayleigh waves. When using the (2,6) SSG scheme for the two methods, a spatial step of eight points per minimum wavelength achieves 95% accuracy in modeled Rayleigh waves. Our proposed method is physically reasonable and, based on dispersive analysis of simulated seismographs from a layered half-space model, is highly accurate. As a bonus, our proposed method is easy to program and slightly faster than the SIM. ?? 2007 Society of Exploration Geophysicists.

  15. Model System to Define Pharmacokinetic Requirements for Antimalarial Drug Efficacy

    PubMed Central

    Bakshi, Rahul P.; Nenortas, Elizabeth; Tripathi, Abhai K.; Sullivan, David J.; Shapiro, Theresa A.

    2013-01-01

    Malaria presents a tremendous public health burden and new therapies are needed. Massive compound libraries screened against Plasmodium falciparum have yielded thousands of lead compounds, resulting in an acute need for rational criteria to select the best candidates for development. We reasoned that, akin to antibacterials, antimalarials might have an essential pharmacokinetic requirement for efficacy: action governed either by total exposure or peak concentration (AUC/CMAX), or by duration above a defined minimum concentration (Time above Minimum Inhibitory Concentration, TMIC). We devised an in vitro system for P. falciparum, capable of mimicking the dynamic fluctuations of a drug in vivo. Utilizing this apparatus, we find that chloroquine is TMIC-dependent while the efficacy of artemisinin is driven by CMAX. The latter was confirmed in a mouse model of malaria. These characteristics can explain the clinical success of two antimalarial drugs with widely different kinetics in humans. Chloroquine, which persists for weeks, is ideally suited for its TMIC mechanism, whereas great efficacy despite short exposure (t1/2 in blood 3 h or less) is attained by CMAX-driven artemisinins. This validated preclinical model system can be used to select those antimalarial lead compounds whose CMAX or TMIC requirement for efficacy match pharmacokinetics obtained in vivo. The apparatus can also be used to explore the kinetic-dependence of other pharmacodynamic endpoints in parasites. PMID:24089407

  16. Biochemical methane potential (BMP) tests: Reducing test time by early parameter estimation.

    PubMed

    Da Silva, C; Astals, S; Peces, M; Campos, J L; Guerrero, L

    2018-01-01

    Biochemical methane potential (BMP) test is a key analytical technique to assess the implementation and optimisation of anaerobic biotechnologies. However, this technique is characterised by long testing times (from 20 to >100days), which is not suitable for waste utilities, consulting companies or plants operators whose decision-making processes cannot be held for such a long time. This study develops a statistically robust mathematical strategy using sensitivity functions for early prediction of BMP first-order model parameters, i.e. methane yield (B 0 ) and kinetic constant rate (k). The minimum testing time for early parameter estimation showed a potential correlation with the k value, where (i) slowly biodegradable substrates (k≤0.1d -1 ) have a minimum testing times of ≥15days, (ii) moderately biodegradable substrates (0.1

  17. PAVENET OS: A Compact Hard Real-Time Operating System for Precise Sampling in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Saruwatari, Shunsuke; Suzuki, Makoto; Morikawa, Hiroyuki

    The paper shows a compact hard real-time operating system for wireless sensor nodes called PAVENET OS. PAVENET OS provides hybrid multithreading: preemptive multithreading and cooperative multithreading. Both of the multithreading are optimized for two kinds of tasks on wireless sensor networks, and those are real-time tasks and best-effort ones. PAVENET OS can efficiently perform hard real-time tasks that cannot be performed by TinyOS. The paper demonstrates the hybrid multithreading realizes compactness and low overheads, which are comparable to those of TinyOS, through quantitative evaluation. The evaluation results show PAVENET OS performs 100 Hz sensor sampling with 0.01% jitter while performing wireless communication tasks, whereas optimized TinyOS has 0.62% jitter. In addition, PAVENET OS has a small footprint and low overheads (minimum RAM size: 29 bytes, minimum ROM size: 490 bytes, minimum task switch time: 23 cycles).

  18. Preoperative fasting for elective surgery in a regional hospital in Oman.

    PubMed

    Abdullah Al Maqbali, Mohammed

    2016-07-28

    A fasting period before anesthesia is necessary to avoid the aspiration of stomach contents, which can be threatening to the patient's life. Guidelines from professional societies in the USA and UK recommend that healthy patients fast for 6 hours from solid food and 2 hours from liquids. Despite this, many institutions still practice nil-by-mouth after midnight. This can affect the patient's recovery after surgery, and increase the length of stay in hospital. The aim of this study was to assess the duration of fasting before elective surgery on the part of adult patients. A prospective study was conducted to identify the fasting time and complications among surgical patients undergoing elective surgery over a 4-month period. The patients were asked for preoperative fasting times, and the complication. The demographic data were taken from the patients' files. A total of 169 patients were included in the study, 88 male and 81 female. The minimum and maximum fasting hours with regard to food were 7 hours and 19 hours, respectively; all the patients fasted from food for longer than the recommended time. The minimum and maximum fasting hours for fluids were 4 hours and 19 hours, respectively; all the patients fasted from fluid for longer than the recommended time. Most of the patients fasted from food and fluids for more than the time recommended by the American Society of Anaesthesiologists, the Royal College of Nursing, the Association of Anaesthetists of Great Britain and Ireland and the Royal College of Anaesthetists. Excessive fasting could lead to discomfort and possible morbidity in surgical patients. The surgical team needs to collaborate to reduce the fasting time by revising the operative list.

  19. Long-term variability in sugarcane bagasse feedstock compositional methods: Sources and magnitude of analytical variability

    DOE PAGES

    Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie; ...

    2016-10-18

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less

  20. Addressing the minimum fleet problem in on-demand urban mobility.

    PubMed

    Vazifeh, M M; Santi, P; Resta, G; Strogatz, S H; Ratti, C

    2018-05-01

    Information and communication technologies have opened the way to new solutions for urban mobility that provide better ways to match individuals with on-demand vehicles. However, a fundamental unsolved problem is how best to size and operate a fleet of vehicles, given a certain demand for personal mobility. Previous studies 1-5 either do not provide a scalable solution or require changes in human attitudes towards mobility. Here we provide a network-based solution to the following 'minimum fleet problem', given a collection of trips (specified by origin, destination and start time), of how to determine the minimum number of vehicles needed to serve all the trips without incurring any delay to the passengers. By introducing the notion of a 'vehicle-sharing network', we present an optimal computationally efficient solution to the problem, as well as a nearly optimal solution amenable to real-time implementation. We test both solutions on a dataset of 150 million taxi trips taken in the city of New York over one year 6 . The real-time implementation of the method with near-optimal service levels allows a 30 per cent reduction in fleet size compared to current taxi operation. Although constraints on driver availability and the existence of abnormal trip demands may lead to a relatively larger optimal value for the fleet size than that predicted here, the fleet size remains robust for a wide range of variations in historical trip demand. These predicted reductions in fleet size follow directly from a reorganization of taxi dispatching that could be implemented with a simple urban app; they do not assume ride sharing 7-9 , nor require changes to regulations, business models, or human attitudes towards mobility to become effective. Our results could become even more relevant in the years ahead as fleets of networked, self-driving cars become commonplace 10-14 .

  1. Tracking of climatic niche boundaries under recent climate change.

    PubMed

    La Sorte, Frank A; Jetz, Walter

    2012-07-01

    1. Global climate has changed significantly during the past 30 years and especially in northern temperate regions which have experienced poleward shifts in temperature regimes. While there is evidence that some species have responded by moving their distributions to higher latitudes, the efficiency of this response in tracking species' climatic niche boundaries over time has yet to be addressed. 2. Here, we provide a continental assessment of the temporal structure of species responses to recent spatial shifts in climatic conditions. We examined geographic associations with minimum winter temperature for 59 species of winter avifauna at 476 Christmas Bird Count circles in North America from 1975 to 2009 under three sampling schemes that account for spatial and temporal sampling effects. 3. Minimum winter temperature associated with species occurrences showed an overall increase with a weakening trend after 1998. Species displayed highly variable responses that, on average and across sampling schemes, contained a strong lag effect that weakened in strength over time. In general, the conservation of minimum winter temperature was relevant when all species were considered together but only after an initial lag period (c. 35 years) was overcome. The delayed niche tracking observed at the combined species level was likely supported by the post1998 lull in the warming trend. 4. There are limited geographic and ecological explanations for the observed variability, suggesting that the efficiency of species' responses under climate change is likely to be highly idiosyncratic and difficult to predict. This outcome is likely to be even more pronounced and time lags more persistent for less vagile taxa, particularly during the periods of consistent or accelerating warming. Current modelling efforts and conservation strategies need to better appreciate the variation, strength and duration of lag effects and their association with climatic variability. Conservation strategies in particular will benefit through identifying and maintaining dispersal corridors that accommodate diverging dispersal strategies and timetables. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.

  2. Long-term variability in sugarcane bagasse feedstock compositional methods: Sources and magnitude of analytical variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less

  3. Human rights assessment in Parc Jean Marie Vincent, Port-au-Prince, Haiti.

    PubMed

    Cullen, Kimberly A; Ivers, Louise C

    2010-12-15

    Months after a 7.0 magnitude earthquake hit Port-au-Prince, Haiti, over one million remain homeless and living in spontaneous internally displaced person (IDP) camps. Billions of dollars from aid organizations and government agencies have been pledged toward the relief effort, yet many basic human needs, including food, shelter, and sanitation, continue to be unmet. The Sphere Project, "Humanitarian Charter and Minimum Standards in Disaster Response," identifies the minimum standards to be attained in disaster response. From a human rights perspective and utilizing key indicators from the Sphere Project as benchmarks, this article reports on an assessment of the living conditions approximately 12 weeks after the earthquake in Parc Jean Marie Vincent, a spontaneous IDP camp in Port-au-Prince. A stratified random sample of households in the camp, proportionate to the number of families living in each sector, was selected. Interview questions were designed to serve as "key indicators" for the Sphere Project minimum standards. A total of 486 interviews were completed, representing approximately 5% of households in each of the five sectors of the camp. Our assessment identified the relative achievements and shortcomings in the provision of relief services in Parc Jean Marie Vincent. At the time of this survey, the Sphere Project minimum standards for access to health care and quantity of water per person per day were being met. Food, shelter, sanitation, and security were below minimum accepted standard and of major concern. The formal assessment reported here was completed by September 2010, and is necessarily limited to conditions in Haiti before the cholera outbreak in October. Copyright © 2010 Cullen and Ivers. This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.

  4. Solar Drivers of 11-yr and Long-Term Cosmic Ray Modulation

    NASA Technical Reports Server (NTRS)

    Cliver, E. W.; Richardson, I. G.; Ling, A. G.

    2011-01-01

    In the current paradigm for the modulation of galactic cosmic rays (GCRs), diffusion is taken to be the dominant process during solar maxima while drift dominates at minima. Observations during the recent solar minimum challenge the pre-eminence of drift: at such times. In 2009, the approx.2 GV GCR intensity measured by the Newark neutron monitor increased by approx.5% relative to its maximum value two cycles earlier even though the average tilt angle in 2009 was slightly larger than that in 1986 (approx.20deg vs. approx.14deg), while solar wind B was significantly lower (approx.3.9 nT vs. approx.5.4 nT). A decomposition of the solar wind into high-speed streams, slow solar wind, and coronal mass ejections (CMEs; including postshock flows) reveals that the Sun transmits its message of changing magnetic field (diffusion coefficient) to the heliosphere primarily through CMEs at solar maximum and high-speed streams at solar minimum. Long-term reconstructions of solar wind B are in general agreement for the approx. 1900-present interval and can be used to reliably estimate GCR intensity over this period. For earlier epochs, however, a recent Be-10-based reconstruction covering the past approx. 10(exp 4) years shows nine abrupt and relatively short-lived drops of B to < or approx.= 0 nT, with the first of these corresponding to the Sporer minimum. Such dips are at variance with the recent suggestion that B has a minimum or floor value of approx.2.8 nT. A floor in solar wind B implies a ceiling in the GCR intensity (a permanent modulation of the local interstellar spectrum) at a given energy/rigidity. The 30-40% increase in the intensity of 2.5 GV electrons observed by Ulysses during the recent solar minimum raises an interesting paradox that will need to be resolved.

  5. Magazines as wilderness information sources: assessing users' general wilderness knowledge and specific leave no trace knowledge

    Treesearch

    John J. Confer; Andrew J. Mowen; Alan K. Graefe; James D. Absher

    2000-01-01

    The Leave No Trace (LNT) educational program has the potential to provide wilderness users with useful minimum impact information. For LNT to be effective, managers need to understand who is most/least aware of minimum impact practices and how to expose users to LNT messages. This study examined LNT knowledge among various user groups at an Eastern wilderness area and...

  6. An Analysis of Minimum Service Standards (MSS) in Basic Education: A Case Study at Magelang Municipality, Central Java, Indonesia

    ERIC Educational Resources Information Center

    Haryati, Sri

    2014-01-01

    The study aims at analyzing the achievement of Minimum Service Standards (MSS) in Basic Education through a case study at Magelang Municipality. The findings shall be used as a starting point to predict the needs to meet MMS by 2015 and to provide strategies for achievement. Both primary and secondary data were used in the study investigating the…

  7. Minimum number of measurements for evaluating soursop (Annona muricata L.) yield.

    PubMed

    Sánchez, C F B; Teodoro, P E; Londoño, S; Silva, L A; Peixoto, L A; Bhering, L L

    2017-05-31

    Repeatability studies on fruit species are of great importance to identify the minimum number of measurements necessary to accurately select superior genotypes. This study aimed to identify the most efficient method to estimate the repeatability coefficient (r) and predict the minimum number of measurements needed for a more accurate evaluation of soursop (Annona muricata L.) genotypes based on fruit yield. Sixteen measurements of fruit yield from 71 soursop genotypes were carried out between 2000 and 2016. In order to estimate r with the best accuracy, four procedures were used: analysis of variance, principal component analysis based on the correlation matrix, principal component analysis based on the phenotypic variance and covariance matrix, and structural analysis based on the correlation matrix. The minimum number of measurements needed to predict the actual value of individuals was estimated. Principal component analysis using the phenotypic variance and covariance matrix provided the most accurate estimates of both r and the number of measurements required for accurate evaluation of fruit yield in soursop. Our results indicate that selection of soursop genotypes with high fruit yield can be performed based on the third and fourth measurements in the early years and/or based on the eighth and ninth measurements at more advanced stages.

  8. State-Level Community Benefit Regulation and Nonprofit Hospitals' Provision of Community Benefits.

    PubMed

    Singh, Simone R; Young, Gary J; Loomer, Lacey; Madison, Kristin

    2018-04-01

    Do nonprofit hospitals provide enough community benefits to justify their tax exemptions? States have sought to enhance nonprofit hospitals' accountability and oversight through regulation, including requirements to report community benefits, conduct community health needs assessments, provide minimum levels of community benefits, and adhere to minimum income eligibility standards for charity care. However, little research has assessed these regulations' impact on community benefits. Using 2009-11 Internal Revenue Service data on community benefit spending for more than eighteen hundred hospitals and the Hilltop Institute's data on community benefit regulation, we investigated the relationship between these four types of regulation and the level and types of hospital-provided community benefits. Our multivariate regression analyses showed that only community health needs assessments were consistently associated with greater community benefit spending. The results for reporting and minimum spending requirements were mixed, while minimum income eligibility standards for charity care were unrelated to community benefit spending. State adoption of multiple types of regulation was consistently associated with higher levels of hospital-provided community benefits, possibly because regulatory intensity conveys a strong signal to the hospital community that more spending is expected. This study can inform efforts to design regulations that will encourage hospitals to provide community benefits consistent with policy makers' goals. Copyright © 2018 by Duke University Press.

  9. Sampling frequency for water quality variables in streams: Systems analysis to quantify minimum monitoring rates.

    PubMed

    Chappell, Nick A; Jones, Timothy D; Tych, Wlodek

    2017-10-15

    Insufficient temporal monitoring of water quality in streams or engineered drains alters the apparent shape of storm chemographs, resulting in shifted model parameterisations and changed interpretations of solute sources that have produced episodes of poor water quality. This so-called 'aliasing' phenomenon is poorly recognised in water research. Using advances in in-situ sensor technology it is now possible to monitor sufficiently frequently to avoid the onset of aliasing. A systems modelling procedure is presented allowing objective identification of sampling rates needed to avoid aliasing within strongly rainfall-driven chemical dynamics. In this study aliasing of storm chemograph shapes was quantified by changes in the time constant parameter (TC) of transfer functions. As a proportion of the original TC, the onset of aliasing varied between watersheds, ranging from 3.9-7.7 to 54-79 %TC (or 110-160 to 300-600 min). However, a minimum monitoring rate could be identified for all datasets if the modelling results were presented in the form of a new statistic, ΔTC. For the eight H + , DOC and NO 3 -N datasets examined from a range of watershed settings, an empirically-derived threshold of 1.3(ΔTC) could be used to quantify minimum monitoring rates within sampling protocols to avoid artefacts in subsequent data analysis. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Hemodialysis Adequacy Monitoring Information System: Minimum Data Set and Capabilities Required.

    PubMed

    Jebraeily, Mohamad; Ghazisaeidi, Marjan; Safdari, Reza; Makhdoomi, Khadijeh; Rahimi, Bahlol

    2015-08-01

    In dialysis centers both nephrologists and nurses are faced with the challenge of ensuring reliable and efficient care accordance with the clinical guideline. Hemodialysis adequacy monitoring information system therefore enable the automation of tasks, which ultimately allows doctors and nursing staff more time to dedicate to the individual treatment of patients. Development of the information systems in healthcare has made the use of the Minimum data set inevitable. The purpose of this study was determined MDS and capabilities required in hemodialysis adequacy monitoring information system. This is a cross-sectional survey conducted with participation of 320 nephrology specialists in 2015. Data were collected using an electronic questionnaire which was estimated as both reliable and valid. The data were analyzed by SPSS software descriptive statistics and analytical statistics. Overall 42 data elements were determined as final set in 4 major categories (patient demographics, medical history, treatment plan and hemodialysis adequacy). The most capabilities required of hemodialysis information system were related to calculate of dialysis adequacy Index (4.80), advice optimal dose of dialysis for each patient (4.63), Easy access to information system without restrictions of time and place (4.61), providing alerts when dialysis adequacy index below the standard (4.55) and Interchange to other information systems in hospitals (4.46) respectively. In design and implementation of information systems focus on MDS and identification IS capabilities based on the users' needs, due to the wide participation users and also the success of the information system. Therefore it is necessary that MDS evaluated carefully with regard to the intended uses of the data. Also information systems based on capabilities the ability to meet the needs of their users.

  11. Research Findings on Xylitol and the Development of Xylitol Vehicles to Address Public Health Needs

    PubMed Central

    Milgrom, P.; Ly, K.A.; Rothen, M.

    2013-01-01

    Xylitol has been demonstrated to be a safe and effective tooth decay preventive agent when used habitually. Nevertheless, its application has been limited by absence of formulations that demand minimal adherence and are acceptable and safe in settings where chewing gum may not be allowed. A substantial literature suggests that a minimum of five to six grams and three exposures per day from chewing gum or candies are needed for a clinical effect. At the same time there is conflicting evidence in the literature from toothpaste studies suggesting that lower-doses and less frequent exposures might be effective. The growing use of xylitol as a sweetener in low amounts in foods and other consumables is, simultaneously, increasing the overall exposure of the public to xylitol and may have additive benefits. PMID:19710081

  12. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    NASA Astrophysics Data System (ADS)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  13. 40 CFR Table 1 to Subpart III of... - Emission Limitations

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determining compliance using this method Cadmium 0.004 milligrams per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of part 60). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance...

  14. 40 CFR Table 1 to Subpart Eeee of... - Emission Limitations

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determiningcompliance using this method 1. Cadmium 18 micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour minimum sample time per run during performance test), and 12-hour...

  15. 40 CFR Table 1 to Subpart III of... - Emission Limitations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... determining compliance using this method Cadmium 0.004 milligrams per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of part 60). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance...

  16. 40 CFR Table 1 to Subpart Eeee of... - Emission Limitations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... determiningcompliance using this method 1. Cadmium 18 micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour minimum sample time per run during performance test), and 12-hour...

  17. Optimal impulsive time-fixed orbital rendezvous and interception with path constraints

    NASA Technical Reports Server (NTRS)

    Taur, D.-R.; Prussing, J. E.; Coverstone-Carroll, V.

    1990-01-01

    Minimum-fuel, impulsive, time-fixed solutions are obtained for the problem of orbital rendezvous and interception with interior path constraints. Transfers between coplanar circular orbits in an inverse-square gravitational field are considered, subject to a circular path constraint representing a minimum or maximum permissible orbital radius. Primer vector theory is extended to incorporate path constraints. The optimal number of impulses, their times and positions, and the presence of initial or final coasting arcs are determined. The existence of constraint boundary arcs and boundary points is investigated as well as the optimality of a class of singular arc solutions. To illustrate the complexities introduced by path constraints, an analysis is made of optimal rendezvous in field-free space subject to a minimum radius constraint.

  18. Electrofishing distance needed to estimate consistent Index of Biotic Integrity (IBI) scores in raftable Oregon rivers

    EPA Science Inventory

    An important issue surrounding assessment of riverine fish assemblages is the minimum amount of sampling distance needed to adequately determine biotic condition. Determining adequate sampling distance is important because sampling distance affects estimates of fish assemblage c...

  19. Size, weight, and power reduction of mercury cadmium telluride infrared detection modules

    NASA Astrophysics Data System (ADS)

    Breiter, Rainer; Ihle, Tobias; Wendler, Joachim C.; Lutz, Holger; Rutzinger, Stefan; Schallenberg, Timo; Hofmann, Karl C.; Ziegler, Johann

    2011-06-01

    Application requirements driving present IR technology development activities are improved capability to detect and identify a threat as well as the need to reduce size weight and power consumption (SWaP) of thermal sights. In addition to the development of 3rd Gen IR modules providing dual-band or dual-color capability, AIM is focused on IR FPAs with reduced pitch and high operating temperature for SWaP reduction. State-of-the-art MCT technology allows AIM the production of mid-wave infrared (MWIR) detectors operating at temperatures exceeding 120 K without any need to sacrifice the 5-μm cut-off wavelength. These FPAs allow manufacturing of low cost IR modules with minimum size, weight, and power for state-of-the-art high performance IR systems. AIM has realized full TV format MCT 640×512 mid-wave and long-wave IR detection modules with a 15-μm pitch to meet the requirements of critical military applications like thermal weapon sights or thermal imagers in unmanned aerial vehicles applications. In typical configurations like an F/4.6 cold shield for the 640×512 MWIR module an noise equivalent temperature difference (NETD) <25 mK @ 5 ms integration time is achieved, while the long-wavelength infrared (LWIR) modules achieve an NETD <38 mK @ F/2 and 180 μs integration time. For the LWIR modules, FPAs with a cut-off up to 10 μm have been realized. The modules are available either with different integral rotary cooler configurations for portable applications that require minimum cooling power or a new split linear cooler providing long lifetime with a mean time to failure (MTTF) > 20000, e.g., for warning sensors in 24/7 operation. The modules are available with optional image processing electronics providing nonuniformity correction and further image processing for a complete IR imaging solution. The latest results and performance of those modules and their applications are presented.

  20. 29 CFR 780.620 - Minimum wage for livestock auction work.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... minimum rate required by section 6(a)(1) of the Act for the time spent in livestock auction work. The exemption does not apply unless there is payment for all hours spent in livestock auction work at not less... 29 Labor 3 2010-07-01 2010-07-01 false Minimum wage for livestock auction work. 780.620 Section...

  1. The Age 21 Minimum Legal Drinking Age Law. Prevention Update

    ERIC Educational Resources Information Center

    Higher Education Center for Alcohol, Drug Abuse, and Violence Prevention, 2011

    2011-01-01

    Currently, all 50 states limit alcohol purchases to people aged 21 and over. But that hasn't always been the case. In fact, it was July 17, 1984, when President Ronald Reagan signed the national 21 minimum legal drinking age (MLDA) legislation into law. At that time, only 23 states had minimum alcohol purchasing ages of 21 years old. The…

  2. Scaling-up the minimum requirements analysis for big wilderness issues

    Treesearch

    David N. Cole

    2007-01-01

    The concept of applying a "minimum requirements" analysis to decisions about administrative actions in wilderness in the United States has been around for a long time. It comes from Section 4(c) of the Wilderness Act of 1964, which states that "except as necessary to meet minimum requirements for the administration of the area for the purposes of this...

  3. Mechanical properties of direct core build-up materials.

    PubMed

    Combe, E C; Shaglouf, A M; Watts, D C; Wilson, N H

    1999-05-01

    This work was undertaken to measure mechanical properties of a diverse group of materials used for direct core build-ups, including a high copper amalgam, a silver cermet cement, a VLC resin composite and two composites specifically developed for this application. Compressive strength, elastic modulus, diametral tensile strength and flexural strength and modulus were measured for each material as a function of time up to 3 months, using standard specification tests designed for the materials. All the materials were found to meet the minimum specification requirements except in terms of flexural strength for the amalgam after 1 h and the silver cermet at all time intervals. There proved to be no obvious superior material in all respects for core build-ups, and the need exists for a specification to be established specifically for this application.

  4. Celeris: A GPU-accelerated open source software with a Boussinesq-type wave solver for real-time interactive simulation and visualization

    NASA Astrophysics Data System (ADS)

    Tavakkol, Sasan; Lynett, Patrick

    2017-08-01

    In this paper, we introduce an interactive coastal wave simulation and visualization software, called Celeris. Celeris is an open source software which needs minimum preparation to run on a Windows machine. The software solves the extended Boussinesq equations using a hybrid finite volume-finite difference method and supports moving shoreline boundaries. The simulation and visualization are performed on the GPU using Direct3D libraries, which enables the software to run faster than real-time. Celeris provides a first-of-its-kind interactive modeling platform for coastal wave applications and it supports simultaneous visualization with both photorealistic and colormapped rendering capabilities. We validate our software through comparison with three standard benchmarks for non-breaking and breaking waves.

  5. Period Study and Analyses of 2017 Observations of the Totally Eclipsing, Solar Type Binary, MT Camelopardalis

    NASA Astrophysics Data System (ADS)

    Faulkner, Danny R.; Samec, Ronald G.; Caton, Daniel B.

    2018-06-01

    We report here on a period study and the analysis of BVRcIc light curves (taken in 2017) of MT Cam (GSC03737-01085), which is a solar type (T ~ 5500K) eclipsing binary. D. Caton observed MT Cam on 05, 14, 15, 16, and 17, December 2017 with the 0.81-m reflector at Dark Sky Observatory. Six times of minimum light were calculated from four primary eclipses and two secondary eclipses:HJD I = 24 58092.4937±0.0002, 2458102.74600±0.0021, 2458104.5769±0.0002, 2458104.9434±0.0029HJD II = 2458103.6610±0.0001, 2458104.7607±0.0020,Six times of minimum light were also calculated from data taken by Terrell, Gross, and Cooney, in their 2016 and 2004 observations (reported in IBVS #6166; TGC, hereafter). In addition, six more times of minimum light were taken from the literature. From all 18 times of minimum light, we determined the following light elements:JD Hel Min I=2458102.7460(4) + 0.36613937(5) EWe found the orbital period was constant over the 14 years spanning all observations. We note that TGC found a slightly increasing period. However, our results were obtained from a period study rather than comparison of observations from only two epochs by the Wilson-Devinney (W-D) Program. A BVRcIc Johnson-Cousins filtered simultaneous W-D Program solution gives a mass ratio (0.3385±0.0014) very nearly the same as TGC’s (0.347±0.003), and a component temperature difference of only ~40 K. As with TGC, no spot was needed in the modeling. Our modeling (beginning with Binary Maker 3.0 fits) was done without prior knowledge of TGC’s. This shows the agreement achieved when independent analyses are done with the W-D code. The present observations were taken 1.8 years later than the last curves by TGC, so some variation is expected.The Roche Lobe fill-out of the binary is ~13% and the inclination is ~83.5 degrees. The system is a shallow contact W-type W UMa Binary, albeit, the amplitudes of the primary and secondary eclipse are very nearly identical. An eclipse duration of ~21 minutes was determined for the secondary eclipse and the light curve solution. Additional and more detailed information is given in the poster paper.

  6. Optimal heliocentric trajectories for solar sail with minimum area

    NASA Astrophysics Data System (ADS)

    Petukhov, Vyacheslav G.

    2018-05-01

    The fixed-time heliocentric trajectory optimization problem is considered for planar solar sail with minimum area. Necessary optimality conditions are derived, a numerical method for solving the problem is developed, and numerical examples of optimal trajectories to Mars, Venus and Mercury are presented. The dependences of the minimum area of the solar sail from the date of departure from the Earth, the time of flight and the departing hyperbolic excess of velocity are analyzed. In particular, for the rendezvous problem (approaching a target planet with zero relative velocity) with zero departing hyperbolic excess of velocity for a flight duration of 1200 days it was found that the minimum area-to-mass ratio should be about 12 m2/kg for trajectory to Venus, 23.5 m2/kg for the trajectory to Mercury and 25 m2/kg for trajectory to Mars.

  7. Time-to-space mapping of a continuous light wave with picosecond time resolution based on an electrooptic beam deflection.

    PubMed

    Hisatake, S; Kobayashi, T

    2006-12-25

    We demonstrate a time-to-space mapping of an optical signal with a picosecond time resolution based on an electrooptic beam deflection. A time axis of the optical signal is mapped into a spatial replica by the deflection. We theoretically derive a minimum time resolution of the time-to-space mapping and confirm it experimentally on the basis of the pulse width of the optical pulses picked out from the deflected beam through a narrow slit which acts as a temporal window. We have achieved the minimum time resolution of 1.6+/-0.2 ps.

  8. Artificial intelligence and its impact on combat aircraft

    NASA Technical Reports Server (NTRS)

    Ott, Lawrence M.; Abbot, Kathy; Kleider, Alfred; Moon, D.; Retelle, John

    1987-01-01

    As the threat becomes more sophisticated and weapon systems more complex to meet the threat, the need for machines to assist the pilot in the assessment of information becomes paramount. This is particularly true in real-time, high stress situations. The advent of artificial intelligence (AI) technology offers the opportunity to make quantum advances in the application of machine technology. However, if AI systems are to find their way into combat aircraft, they must meet certain criteria. The systems must be responsive, reliable, easy to use, flexible, and understandable. These criteria are compared with the current status used in a combat airborne application. Current AI systems deal with nonreal time applications and require significant user interaction. On the other hand, aircraft applications require real time, minimum human interaction systems. In order to fill the gap between where technology is now and where it must be for aircraft applications, considerable government research is ongoing in NASA, DARPA, and three services. The ongoing research is briefly summarized. Finally, recognizing that AI technology is in its embryonic stage, and the aircraft needs are very demanding, a number of issues arise. These issues are delineated and findings are provided where appropriate.

  9. A model for estimating pathogen variability in shellfish and predicting minimum depuration times.

    PubMed

    McMenemy, Paul; Kleczkowski, Adam; Lees, David N; Lowther, James; Taylor, Nick

    2018-01-01

    Norovirus is a major cause of viral gastroenteritis, with shellfish consumption being identified as one potential norovirus entry point into the human population. Minimising shellfish norovirus levels is therefore important for both the consumer's protection and the shellfish industry's reputation. One method used to reduce microbiological risks in shellfish is depuration; however, this process also presents additional costs to industry. Providing a mechanism to estimate norovirus levels during depuration would therefore be useful to stakeholders. This paper presents a mathematical model of the depuration process and its impact on norovirus levels found in shellfish. Two fundamental stages of norovirus depuration are considered: (i) the initial distribution of norovirus loads within a shellfish population and (ii) the way in which the initial norovirus loads evolve during depuration. Realistic assumptions are made about the dynamics of norovirus during depuration, and mathematical descriptions of both stages are derived and combined into a single model. Parameters to describe the depuration effect and norovirus load values are derived from existing norovirus data obtained from U.K. harvest sites. However, obtaining population estimates of norovirus variability is time-consuming and expensive; this model addresses the issue by assuming a 'worst case scenario' for variability of pathogens, which is independent of mean pathogen levels. The model is then used to predict minimum depuration times required to achieve norovirus levels which fall within possible risk management levels, as well as predictions of minimum depuration times for other water-borne pathogens found in shellfish. Times for Escherichia coli predicted by the model all fall within the minimum 42 hours required for class B harvest sites, whereas minimum depuration times for norovirus and FRNA+ bacteriophage are substantially longer. Thus this study provides relevant information and tools to assist norovirus risk managers with future control strategies.

  10. Kentucky's Automotive Certification Program.

    ERIC Educational Resources Information Center

    Kentucky State Dept. of Education, Frankfort. Office of Vocational Education.

    The state of Kentucky recognized a need to standardize automotive mechanics training throughout the state and to establish minimum guidelines for the quality of instruction in such programs. To meet these needs, the Office of Vocational Education selected the National Institute for Automotive Service Excellence (ASE) and began the certification…

  11. Information management for aged care provision in Australia: development of an aged care minimum dataset and strategies to improve quality and continuity of care.

    PubMed

    Davis, Jenny; Morgans, Amee; Burgess, Stephen

    2016-04-01

    Efficient information systems support the provision of multi-disciplinary aged care and a variety of organisational purposes, including quality, funding, communication and continuity of care. Agreed minimum data sets enable accurate communication across multiple care settings. However, in aged care multiple and poorly integrated data collection frameworks are commonly used for client assessment, government reporting and funding purposes. To determine key information needs in aged care settings to improve information quality, information transfer, safety, quality and continuity of care to meet the complex needs of aged care clients. Modified Delphi methods involving five stages were employed by one aged care provider in Victoria, Australia, to establish stakeholder consensus for a derived minimum data set and address barriers to data quality. Eleven different aged care programs were identified; with five related data dictionaries, three minimum data sets, five program standards or quality frameworks. The remaining data collection frameworks related to diseases classification, funding, service activity reporting, and statistical standards and classifications. A total of 170 different data items collected across seven internal information systems were consolidated to a derived set of 60 core data items and aligned with nationally consistent data collection frameworks. Barriers to data quality related to inconsistencies in data items, staff knowledge, workflow, system access and configuration. The development an internal aged care minimum data set highlighted the critical role of primary data quality in the upstream and downstream use of client information; and presents a platform to build national consistency across the sector.

  12. A real-time surface inspection system for precision steel balls based on machine vision

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Ji; Tsai, Jhy-Cherng; Hsu, Ya-Chen

    2016-07-01

    Precision steel balls are one of the most fundament components for motion and power transmission parts and they are widely used in industrial machinery and the automotive industry. As precision balls are crucial for the quality of these products, there is an urgent need to develop a fast and robust system for inspecting defects of precision steel balls. In this paper, a real-time system for inspecting surface defects of precision steel balls is developed based on machine vision. The developed system integrates a dual-lighting system, an unfolding mechanism and inspection algorithms for real-time signal processing and defect detection. The developed system is tested under feeding speeds of 4 pcs s-1 with a detection rate of 99.94% and an error rate of 0.10%. The minimum detectable surface flaw area is 0.01 mm2, which meets the requirement for inspecting ISO grade 100 precision steel balls.

  13. [Application of elastic registration based on Demons algorithm in cone beam CT].

    PubMed

    Pang, Haowen; Sun, Xiaoyang

    2014-02-01

    We applied Demons and accelerated Demons elastic registration algorithm in radiotherapy cone beam CT (CBCT) images, We provided software support for real-time understanding of organ changes during radiotherapy. We wrote a 3D CBCT image elastic registration program using Matlab software, and we tested and verified the images of two patients with cervical cancer 3D CBCT images for elastic registration, based on the classic Demons algorithm, minimum mean square error (MSE) decreased 59.7%, correlation coefficient (CC) increased 11.0%. While for the accelerated Demons algorithm, MSE decreased 40.1%, CC increased 7.2%. The experimental verification with two methods of Demons algorithm obtained the desired results, but the small difference appeared to be lack of precision, and the total registration time was a little long. All these problems need to be further improved for accuracy and reducing of time.

  14. Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing

    NASA Astrophysics Data System (ADS)

    Srivastava, Praveen Ranjan; Pareek, Deepak

    Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.

  15. Realization of Real-Time Clinical Data Integration Using Advanced Database Technology

    PubMed Central

    Yoo, Sooyoung; Kim, Boyoung; Park, Heekyong; Choi, Jinwook; Chun, Jonghoon

    2003-01-01

    As information & communication technologies have advanced, interest in mobile health care systems has grown. In order to obtain information seamlessly from distributed and fragmented clinical data from heterogeneous institutions, we need solutions that integrate data. In this article, we introduce a method for information integration based on real-time message communication using trigger and advanced database technologies. Messages were devised to conform to HL7, a standard for electronic data exchange in healthcare environments. The HL7 based system provides us with an integrated environment in which we are able to manage the complexities of medical data. We developed this message communication interface to generate and parse HL7 messages automatically from the database point of view. We discuss how easily real time data exchange is performed in the clinical information system, given the requirement for minimum loading of the database system. PMID:14728271

  16. Real-time stereo matching using orthogonal reliability-based dynamic programming.

    PubMed

    Gong, Minglun; Yang, Yee-Hong

    2007-03-01

    A novel algorithm is presented in this paper for estimating reliable stereo matches in real time. Based on the dynamic programming-based technique we previously proposed, the new algorithm can generate semi-dense disparity maps using as few as two dynamic programming passes. The iterative best path tracing process used in traditional dynamic programming is replaced by a local minimum searching process, making the algorithm suitable for parallel execution. Most computations are implemented on programmable graphics hardware, which improves the processing speed and makes real-time estimation possible. The experiments on the four new Middlebury stereo datasets show that, on an ATI Radeon X800 card, the presented algorithm can produce reliable matches for 60% approximately 80% of pixels at the rate of 10 approximately 20 frames per second. If needed, the algorithm can be configured for generating full density disparity maps.

  17. STFM's National Clerkship Curriculum: CERA reveals impact, clerkship director needs.

    PubMed

    Cochella, Susan; Steiner, Beat D; Clinch, C Randall; WinklerPrins, Vince

    2014-06-01

    Consistency is needed in family medicine clerkships nationwide. The Society of Teachers of Family Medicine's (STFM) National Clerkship Curriculum (NCC) and supporting NCC website have been developed to address this need. A survey was used to measure these tools' effect and guide future improvements. The Council of Academic Family Medicine's (CAFM) Educational Research Alliance (CERA) 2012 survey of clerkship directors (CD) was used to answer two research questions: (1) To what extent are clerkships teaching the minimum core curriculum? and (2) What resources do clerkship directors identify as important in their role? The survey response rate was 66% (88/134). Ninety-two percent of these CDs are aware of the NCC, 74% report having visited the NCC website, and 71% plan to visit it more than once per year in the future. A total of 21.6% strongly agree that their clerkship content matches the NCC. CDs rate the quality of materials on the website as high and place greatest value on materials that can be downloaded and adapted to their clerkships. STFM's NCC website and materials are familiar to CDs although only one in five state their clerkship curriculum matches the NCC minimum core curriculum. The NCC editorial board needs to better understand why so few teach curriculum that closely matches the minimum core. Continued outreach to CDs can answer this question and improve our ability to support CDs as they incorporate the NCC into family medicine clerkships.

  18. Automated storm water sampling on small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; King, K.W.; Slade, R.M.

    2003-01-01

    Few guidelines are currently available to assist in designing appropriate automated storm water sampling strategies for small watersheds. Therefore, guidance is needed to develop strategies that achieve an appropriate balance between accurate characterization of storm water quality and loads and limitations of budget, equipment, and personnel. In this article, we explore the important sampling strategy components (minimum flow threshold, sampling interval, and discrete versus composite sampling) and project-specific considerations (sampling goal, sampling and analysis resources, and watershed characteristics) based on personal experiences and pertinent field and analytical studies. These components and considerations are important in achieving the balance between sampling goals and limitations because they determine how and when samples are taken and the potential sampling error. Several general recommendations are made, including: setting low minimum flow thresholds, using flow-interval or variable time-interval sampling, and using composite sampling to limit the number of samples collected. Guidelines are presented to aid in selection of an appropriate sampling strategy based on user's project-specific considerations. Our experiences suggest these recommendations should allow implementation of a successful sampling strategy for most small watershed sampling projects with common sampling goals.

  19. Evaluation of the stability and antimicrobial activity of an ethanolic extract of Libidibia ferrea

    PubMed Central

    de Oliveira Marreiro, Raquel; Bandeira, Maria Fulgência Costa Lima; de Souza, Tatiane Pereira; de Almeida, Mailza Costa; Bendaham, Katiana; Venâncio, Gisely Naura; Rodrigues, Isis Costa; Coelho, Cristiane Nagai; Milério, Patrícia Sâmea Lêdo Lima; de Oliveira, Glauber Palma; de Oliveira Conde, Nikeila Chacon

    2014-01-01

    Biofilm is a dense, whitish, noncalcified aggregate of bacteria, with desquamated epithelial cells and food debris creating conditions for an imbalance of resident oral microflora and favoring the destruction of hard and soft tissues by development of caries and gingivitis. The aim of this study was to obtain and characterize an extract of Libidibia ferrea, ex Caesalpinia ferrea L. and to evaluate its feasibility for formulation as a mouthwash, according to current legislation. For this purpose, pH, sedimentation, density, and stability were evaluated, along with microbiological testing of the extract. The microbiological test was used to verify the presence of Staphylococcus aureus, Pseudomonas aeruginosa, fungi, yeasts, coliforms, and minimum inhibitory concentrations of Streptococcus mutans and Streptococcus oralis strains. Characterization, microbiological evaluation, and minimum inhibitory concentration results were tabulated and described using descriptive statistics. The L. ferrea extract showed stable characteristics, product quality, and antibacterial activity against the microorganisms tested irrespective of experimental time intervals. According to these results, it can be concluded that formulation of a mouthwash containing L. ferrea extract to control biofilm is feasible, but further studies are needed. PMID:24501546

  20. Polyhexamethylene guanidine hydrochloride shows bactericidal advantages over chlorhexidine digluconate against ESKAPE bacteria.

    PubMed

    Zhou, Zhongxin; Wei, Dafu; Lu, Yanhua

    2015-01-01

    More information regarding the bactericidal properties of polyhexamethylene guanidine hydrochloride (PHMG) against clinically important antibiotic-resistant ESKAPE (Enterococcus faecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, and Enterobacter species) pathogens needs to be provided for its uses in infection control. The bactericidal properties of PHMG and chlorhexidine digluconate (CHG) were compared based on their minimum inhibitory concentrations (MICs), minimum bactericidal concentrations, and time-course-killing curves against clinically important antibiotic-susceptible and antibiotic-resistant ESKAPE pathogens. Results showed that PHMG exhibited significantly higher bactericidal activities against methicillin-resistant Staphylococcus aureus, carbapenem-resistant Klebsiella pneumoniae, and ceftazidime-resistant Enterobacter spp. than CHG. A slight bactericidal advantage over CHG was obtained against vancomycin-resistant Enterococcus faecium, ciprofloxacin- and levofloxacin-resistant Acinetobacter spp., and multidrug-resistant Pseudomonas aeruginosa. In previous reports, PHMG had higher antimicrobial activity against almost all tested Gram-negative bacteria and several Gram-positive bacteria than CHG using MIC test. These studies support the further development of covalently bound PHMG in sterile-surface materials and the incorporation of PHMG in novel disinfectant formulas. © 2014 International Union of Biochemistry and Molecular Biology, Inc.

  1. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  2. A new photocatalytic reactor for trace contaminant control: a water polishing system.

    PubMed

    Gonzalez-Martin, A; Kim, J; Van Hyfte, J; Rutherford, L A; Andrews, C

    2001-01-01

    In spacecraft water recovery systems there is a need to develop a postprocessor water polishing system to remove organic impurities to levels below 250 micrograms/L (ppb) with a minimum use of expendables. This article addresses the development of a photocatalytic process as a postprocessor water polishing system that is microgravity compatible, operates at room temperature, and requires only a minimal use of both oxygen gas (or air) and electrical power for low energy UV-A (315-400 nm) lamps. In the photocatalytic process, organic contaminants are degraded to benign end products on semiconductor surfaces, usually TiO2. Some challenging issues related to the use of TiO2 for the degradation of organic contaminants have been addressed. These include: i) efficient and stable catalytic material; ii) immobilization of the catalyst to produce a high surface area material that can be used in packed-bed reactors, iii) effective light penetration, iv) effective, microgravity-compatible, oxidant delivery; v) reduced pressure drop, and vi) minimum retention time. The research and development performed on this photocatalytic process is presented in detail. Grant numbers: NAS9-97182.

  3. Serious game versus online course for pretraining medical students before a simulation-based mastery learning course on cardiopulmonary resuscitation: A randomised controlled study.

    PubMed

    Drummond, David; Delval, Paul; Abdenouri, Sonia; Truchot, Jennifer; Ceccaldi, Pierre-François; Plaisance, Patrick; Hadchouel, Alice; Tesnière, Antoine

    2017-12-01

    Although both recorded lectures and serious games have been used to pretrain health professionals before simulation training on cardiopulmonary resuscitation, they have never been compared. The aim of this study was to compare an online course and a serious game for pretraining medical students before simulation-based mastery learning on the management of sudden cardiac arrest. A randomised controlled trial. Participants were pretrained using the online course or the serious game on day 1 and day 7. On day 8, each participant was evaluated repeatedly on a scenario of cardiac arrest until reaching a minimum passing score. Department of Simulation in Healthcare in a French medical faculty. Eighty-two volunteer second-year medical students participated between June and October 2016 and 79 were assessed for primary outcome. The serious game used was Staying Alive, which involved a 3D realistic environment, and the online course involved a PowerPoint lecture. The median total training time needed for students to reach the minimum passing score on day 8. This same outcome was also assessed 4 months later. The median training time (interquartile range) necessary for students to reach the minimum passing score was similar between the two groups: 20.5 (15.8 to 30.3) minutes in the serious game group versus 23 (15 to 32) minutes in the online course group, P = 0.51. Achieving an appropriate degree of chest compression was the most difficult requirement to fulfil for students in both groups. Four months later, the median training time decreased significantly in both groups, but no correlation was found at an individual level with the training times observed on day 8. The serious game used in this study was not superior to an online course to pretrain medical students in the management of a cardiac arrest. The absence of any correlation between the performances of students evaluated during two training sessions separated by 4 months suggests that some elements in the management of cardiac arrest such as compression depth can only be partially learned and retained after a simulation-based training. ClinicalTrials.gov-NCT02758119.

  4. Ground Reaction Forces of the Lead and Trail Limbs when Stepping Over an Obstacle

    PubMed Central

    Bovonsunthonchai, Sunee; Khobkhun, Fuengfa; Vachalathiti, Roongtiwa

    2015-01-01

    Background Precise force generation and absorption during stepping over different obstacles need to be quantified for task accomplishment. This study aimed to quantify how the lead limb (LL) and trail limb (TL) generate and absorb forces while stepping over obstacle of various heights. Material/Methods Thirteen healthy young women participated in the study. Force data were collected from 2 force plates when participants stepped over obstacles. Two limbs (right LL and left TL) and 4 conditions of stepping (no obstacle, stepping over 5 cm, 20 cm, and 30 cm obstacle heights) were tested for main effect and interaction effect by 2-way ANOVA. Paired t-test and 1-way repeated-measure ANOVA were used to compare differences of variables between limbs and among stepping conditions, respectively. The main effects on the limb were found in first peak vertical force, minimum vertical force, propulsive peak force, and propulsive impulse. Results Significant main effects of condition were found in time to minimum force, time to the second peak force, time to propulsive peak force, first peak vertical force, braking peak force, propulsive peak force, vertical impulse, braking impulse, and propulsive impulse. Interaction effects of limb and condition were found in first peak vertical force, propulsive peak force, braking impulse, and propulsive impulse. Conclusions Adaptations of force generation in the LL and TL were found to involve adaptability to altered external environment during stepping in healthy young adults. PMID:26169293

  5. Comparing effects of fire modeling methods on simulated fire patterns and succession: a case study in the Missouri Ozarks

    Treesearch

    Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson

    2008-01-01

    We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...

  6. 40 CFR Table 2 to Subpart Ffff of... - Model Rule-Emission Limitations

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour minimum sample time per run during performance test), and 12-hour rolling averages measured using CEMS b...

  7. 40 CFR Table 1 to Subpart Cccc of... - Emission Limitations

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of this part). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10, 10A, or 10B of appendix A of this...

  8. 40 CFR Table 2 to Subpart Ffff of... - Model Rule-Emission Limitations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour minimum sample time per run during performance test), and 12-hour rolling averages measured using CEMS b...

  9. 40 CFR Table 1 to Subpart Cccc of... - Emission Limitations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of this part). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10, 10A, or 10B of appendix A of this...

  10. 40 CFR Table 2 to Subpart Dddd of... - Model Rule-Emission Limitations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of this part) Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10, 10A, or 10B, of appendix A of this part) Dioxins/furans...

  11. 40 CFR Table 2 to Subpart Dddd of... - Model Rule-Emission Limitations

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... part) Hydrogen chloride 62 parts per million by dry volume 3-run average (1 hour minimum sample time...) Sulfur dioxide 20 parts per million by dry volume 3-run average (1 hour minimum sample time per run...-8) or ASTM D6784-02 (Reapproved 2008).c Opacity 10 percent Three 1-hour blocks consisting of ten 6...

  12. Comparison of methods to detect the in vitro activity of silver nanoparticles (AgNP) against multidrug resistant bacteria.

    PubMed

    Cavassin, Emerson Danguy; de Figueiredo, Luiz Francisco Poli; Otoch, José Pinhata; Seckler, Marcelo Martins; de Oliveira, Roberto Angelo; Franco, Fabiane Fantinelli; Marangoni, Valeria Spolon; Zucolotto, Valtencir; Levin, Anna Sara Shafferman; Costa, Silvia Figueiredo

    2015-10-05

    Multidrug resistant microorganisms are a growing challenge and new substances that can be useful to treat infections due to these microorganisms are needed. Silver nanoparticle may be a future option for treatment of these infections, however, the methods described in vitro to evaluate the inhibitory effect are controversial. This study evaluated the in vitro activity of silver nanoparticles against 36 susceptible and 54 multidrug resistant Gram-positive and Gram-negative bacteria from clinical sources. The multidrug resistant bacteria were oxacilin-resistant Staphylococcus aureus, vancomycin-resistant Enterococcus spp., carbapenem- and polymyxin B-resistant A. baumannii, carbapenem-resistant P. aeruginosa and carbapenem-resistant Enterobacteriaceae. We analyzed silver nanoparticles stabilized with citrate, chitosan and polyvinyl alcohol and commercial silver nanoparticle. Silver sulfadiazine and silver nitrate were used as control. Different methods were used: agar diffusion, minimum inhibitory concentration, minimum bactericidal concentration and time-kill. The activity of AgNPs using diffusion in solid media and the MIC methods showed similar effect against MDR and antimicrobial-susceptible isolates, with a higher effect against Gram-negative isolates. The better results were achieved with citrate and chitosan silver nanoparticle, both with MIC90 of 6.75 μg mL(-1), which can be due the lower stability of these particles and, consequently, release of Ag(+) ions as revealed by X-ray diffraction (XRD). The bactericidal effect was higher against antimicrobial-susceptible bacteria. It seems that agar diffusion method can be used as screening test, minimum inhibitory concentration/minimum bactericidal concentration and time kill showed to be useful methods. The activity of commercial silver nanoparticle and silver controls did not exceed the activity of the citrate and chitosan silver nanoparticles. The in vitro inhibitory effect was stronger against Gram-negative than Gram-positive, and similar against multidrug resistant and susceptible bacteria, with best result achieved using citrate and chitosan silver nanoparticles. The bactericidal effect of silver nanoparticle may, in the future, be translated into important therapeutic and clinical options, especially considering the shortage of new antimicrobials against the emerging antimicrobial resistant microorganisms, in particular against Gram-negative bacteria.

  13. Bose-Einstein condensation of photons from the thermodynamic limit to small photon numbers

    NASA Astrophysics Data System (ADS)

    Nyman, Robert A.; Walker, Benjamin T.

    2018-03-01

    Photons can come to thermal equilibrium at room temperature by scattering multiple times from a fluorescent dye. By confining the light and dye in a microcavity, a minimum energy is set and the photons can then show Bose-Einstein condensation. We present here the physical principles underlying photon thermalization and condensation, and review the literature on the subject. We then explore the 'small' regime where very few photons are needed for condensation. We compare thermal equilibrium results to a rate-equation model of microlasers, which includes spontaneous emission into the cavity, and we note that small systems result in ambiguity in the definition of threshold.

  14. A multilevel analysis of aggressive behaviors among nursing home residents.

    PubMed

    Cassie, Kimberly M

    2012-01-01

    Individual and organizational characteristics associated with aggressive behavior among nursing home residents were examined among a sample of 5,494 residents from 23 facilities using the Minimum Data Set 2.0 and the Organizational Social Context scale. On admission, some individual level variables (age, sex, depression, activities of daily life [ADL] impairments, and cognitive impairments) and no organizational level variables were associated with aggressive behaviors. Over time, aggressive behaviors were linked with some individual characteristics (age, sex, and ADL impairments) and several organizational level variables (stressful climates, less rigid cultures, more resistant cultures, geographic location, facility size and staffing patterns). Findings suggest multi-faceted change strategies are needed.

  15. Distributed Optimization for a Class of Nonlinear Multiagent Systems With Disturbance Rejection.

    PubMed

    Wang, Xinghu; Hong, Yiguang; Ji, Haibo

    2016-07-01

    The paper studies the distributed optimization problem for a class of nonlinear multiagent systems in the presence of external disturbances. To solve the problem, we need to achieve the optimal multiagent consensus based on local cost function information and neighboring information and meanwhile to reject local disturbance signals modeled by an exogenous system. With convex analysis and the internal model approach, we propose a distributed optimization controller for heterogeneous and nonlinear agents in the form of continuous-time minimum-phase systems with unity relative degree. We prove that the proposed design can solve the exact optimization problem with rejecting disturbances.

  16. Spiral biasing adaptor for use in Si drift detectors and Si drift detector arrays

    DOEpatents

    Li, Zheng; Chen, Wei

    2016-07-05

    A drift detector array, preferably a silicon drift detector (SDD) array, that uses a low current biasing adaptor is disclosed. The biasing adaptor is customizable for any desired geometry of the drift detector single cell with minimum drift time of carriers. The biasing adaptor has spiral shaped ion-implants that generate the desired voltage profile. The biasing adaptor can be processed on the same wafer as the drift detector array and only one biasing adaptor chip/side is needed for one drift detector array to generate the voltage profiles on the front side and back side of the detector array.

  17. Dynamics of ultralight aircraft: Dive recovery of hang gliders

    NASA Technical Reports Server (NTRS)

    Jones, R. T.

    1977-01-01

    Longitudinal control of a hang glider by weight shift is not always adequate for recovery from a vertical dive. According to Lanchester's phugoid theory, recovery from rest to horizontal flight ought to be possible within a distance equal to three times the height of fall needed to acquire level flight velocity. A hang glider, having a wing loading of 5 kg sq m and capable of developing a lift coefficient of 1.0, should recover to horizontal flight within a vertical distance of about 12 m. The minimum recovery distance can be closely approached if the glider is equipped with a small all-moveable tail surface having sufficient upward deflection.

  18. 24 CFR 3280.301 - Scope.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... This subpart covers the minimum requirements for materials, products, equipment and workmanship needed...) Protection against corrosion, decay, insects and other similar destructive forces, (c) Protection against...

  19. Parental leave for residents and pediatric training programs.

    PubMed

    2013-02-01

    The American Academy of Pediatrics (AAP) is committed to the development of rational, equitable, and effective parental leave policies that are sensitive to the needs of pediatric residents, families, and developing infants and that enable parents to spend adequate and good-quality time with their young children. It is important for each residency program to have a policy for parental leave that is written, that is accessible to residents, and that clearly delineates program practices regarding parental leave. At a minimum, a parental leave policy for residents and fellows should conform legally with the Family Medical Leave Act as well as with respective state laws and should meet institutional requirements of the Accreditation Council for Graduate Medical Education for accredited programs. Policies should be well formulated and communicated in a culturally sensitive manner. The AAP advocates for extension of benefits consistent with the Family Medical Leave Act to all residents and interns beginning at the time that pediatric residency training begins. The AAP recommends that regardless of gender, residents who become parents should be guaranteed 6 to 8 weeks, at a minimum, of parental leave with pay after the infant's birth. In addition, in conformance with federal law, the resident should be allowed to extend the leave time when necessary by using paid vacation time or leave without pay. Coparenting, adopting, or fostering of a child should entitle the resident, regardless of gender, to the same amount of paid leave (6-8 weeks) as a person who takes maternity/paternity leave. Flexibility, creativity, and advanced planning are necessary to arrange schedules that optimize resident education and experience, cultivate equity in sharing workloads, and protect pregnant residents from overly strenuous work experiences at critical times of their pregnancies.

  20. Measuring and managing radiologist workload: a method for quantifying radiologist activities and calculating the full-time equivalents required to operate a service.

    PubMed

    MacDonald, Sharyn L S; Cowan, Ian A; Floyd, Richard A; Graham, Rob

    2013-10-01

    Accurate and transparent measurement and monitoring of radiologist workload is highly desirable for management of daily workflow in a radiology department, and for informing decisions on department staffing needs. It offers the potential for benchmarking between departments and assessing future national workforce and training requirements. We describe a technique for quantifying, with minimum subjectivity, all the work carried out by radiologists in a tertiary department. Six broad categories of clinical activities contributing to radiologist workload were identified: reporting, procedures, trainee supervision, clinical conferences and teaching, informal case discussions, and administration related to referral forms. Time required for reporting was measured using data from the radiology information system. Other activities were measured by observation and timing by observers, and based on these results and extensive consultation, the time requirements and frequency of each activity was agreed on. An activity list was created to record this information and to calculate the total clinical hours required to meet the demand for radiologist services. Diagnostic reporting accounted for approximately 35% of radiologist clinical time; procedures, 23%; trainee supervision, 15%; conferences and tutorials, 14%; informal case discussions, 10%; and referral-related administration, 3%. The derived data have been proven reliable for workload planning over the past 3 years. A transparent and robust method of measuring radiologists' workload has been developed, with subjective assessments kept to a minimum. The technique has value for daily workload and longer term planning. It could be adapted for widespread use. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  1. Regaining Lost Separation in a Piloted Simulation of Autonomous Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Eischeid, Todd M.; Palmer, Michael T.; Wing, David J.

    2002-01-01

    NASA is currently investigating a new concept of operations for the National Airspace System, designed to improve capacity while maintaining or improving current levels of safety. This concept, known as Distributed Air/Ground Traffic Management (DAG-TM), allows appropriately equipped autonomous aircraft to maneuver freely for flight optimization while resolving conflicts with other traffic and staying out of special use airspace and hazardous weather. While Airborne Separation Assurance System (ASAS) tools would normally allow pilots to resolve conflicts before they become hazardous, evaluation of system performance in sudden, near-term conflicts is needed in order to determine concept feasibility. If an acceptable safety level can be demonstrated in these situations, then operations may be conducted with lower separation minimums. An experiment was conducted in NASA Langley s Air Traffic Operations Lab to address issues associated with resolving near-term conflicts and the potential use of lower separation minimums. Sixteen commercial airline pilots flew a total of 32 traffic scenarios that required them to use prototype ASAS tools to resolve close range pop-up conflicts. Required separation standards were set at either 3 or 5 NM lateral spacing, with 1000 ft vertical separation being used for both cases. Reducing the lateral separation from 5 to 3 NM did not appear to increase operational risk, as indicated by the proximity to the intruder aircraft. Pilots performed better when they followed tactical guidance cues provided by ASAS than when they didn't follow the guidance. As air-air separation concepts are evolved, further studies will consider integration issues between ASAS and existing Airborne Collision Avoidance Systems (ACAS).These types of non-normal events will require the ASAS to provide effective alerts and resolutions prior to the time that an Airborne Collision Avoidance System (ACAS) would give a Resolution Advisory (RA). When an RA is issued, a pilot must take immediate action in order to avoid a potential near miss. The Traffic Alert and Collision Avoidance System (TCAS) II currently functions as an ACAS aboard commercial aircraft. Depending on the own aircraft s altitude, TCAS only issues RA s 15-35 seconds prior to the Closest Point of Approach (CPA). Prior to an RA, DAG-TM pilots operating autonomous aircraft must rely solely on ASAS for resolution guidance. An additional area of DAG-TM concept feasibility relates to a potential reduction in separation standards. Lower separation standards are likely needed in order to improve NAS efficiency and capacity. Current separation minimums are based in large part on the capabilities of older radar systems. Safety assessments are needed to determine the feasibility of reduced separation minimums. They will give strong consideration to surveillance system performance, including accuracy, integrity, and availability. Candidate surveillance systems include Automatic Dependent Surveillance-Broadcast (ADS-B) and multi-lateration systems. Considering studies done for Reduced Vertical Separation Minimums (RVSM) operations, it is likely that flight technical errors will also be considered. In addition to a thorough evaluation of surveillance system performance, a potential decision to lower the separation standards should also take operational considerations into account. An ASAS Safety Assessment study identified improper maneuvering in response to a conflict (due to ambiguous or improper resolution commands or a pilot s failure to comply with the resolution) as a potential safety risk. If near-term conflicts with lower separation minimums were determined to be more challenging for pilots, the severity of these risks could be even greater.

  2. The Effect of Minimum Wages on Adolescent Fertility: A Nationwide Analysis.

    PubMed

    Bullinger, Lindsey Rose

    2017-03-01

    To investigate the effect of minimum wage laws on adolescent birth rates in the United States. I used a difference-in-differences approach and vital statistics data measured quarterly at the state level from 2003 to 2014. All models included state covariates, state and quarter-year fixed effects, and state-specific quarter-year nonlinear time trends, which provided plausibly causal estimates of the effect of minimum wage on adolescent birth rates. A $1 increase in minimum wage reduces adolescent birth rates by about 2%. The effects are driven by non-Hispanic White and Hispanic adolescents. Nationwide, increasing minimum wages by $1 would likely result in roughly 5000 fewer adolescent births annually.

  3. 24 CFR 570.486 - Local government requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... such groups; (5) Provide for a minimum of two public hearings, each at a different stage of the program... review of program performance. The public hearings to cover community development and housing needs must... accommodations for the handicapped. Public hearings shall be conducted in a manner to meet the needs of non...

  4. 24 CFR 570.486 - Local government requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... such groups; (5) Provide for a minimum of two public hearings, each at a different stage of the program... review of program performance. The public hearings to cover community development and housing needs must... accommodations for the handicapped. Public hearings shall be conducted in a manner to meet the needs of non...

  5. [Access by hearing-disabled individuals to health services in a southern Brazilian city].

    PubMed

    Freire, Daniela Buchrieser; Gigante, Luciana Petrucci; Béria, Jorge Umberto; Palazzo, Lílian dos Santos; Figueiredo, Andréia Cristina Leal; Raymann, Beatriz Carmen Warth

    2009-04-01

    This cross-sectional study aimed to compare access to health services and preventive measures by persons with hearing disability and those with normal hearing in Canoas, Rio Grande do Sul State, Brazil. The sample included 1,842 individuals 15 years or older (52.9% of whom were females). The most frequent income bracket was twice the minimum wage or more, or approximately U$360/month (42.7%). Individuals with hearing disability were more likely to have visited a physician in the previous two months (PR = 1.3, 95%CI: 1.10-1.51) and to have been hospitalized in the previous 12 months (PR = 2.1, 95%CI: 1.42-3.14). Regarding mental health, individuals with hearing disability showed 1.5 times greater probability of health care due to mental disorders and 4.2 times greater probability of psychiatric hospitalization as compared to those with normal hearing. Consistent with other studies, women with hearing disability performed less breast self-examination and had fewer Pap smears. The data indicate the need to invest in specific campaigns for this group of individuals with special needs.

  6. General guidelines for biomedical software development

    PubMed Central

    Silva, Luis Bastiao; Jimenez, Rafael C.; Blomberg, Niklas; Luis Oliveira, José

    2017-01-01

    Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic. PMID:28443186

  7. Setting a minimum age for juvenile justice jurisdiction in California

    PubMed Central

    Barnert, Elizabeth S.; Abrams, Laura S.; Maxson, Cheryl; Gase, Lauren; Soung, Patricia; Carroll, Paul; Bath, Eraka

    2018-01-01

    Purpose Despite the existence of minimum age laws for juvenile justice jurisdiction in 18 US states, California has no explicit law that protects children (i.e. youth less than 12 years old) from being processed in the juvenile justice system. In the absence of a minimum age law, California lags behind other states and international practice and standards. The paper aims to discuss these issues. Design/methodology/approach In this policy brief, academics across the University of California campuses examine current evidence, theory, and policy related to the minimum age of juvenile justice jurisdiction. Findings Existing evidence suggests that children lack the cognitive maturity to comprehend or benefit from formal juvenile justice processing, and diverting children from the system altogether is likely to be more beneficial for the child and for public safety. Research limitations/implications Based on current evidence and theory, the authors argue that minimum age legislation that protects children from contact with the juvenile justice system and treats them as children in need of services and support, rather than as delinquents or criminals, is an important policy goal for California and for other national and international jurisdictions lacking a minimum age law. Originality/value California has no law specifying a minimum age for juvenile justice jurisdiction, meaning that young children of any age can be processed in the juvenile justice system. This policy brief provides a rationale for a minimum age law in California and other states and jurisdictions without one. Paper type Conceptual paper PMID:28299968

  8. Setting a minimum age for juvenile justice jurisdiction in California.

    PubMed

    S Barnert, Elizabeth; S Abrams, Laura; Maxson, Cheryl; Gase, Lauren; Soung, Patricia; Carroll, Paul; Bath, Eraka

    2017-03-13

    Purpose Despite the existence of minimum age laws for juvenile justice jurisdiction in 18 US states, California has no explicit law that protects children (i.e. youth less than 12 years old) from being processed in the juvenile justice system. In the absence of a minimum age law, California lags behind other states and international practice and standards. The paper aims to discuss these issues. Design/methodology/approach In this policy brief, academics across the University of California campuses examine current evidence, theory, and policy related to the minimum age of juvenile justice jurisdiction. Findings Existing evidence suggests that children lack the cognitive maturity to comprehend or benefit from formal juvenile justice processing, and diverting children from the system altogether is likely to be more beneficial for the child and for public safety. Research limitations/implications Based on current evidence and theory, the authors argue that minimum age legislation that protects children from contact with the juvenile justice system and treats them as children in need of services and support, rather than as delinquents or criminals, is an important policy goal for California and for other national and international jurisdictions lacking a minimum age law. Originality/value California has no law specifying a minimum age for juvenile justice jurisdiction, meaning that young children of any age can be processed in the juvenile justice system. This policy brief provides a rationale for a minimum age law in California and other states and jurisdictions without one.

  9. Lithology-dependent minimum horizontal stress and in-situ stress estimate

    NASA Astrophysics Data System (ADS)

    Zhang, Yushuai; Zhang, Jincai

    2017-04-01

    Based on the generalized Hooke's law with coupling stresses and pore pressure, the minimum horizontal stress is solved with assumption that the vertical, minimum and maximum horizontal stresses are in equilibrium in the subsurface formations. From this derivation, we find that the uniaxial strain method is the minimum value or lower bound of the minimum stress. Using Anderson's faulting theory and this lower bound of the minimum horizontal stress, the coefficient of friction of the fault is derived. It shows that the coefficient of friction may have a much smaller value than what it is commonly assumed (e.g., μf = 0.6-0.7) for in-situ stress estimate. Using the derived coefficient of friction, an improved stress polygon is drawn, which can reduce the uncertainty of in-situ stress calculation by narrowing the area of the conventional stress polygon. It also shows that the coefficient of friction of the fault is dependent on lithology. For example, if the formation in the fault is composed of weak shales, then the coefficient of friction of the fault may be small (as low as μf = 0.2). This implies that this fault is weaker and more likely to have shear failures than the fault composed of sandstones. To avoid the weak fault from shear sliding, it needs to have a higher minimum stress and a lower shear stress. That is, the critically stressed weak fault maintains a higher minimum stress, which explains why a low shear stress appears in the frictionally weak fault.

  10. Using regression methods to estimate stream phosphorus loads at the Illinois River, Arkansas

    USGS Publications Warehouse

    Haggard, B.E.; Soerens, T.S.; Green, W.R.; Richards, R.P.

    2003-01-01

    The development of total maximum daily loads (TMDLs) requires evaluating existing constituent loads in streams. Accurate estimates of constituent loads are needed to calibrate watershed and reservoir models for TMDL development. The best approach to estimate constituent loads is high frequency sampling, particularly during storm events, and mass integration of constituents passing a point in a stream. Most often, resources are limited and discrete water quality samples are collected on fixed intervals and sometimes supplemented with directed sampling during storm events. When resources are limited, mass integration is not an accurate means to determine constituent loads and other load estimation techniques such as regression models are used. The objective of this work was to determine a minimum number of water-quality samples needed to provide constituent concentration data adequate to estimate constituent loads at a large stream. Twenty sets of water quality samples with and without supplemental storm samples were randomly selected at various fixed intervals from a database at the Illinois River, northwest Arkansas. The random sets were used to estimate total phosphorus (TP) loads using regression models. The regression-based annual TP loads were compared to the integrated annual TP load estimated using all the data. At a minimum, monthly sampling plus supplemental storm samples (six samples per year) was needed to produce a root mean square error of less than 15%. Water quality samples should be collected at least semi-monthly (every 15 days) in studies less than two years if seasonal time factors are to be used in the regression models. Annual TP loads estimated from independently collected discrete water quality samples further demonstrated the utility of using regression models to estimate annual TP loads in this stream system.

  11. The minimum follow-up required for radial head arthroplasty: a meta-analysis.

    PubMed

    Laumonerie, P; Reina, N; Kerezoudis, P; Declaux, S; Tibbo, M E; Bonnevialle, N; Mansat, P

    2017-12-01

    The primary aim of this study was to define the standard minimum follow-up required to produce a reliable estimate of the rate of re-operation after radial head arthroplasty (RHA). The secondary objective was to define the leading reasons for re-operation. Four electronic databases, between January 2000 and March 2017 were searched. Articles reporting reasons for re-operation (Group I) and results (Group II) after RHA were included. In Group I, a meta-analysis was performed to obtain the standard minimum follow-up, the mean time to re-operation and the reason for failure. In Group II, the minimum follow-up for each study was compared with the standard minimum follow-up. A total of 40 studies were analysed: three were Group I and included 80 implants and 37 were Group II and included 1192 implants. In Group I, the mean time to re-operation was 1.37 years (0 to 11.25), the standard minimum follow-up was 3.25 years; painful loosening was the main indication for re-operation. In Group II, 33 Group II articles (89.2%) reported a minimum follow-up of < 3.25 years. The literature does not provide a reliable estimate of the rate of re-operation after RHA. The reproducibility of results would be improved by using a minimum follow-up of three years combined with a consensus of the definition of the reasons for failure after RHA. Cite this article: Bone Joint J 2017;99-B:1561-70. ©2017 The British Editorial Society of Bone & Joint Surgery.

  12. Energy Minimization of Discrete Protein Titration State Models Using Graph Theory.

    PubMed

    Purvine, Emilie; Monson, Kyle; Jurrus, Elizabeth; Star, Keith; Baker, Nathan A

    2016-08-25

    There are several applications in computational biophysics that require the optimization of discrete interacting states, for example, amino acid titration states, ligand oxidation states, or discrete rotamer angles. Such optimization can be very time-consuming as it scales exponentially in the number of sites to be optimized. In this paper, we describe a new polynomial time algorithm for optimization of discrete states in macromolecular systems. This algorithm was adapted from image processing and uses techniques from discrete mathematics and graph theory to restate the optimization problem in terms of "maximum flow-minimum cut" graph analysis. The interaction energy graph, a graph in which vertices (amino acids) and edges (interactions) are weighted with their respective energies, is transformed into a flow network in which the value of the minimum cut in the network equals the minimum free energy of the protein and the cut itself encodes the state that achieves the minimum free energy. Because of its deterministic nature and polynomial time performance, this algorithm has the potential to allow for the ionization state of larger proteins to be discovered.

  13. Energy Minimization of Discrete Protein Titration State Models Using Graph Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purvine, Emilie AH; Monson, Kyle E.; Jurrus, Elizabeth R.

    There are several applications in computational biophysics which require the optimization of discrete interacting states; e.g., amino acid titration states, ligand oxidation states, or discrete rotamer angles. Such optimization can be very time-consuming as it scales exponentially in the number of sites to be optimized. In this paper, we describe a new polynomial-time algorithm for optimization of discrete states in macromolecular systems. This algorithm was adapted from image processing and uses techniques from discrete mathematics and graph theory to restate the optimization problem in terms of maximum flow-minimum cut graph analysis. The interaction energy graph, a graph in which verticesmore » (amino acids) and edges (interactions) are weighted with their respective energies, is transformed into a flow network in which the value of the minimum cut in the network equals the minimum free energy of the protein, and the cut itself encodes the state that achieves the minimum free energy. Because of its deterministic nature and polynomial-time performance, this algorithm has the potential to allow for the ionization state of larger proteins to be discovered.« less

  14. Energy Minimization of Discrete Protein Titration State Models Using Graph Theory

    PubMed Central

    Purvine, Emilie; Monson, Kyle; Jurrus, Elizabeth; Star, Keith; Baker, Nathan A.

    2016-01-01

    There are several applications in computational biophysics which require the optimization of discrete interacting states; e.g., amino acid titration states, ligand oxidation states, or discrete rotamer angles. Such optimization can be very time-consuming as it scales exponentially in the number of sites to be optimized. In this paper, we describe a new polynomial-time algorithm for optimization of discrete states in macromolecular systems. This algorithm was adapted from image processing and uses techniques from discrete mathematics and graph theory to restate the optimization problem in terms of “maximum flow-minimum cut” graph analysis. The interaction energy graph, a graph in which vertices (amino acids) and edges (interactions) are weighted with their respective energies, is transformed into a flow network in which the value of the minimum cut in the network equals the minimum free energy of the protein, and the cut itself encodes the state that achieves the minimum free energy. Because of its deterministic nature and polynomial-time performance, this algorithm has the potential to allow for the ionization state of larger proteins to be discovered. PMID:27089174

  15. virtX - evaluation of a computer-based training system for mobile C-arm systems in trauma and orthopedic surgery.

    PubMed

    Bott, O J; Teistler, M; Duwenkamp, C; Wagner, M; Marschollek, M; Plischke, M; Raab, B W; Stürmer, K M; Pretschner, D P; Dresing, K

    2008-01-01

    Operating room personnel (ORP) operating mobile image intensifier systems (C-arms) need training to produce high quality radiographs with a minimum of time and X-ray exposure. Our study aims at evaluating acceptance, usability and learning effect of the CBT system virtX that simulates C-arm based X-ray imaging in the context of surgical case scenarios. Prospective, interventional study conducted during an ORP course with three groups: intervention group 1 (training on a PC using virtX), and 2 (virtX with a C-arm as input device), and a control group (training without virtX) - IV1, IV2 and CG. All participants finished training with the same exercise. Time needed to produce an image of sufficient quality was recorded and analyzed using One-Way-ANOVA and Dunnett post hoc test (alpha = .05). Acceptance and usability of virtX have been evaluated using a questionnaire. CG members (n = 21) needed more time for the exercise than those of IV2 (n = 20): 133 +/- 55 vs. 101 +/- 37 sec. (p = .03). IV1 (n = 12) also performed better than CG (128 +/- 48 sec.), but this was not statistically significant. Seventy-nine participants returned a questionnaire (81% female, age 34 +/- 9 years, professional experience 8.3 +/- 7.6 years; 77% regularly used a C-arm). 83% considered virtX a useful addition to conventional C-arm training. 91% assessed virtual radiography as helpful for understanding C-arm operation. Trainees experienced virtX as substantial enhancement of C-arm training. Training with virtX can reduce the time needed to perform an imaging task.

  16. Finding minimum-quotient cuts in planar graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, J.K.; Phillips, C.A.

    Given a graph G = (V, E) where each vertex v {element_of} V is assigned a weight w(v) and each edge e {element_of} E is assigned a cost c(e), the quotient of a cut partitioning the vertices of V into sets S and {bar S} is c(S, {bar S})/min{l_brace}w(S), w(S){r_brace}, where c(S, {bar S}) is the sum of the costs of the edges crossing the cut and w(S) and w({bar S}) are the sum of the weights of the vertices in S and {bar S}, respectively. The problem of finding a cut whose quotient is minimum for a graph hasmore » in recent years attracted considerable attention, due in large part to the work of Rao and Leighton and Rao. They have shown that an algorithm (exact or approximation) for the minimum-quotient-cut problem can be used to obtain an approximation algorithm for the more famous minimumb-balanced-cut problem, which requires finding a cut (S,{bar S}) minimizing c(S,{bar S}) subject to the constraint bW {le} w(S) {le} (1 {minus} b)W, where W is the total vertex weight and b is some fixed balance in the range 0 < b {le} {1/2}. Unfortunately, the minimum-quotient-cut problem is strongly NP-hard for general graphs, and the best polynomial-time approximation algorithm known for the general problem guarantees only a cut whose quotient is at mostO(lg n) times optimal, where n is the size of the graph. However, for planar graphs, the minimum-quotient-cut problem appears more tractable, as Rao has developed several efficient approximation algorithms for the planar version of the problem capable of finding a cut whose quotient is at most some constant times optimal. In this paper, we improve Rao`s algorithms, both in terms of accuracy and speed. As our first result, we present two pseudopolynomial-time exact algorithms for the planar minimum-quotient-cut problem. As Rao`s most accurate approximation algorithm for the problem -- also a pseudopolynomial-time algorithm -- guarantees only a 1.5-times-optimal cut, our algorithms represent a significant advance.« less

  17. Two-way communication and analysis program on LANDSAT

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Community workshops, field visits, telephone surveys, and other research reveals that professionals at the substate level are interested in and open to consideration of LANDSAT as a planning and resource management tool, but are at the same time skeptical about some of the inherent problems with LANDSAT such as cost, resolution, frequency of coverage, and data continuity. The principal requirements for increasing the utilization of LANDSAT by potential substate users were identified and documented. Without a committment from the Federal Government for increased substrate utilization and the availability of trained professionals to meet the needs of a largely new user community, substrate activity is likely to remain at a minimum. Well conceived and well executed demonstration projects could play a critical role is shaping the technology's ability to be more sensitive to substate user needs and interests as well as validating the effectiveness of this data to a skeptical audience.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chrisochoides, N.; Sukup, F.

    In this paper we present a parallel implementation of the Bowyer-Watson (BW) algorithm using the task-parallel programming model. The BW algorithm constitutes an ideal mesh refinement strategy for implementing a large class of unstructured mesh generation techniques on both sequential and parallel computers, by preventing the need for global mesh refinement. Its implementation on distributed memory multicomputes using the traditional data-parallel model has been proven very inefficient due to excessive synchronization needed among processors. In this paper we demonstrate that with the task-parallel model we can tolerate synchronization costs inherent to data-parallel methods by exploring concurrency in the processor level.more » Our preliminary performance data indicate that the task- parallel approach: (i) is almost four times faster than the existing data-parallel methods, (ii) scales linearly, and (iii) introduces minimum overheads compared to the {open_quotes}best{close_quotes} sequential implementation of the BW algorithm.« less

  19. Measuring a critical stress for continuous prevention of marine biofouling accumulation with aeration.

    PubMed

    Menesses, Mark; Belden, Jesse; Dickenson, Natasha; Bird, James

    2017-10-01

    When cleaning the hull of a ship, significant shear stresses are needed to remove established biofouling organisms. Given that there exists a link between the amount of time that fouling accumulates and the stress required to remove it, it is not surprising that more frequent grooming requires less shear stress. Yet, it is unclear if there is a minimum stress needed to prevent the growth of macrofouling in the limit of continuous grooming. This manuscript shows that single bubble stream aeration provides continuous grooming and prevents biofouling accumulation in regions where the average wall stress exceeds ~0.01 Pa. This value was found by comparing observations of biofouling growth from field studies with complementary laboratory measurements that probe the associated flow fields. These results suggest that aeration and other continuous grooming systems must exceed a wall stress of 0.01 Pa to prevent macrofouling accumulation.

  20. Pell Grants as Performance-Based Scholarships? An Examination of Satisfactory Academic Progress Requirements in the Nation's Largest Need-Based Aid Program

    ERIC Educational Resources Information Center

    Schudde, Lauren; Scott-Clayton, Judith

    2016-01-01

    The Federal Pell Grant Program is the nation's largest need-based grant program. While students' initial eligibility for the Pell is based on financial need, renewal is contingent on meeting minimum academic standards similar to those in models of performance-based scholarships, including a grade point average (GPA) requirement and ratio of…

  1. Jobs Taken by Mothers Moving from Welfare to Work and the Effects of Minimum Wages on This Transition.

    ERIC Educational Resources Information Center

    Brandon, Peter D.

    The potential effects of raising the minimum wage on the earnings of mothers moving from welfare to work were examined by analyzing the differences that existed in the late 1980s in the various states' minimum wage rates and data from three waves of the Survey of Income and Program Participation for the years 1985-1990 (during which time 13 states…

  2. Quality improvement in neurology residency programs. Report of the Quality Improvement Committee of the Association of University Professors of Neurology.

    PubMed

    Bradley, W G; Daube, J; Mendell, J R; Posner, J; Richman, D; Troost, B T; Swift, T R

    1997-11-01

    The neurology residency programs in the United States are facing a crisis of quality. The Association of University Professors of Neurology (AUPN) approved the Quality Improvement Committee to examine this situation and make recommendations, which have been accepted by the AUPN. The recommendations are (1) that the educational goals of neurology residency training be dissociated from patient-care needs in academic medical centers and (2) that minimum levels of quality be applied to residents in neurology residency programs and to these programs themselves. These minimum criteria should include minimum educational criteria for entry into the program, minimum criteria for advancement from one year to the next in the program, and minimum criteria for performance of the graduates of neurology residency programs for program accreditation. The implementation of these recommendations will require a shift of funding of the care of indigent patients from the graduate medical education budget to direct patient-care sources. These recommendations will significantly improve the quality of neurologists and neurologic care in the United States.

  3. The Minimum Impulse Thruster

    NASA Technical Reports Server (NTRS)

    Parker, J. Morgan; Wilson, Michael J.

    2005-01-01

    The Minimum Impulse Thruster (MIT) was developed to improve the state-of-the-art minimum impulse capability of hydrazine monopropellant thrusters. Specifically, a new fast response solenoid valve was developed, capable of responding to a much shorter electrical pulse width, thereby reducing the propellant flow time and the minimum impulse bit. The new valve was combined with the Aerojet MR-103, 0.2 lbf (0.9 N) thruster and put through an extensive Delta-qualification test program, resulting in a factor of 5 reduction in the minimum impulse bit, from roughly 1.1 milli-lbf-seconds (5 milliNewton seconds) to - 0.22 milli-lbf-seconds (1 mN-s). To maintain it's extensive heritage, the thruster itself was left unchanged. The Minimum Impulse Thruster provides mission and spacecraft designers new design options for precision pointing and precision translation of spacecraft.

  4. How well do WHO complementary feeding indicators relate to nutritional status of children aged 6-23 months in rural Northern Ghana?

    PubMed

    Saaka, Mahama; Wemakor, Anthony; Abizari, Abdul-Razak; Aryee, Paul

    2015-11-23

    Though the World Health Organization (WHO) recommended Infant and Young Child Feeding (IYCF) indicators have been in use, little is known about their association with child nutritional status. The objective of this study was to explore the relationship between IYCF indicators (timing of complementary feeding, minimum dietary diversity, minimum meal frequency and minimum acceptable diet) and child growth indicators. A community-based cross-sectional survey was carried out in November 2013. The study population comprised mothers/primary caregivers and their children selected using a two-stage cluster sampling procedure. Of the 1984 children aged 6-23 months; 58.2 % met the minimum meal frequency, 34.8 % received minimum dietary diversity (≥4 food groups), 27.8 % had received minimum acceptable diet and only 15.7 % received appropriate complementary feeding. With respect to nutritional status, 20.5 %, 11.5 % and 21.1 % of the study population were stunted, wasted and underweight respectively. Multiple logistic regression analysis revealed that compared to children who were introduced to complementary feeding either late or early, children who started complementary feeding at six months of age were 25 % protected from chronic malnutrition (AOR = 0.75, CI = 0.50 - 0.95, P = 0.02). It was found that children whose mothers attended antenatal care (ANC) at least 4 times were 34 % protected [AOR 0.66; 95 % CI (0.50 - 0.88)] against stunted growth compared to children born to mothers who attended ANC less than 4 times. Children from households with high household wealth index were 51 % protected [AOR 0.49; 95 % CI (0.26 - 0.94)] against chronic malnutrition compared to children from households with low household wealth index. After adjusting for potential confounders, there was a significant positive association between appropriate complementary feeding index and mean WLZ (β = 0.10, p = 0.005) but was not associated with mean LAZ. The WHO IYCF indicators better explain weight-for-length Z-scores than length-for-age Z-scores of young children in rural Northern Ghana. Furthermore, a composite indicator comprising timely introduction of solid, semi-solid or soft foods at 6 months, minimum meal frequency, and minimum dietary diversity better explains weight-for-length Z-scores than each of the single indicators.

  5. Droplet squeezing through a narrow constriction: Minimum impulse and critical velocity

    NASA Astrophysics Data System (ADS)

    Zhang, Zhifeng; Drapaca, Corina; Chen, Xiaolin; Xu, Jie

    2017-07-01

    Models of a droplet passing through narrow constrictions have wide applications in science and engineering. In this paper, we report our findings on the minimum impulse (momentum change) of pushing a droplet through a narrow circular constriction. The existence of this minimum impulse is mathematically derived and numerically verified. The minimum impulse happens at a critical velocity when the time-averaged Young-Laplace pressure balances the total minor pressure loss in the constriction. Finally, numerical simulations are conducted to verify these concepts. These results could be relevant to problems of energy optimization and studies of chemical and biomedical systems.

  6. The migratory impact of minimum wage legislation: Puerto Rico, 1970-1987.

    PubMed

    Santiago, C E

    1993-01-01

    "This study examines the impact of minimum wage setting on labor migration. A multiple time series framework is applied to monthly data for Puerto Rico from 1970-1987. The results show that net emigration from Puerto Rico to the United States fell in response to significant changes in the manner in which minimum wage policy was conducted, particularly after 1974. The extent of commuter type labor migration between Puerto Rico and the United States is influenced by minimum wage policy, with potentially important consequences for human capital investment and long-term standards of living." excerpt

  7. Improved Gravitation Field Algorithm and Its Application in Hierarchical Clustering

    PubMed Central

    Zheng, Ming; Sun, Ying; Liu, Gui-xia; Zhou, You; Zhou, Chun-guang

    2012-01-01

    Background Gravitation field algorithm (GFA) is a new optimization algorithm which is based on an imitation of natural phenomena. GFA can do well both for searching global minimum and multi-minima in computational biology. But GFA needs to be improved for increasing efficiency, and modified for applying to some discrete data problems in system biology. Method An improved GFA called IGFA was proposed in this paper. Two parts were improved in IGFA. The first one is the rule of random division, which is a reasonable strategy and makes running time shorter. The other one is rotation factor, which can improve the accuracy of IGFA. And to apply IGFA to the hierarchical clustering, the initial part and the movement operator were modified. Results Two kinds of experiments were used to test IGFA. And IGFA was applied to hierarchical clustering. The global minimum experiment was used with IGFA, GFA, GA (genetic algorithm) and SA (simulated annealing). Multi-minima experiment was used with IGFA and GFA. The two experiments results were compared with each other and proved the efficiency of IGFA. IGFA is better than GFA both in accuracy and running time. For the hierarchical clustering, IGFA is used to optimize the smallest distance of genes pairs, and the results were compared with GA and SA, singular-linkage clustering, UPGMA. The efficiency of IGFA is proved. PMID:23173043

  8. Automated Glacier Surface Velocity using Multi-Image/Multi-Chip (MIMC) Feature Tracking

    NASA Astrophysics Data System (ADS)

    Ahn, Y.; Howat, I. M.

    2009-12-01

    Remote sensing from space has enabled effective monitoring of remote and inhospitable polar regions. Glacier velocity, and its variation in time, is one of the most important parameters needed to understand glacier dynamics, glacier mass balance and contribution to sea level rise. Regular measurements of ice velocity are possible from large and accessible satellite data set archives, such as ASTER and LANDSAT-7. Among satellite imagery, optical imagery (i.e. passive, visible to near-infrared band sensors) provides abundant data with optimal spatial resolution and repeat interval for tracking glacier motion at high temporal resolution. Due to massive amounts of data, computation of ice velocity from feature tracking requires 1) user-friendly interface, 2) minimum local/user parameter inputs and 3) results that need minimum editing. We focus on robust feature tracking, applicable to all currently available optical satellite imagery, that is ASTER, SPOT and LANDSAT etc. We introduce the MIMC (multiple images/multiple chip sizes) matching approach that does not involve any user defined local/empirical parameters except approximate average glacier speed. We also introduce a method for extracting velocity from LANDSAT-7 SLC-off data, which has 22 percent of scene data missing in slanted strips due to failure of the scan line corrector. We apply our approach to major outlet glaciers in west/east Greenland and assess our MIMC feature tracking technique by comparison with conventional correlation matching and other methods (e.g. InSAR).

  9. The application of disease management to clinical trial designs.

    PubMed

    Puterman, Jared; Alter, David A

    2009-08-01

    The utilization of disease management (DM) as a minimum standard of care is believed to facilitate pronounced benefits in overall patient outcome and cost management. Randomized clinical trials remain the gold standard evaluative tool in clinical medicine. However, the extent to which contemporary cardiovascular clinical trials incorporate DM components into their treatment or control arms is unknown. Our study is the first to evaluate the extent to which clinical trials incorporate DM as a minimum standard of care for both the intervention and control groups. In total, 386 clinical trials published in 3 leading medical journals between 2003 and 2006 were evaluated. For each study, elements related to DM care, as defined using the American Heart Association Taxonomy, were abstracted and characterized. Our results demonstrate that while the application of DM has increased over time, only 3.4% of the clinical trials examined incorporated all 8 DM elements (and only 11% of such trials incorporated 4 DM elements). A significant association was found between study year and the inclusion of more than 3 elements of DM (chi(2) = 10.10 (3); p = 0.018). In addition, associations were found between study objective and DM criteria, as well as between cohort type and domains described. Our study serves as a baseline reference for the tracking of DM within, and its application to, randomized clinical trials. Moreover, our results underscore the need for broader implementation and evaluation of DM as a minimum care standard within clinical trial research.

  10. Minimum intravenous thrombolysis utilization rates in acute ischemic stroke to achieve population effects on disability: A discrete-event simulation model.

    PubMed

    Hoffmeister, Lorena; Lavados, Pablo M; Mar, Javier; Comas, Merce; Arrospide, Arantzazu; Castells, Xavier

    2016-06-15

    The only pharmacological treatment with proven cost-effectiveness in reducing acute ischemic stroke (AIS) associated disability is intravenous thrombolysis with recombinant tissue plasminogen activator but it's utilization rate is still low in most of the world. We estimated the minimum thrombolysis utilization rate needed to decrease the prevalence of stroke-related disability at a population level by using a discrete-event simulation model. The model included efficacy according to time to treatment up to 4.5h, and four scenarios for the utilization of intravenous thrombolysis in eligible patients with AIS: a) 2%; b) 12% c) 25% and d) 40%. We calculated the prevalence of AIS related disability in each scenario, using population based data. The simulation was performed from 2002 to 2017 using the ARENA software. A 2% utilization rate yielded a prevalence of disability of 359.1 per 100,000. Increasing thrombolysis to 12% avoided 779 disabled patients. If the utilization rate was increased to 25%, 1783 disabled patients would be avoided. The maximum scenario of 40% decreased disability to 335.7 per 100,000, avoiding 17% of AIS-related disability. The current utilization rate of intravenous thrombolysis of 2% has minimal population impact. Increasing the rate of utilization to more than 12% is the minimum to have a significant population effect on disability and should be a public policy aim. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. A soil water based index as a suitable agricultural drought indicator

    NASA Astrophysics Data System (ADS)

    Martínez-Fernández, J.; González-Zamora, A.; Sánchez, N.; Gumuzzio, A.

    2015-03-01

    Currently, the availability of soil water databases is increasing worldwide. The presence of a growing number of long-term soil moisture networks around the world and the impressive progress of remote sensing in recent years has allowed the scientific community and, in the very next future, a diverse group of users to obtain precise and frequent soil water measurements. Therefore, it is reasonable to consider soil water observations as a potential approach for monitoring agricultural drought. In the present work, a new approach to define the soil water deficit index (SWDI) is analyzed to use a soil water series for drought monitoring. In addition, simple and accurate methods using a soil moisture series solely to obtain soil water parameters (field capacity and wilting point) needed for calculating the index are evaluated. The application of the SWDI in an agricultural area of Spain presented good results at both daily and weekly time scales when compared to two climatic water deficit indicators (average correlation coefficient, R, 0.6) and to agricultural production. The long-term minimum, the growing season minimum and the 5th percentile of the soil moisture series are good estimators (coefficient of determination, R2, 0.81) for the wilting point. The minimum of the maximum value of the growing season is the best estimator (R2, 0.91) for field capacity. The use of these types of tools for drought monitoring can aid the better management of agricultural lands and water resources, mainly under the current scenario of climate uncertainty.

  12. Effectiveness of Teacher Training: Voices of Teachers Serving High-Needs Populations of Students

    ERIC Educational Resources Information Center

    Varela, Daniella G.; Maxwell, Gerri M.

    2015-01-01

    This study explores the effectiveness of educator preparation programs from the perspective of three female Hispanic veteran teachers serving high-needs populations of students. The study strives to contribute to research on minimum proposed standards for teacher preparation programs in Texas. Through a process of coding data from the informant…

  13. Autonomous Aerobraking: A Design, Development, and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Prince, Jill L. H.; Powell, Richard W.; Murri, Dan

    2011-01-01

    Aerobraking has been used four times to decrease the apoapsis of a spacecraft in a captured orbit around a planetary body with a significant atmosphere utilizing atmospheric drag to decelerate the spacecraft. While aerobraking requires minimum fuel, the long time required for aerobraking requires both a large operations staff, and large Deep Space Network resources. A study to automate aerobraking has been sponsored by the NASA Engineering and Safety Center to determine initial feasibility of equipping a spacecraft with the onboard capability for autonomous aerobraking, thus saving millions of dollars incurred by a large aerobraking operations workforce and continuous DSN coverage. This paper describes the need for autonomous aerobraking, the development of the Autonomous Aerobraking Development Software that includes an ephemeris estimator, an atmospheric density estimator, and maneuver calculation, and the plan forward for continuation of this study.

  14. Warning Alert HITL Experiment Results

    NASA Technical Reports Server (NTRS)

    Monk, Kevin J.; Ferm, Lisa; Roberts, Zach

    2018-01-01

    Minimum Operational Performance Standards (MOPS) are being developed to support the integration of Unmanned Aircraft Systems (UAS) in the National Airspace (NAS). Input from subject matter experts and multiple research studies have informed display requirements for Detect-and-Avoid (DAA) systems aimed at supporting timely and appropriate pilot responses to collision hazards. Phase 1 DAA MOPS alerting is designed to inform pilots if an avoidance maneuver is necessary; the two highest alert levels - caution and warning - indicate how soon pilot action is required and whether there is adequate time to coordinate with the air traffic controller (ATC). Additional empirical support is needed to clarify the extent to which warning-level alerting impacts DAA task performance. The present study explores the differential effects of the auditory and visual cues provided by the DAA Warning alert, and performance implications compared to caution-only alerting are discussed.

  15. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  16. Application of independent component analysis for speech-music separation using an efficient score function estimation

    NASA Astrophysics Data System (ADS)

    Pishravian, Arash; Aghabozorgi Sahaf, Masoud Reza

    2012-12-01

    In this paper speech-music separation using Blind Source Separation is discussed. The separating algorithm is based on the mutual information minimization where the natural gradient algorithm is used for minimization. In order to do that, score function estimation from observation signals (combination of speech and music) samples is needed. The accuracy and the speed of the mentioned estimation will affect on the quality of the separated signals and the processing time of the algorithm. The score function estimation in the presented algorithm is based on Gaussian mixture based kernel density estimation method. The experimental results of the presented algorithm on the speech-music separation and comparing to the separating algorithm which is based on the Minimum Mean Square Error estimator, indicate that it can cause better performance and less processing time

  17. Optimal control of M/M/1 two-phase queueing system with state-dependent arrival rate, server breakdowns, delayed repair, and N-policy

    NASA Astrophysics Data System (ADS)

    Rao, Hanumantha; Kumar, Vasanta; Srinivasa Rao, T.; Srinivasa Kumar, B.

    2018-04-01

    In this paper, we examine a two-stage queueing system where the arrivals are Poisson with rate depends on the condition of the server to be specific: vacation, pre-service, operational or breakdown state. The service station is liable to breakdowns and deferral in repair because of non-accessibility of the repair facility. The service is in two basic stages, the first being bulk service to every one of the customers holding up on the line and the second stage is individual to each of them. The server works under N-policy. The server needs preliminary time (startup time) to begin batch service after a vacation period. Startup times, uninterrupted service times, the length of each vacation period, delay times and service times follows an exponential distribution. The closed form of expressions for the mean system size at different conditions of the server is determined. Numerical investigations are directed to concentrate the impact of the system parameters on the ideal limit N and the minimum base expected unit cost.

  18. Time dependent analysis of assay comparability: a novel approach to understand intra- and inter-site variability over time

    NASA Astrophysics Data System (ADS)

    Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire

    2015-09-01

    We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.

  19. [68% of Brazilians want abortion prohibited].

    PubMed

    1991-09-25

    According to a survey, 68% of the Brazilian population want the continuation of the law banning abortion. Only 24% favor liberalization. The penal code stipulates a jail term of 2-8 years for abortion. The survey was carried out in 1991 involving 7018 persons aged 16 in 15 municipalities. 71% who approved the ban lived in the northeast north, and central-east regions. 68% in the south and 65% in the southeast were in favor of the prohibition. 74% in the small towns endorsed this law. 73% with up to 5 times the minimum monthly salary were against abortion, 65% of those with incomes between 5-10 times the minimum salary and 57% of those earning more than 10 times the minimum salary condemned abortion. 72% of women and 64% of men were against it. 73% of young people aged 16-25 wanted the continuation of the ban, compared to 66% of those aged 26-40 and 65% of people 41 or over. 72% of those with up to primary school education, 65% with secondary school education, and 48% with higher education approved the ban. Among those who favored liberalization, 27% lived in the southwest region, 31% were inhabitants of large cities, 36% earned more than 10 times the minimum income monthly, and 39% had obtained higher education.

  20. 25 CFR 39.214 - What is the minimum number of instructional hours required in order to be considered a full-time...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION PROGRAM Administrative Procedures, Student Counts, and Verifications § 39.214 What is the minimum number of instructional...

  1. Diffusion-weighted imaging of breast lesions: Region-of-interest placement and different ADC parameters influence apparent diffusion coefficient values.

    PubMed

    Bickel, Hubert; Pinker, Katja; Polanec, Stephan; Magometschnigg, Heinrich; Wengert, Georg; Spick, Claudio; Bogner, Wolfgang; Bago-Horvath, Zsuzsanna; Helbich, Thomas H; Baltzer, Pascal

    2017-05-01

    To investigate the influence of region-of-interest (ROI) placement and different apparent diffusion coefficient (ADC) parameters on ADC values, diagnostic performance, reproducibility and measurement time in breast tumours. In this IRB-approved, retrospective study, 149 histopathologically proven breast tumours (109 malignant, 40 benign) in 147 women (mean age 53.2) were investigated. Three radiologists independently measured minimum, mean and maximum ADC, each using three ROI placement approaches:1 - small 2D-ROI, 2 - large 2D-ROI and 3 - 3D-ROI covering the whole lesion. One reader performed all measurements twice. Median ADC values, diagnostic performance, reproducibility, and measurement time were calculated and compared between all combinations of ROI placement approaches and ADC parameters. Median ADC values differed significantly between the ROI placement approaches (p < .001). Minimum ADC showed the best diagnostic performance (AUC .928-.956), followed by mean ADC obtained from 2D ROIs (.926-.94). Minimum and mean ADC showed high intra- (ICC .85-.94) and inter-reader reproducibility (ICC .74-.94). Median measurement time was significantly shorter for the 2D ROIs (p < .001). ROI placement significantly influences ADC values measured in breast tumours. Minimum and mean ADC acquired from 2D-ROIs are useful for the differentiation of benign and malignant breast lesions, and are highly reproducible, with rapid measurement. • Region of interest placement significantly influences apparent diffusion coefficient of breast tumours. • Minimum and mean apparent diffusion coefficient perform best and are reproducible. • 2D regions of interest perform best and provide rapid measurement times.

  2. Simulation of hydrodynamics, temperature, and dissolved oxygen in Table Rock Lake, Missouri, 1996-1997

    USGS Publications Warehouse

    Green, W. Reed; Galloway, Joel M.; Richards, Joseph M.; Wesolowski, Edwin A.

    2003-01-01

    Outflow from Table Rock Lake and other White River reservoirs support a cold-water trout fishery of substantial economic yield in south-central Missouri and north-central Arkansas. The Missouri Department of Conservation has requested an increase in existing minimum flows through the Table Rock Lake Dam from the U.S. Army Corps of Engineers to increase the quality of fishable waters downstream in Lake Taneycomo. Information is needed to assess the effect of increased minimum flows on temperature and dissolved- oxygen concentrations of reservoir water and the outflow. A two-dimensional, laterally averaged, hydrodynamic, temperature, and dissolved-oxygen model, CE-QUAL-W2, was developed and calibrated for Table Rock Lake, located in Missouri, north of the Arkansas-Missouri State line. The model simulates water-surface elevation, heat transport, and dissolved-oxygen dynamics. The model was developed to assess the effects of proposed increases in minimum flow from about 4.4 cubic meters per second (the existing minimum flow) to 11.3 cubic meters per second (the increased minimum flow). Simulations included assessing the effect of (1) increased minimum flows and (2) increased minimum flows with increased water-surface elevations in Table Rock Lake, on outflow temperatures and dissolved-oxygen concentrations. In both minimum flow scenarios, water temperature appeared to stay the same or increase slightly (less than 0.37 ?C) and dissolved oxygen appeared to decrease slightly (less than 0.78 mg/L) in the outflow during the thermal stratification season. However, differences between the minimum flow scenarios for water temperature and dissolved- oxygen concentration and the calibrated model were similar to the differences between measured and simulated water-column profile values.

  3. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  4. Fuzzy α-minimum spanning tree problem: definition and solutions

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Chen, Lu; Wang, Ke; Yang, Fan

    2016-04-01

    In this paper, the minimum spanning tree problem is investigated on the graph with fuzzy edge weights. The notion of fuzzy ? -minimum spanning tree is presented based on the credibility measure, and then the solutions of the fuzzy ? -minimum spanning tree problem are discussed under different assumptions. First, we respectively, assume that all the edge weights are triangular fuzzy numbers and trapezoidal fuzzy numbers and prove that the fuzzy ? -minimum spanning tree problem can be transformed to a classical problem on a crisp graph in these two cases, which can be solved by classical algorithms such as the Kruskal algorithm and the Prim algorithm in polynomial time. Subsequently, as for the case that the edge weights are general fuzzy numbers, a fuzzy simulation-based genetic algorithm using Prüfer number representation is designed for solving the fuzzy ? -minimum spanning tree problem. Some numerical examples are also provided for illustrating the effectiveness of the proposed solutions.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanchon, L; Russell, J; Dogan, S

    Purpose: Genetic profiling of biopsied tissue is the basis for personalized cancer therapy. However biopsied materials may not contain sufficient amounts of DNA needed for analysis. We propose a method to determine the adequacy of specimens for performing genetic profiling by quantifying metabolic activity. Methods: We measured the response of two radiation detectors to the activity contained in the minimum amount of tumor cells needed for genetic profiling in biopsy specimens obtained under 2-deoxy-2-({sup 18}F)fluoro-D-glucose ({sup 18}F-FDG) PET/CT guidance. The expected tumor cell concentration in biopsy specimens was evaluated from the amount of DNA needed (∼100 µg) and the numbermore » of pathology sections typically used for the analysis. The average {sup 18}F-FDG uptake per cell was measured by incubating KPC-4662 pancreatic tumor cells and HT-29 colorectal adenocarcinoma tumor cells in {sup 18}F-FDG containing solution (activity concentrations between 0.0122 and 1.51 MBq/mL and glucose concentrations of 3.1 and 1 g/L) for 1 to 1.75 hours and then measuring the activity of a known number of cells. Measurements of surrogate specimens obtained using 18G needle biopsies of gels containing these cells in expected concentrations (∼10{sup 4} µL{sup −1}) were performed using an autoradiography CCD based device (up to 20 min exposure) and a scintillation well counter (∼1 min measurements) about 3 and 5 hours after the end of incubation respectively. Results: At start of autoradiography there were between 0.16 and 1.5 {sup 18}F-FDG molecules/cell and between 1.14 and 5.43×10{sup 7} {sup 18}F-FDG molecules/mL. For the scintillation well counter, sample to minimum-detectable-count rate ratios were greater than 7 and the counting error was less than 25% for ≤80 s measurement times. Images of the samples were identifiable on the autoradiograph for ∼10 min and longer exposure times. Conclusion: Scintillation well counter measurements and CCD based autoradiography have adequate sensitivity to detect the tumor burden needed for genetic profiling in 18G core needle biopsies. Supported in part through the NIH/NCI Cancer Center Support Grant P30 CA008748 and by a sponsored research agreement with Biospace Lab S.A.« less

  6. Pell Grants as Performance-Based Aid? An Examination of Satisfactory Academic Progress Requirements in the Nation's Largest Need-Based Aid Program. A CAPSEE Working Paper

    ERIC Educational Resources Information Center

    Schudde, Lauren; Scott-Clayton, Judith

    2014-01-01

    The Federal Pell Grant Program is the nation's largest need-based grant program. While students' initial eligibility for the Pell is based on financial need, renewal of the award is contingent on their making satisfactory academic progress (SAP)--meeting minimum academic standards similar to those proposed in models of performance-based…

  7. A case at last for age-phased reduction in equity.

    PubMed Central

    Samuelson, P A

    1989-01-01

    Maximizing expected utility over a lifetime leads one who has constant relative risk aversion and faces random-walk securities returns to be "myopic" and hold the same fraction of portfolio in equities early and late in life--a defiance of folk wisdom and casual introspection. By assuming one needs to assure at retirement a minimum ("subsistence") level of wealth, the present analysis deduces a pattern of greater risk-taking when young than when old. When a subsistence minimum is needed at every period of life, the rentier paradoxically is least risk tolerant in youth--the Robert C. Merton paradox that traces to the decline with age of the present discounted value of the subsistence-consumption requirements. Conversely, the decline with age of capitalized human capital reverses the Merton effect. PMID:2813438

  8. Highway runoff quality models for the protection of environmentally sensitive areas

    NASA Astrophysics Data System (ADS)

    Trenouth, William R.; Gharabaghi, Bahram

    2016-11-01

    This paper presents novel highway runoff quality models using artificial neural networks (ANN) which take into account site-specific highway traffic and seasonal storm event meteorological factors to predict the event mean concentration (EMC) statistics and mean daily unit area load (MDUAL) statistics of common highway pollutants for the design of roadside ditch treatment systems (RDTS) to protect sensitive receiving environs. A dataset of 940 monitored highway runoff events from fourteen sites located in five countries (Canada, USA, Australia, New Zealand, and China) was compiled and used to develop ANN models for the prediction of highway runoff suspended solids (TSS) seasonal EMC statistical distribution parameters, as well as the MDUAL statistics for four different heavy metal species (Cu, Zn, Cr and Pb). TSS EMCs are needed to estimate the minimum required removal efficiency of the RDTS needed in order to improve highway runoff quality to meet applicable standards and MDUALs are needed to calculate the minimum required capacity of the RDTS to ensure performance longevity.

  9. Alzheimer Classification Using a Minimum Spanning Tree of High-Order Functional Network on fMRI Dataset

    PubMed Central

    Guo, Hao; Liu, Lei; Chen, Junjie; Xu, Yong; Jie, Xiang

    2017-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most useful methods to generate functional connectivity networks of the brain. However, conventional network generation methods ignore dynamic changes of functional connectivity between brain regions. Previous studies proposed constructing high-order functional connectivity networks that consider the time-varying characteristics of functional connectivity, and a clustering method was performed to decrease computational cost. However, random selection of the initial clustering centers and the number of clusters negatively affected classification accuracy, and the network lost neurological interpretability. Here we propose a novel method that introduces the minimum spanning tree method to high-order functional connectivity networks. As an unbiased method, the minimum spanning tree simplifies high-order network structure while preserving its core framework. The dynamic characteristics of time series are not lost with this approach, and the neurological interpretation of the network is guaranteed. Simultaneously, we propose a multi-parameter optimization framework that involves extracting discriminative features from the minimum spanning tree high-order functional connectivity networks. Compared with the conventional methods, our resting-state fMRI classification method based on minimum spanning tree high-order functional connectivity networks greatly improved the diagnostic accuracy for Alzheimer's disease. PMID:29249926

  10. Locating helicopter emergency medical service bases to optimise population coverage versus average response time.

    PubMed

    Garner, Alan A; van den Berg, Pieter L

    2017-10-16

    New South Wales (NSW), Australia has a network of multirole retrieval physician staffed helicopter emergency medical services (HEMS) with seven bases servicing a jurisdiction with population concentrated along the eastern seaboard. The aim of this study was to estimate optimal HEMS base locations within NSW using advanced mathematical modelling techniques. We used high resolution census population data for NSW from 2011 which divides the state into areas containing 200-800 people. Optimal HEMS base locations were estimated using the maximal covering location problem facility location optimization model and the average response time model, exploring the number of bases needed to cover various fractions of the population for a 45 min response time threshold or minimizing the overall average response time to all persons, both in green field scenarios and conditioning on the current base structure. We also developed a hybrid mathematical model where average response time was optimised based on minimum population coverage thresholds. Seven bases could cover 98% of the population within 45mins when optimised for coverage or reach the entire population of the state within an average of 21mins if optimised for response time. Given the existing bases, adding two bases could either increase the 45 min coverage from 91% to 97% or decrease the average response time from 21mins to 19mins. Adding a single specialist prehospital rapid response HEMS to the area of greatest population concentration decreased the average state wide response time by 4mins. The optimum seven base hybrid model that was able to cover 97.75% of the population within 45mins, and all of the population in an average response time of 18 mins included the rapid response HEMS model. HEMS base locations can be optimised based on either percentage of the population covered, or average response time to the entire population. We have also demonstrated a hybrid technique that optimizes response time for a given number of bases and minimum defined threshold of population coverage. Addition of specialized rapid response HEMS services to a system of multirole retrieval HEMS may reduce overall average response times by improving access in large urban areas.

  11. Spatially valid data of atmospheric deposition of heavy metals and nitrogen derived by moss surveys for pollution risk assessments of ecosystems.

    PubMed

    Schröder, Winfried; Nickel, Stefan; Schönrock, Simon; Meyer, Michaela; Wosniok, Werner; Harmens, Harry; Frontasyeva, Marina V; Alber, Renate; Aleksiayenak, Julia; Barandovski, Lambe; Carballeira, Alejo; Danielsson, Helena; de Temmermann, Ludwig; Godzik, Barbara; Jeran, Zvonka; Karlsson, Gunilla Pihl; Lazo, Pranvera; Leblond, Sebastien; Lindroos, Antti-Jussi; Liiv, Siiri; Magnússon, Sigurður H; Mankovska, Blanka; Martínez-Abaigar, Javier; Piispanen, Juha; Poikolainen, Jarmo; Popescu, Ion V; Qarri, Flora; Santamaria, Jesus Miguel; Skudnik, Mitja; Špirić, Zdravko; Stafilov, Trajce; Steinnes, Eiliv; Stihi, Claudia; Thöni, Lotti; Uggerud, Hilde Thelle; Zechmeister, Harald G

    2016-06-01

    For analysing element input into ecosystems and associated risks due to atmospheric deposition, element concentrations in moss provide complementary and time-integrated data at high spatial resolution every 5 years since 1990. The paper reviews (1) minimum sample sizes needed for reliable, statistical estimation of mean values at four different spatial scales (European and national level as well as landscape-specific level covering Europe and single countries); (2) trends of heavy metal (HM) and nitrogen (N) concentrations in moss in Europe (1990-2010); (3) correlations between concentrations of HM in moss and soil specimens collected across Norway (1990-2010); and (4) canopy drip-induced site-specific variation of N concentration in moss sampled in seven European countries (1990-2013). While the minimum sample sizes on the European and national level were achieved without exception, for some ecological land classes and elements, the coverage with sampling sites should be improved. The decline in emission and subsequent atmospheric deposition of HM across Europe has resulted in decreasing HM concentrations in moss between 1990 and 2010. In contrast, hardly any changes were observed for N in moss between 2005, when N was included into the survey for the first time, and 2010. In Norway, both, the moss and the soil survey data sets, were correlated, indicating a decrease of HM concentrations in moss and soil. At the site level, the average N deposition inside of forests was almost three times higher than the average N deposition outside of forests.

  12. General-Purpose Front End for Real-Time Data Processing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    FRONTIER is a computer program that functions as a front end for any of a variety of other software of both the artificial intelligence (AI) and conventional data-processing types. As used here, front end signifies interface software needed for acquiring and preprocessing data and making the data available for analysis by the other software. FRONTIER is reusable in that it can be rapidly tailored to any such other software with minimum effort. Each component of FRONTIER is programmable and is executed in an embedded virtual machine. Each component can be reconfigured during execution. The virtual-machine implementation making FRONTIER independent of the type of computing hardware on which it is executed.

  13. Communication: Introducing prescribed biases in out-of-equilibrium Markov models

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.

    2018-03-01

    Markov models are often used in modeling complex out-of-equilibrium chemical and biochemical systems. However, many times their predictions do not agree with experiments. We need a systematic framework to update existing Markov models to make them consistent with constraints that are derived from experiments. Here, we present a framework based on the principle of maximum relative path entropy (minimum Kullback-Leibler divergence) to update Markov models using stationary state and dynamical trajectory-based constraints. We illustrate the framework using a biochemical model network of growth factor-based signaling. We also show how to find the closest detailed balanced Markov model to a given Markov model. Further applications and generalizations are discussed.

  14. Oxygen enhanced switching to combustion of lower rank fuels

    DOEpatents

    Kobayashi, Hisashi; Bool, III, Lawrence E.; Wu, Kuang Tsai

    2004-03-02

    A furnace that combusts fuel, such as coal, of a given minimum energy content to obtain a stated minimum amount of energy per unit of time is enabled to combust fuel having a lower energy content, while still obtaining at least the stated minimum energy generation rate, by replacing a small amount of the combustion air fed to the furnace by oxygen. The replacement of oxygen for combustion air also provides reduction in the generation of NOx.

  15. Solar activity as driver for the Dark Age Grand Solar Minimum

    NASA Astrophysics Data System (ADS)

    Neuhäuser, Ralph; Neuhäuser, Dagmar

    2017-04-01

    We will discuss the role of solar activity for the temperature variability from AD 550 to 840, roughly the last three centuries of the Dark Ages. This time range includes the so-called Dark Age Grand Solar Minimum, whose deep part is dated to about AD 650 to 700, which is seen in increased radiocarbon, but decreased aurora observations (and a lack of naked-eye sunspot sightings). We present historical reports on aurorae from all human cultures with written reports including East Asia, Near East (Arabia), and Europe. To classify such reports correctly, clear criteria are needed, which are also discussed. We compare our catalog of historical aurorae (and sunspots) as well as C-14 data, i.e. solar activity proxies, with temperature reconstructions (PAGES). After increased solar activity until around AD 600, we see a dearth of aurorae and increased radiocarbon production in particular in the second half of the 7th century, i.e. a typical Grand Solar Minimum. Then, after about AD 690 (the maximum in radiocarbon, the end of the Dark Age Grand Minimum), we see increased auroral activity, decreasing radiocarbon, and increasing temperature until about AD 775. At around AD 775, we see the well-known strong C-14 variability (solar activity drop), then immediately another dearth of aurorae plus high C-14, indicating another solar activity minimum. This is consistent with a temperature depression from about AD 775 on into the beginning of the 9th century. Very high solar activity is then seen in the first four decades with four aurora clusters and three simultaneous sunspot clusters, and low C-14, again also increasing temperature. The period of increasing solar activity marks the end of the so-called Dark Ages: While auroral activity increases since about AD 793, temperature starts to increase quite exactly at AD 800. We can reconstruct the Schwabe cycles with aurorae and C-14 data. In summary, we can see a clear correspondence of the variability of solar activity proxies and surface temperature reconstructions. This indicates that solar activity is an important climate driver.

  16. Barriers and dispersal surfaces in minimum-time interception

    NASA Technical Reports Server (NTRS)

    Rajan, N.; Ardema, M. D.

    1982-01-01

    Minimum time interception of a target moving in a horizontal plane is analyzed as a one-player differential game. Dispersal points and points on the barrier are located for a class of pursuit evasion and interception problems. These points are determined by constructing cross sections of the isochrones and hence obtaining the barrier, dispersal, and control level surfaces. The game solution maps the controls as a function of the state within the capture region.

  17. Theoretical study of network design methodologies for the aerial relay system. [energy consumption and air traffic control

    NASA Technical Reports Server (NTRS)

    Rivera, J. M.; Simpson, R. W.

    1980-01-01

    The aerial relay system network design problem is discussed. A generalized branch and bound based algorithm is developed which can consider a variety of optimization criteria, such as minimum passenger travel time and minimum liner and feeder operating costs. The algorithm, although efficient, is basically useful for small size networks, due to its nature of exponentially increasing computation time with the number of variables.

  18. Analysis of temperature trends in Northern Serbia

    NASA Astrophysics Data System (ADS)

    Tosic, Ivana; Gavrilov, Milivoj; Unkašević, Miroslava; Marković, Slobodan; Petrović, Predrag

    2017-04-01

    An analysis of air temperature trends in Northern Serbia for the annual and seasonal time series is performed for two periods: 1949-2013 and 1979-2013. Three data sets of surface air temperatures: monthly mean temperatures, monthly maximum temperatures, and monthly minimum temperatures are analyzed at 9 stations that have altitudes varying between 75 m and 102 m. Monthly mean temperatures are obtained as the average of the daily mean temperatures, while monthly maximum (minimum) temperatures are the maximum (minimum) values of daily temperatures in corresponding month. Positive trends were found in 29 out of 30 time series, and the negative trend was found only in winter during the period 1979-2013. Applying the Mann-Kendall test, significant positive trends were found in 15 series; 7 in the period 1949-2013 and 8 in the period 1979-2013; and no significant trend was found in 15 series. Significant positive trends are dominated during the year, spring, and summer, where it was found in 14 out of 18 cases. Significant positive trends were found 7, 5, and 3 times in mean, maximum and minimum temperatures, respectively. It was found that the positive temperature trends are dominant in Northern Serbia.

  19. results obtained by the application of two different methods for the calculation of optimal coplanar orbital maneuvers with time limit

    NASA Astrophysics Data System (ADS)

    Rocco, Emr; Prado, Afbap; Souza, Mlos

    In this work, the problem of bi-impulsive orbital transfers between coplanar elliptical orbits with minimum fuel consumption but with a time limit for this transfer is studied. As a first method, the equations presented by Lawden (1993) were used. Those equations furnishes the optimal transfer orbit with fixed time for this transfer, between two elliptical coplanar orbits considering fixed terminal points. The method was adapted to cases with free terminal points and those equations was solved to develop a software for orbital maneuvers. As a second method, the equations presented by Eckel and Vinh (1984) were used, those equations provide the transfer orbit between non-coplanar elliptical orbits with minimum fuel and fixed time transfer, or minimum time transfer for a prescribed fuel consumption, considering free terminal points. But in this work only the problem with fixed time transfer was considered, the case of minimum time for a prescribed fuel consumption was already studied in Rocco et al. (2000). Then, the method was modified to consider cases of coplanar orbital transfer, and develop a software for orbital maneuvers. Therefore, two software that solve the same problem using different methods were developed. The first method, presented by Lawden, uses the primer vector theory. The second method, presented by Eckel and Vinh, uses the ordinary theory of maxima and minima. So, to test the methods we choose the same terminal orbits and the same time as input. We could verify that we didn't obtain exactly the same result. In this work, that is an extension of Rocco et al. (2002), these differences in the results are explored with objective of determining the reason of the occurrence of these differences and which modifications should be done to eliminate them.

  20. The minimum record time for PIV measurement in a vessel agitated by a Rushton turbine

    NASA Astrophysics Data System (ADS)

    Šulc, Radek; Ditl, Pavel; Fořt, Ivan; Jašíkova, Darina; Kotek, Michal; Kopecký, Václav; Kysela, Bohuš

    In PIV studies published in the literature focusing on the investigation of the flow field in an agitated vessel the record time is ranging from the tenths and the units of seconds. The aim of this work was to determine minimum record time for PIV measurement in a vessel agitated by a Rushton turbine that is necessary to obtain relevant results of velocity field. The velocity fields were measured in a fully baffled cylindrical flat bottom vessel 400 mm in inner diameter agitated by a Rushton turbine 133 mm in diameter using 2-D Time Resolved Particle Image Velocimetry in the impeller Reynolds number range from 50 000 to 189 000. This Re range secures the fully-developed turbulent flow of agitated liquid. Three liquids of different viscosities were used as the agitated liquid. On the basis of the analysis of the radial and axial components of the mean- and fluctuation velocities measured outside the impeller region it was found that dimensionless minimum record time is independent of impeller Reynolds number and is equalled N.tRmin = 103 ± 19.

  1. How to Extend the Capabilities of Space Systems for Long Duration Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Waterman, Robert D.; KrishnaKumar, Kalmanje; Waterman, Susan J.

    2005-01-01

    For sustainable Exploration Missions the need exists to assemble systems-of-systems in space, on the Moon or on other planetary surfaces. To fulfill this need new and innovative system architecture is needed that can be satisfied with the present lift capability of existing rocket technology without the added cost of developing a new heavy lift vehicle. To enable ultra-long life missions with minimum redundancy and lighter mass the need exists to develop system soft,i,are and hardware reconfigurability, which enables increasing functionality and multiple use of launched assets while at the same time overcoming any components failures. Also the need exists to develop the ability to dynamically demate and reassemble individual system elements during a mission in order to work around failed hardware or changed mission requirements. Therefore to meet the goals of Space Exploration Missions in hiteroperability and Reconfigurability, many challenges must be addressed to transform the traditional static avionics architecture into architecture with dynamic capabilities. The objective of this paper is to introduce concepts associated with reconfigurable computer systems; review the various needs and challenges associated with reconfigurable avionics space systems; provide an operational example that illustrates the needs applicable to either the Crew Exploration Vehicle or a collection of "Habot like" mobile surface elements; summarize the approaches that address key challenges to acceptance of a Flexible, Intelligent, Modular and Affordable reconfigurable avionics space system.

  2. High Tensile Strength Amalgams for In-Space Repair and Fabrication

    NASA Technical Reports Server (NTRS)

    Grugel, R. N.

    2005-01-01

    Amalgams are defined as an alloy of mercury with one or more other metals. These, along with those based on gallium (also liquid at near room temperature), are widely used in dental practice as a tooth filling material. Amalgams have a number of useful attributes that indude room temperature compounding. corrosion resistance, dimensional stability, and good compressive strength. These properties well serve dental needs but, unfortunately, amalgams have extremely poor tensile strength, a feature that severely limits their applications. The work presented here demonstrates how, by modifying particle geometry, the tensile strength of amalgams can be increased and thus extending the range of potential applications. This is relevant to, for example, the freeform fabrication of replacement parts that might be necessary during an extended space mission. Advantages, i.e. Figures-of-Merit. include the ability to produce complex parts, minimum crew interaction, high yield - minimum wasted material, reduced gravity compatibility, minimum final finishing, safety, and minimum power consumption.

  3. Political and technical issues of coal fire extinction in the Kyoto framework

    NASA Astrophysics Data System (ADS)

    Meyer, U.; Chen-Brauchler, D.; Rüter, H.; Fischer, C.; Bing, K.

    2009-04-01

    It is a highly desirable effort to extinguish as much coal fires as possible in short time to prevent large losses of energy resources and to minimise CO2 and other exhaust gas releases from such sources. Unfortunately, extinguishing coal fires needs massive financial investments, skilled man power, suited technology and a long time. Even mid to small scale coal fires need several months of extinguishing measures and of monitoring time after extinction resulting in expenditures of a minimum of several hundred thousand Euros. Large companies might be willing to spend money for coal fire extinction measures but smaller holdings or regional governments might not have the monetary resources for it. Since there is no law in China that demands coal fire extinction, measures under the Kyoto framework may be applied to sell CO2 certificates for prevented emissions from extinguished coal fires and thus used as a financial stimulus for coal fire extinction activities. The set-up for methodologies and project designs is especially complex for coal fire extinction measures and thus for necessary exploration, evaluation and monitoring using geophysical and remote sensing methods. A brief overview of most important formal and technical aspects is given to outline the conditions for a potentially successful CDM application on coal fires based on geophysical observations and numerical modelling.

  4. Inventory slack routing application in emergency logistics and relief distributions.

    PubMed

    Yang, Xianfeng; Hao, Wei; Lu, Yang

    2018-01-01

    Various natural and manmade disasters during last decades have highlighted the need of further improving on governmental preparedness to emergency events, and a relief supplies distribution problem named Inventory Slack Routing Problem (ISRP) has received increasing attentions. In an ISRP, inventory slack is defined as the duration between reliefs arriving time and estimated inventory stock-out time. Hence, a larger inventory slack could grant more responsive time in facing of various factors (e.g., traffic congestion) that may lead to delivery lateness. In this study, the relief distribution problem is formulated as an optimization model that maximize the minimum slack among all dispensing sites. To efficiently solve this problem, we propose a two-stage approach to tackle the vehicle routing and relief allocation sub-problems. By analyzing the inter-relations between these two sub-problems, a new objective function considering both delivery durations and dispensing rates of demand sites is applied in the first stage to design the vehicle routes. A hierarchical routing approach and a sweep approach are also proposed in this stage. Given the vehicle routing plan, the relief allocation could be easily solved in the second stage. Numerical experiment with a comparison of multi-vehicle Traveling Salesman Problem (TSP) has demonstrated the need of ISRP and the capability of the proposed solution approaches.

  5. Inventory slack routing application in emergency logistics and relief distributions

    PubMed Central

    Yang, Xianfeng; Lu, Yang

    2018-01-01

    Various natural and manmade disasters during last decades have highlighted the need of further improving on governmental preparedness to emergency events, and a relief supplies distribution problem named Inventory Slack Routing Problem (ISRP) has received increasing attentions. In an ISRP, inventory slack is defined as the duration between reliefs arriving time and estimated inventory stock-out time. Hence, a larger inventory slack could grant more responsive time in facing of various factors (e.g., traffic congestion) that may lead to delivery lateness. In this study, the relief distribution problem is formulated as an optimization model that maximize the minimum slack among all dispensing sites. To efficiently solve this problem, we propose a two-stage approach to tackle the vehicle routing and relief allocation sub-problems. By analyzing the inter-relations between these two sub-problems, a new objective function considering both delivery durations and dispensing rates of demand sites is applied in the first stage to design the vehicle routes. A hierarchical routing approach and a sweep approach are also proposed in this stage. Given the vehicle routing plan, the relief allocation could be easily solved in the second stage. Numerical experiment with a comparison of multi-vehicle Traveling Salesman Problem (TSP) has demonstrated the need of ISRP and the capability of the proposed solution approaches. PMID:29902196

  6. Analysis of Users' Searches of CD-ROM Databases in the National and University Library in Zagreb.

    ERIC Educational Resources Information Center

    Jokic, Maja

    1997-01-01

    Investigates the search behavior of CD-ROM database users in Zagreb (Croatia) libraries: one group needed a minimum of technical assistance, and the other was completely independent. Highlights include the use of questionnaires and transaction log analysis and the need for end-user education. The questionnaire and definitions of search process…

  7. NREL Evaluates National Charging Infrastructure Needs for Growing Fleet of

    Science.gov Websites

    PEV charging requirements within urban and rural communities and along interstate corridors. For each spacing set to enhance station utility and economics. Compared to interstate corridors, urban and rural stations would be needed to provide a minimum level of urban and rural coverage nationwide. In a PEV market

  8. 25 CFR 36.70 - What terms do I need to know?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false What terms do I need to know? 36.70 Section 36.70 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION MINIMUM ACADEMIC STANDARDS FOR THE BASIC EDUCATION OF INDIAN CHILDREN AND NATIONAL CRITERIA FOR DORMITORY SITUATIONS Homeliving Programs § 36.70 What...

  9. Is the minimum enough? Affordability of a nutritious diet for minimum wage earners in Nova Scotia (2002-2012).

    PubMed

    Newell, Felicia D; Williams, Patricia L; Watt, Cynthia G

    2014-05-09

    This paper aims to assess the affordability of a nutritious diet for households earning minimum wage in Nova Scotia (NS) from 2002 to 2012 using an economic simulation that includes food costing and secondary data. The cost of the National Nutritious Food Basket (NNFB) was assessed with a stratified, random sample of grocery stores in NS during six time periods: 2002, 2004/2005, 2007, 2008, 2010 and 2012. The NNFB's cost was factored into affordability scenarios for three different household types relying on minimum wage earnings: a household of four; a lone mother with three children; and a lone man. Essential monthly living expenses were deducted from monthly net incomes using methods that were standardized from 2002 to 2012 to determine whether adequate funds remained to purchase a basic nutritious diet across the six time periods. A 79% increase to the minimum wage in NS has resulted in a decrease in the potential deficit faced by each household scenario in the period examined. However, the household of four and the lone mother with three children would still face monthly deficits ($44.89 and $496.77, respectively, in 2012) if they were to purchase a nutritiously sufficient diet. As a social determinant of health, risk of food insecurity is a critical public health issue for low wage earners. While it is essential to increase the minimum wage in the short term, adequately addressing income adequacy in NS and elsewhere requires a shift in thinking from a focus on minimum wage towards more comprehensive policies ensuring an adequate livable income for everyone.

  10. Studying the start of the Maunder Minimum to understand the current situation

    NASA Astrophysics Data System (ADS)

    Neuhäuser, Ralph; Neuhäuser, Dagmar L.

    2016-04-01

    To investigate whether we now enter a Maunder-like grand minimum, we have to compare the current situation with the time around the start of the Maunder minimum. Sunspot observations in the 1610s are of particular importance and relevance, because they are shortly before the start of the Maunder Grand Minimum. While the Maunder Minimum it is usually dated from 1645 to 1715, Vaquero & Trigo (2015) argue that what they call the "Extended Maunder Minimum" would have started in 1618 during or around a Schwabe cycle minimum around that time. We have therefore studied the sunspot record of that time in detail. Hoyt & Schatten (1998) compiled for all known telescopic observers a list of their observations; recent solar activity studies for the past four centuries are based on their compilation. In addition to 12 observers listed by Hoyt & Schatten (1998) for the 1610s, we list six more observers with datable spot observations. Furthermore, while Hoyt & Schatten (1998) argue that Simon Marius would have observed from mid 1617 to the end of 1618 almost every day, but would have never seen a spot, we can show with the original reports by Marius that he observed from Aug 1611 to spring 1619 with a lot of sunspot detections. Similar, while Hoyt & Schatten (1998) argue that Giovanni Riccioli would have observed on almost every day in 1618, but would have never seen a spot, he did not report any own observations at all that year, but quoted Argoli for that there were no spots during the periods with comets in 1618. The data base by Hoyt & Schatten (1998) has several more errors in the 1610s, as we show also for the observations by Harriot, Scheiner, Malapert, Saxonius, and Tarde. We also compare drawings from Jungius with the observations by Harriot, Galilei, and Marius. In contrast to what is specified in Hoyt & Schatten (1998), after Harriot, the two Fabricius (father and son), Scheiner and Cysat, Marius and Schmidnerus are among the earliest datable telescopic sunspot observers (1611 Aug 3, Julian). It is very important to go back to the original drawings and observational reports (written often in Latin or German). The active day fractions was high from 1611 to 1616 (1.0 to 0.9), but then dropped to much lower values 1617 to 1620. Sunspots records by Malapert from 1618 to 1621 show that the last low-latitude spot was seen in Dec 1620, while the first high-latitude spots were noticed in June and Oct 1620 (we show his drawings), so that the turnover from one Schwabe cycle to the next (minimum) took place around that time, also seen in longer periods without naked-eye and telescopic spots nor any likely true aurorae. Did the Maunder Minimum start with or right after this Schwabe cycle minimum in the second half of 1620 or one or two cycles later? We will then compare the start of the Maunder minimum with the current situation.

  11. Minimum dietary diversity and associated factors among children aged 6-23 months in Addis Ababa, Ethiopia.

    PubMed

    Solomon, Dagmawit; Aderaw, Zewdie; Tegegne, Teketo Kassaw

    2017-10-12

    Dietary diversity has long been recognized as a key element of high quality diets. Minimum Dietary Diversity (MDD) is the consumption of four or more food groups from the seven food groups. Globally, only few children are receiving nutritionally adequate and diversified foods. More than two-thirds of malnutrition related child deaths are associated with inappropriate feeding practice during the first two years of life. In Ethiopia, only 7 % of children age 6-23 months had received the minimum acceptable diet. Therefore, the main aim of this study was to determine the level of minimum dietary diversity practice and identify the associated factors among children aged 6-23 months in Addis Ababa, Ethiopia. A health facility based cross sectional study was undertaken in the three sub-cities of Addis Ababa from 26th February to 28th April, 2016. A multi-stage sampling technique was used to sample the 352 study participants or mothers who had children aged 6-23 months. Data were collected by using a structured and pretested questionnaire, cleaned and entered into Epi info 7 and analyzed using SPSS 24 software. Logistic regression was fitted and odds ratio with 95% confidence interval (CI) with p-value less than 0.05 was used to identify factors associated with minimum dietary diversity. In this study, the overall children with minimum dietary diversity score were found to be 59.9%. Mother's educational attainment and a higher household monthly income were positively associated with the minimum dietary diversity practice. Similarly, mothers' knowledge on dietary diversity and child feeding was positively associated with minimum dietary diversity child feeding practice, with an adjusted odds ratio of 1.98 (95% CI: 1.11-3.53). In this study, the consumption of minimum dietary diversity was found to be high. In spite of this, more efforts need to be done to achieve the recommended minimum dietary diversity intake for all children aged between 6 and 23 months.

  12. Mesh refinement strategy for optimal control problems

    NASA Astrophysics Data System (ADS)

    Paiva, L. T.; Fontes, F. A. C. C.

    2013-10-01

    Direct methods are becoming the most used technique to solve nonlinear optimal control problems. Regular time meshes having equidistant spacing are frequently used. However, in some cases these meshes cannot cope accurately with nonlinear behavior. One way to improve the solution is to select a new mesh with a greater number of nodes. Another way, involves adaptive mesh refinement. In this case, the mesh nodes have non equidistant spacing which allow a non uniform nodes collocation. In the method presented in this paper, a time mesh refinement strategy based on the local error is developed. After computing a solution in a coarse mesh, the local error is evaluated, which gives information about the subintervals of time domain where refinement is needed. This procedure is repeated until the local error reaches a user-specified threshold. The technique is applied to solve the car-like vehicle problem aiming minimum consumption. The approach developed in this paper leads to results with greater accuracy and yet with lower overall computational time as compared to using a time meshes having equidistant spacing.

  13. A novel non-uniform control vector parameterization approach with time grid refinement for flight level tracking optimal control problems.

    PubMed

    Liu, Ping; Li, Guodong; Liu, Xinggao; Xiao, Long; Wang, Yalin; Yang, Chunhua; Gui, Weihua

    2018-02-01

    High quality control method is essential for the implementation of aircraft autopilot system. An optimal control problem model considering the safe aerodynamic envelop is therefore established to improve the control quality of aircraft flight level tracking. A novel non-uniform control vector parameterization (CVP) method with time grid refinement is then proposed for solving the optimal control problem. By introducing the Hilbert-Huang transform (HHT) analysis, an efficient time grid refinement approach is presented and an adaptive time grid is automatically obtained. With this refinement, the proposed method needs fewer optimization parameters to achieve better control quality when compared with uniform refinement CVP method, whereas the computational cost is lower. Two well-known flight level altitude tracking problems and one minimum time cost problem are tested as illustrations and the uniform refinement control vector parameterization method is adopted as the comparative base. Numerical results show that the proposed method achieves better performances in terms of optimization accuracy and computation cost; meanwhile, the control quality is efficiently improved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Link Performance Analysis and monitoring - A unified approach to divergent requirements

    NASA Astrophysics Data System (ADS)

    Thom, G. A.

    Link Performance Analysis and real-time monitoring are generally covered by a wide range of equipment. Bit Error Rate testers provide digital link performance measurements but are not useful during real-time data flows. Real-time performance monitors utilize the fixed overhead content but vary widely from format to format. Link quality information is also present from signal reconstruction equipment in the form of receiver AGC, bit synchronizer AGC, and bit synchronizer soft decision level outputs, but no general approach to utilizing this information exists. This paper presents an approach to link tests, real-time data quality monitoring, and results presentation that utilizes a set of general purpose modules in a flexible architectural environment. The system operates over a wide range of bit rates (up to 150 Mbs) and employs several measurement techniques, including P/N code errors or fixed PCM format errors, derived real-time BER from frame sync errors, and Data Quality Analysis derived by counting significant sync status changes. The architecture performs with a minimum of elements in place to permit a phased update of the user's unit in accordance with his needs.

  15. Three-Axis Time-Optimal Attitude Maneuvers of a Rigid-Body

    NASA Astrophysics Data System (ADS)

    Wang, Xijing; Li, Jisheng

    With the development trends for modern satellites towards macro-scale and micro-scale, new demands are requested for its attitude adjustment. Precise pointing control and rapid maneuvering capabilities have long been part of many space missions. While the development of computer technology enables new optimal algorithms being used continuously, a powerful tool for solving problem is provided. Many papers about attitude adjustment have been published, the configurations of the spacecraft are considered rigid body with flexible parts or gyrostate-type systems. The object function always include minimum time or minimum fuel. During earlier satellite missions, the attitude acquisition was achieved by using the momentum ex change devices, performed by a sequential single-axis slewing strategy. Recently, the simultaneous three-axis minimum-time maneuver(reorientation) problems have been studied by many researchers. It is important to research the minimum-time maneuver of a rigid spacecraft within onboard power limits, because of potential space application such as surveying multiple targets in space and academic value. The minimum-time maneuver of a rigid spacecraft is a basic problem because the solutions for maneuvering flexible spacecraft are based on the solution to the rigid body slew problem. A new method for the open-loop solution for a rigid spacecraft maneuver is presented. Having neglected all perturbation torque, the necessary conditions of spacecraft from one state to another state can be determined. There is difference between single-axis with multi-axis. For single- axis analytical solution is possible and the switching line passing through the state-space origin belongs to parabolic. For multi-axis, it is impossible to get analytical solution due to the dynamic coupling between the axes and must be solved numerically. Proved by modern research, Euler axis rotations are quasi-time-optimal in general. On the basis of minimum value principles, a research for reorienting an inertial syrnmetric spacecraft with time cost function from an initial state of rest to a final state of rest is deduced. And the solution to it is stated below: Firstly, the essential condition for solving the problem is deduced with the minimum value principle. The necessary conditions for optimality yield a two point boundary-value problem (TPBVP), which, when solved, produces the control history that minimize time performance index. In the nonsingular control, the solution is the' bang-bang maneuver. The control profile is characterized by Saturated controls for the entire maneuver. The singular control maybe existed. It is only singular in mathematics. According to physical principle, the bigger the mode of the control torque is, the shorter the time is. So saturated controls are used in singular control. Secondly, the control parameters are always in maximum, so the key problem is to determine switch point thus original problem is changed to find the changing time. By the use of adjusting the switch on/off time, the genetic algorithm, which is a new robust method is optimized to determine the switch features without the gyroscopic coupling. There is improvement upon the traditional GA in this research. The homotopy method to find the nonlinear algebra is based on rigorous topology continuum theory. Based on the idea of the homotopy, the relaxation parameters are introduced, and the switch point is figured out with simulated annealing. Computer simulation results using a rigid body show that the new method is feasible and efficient. A practical method of computing approximate solutions to the time-optimal control- switch times for rigid body reorientation has been developed.

  16. Automated Camouflage Pattern Generation Technology Survey.

    DTIC Science & Technology

    1985-08-07

    supported by high speed data communications? Costs: 9 What are your rates? $/CPU hour: $/MB disk storage/day: S/connect hour: other charges: What are your... data to the workstation, tape drives are needed for backing up and archiving completed patterns, 256 megabytes of on-line hard disk space as a minimum...is needed to support multiple processes and data files, and 4 megabytes of actual or virtual memory is needed to process the largest expected single

  17. Delimitation of homogeneous regions in the UNIFESP/EPM healthcare center coverage area based on sociodemographic indicators.

    PubMed

    Harada, K Y; Silva, J G; Schenkman, S; Hayama, E T; Santos, F R; Prado, M C; Pontes, R H

    1999-01-07

    The drawing up of adequate Public Health action planning to address the true needs of the population would increase the chances of effectiveness and decrease unnecessary expenses. To identify homogeneous regions in the UNIFESP/EPM healthcare center (HCC) coverage area based on sociodemographic indicators and to relate them to causes of deaths in 1995. Secondary data analysis. HCC coverage area; primary care. Sociodemographic indicators were obtained from special tabulations of the Demographic Census of 1991. Proportion of children and elderly in the population; family providers' education level (maximum: > 15 years, minimum: < 1 year) and income level (maximum: > 20 minimum wages, minimum: < 1 minimum wage); proportional mortality distribution The maximum income permitted the construction of four homogeneous regions, according to income ranking. Although the proportion of children and of elderly did not vary significantly among the regions, minimum income and education showed a statistically significant (p < 0.05) difference between the first region (least affluent) and the others. A clear trend of increasing maximum education was observed across the regions. Mortality also differed in the first region, with deaths generated by possibly preventable infections. The inequalities observed may contribute to primary health prevention.

  18. Impact of HIPAA’s Minimum Necessary Standard on Genomic Data Sharing

    PubMed Central

    Evans, Barbara J.; Jarvik, Gail P.

    2017-01-01

    Purpose This article provides a brief introduction to the HIPAA Privacy Rule’s minimum necessary standard, which applies to sharing of genomic data, particularly clinical data, following 2013 Privacy Rule revisions. Methods This research used the Thomson Reuters Westlaw™ database and law library resources in its legal analysis of the HIPAA privacy tiers and the impact of the minimum necessary standard on genomic data-sharing. We considered relevant example cases of genomic data-sharing needs. Results In a climate of stepped-up HIPAA enforcement, this standard is of concern to laboratories that generate, use, and share genomic information. How data-sharing activities are characterized—whether for research, public health, or clinical interpretation and medical practice support—affects how the minimum necessary standard applies and its overall impact on data access and use. Conclusion There is no clear regulatory guidance on how to apply HIPAA’s minimum necessary standard when considering the sharing of information in the data-rich environment of genomic testing. Laboratories that perform genomic testing should engage with policy-makers to foster sound, well-informed policies and appropriate characterization of data-sharing activities to minimize adverse impacts on day-to-day workflows. PMID:28914268

  19. Impact of HIPAA's minimum necessary standard on genomic data sharing.

    PubMed

    Evans, Barbara J; Jarvik, Gail P

    2018-04-01

    This article provides a brief introduction to the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule's minimum necessary standard, which applies to sharing of genomic data, particularly clinical data, following 2013 Privacy Rule revisions. This research used the Thomson Reuters Westlaw database and law library resources in its legal analysis of the HIPAA privacy tiers and the impact of the minimum necessary standard on genomic data sharing. We considered relevant example cases of genomic data-sharing needs. In a climate of stepped-up HIPAA enforcement, this standard is of concern to laboratories that generate, use, and share genomic information. How data-sharing activities are characterized-whether for research, public health, or clinical interpretation and medical practice support-affects how the minimum necessary standard applies and its overall impact on data access and use. There is no clear regulatory guidance on how to apply HIPAA's minimum necessary standard when considering the sharing of information in the data-rich environment of genomic testing. Laboratories that perform genomic testing should engage with policy makers to foster sound, well-informed policies and appropriate characterization of data-sharing activities to minimize adverse impacts on day-to-day workflows.

  20. Minimum airflow reset of single-duct VAV terminal boxes

    NASA Astrophysics Data System (ADS)

    Cho, Young-Hum

    Single duct Variable Air Volume (VAV) systems are currently the most widely used type of HVAC system in the United States. When installing such a system, it is critical to determine the minimum airflow set point of the terminal box, as an optimally selected set point will improve the level of thermal comfort and indoor air quality (IAQ) while at the same time lower overall energy costs. In principle, this minimum rate should be calculated according to the minimum ventilation requirement based on ASHRAE standard 62.1 and maximum heating load of the zone. Several factors must be carefully considered when calculating this minimum rate. Terminal boxes with conventional control sequences may result in occupant discomfort and energy waste. If the minimum rate of airflow is set too high, the AHUs will consume excess fan power, and the terminal boxes may cause significant simultaneous room heating and cooling. At the same time, a rate that is too low will result in poor air circulation and indoor air quality in the air-conditioned space. Currently, many scholars are investigating how to change the algorithm of the advanced VAV terminal box controller without retrofitting. Some of these controllers have been found to effectively improve thermal comfort, indoor air quality, and energy efficiency. However, minimum airflow set points have not yet been identified, nor has controller performance been verified in confirmed studies. In this study, control algorithms were developed that automatically identify and reset terminal box minimum airflow set points, thereby improving indoor air quality and thermal comfort levels, and reducing the overall rate of energy consumption. A theoretical analysis of the optimal minimum airflow and discharge air temperature was performed to identify the potential energy benefits of resetting the terminal box minimum airflow set points. Applicable control algorithms for calculating the ideal values for the minimum airflow reset were developed and applied to actual systems for performance validation. The results of the theoretical analysis, numeric simulations, and experiments show that the optimal control algorithms can automatically identify the minimum rate of heating airflow under actual working conditions. Improved control helps to stabilize room air temperatures. The vertical difference in the room air temperature was lower than the comfort value. Measurements of room CO2 levels indicate that when the minimum airflow set point was reduced it did not adversely affect the indoor air quality. According to the measured energy results, optimal control algorithms give a lower rate of reheating energy consumption than conventional controls.

  1. Monitoring HD 148703 during upcoming eclipses

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-06-01

    Dr. Milena Ratajczak (University of Wrocław) has requested AAVSO observers' assistance in monitoring the very bright (V = 4.23) and very unusual eclipsing binary HD 148703 (HR 6143, N Sco) during its infrequent primary and secondary eclipses scheduled for 2017 June 11 and June 14, respectively. Dr. Ratajczak writes: "[HD 148703] N Sco is a B-type detached eclipsing binary, which turned out to be an exceptionally interesting object to study when we realised its orbital period is 223 days and time between eclipses is only 3.5 days. Such configuration makes it an extremely eccentric system, probably the most eccentric from any objects of that class ever studied...Since the object is very bright, it's difficult to use professional photometric telescopes due to saturation issues. That is why we kindly invite amateur astronomers to join the campaign. Data taken during times of eclipses (photometry) and time between eclipses (radial velocities from spectroscopy) which occur next week are crucial to cover in order to determine orbital and stellar parameters of system's components. Data taken over that time will be of very high value for us." The next primary eclipse time of minimum is on 2017 June 11 (UT 00:41:45), and the secondary on June 14 (UT 09:17:34). Each eclipse lasts about 20 hours. The amplitude of the primary eclipse is 0.15 magnitude, and the secondary 0.35 mag. PEP V and DSLR V photometry is requested. (CCD V is welcome if saturation can be avoided.) Beginning immediately, one to a few snapshots each night are requested to establish an out-of-eclipse baseline for each observer; they should continue for a few nights after the secondary eclipse has occurred.Time series photometry is requested beginning 12 hours before each time of minimum and continuing until 12 hours after. Precision to 0.01 mag or better per single observation is needed. Exposures should be as long as possible without saturating; don't make very short exposures simply for the purpose of gathering more data points. B or Ic data would also be useful; B is preferred to Ic. If imaging in more than one filter, please make five V observations for each B or Ic.Visual observations are also welcome. For spectroscopy now through June 20, resolution of at least a few thousands is needed. Coordinates: RA = 16 31 22.93 Dec = -34 42 15.7 (2000.0). Finder charts may be created and data from the AAVSO International Database may be viewed, plotted, or downloaded (www.aavso.org).

  2. The Evolution of Universe as Splitting of the ``Non Existing'' and Space-Time Expansion

    NASA Astrophysics Data System (ADS)

    Nassikas, A. A.

    2010-09-01

    The purpose of this paper is to show that the creation of Universe can be regarded as a splitting process of the ``non existing'', ``where'' there is no space-time and that the expansion of Universe is due to the compatibility between the stochastic-quantum space-time created and the surrounding ``non existing''. In this way it is not required that space time should pre-exist in contrast, as it can be shown, to the Universe creation from vacuum theory. The present point of view can be derived on the basis of a Minimum Contradictions Physics according to which stochastic-quantum space-time is matter itself; there are (g)-mass and (em)-charge space-time which interact-communicate through photons [(g) or (em) particles with zero rest mass]. This point of view is compatible to the present knowledge of CERN and Fermi Lab experiments as well as to the neutron synthesis according to Rutherford, experimentally verified and theoretically explained through Hadronic Mechanics by R. M. Santilli. On the basis of the Minimum Contradictions Physics a quantum gravity formula is derived which implies either positive or negative gravitational acceleration; thus, bodies can be attracted while Universe can be expanded. Minimum Contradictions Physics, under certain simplifications, is compatible to Newton Mechanics, Relativity Theory and QM. This physics is compatible to language through which it is stated. On this basis the physical laws are the principles of language i.e.: the Classical Logic, the Sufficient Reason Principle the Communication Anterior-Posterior Axiom and the Claim for Minimum Contradictions; according to a theorem contradictions cannot be vanished.

  3. Evaluation of Antifungal Activity and Mechanism of Action of Citral against Candida albicans.

    PubMed

    Leite, Maria Clerya Alvino; Bezerra, André Parente de Brito; de Sousa, Janiere Pereira; Guerra, Felipe Queiroga Sarmento; Lima, Edeltrudes de Oliveira

    2014-01-01

    Candida albicans is a yeast that commensally inhabits the human body and can cause opportunistic or pathogenic infections. Objective. To investigate the antifungal activity of citral against C. albicans. Methodology. The minimum inhibitory concentration (MIC) and the minimum fungicidal concentration (MFC) were determined by the broth microdilution techniques. We also investigated possible citral action on cell walls (0.8 M sorbitol), cell membranes (citral to ergosterol binding), the time-kill curve, and biological activity on the yeast's morphology. Results. The MIC and MFC of citral were, respectively, 64 µg/mL and 256 µg/mL. Involvement with the cell wall and ergosterol binding were excluded as possible mechanisms of action. In the morphological interference assay, it was observed that the product inhibited pseudohyphae and chlamydoconidia formation. The MIC and the MFC of citral required only 4 hours of exposure to effectively kill 99.9% of the inoculum. Conclusion. Citral showed in vitro antifungal potential against strains of C. albicans. Citral's mechanism of action does not involve the cell wall or ergosterol, and further study is needed to completely describe its effects before being used in the future as a component of new antifungals.

  4. Effects of eccentricities and lateral pressure on the design of stiffened compression panels

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Anderson, M. S.

    1972-01-01

    An analysis for determining the effects of eccentricities and lateral pressure on the design of stiffened compression panels is presented. The four types of panel stiffeners considered are integral, zee, integral zee, and integral tee. Mass-strength curves, which give the mass of the panel necessary to carry a specified load, are given along with related design equations needed to calculate the cross-sectional dimensions of the minimum-mass-stiffened panel. The results of the study indicate that the proportions of the panels are geometrically similar to the proportions of panels designed for no eccentricity or lateral pressure, but no cross-sectional dimensions are greater, resulting in significantly increased mass. The analytical minimum-mass designs of zee-stiffened panels are compared with designs from experimentally derived charts. An assumed eccentricity of 0.001 times the length of the panel is used to correlate the analytical and experimental data. Good correlation between the experimentally derived and the analytical curves is obtained for the range of loading where materials yield governs the design. At lower loads the mass given by the analytical curve using this assumed eccentricity is greater than that given by the experimental results.

  5. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-01-01

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm. PMID:28587084

  6. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors.

    PubMed

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-05-25

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ -connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  7. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  8. Technique of optimization of minimum temperature driving forces in the heaters of regeneration system of a steam turbine unit

    NASA Astrophysics Data System (ADS)

    Shamarokov, A. S.; Zorin, V. M.; Dai, Fam Kuang

    2016-03-01

    At the current stage of development of nuclear power engineering, high demands on nuclear power plants (NPP), including on their economy, are made. In these conditions, improving the quality of NPP means, in particular, the need to reasonably choose the values of numerous managed parameters of technological (heat) scheme. Furthermore, the chosen values should correspond to the economic conditions of NPP operation, which are postponed usually a considerable time interval from the point of time of parameters' choice. The article presents the technique of optimization of controlled parameters of the heat circuit of a steam turbine plant for the future. Its particularity is to obtain the results depending on a complex parameter combining the external economic and operating parameters that are relatively stable under the changing economic environment. The article presents the results of optimization according to this technique of the minimum temperature driving forces in the surface heaters of the heat regeneration system of the steam turbine plant of a K-1200-6.8/50 type. For optimization, the collector-screen heaters of high and low pressure developed at the OAO All-Russia Research and Design Institute of Nuclear Power Machine Building, which, in the authors' opinion, have the certain advantages over other types of heaters, were chosen. The optimality criterion in the task was the change in annual reduced costs for NPP compared to the version accepted as the baseline one. The influence on the decision of the task of independent variables that are not included in the complex parameter was analyzed. An optimization task was decided using the alternating-variable descent method. The obtained values of minimum temperature driving forces can guide the design of new nuclear plants with a heat circuit, similar to that accepted in the considered task.

  9. Magnetic energy dissipation in force-free jets

    NASA Technical Reports Server (NTRS)

    Choudhuri, Arnab Rai; Konigl, Arieh

    1986-01-01

    It is shown that a magnetic pressure-dominated, supersonic jet which expands or contracts in response to variations in the confining external pressure can dissipate magnetic energy through field-line reconnection as it relaxes to a minimum-energy configuration. In order for a continuous dissipation to occur, the effective reconnection time must be a fraction of the expansion time. The dissipation rate for the axisymmetric minimum-energy field configuration is analytically derived. The results indicate that the field relaxation process could be a viable mechanism for powering the synchrotron emission in extragalactic jets if the reconnection time is substantially shorter than the nominal resistive tearing time in the jet.

  10. Launching lunar missions from Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Friedlander, Alan; Young, Archie

    1990-01-01

    The relative orbital motion of Space Station Freedom and the moon places practical constraints on the timing of launch/return transfer trajectories. This paper describes the timing characteristics as well as the Delta-V variations over a representative cycle of launch/return opportunities. On average, the minimum-Delta-V transfer opportunities occur at intervals of 9 days. However, there is a significant nonuniform variation in this timing interval, as well as the minimum stay time at the moon, over the short cycle (51 days) and the long cycle (18.6 years). The advantage of three-impulse transfers for extending the launch window is also described.

  11. Surface microhardness of a resin composite exposed to a "first-generation" LED curing lamp, in vitro.

    PubMed

    Keogh, Pauraic; Ray, Noel J; Lynch, Christopher D; Burke, Francis M; Hannigan, Ailish

    2004-12-01

    This investigation determined the minimum exposure times consistent with optimised surface microhardness parameters for a commercial resin composite cured using a "first-generation" light-emitting diode activation lamp. Disk specimens were exposed and surface microhardness numbers measured at the top and bottom surfaces for elapsed times of 1 hour and 24 hours. Bottom/top microhardness number ratios were also calculated. Most microhardness data increased significantly over the elapsed time interval but microhardness ratios (bottom/top) were dependent on exposure time only. A minimum exposure of 40 secs is appropriate to optimise microhardness parameters for the combination of resin composite and lamp investigated.

  12. Real-time trajectory optimization on parallel processors

    NASA Technical Reports Server (NTRS)

    Psiaki, Mark L.

    1993-01-01

    A parallel algorithm has been developed for rapidly solving trajectory optimization problems. The goal of the work has been to develop an algorithm that is suitable to do real-time, on-line optimal guidance through repeated solution of a trajectory optimization problem. The algorithm has been developed on an INTEL iPSC/860 message passing parallel processor. It uses a zero-order-hold discretization of a continuous-time problem and solves the resulting nonlinear programming problem using a custom-designed augmented Lagrangian nonlinear programming algorithm. The algorithm achieves parallelism of function, derivative, and search direction calculations through the principle of domain decomposition applied along the time axis. It has been encoded and tested on 3 example problems, the Goddard problem, the acceleration-limited, planar minimum-time to the origin problem, and a National Aerospace Plane minimum-fuel ascent guidance problem. Execution times as fast as 118 sec of wall clock time have been achieved for a 128-stage Goddard problem solved on 32 processors. A 32-stage minimum-time problem has been solved in 151 sec on 32 processors. A 32-stage National Aerospace Plane problem required 2 hours when solved on 32 processors. A speed-up factor of 7.2 has been achieved by using 32-nodes instead of 1-node to solve a 64-stage Goddard problem.

  13. 40 CFR Table 2a to Subpart Ce of... - Emissions Limits for Small HMIWI Which Meet the Criteria Under § 60.33e(b)(1)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... time 1 Method for demonstrating compliance 2 Particulate matter mg/dscm (gr/dscf) 197 (0.086) 3-run average (1-hour minimum sample time per run) EPA Reference Method 5 of appendix A-3 of part 60, or EPA Reference Method 26A or 29 of appendix A-8 of part 60. Carbon monoxide ppmv 40 3-run average (1-hour minimum...

  14. 40 CFR Table 2b to Subpart Ce of... - Emissions Limits for Small HMIWI Which Meet the Criteria Under § 60.33e(b)(2)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... time 1 Method for demonstrating compliance 2 Particulate matter mg/dscm (gr/dscf) 87 (0.038) 3-run average (1-hour minimum sample time per run) EPA Reference Method 5 of appendix A-3 of part 60, or EPA Reference Method 26A or 29 of appendix A-8 of part 60. Carbon monoxide ppmv 20 3-run average (1-hour minimum...

  15. Wind Observations of Anomalous Cosmic Rays from Solar Minimum to Maximum

    NASA Technical Reports Server (NTRS)

    Reames, D. V.; McDonald, F. B.

    2003-01-01

    We report the first observation near Earth of the time behavior of anomalous cosmic-ray N, O, and Ne ions through the period surrounding the maximum of the solar cycle. These observations were made by the Wind spacecraft during the 1995-2002 period spanning times from solar minimum through solar maximum. Comparison of anomalous and galactic cosmic rays provides a powerful tool for the study of the physics of solar modulation throughout the solar cycle.

  16. The Wait Calculation: The Broader Consequences of the Minimum Time from Now to Interstellar Destinations and its Significance to the Space Economy

    NASA Astrophysics Data System (ADS)

    Kennedy, A.

    This paper summarises the wait calculation [1] of interstellar voyagers which finds the minimum time to destination given exponential growth in the rate of travel available to a civilisation. The minimum time obliges stellar system colonisers to consider departure times a significant risk factor in their voyages since a departure then to a destination will beat a departure made at any other time before or after. Generalised conclusions will be drawn about the significant impact that departures to interstellar destinations before, at, or after the minimum time will have on the economic potential of missions and on the inevitability of competition between them. There will be no international law operating in interstellar space and an ability to escape predatory actions en route, or at the destination, can only be done by precise calculations of departure times. Social and economic forces affecting the factors in the growth equation are discussed with reference to the probability of accelerating growth reaching the technological Singularity and strengthening the growth incentive trap. Islamic banking practices are discussed as a credible alternative to compounding interest bearing paper for funding the space economy in the long term and for supporting stakeholder investment in such long term mission development. The paper considers the essential free productivity of the Earth's biosphere and the capital accumulations made possible by land productivity are essential components to a viable long term space economy and that research into re-creating the costless productivity of the biosphere at a destination will determine both the mission's ultimate success and provide means of returns for stakeholders during the long build up. Conclusions of these arguments suggest that the Icarus project should ignore a robotic interstellar mission concept and develop a manned colonising mission from now.

  17. Forensic Science

    ERIC Educational Resources Information Center

    Cobb, P. G. W.

    1973-01-01

    Summarizes the type of work carried out by forensic chemists and the minimum qualification needed for appointment. Indicates that there are eight Home Office regional forensic science laboratories in addition to the Central Research Establishment at Aldermaston. (CC)

  18. Voltage scheduling for low power/energy

    NASA Astrophysics Data System (ADS)

    Manzak, Ali

    2001-07-01

    Power considerations have become an increasingly dominant factor in the design of both portable and desk-top systems. An effective way to reduce power consumption is to lower the supply voltage since voltage is quadratically related to power. This dissertation considers the problem of lowering the supply voltage at (i) the system level and at (ii) the behavioral level. At the system level, the voltage of the variable voltage processor is dynamically changed with the work load. Processors with limited sized buffers as well as those with very large buffers are considered. Given the task arrival times, deadline times, execution times, periods and switching activities, task scheduling algorithms that minimize energy or peak power are developed for the processors equipped with very large buffers. A relation between the operating voltages of the tasks for minimum energy/power is determined using the Lagrange multiplier method, and an iterative algorithm that utilizes this relation is developed. Experimental results show that the voltage assignment obtained by the proposed algorithm is very close (0.1% error) to that of the optimal energy assignment and the optimal peak power (1% error) assignment. Next, on-line and off-fine minimum energy task scheduling algorithms are developed for processors with limited sized buffers. These algorithms have polynomial time complexity and present optimal (off-line) and close-to-optimal (on-line) solutions. A procedure to calculate the minimum buffer size given information about the size of the task (maximum, minimum), execution time (best case, worst case) and deadlines is also presented. At the behavioral level, resources operating at multiple voltages are used to minimize power while maintaining the throughput. Such a scheme has the advantage of allowing modules on the critical paths to be assigned to the highest voltage levels (thus meeting the required timing constraints) while allowing modules on non-critical paths to be assigned to lower voltage levels (thus reducing the power consumption). A polynomial time resource and latency constrained scheduling algorithm is developed to distribute the available slack among the nodes such that power consumption is minimum. The algorithm is iterative and utilizes the slack based on the Lagrange multiplier method.

  19. [Pharmacokinetics and pharmacodynamics of antibiotics in intensive care].

    PubMed

    Sörgel, F; Höhl, R; Glaser, R; Stelzer, C; Munz, M; Vormittag, M; Kinzig, M; Bulitta, J; Landersdorfer, C; Junger, A; Christ, M; Wilhelm, M; Holzgrabe, U

    2017-02-01

    Optimized dosage regimens of antibiotics have remained obscure since their introduction. During the last two decades pharmacokinetic(PK)-pharmacodynamic(PD) relationships, originally established in animal experiments, have been increasingly used in patients. The action of betalactams is believed to be governed by the time the plasma concentration is above the minimum inhibitory concentration (MIC). Aminoglycosides act as planned when the peak concentration is a multiple of the MIC and vancomycin seems to work best when the area under the plasma vs. time curve (AUC) to MIC has a certain ratio. Clinicians should be aware that these relationships can only be an indication in which direction dosing should go. Larger studies with sufficiently high numbers of patients and particularly severely sick patients are needed to prove the concepts. In times where all antibiotics can be measured with new technologies, the introduction of therapeutic drug monitoring (TDM) is suggested for ICUs (Intensive Care Unit). The idea of a central lab for TDM of antibiotics such as PEAK (Paul Ehrlich Antibiotika Konzentrationsmessung) is supported.

  20. Vehicle trajectory linearisation to enable efficient optimisation of the constant speed racing line

    NASA Astrophysics Data System (ADS)

    Timings, Julian P.; Cole, David J.

    2012-06-01

    A driver model is presented capable of optimising the trajectory of a simple dynamic nonlinear vehicle, at constant forward speed, so that progression along a predefined track is maximised as a function of time. In doing so, the model is able to continually operate a vehicle at its lateral-handling limit, maximising vehicle performance. The technique used forms a part of the solution to the motor racing objective of minimising lap time. A new approach of formulating the minimum lap time problem is motivated by the need for a more computationally efficient and robust tool-set for understanding on-the-limit driving behaviour. This has been achieved through set point-dependent linearisation of the vehicle model and coupling the vehicle-track system using an intrinsic coordinate description. Through this, the geometric vehicle trajectory had been linearised relative to the track reference, leading to new path optimisation algorithm which can be formed as a computationally efficient convex quadratic programming problem.

  1. A High-Speed, Real-Time Visualization and State Estimation Platform for Monitoring and Control of Electric Distribution Systems: Implementation and Field Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake; Gotseff, Peter; Giraldez, Julieta

    Continued deployment of renewable and distributed energy resources is fundamentally changing the way that electric distribution systems are controlled and operated; more sophisticated active system control and greater situational awareness are needed. Real-time measurements and distribution system state estimation (DSSE) techniques enable more sophisticated system control and, when combined with visualization applications, greater situational awareness. This paper presents a novel demonstration of a high-speed, real-time DSSE platform and related control and visualization functionalities, implemented using existing open-source software and distribution system monitoring hardware. Live scrolling strip charts of meter data and intuitive annotated map visualizations of the entire state (obtainedmore » via DSSE) of a real-world distribution circuit are shown. The DSSE implementation is validated to demonstrate provision of accurate voltage data. This platform allows for enhanced control and situational awareness using only a minimum quantity of distribution system measurement units and modest data and software infrastructure.« less

  2. Time-Coordination Strategies and Control Laws for Multi-Agent Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Puig-Navarro, Javier; Hovakimyan, Naira; Allen, B. Danette

    2017-01-01

    Time-critical coordination tools for unmanned systems can be employed to enforce the type of temporal constraints required in terminal control areas, ensure minimum distance requirements among vehicles are satisfied, and successfully perform coordinated missions. In comparison with previous literature, this paper presents an ampler spectrum of coordination and temporal specifications for unmanned systems, and proposes a general control law that can enforce this range of constraints. The constraint classification presented con- siders the nature of the desired arrival window and the permissible coordination errors to define six different types of time-coordination strategies. The resulting decentralized coordination control law allows the vehicles to negotiate their speeds along their paths in response to information exchanged over the communication network. This control law organizes the different members in the fleet hierarchically per their behavior and informational needs as reference agent, leaders, and followers. Examples and simulation results for all the coordination strategies presented demonstrate the applicability and efficacy of the coordination control law for multiple unmanned systems.

  3. Phenotypic Antimicrobial Susceptibility Testing with Deep Learning Video Microscopy.

    PubMed

    Yu, Hui; Jing, Wenwen; Iriya, Rafael; Yang, Yunze; Syal, Karan; Mo, Manni; Grys, Thomas E; Haydel, Shelley E; Wang, Shaopeng; Tao, Nongjian

    2018-05-15

    Timely determination of antimicrobial susceptibility for a bacterial infection enables precision prescription, shortens treatment time, and helps minimize the spread of antibiotic resistant infections. Current antimicrobial susceptibility testing (AST) methods often take several days and thus impede these clinical and health benefits. Here, we present an AST method by imaging freely moving bacterial cells in urine in real time and analyzing the videos with a deep learning algorithm. The deep learning algorithm determines if an antibiotic inhibits a bacterial cell by learning multiple phenotypic features of the cell without the need for defining and quantifying each feature. We apply the method to urinary tract infection, a common infection that affects millions of people, to determine the minimum inhibitory concentration of pathogens from both bacteria spiked urine and clinical infected urine samples for different antibiotics within 30 min and validate the results with the gold standard broth macrodilution method. The deep learning video microscopy-based AST holds great potential to contribute to the solution of increasing drug-resistant infections.

  4. Short communication: Determination of the ability of Thymox to kill or inhibit various species of microorganisms associated with infectious causes of bovine lameness in vitro.

    PubMed

    Kulow, Megan; Zibaee, Fahimeh; Allard, Marianne; Döpfer, Dörte

    2015-11-01

    Infectious claw diseases continue to plague cattle in intensively managed husbandry systems. Poor foot hygiene and constant moist environments lead to the infection and spread of diseases such as digital dermatitis (hairy heel warts), interdigital dermatitis, and interdigital phlegmon (foot rot). Currently, copper sulfate and formalin are the most widely used disinfecting agents in bovine footbaths; however, the industry could benefit from more environmentally and worker friendly substitutes. This study determined the in vitro minimum inhibitory concentrations and minimum bactericidal concentrations of Thymox (Laboratoire M2, Sherbrooke, Québec, Canada) for a selection of microorganisms related to infectious bovine foot diseases. Thymox is a broad-spectrum agricultural disinfectant that is nontoxic, noncorrosive, and readily biodegradable. The values for minimum inhibitory concentration and minimum bactericidal concentration indicated that Thymox inhibited growth and killed the various species of microorganisms under study at much lower concentrations compared with the recommended working concentration of a 1% solution. Overall, the values found in this study of minimum inhibitory concentration and minimum bactericidal concentration of Thymox show its potential as an alternative antibacterial agent used in bovine footbaths; however, field trials are needed to determine its effectiveness for the control and prevention of infectious claw diseases. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Clinical Outcomes of a Pneumatic Unloader Brace for Kellgren-Lawrence Grades 3 to 4 Osteoarthritis: A Minimum 1-Year Follow-Up Study.

    PubMed

    Chughtai, Morad; Bhave, Anil; Khan, Sabahat Z; Khlopas, Anton; Ali, Osman; Harwin, Steven F; Mont, Michael A

    2016-11-01

    The use of a pneumatic unloader brace has been shown in pilot studies to decrease pain and increase muscle strength in patients with knee osteoarthritis (OA). Therefore, we analyzed patients who had knee OA, and either received a pneumatic unloader brace and conventional treatment or conventional treatment alone. Specifically, we assessed: (1) use of pain relieving injections; (2) opioid consumption; and (3) the eventual need for total knee arthroplasty (TKA) in the above-mentioned cohort. We performed an analysis of a longitudinally maintained database of patients from a prospective, randomized, single center study. This study randomized patients who had Kellgren-Lawrence grades 3 to 4 to receive either a pneumatic unloader brace and conventional treatment or conventional treatment alone. The brace cohort comprised 11 patients with a mean age of 55 years (range, 37-70 years). The final matched cohort comprised 25 patients with a mean age of 63 years (range, 41-86 years). The minimum follow-up was 1 year. There was a lower proportion of patients who underwent an eventual TKA in the bracing cohort as compared with the nonbracing cohort (18 vs. 36%). The mean time to TKA was longer in the bracing cohort as compared with the nonbracing cohort (482 vs. 389 days). The proportion of patients who used opioids was similar in both groups (27 vs. 22%). There was a significantly lower number of patients who received injections in the bracing cohort as compared with the nonbracing cohort (46 vs. 83%, p  = 0.026). The bracing cohort had received a significantly lower number of injections and a lower rate of subsequent TKA as compared with the nonbracing cohort. The mean time to TKA was also longer among the bracing cohort. These results may demonstrate the potential of this brace to reduce the need for and prolonging the time to TKA. Performing larger prospective randomized studies, with built-in compliance monitors is warranted. This brace may be a valuable adjunct to the current knee OA treatment armamentarium pending further investigation. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  6. Imaging of Lymph Flow in Breast Cancer Patients after Microdose Administration of a Near-Infrared Fluorophore: Feasibility Study1

    PubMed Central

    Sevick-Muraca, Eva M.; Sharma, Ruchi; Rasmussen, John C.; Marshall, Milton V.; Wendt, Juliet A.; Pham, Hoang Q.; Bonefas, Elizabeth; Houston, Jessica P.; Sampath, Lakshmi; Adams, Kristen E.; Blanchard, Darlene Kay; Fisher, Ronald E.; Chiang, Stephen B.; Elledge, Richard; Mawad, Michel E.

    2011-01-01

    Purpose To prospectively demonstrate the feasibility of using indocyanine green, a near-infrared (NIR) fluorophore at the minimum dose needed for noninvasive optical imaging of lymph nodes (LNs) in breast cancer patients undergoing sentinel lymph node mapping (SLNM). Materials and Methods Informed consent was obtained from 24 women (age range, 30–85 years) who received intradermal subcutaneous injections of 0.31–100 μg indocyanine green in the breast in this IRB-approved, HIPAA-compliant, dose escalation study to find the minimum microdose for imaging. The breast, axilla, and sternum were illuminated with NIR light and the fluorescence generated in the tissue was collected with an NIR-sensitive intensified charged-coupled device. Lymphoscintigraphy was also performed. Resected LNs were evaluated for the presence of radioactivity, blue dye accumulation, and fluorescence. The associations between the resected LNs that were fluorescent and (a) the time elapsed between NIR fluorophore administration and resection and (b) the dosage of NIR fluorophores were tested with the Spearman rank and Pearson product moment correlation tests, respectively. Results Lymph imaging consistently failed with indocyanine green microdosages between 0.31 and 0.77 μg. When indocyanine green dosages were 10 μg or higher, lymph drainage pathways from the injection site to LNs were imaged in eight of nine women; lymph propulsion was observed in seven of those eight. When propulsion in the breast and axilla regions was present, the mean apparent velocities ranged from 0.08 to 0.32 cm/sec, the time elapsed between “packets” of propelled fluid varied from 14 to 92 seconds. In patients who received 10 μg of indocyanine green or more, a weak negative correlation between the fluorescence status of resected LNs and the time between NIR fluorophore administration and LN resection was found. No statistical association was found between the fluorescence status of resected LNs and the dose of NIR fluorophore. Conclusion NIR fluorescence imaging of lymph function and LNs is feasible in humans at microdoses that would be needed for future molecular imaging of cancer-positive LNs. PMID:18223125

  7. Analysis of Serum Concentrations of Tranexamic Acid Given by Alternate Routes in Swine (Sus scrofa) During Controlled Hemorrhage.

    DTIC Science & Technology

    2017-08-17

    the peak serum concentration of TXA when comparing IV and IM administration, IM did reach a minimum concentration which in vitro has been shown to... research is needed to determine the efficacy of TXA given by this route. 200 ::J’ .€ 150 g> .§_ c: 0 1a ...... "E ~ 100 c: 0 u ~ c: ca...serum concentration of TXA when comparing IV and IM administration, IM did reach a minimum concentration which in vitro has been shown to inhibit

  8. Film viewing conditions in mammography.

    PubMed

    Hill, S J; Faulkner, K; Law, J; Starritt, H C

    1997-04-01

    A requirement for a minimum viewing box brightness of 3000 cd m-2 for reading mammograms has been widely advocated. Some recent work has challenged that opinion by reporting no significant variation in visibility of low contrast and fine detail objects over a wide range of brightness levels. This paper provides further experimental evidence to support the latter conclusion, at least over the range 1340-4190 cd m-2, and suggests that the currently recommended minimum viewing box brightness levels need to be revised. The importance of reducing room lighting levels is fully confirmed.

  9. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  10. Blind identification of nonlinear models with non-Gaussian inputs

    NASA Astrophysics Data System (ADS)

    Prakriya, Shankar; Pasupathy, Subbarayan; Hatzinakos, Dimitrios

    1995-12-01

    Some methods are proposed for the blind identification of finite-order discrete-time nonlinear models with non-Gaussian circular inputs. The nonlinear models consist of two finite memory linear time invariant (LTI) filters separated by a zero-memory nonlinearity (ZMNL) of the polynomial type (the LTI-ZMNL-LTI models). The linear subsystems are allowed to be of non-minimum phase (NMP). The methods base their estimates of the impulse responses on slices of the N plus 1th order polyspectra of the output sequence. It is shown that the identification of LTI-ZMNL systems requires only a 1-D moment or polyspectral slice. The coefficients of the ZMNL are not estimated, and need not be known. The order of the nonlinearity can, in theory, be estimated from the received signal. These methods possess several noise and interference suppression characteristics, and have applications in modeling nonlinearly amplified QAM/QPSK signals in digital satellite and microwave communications.

  11. Multi Dimensional Honey Bee Foraging Algorithm Based on Optimal Energy Consumption

    NASA Astrophysics Data System (ADS)

    Saritha, R.; Vinod Chandra, S. S.

    2017-10-01

    In this paper a new nature inspired algorithm is proposed based on natural foraging behavior of multi-dimensional honey bee colonies. This method handles issues that arise when food is shared from multiple sources by multiple swarms at multiple destinations. The self organizing nature of natural honey bee swarms in multiple colonies is based on the principle of energy consumption. Swarms of multiple colonies select a food source to optimally fulfill the requirements of its colonies. This is based on the energy requirement for transporting food between a source and destination. Minimum use of energy leads to maximizing profit in each colony. The mathematical model proposed here is based on this principle. This has been successfully evaluated by applying it on multi-objective transportation problem for optimizing cost and time. The algorithm optimizes the needs at each destination in linear time.

  12. A functional description of the Buffered Telemetry Demodulator (BTD)

    NASA Technical Reports Server (NTRS)

    Tsou, H.; Shah, B.; Lee, R.; Hinedi, S.

    1993-01-01

    This article gives a functional description of the buffered telemetry demodulator (BTD), which operates on recorded digital samples to extract the symbols from the received signal. The key advantages of the BTD are as follows: (1) its ability to reprocess the signal to reduce acquisition time; (2) its ability to use future information about the signal and to perform smoothing on past samples; and (3) its minimum transmission bandwidth requirement as each sub carrier harmonic is processed individually. The first application of the BTD would be the Galileo S-band contingency mission, where the signal is so weak that reprocessing to reduce the acquisition time is crucial. Moreover, in the event of employing antenna arraying with full spectrum combining, only the sub carrier harmonics need to be transmitted between sites, resulting in significant reduction in data rate transmission requirements. Software implementation of the BTD is described for various general-purpose computers.

  13. Dynamic laser piercing of thick section metals

    NASA Astrophysics Data System (ADS)

    Pocorni, Jetro; Powell, John; Frostevarg, Jan; Kaplan, Alexander F. H.

    2018-01-01

    Before a contour can be laser cut the laser first needs to pierce the material. The time taken to achieve piercing should be minimised to optimise productivity. One important aspect of laser piercing is the reliability of the process because industrial laser cutting machines are programmed for the minimum reliable pierce time. In this work piercing experiments were carried out in 15 mm thick stainless steel sheets, comparing a stationary laser and a laser which moves along a circular trajectory with varying processing speeds. Results show that circular piercing can decrease the pierce duration by almost half compared to stationary piercing. High speed imaging (HSI) was employed during the piercing process to understand melt behaviour inside the pierce hole. HSI videos show that circular rotation of the laser beam forces melt to eject in opposite direction of the beam movement, while in stationary piercing the melt ejects less efficiently in random directions out of the hole.

  14. Managing the On-Board Data Storage, Acknowledgment and Retransmission System for Spitzer

    NASA Technical Reports Server (NTRS)

    Sarrel, Marc A.; Carrion, Carlos; Hunt, Joseph C., Jr.

    2006-01-01

    The Spitzer Space Telescope has a two-phase downlink system. Data are transmitted during one telecom session. Then commands are sent during the next session to delete those data that were received and to retransmit those data that were missed. We must build sequences that are as efficient as possible to make the best use of our finite supply of liquid helium, One way to improve efficiency is to use only the minimum time needed during telecom sessions to transmit the predicted volume of data. But, we must also not fill the onboard storage and must allow enough time margin to retransmit missed data. We describe tools and procedures that allow us to build observatory sequences that are single-fault tolerant in this regard and that allow us to recover quickly and safely from anomalies that affect the receipt or acknowledgment of data.

  15. Managing the On-Board Data Storage, Acknowledgement and Retransmission System for Spitzer

    NASA Technical Reports Server (NTRS)

    Sarrel, Marc A.; Carrion, Carlos; Hunt, Joseph C., Jr.

    2006-01-01

    The Spitzer Space Telescope has a two-phase downlink system. Recorded data are transmitted during one telecom session. Then commands are sent during the next session to delete those data that were received on the ground and to retransmit those data that were missed. We must build science sequences that are as efficient as possible to make the best use of our supply of liquid helium. One way to improve efficiency is to use only the minimum time needed during telecom sessions to transmit the predicted volume of data. But, we must also not fill the on-board storage and must allow enough time margin to retransmit missed data. We describe tools and procedures that allow us to build science sequences that are single-fault tolerant in this regard and that allow us to recover quickly and safely from anomalies that affect the receipt or acknowledgment (i.e. deletion) of data.

  16. Development of a rapid 21-plex autosomal STR typing system for forensic applications.

    PubMed

    Yang, Meng; Yin, Caiyong; Lv, Yuexin; Yang, Yaran; Chen, Jing; Yu, Zailiang; Liu, Xu; Xu, Meibo; Chen, Feng; Wu, Huijuan; Yan, Jiangwei

    2016-10-01

    DNA-STR genotyping technology has been widely used in forensic investigations. Even with such success, there is a great need to reduce the analysis time. In this study, we established a new rapid 21-plex STR typing system, including 13 CODIS loci, Penta D, Penta E, D12S391, D2S1338, D6S1043, D19S433, D2S441 and Amelogenin loci. This system could shorten the amplification time to a minimum of 90 min and does not require DNA extraction from the samples. Validation of the typing system complied with the Scientific Working Group on DNA Analysis Methods (SWGDAM) and the Chinese National Standard (GA/T815-2009) guidelines. The results demonstrated that this 21-plex STR typing system was a valuable tool for rapid criminal investigation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. The providers' profile of the disability support workforce in New Zealand.

    PubMed

    Jorgensen, Diane; Parsons, Matthew; Reid, Michelle Gundersen; Weidenbohm, Kate; Parsons, John; Jacobs, Stephen

    2009-07-01

    To understand one of the predominant groups supporting people with disabilities and illness, this study examined the profile of New Zealand paid caregivers, including their training needs. Paid caregivers, also known as healthcare assistants, caregivers and home health aides, work across several long-term care settings, such as residential homes, continuing-care hospitals and also private homes. Their roles include assisting with personal care and household management. New Zealand, similar to other countries, is facing a health workforce shortage. A three-phased design was used: phase I, a survey of all home-based and residential care providers (N = 942, response rate = 45%); phase II, a targeted survey of training needs (n = 107, response = 100%); phase III, four focus groups and 14 interviews with 36 providers, exploring themes arising from phases I and II. Findings on 17,910 paid caregivers revealed a workforce predominantly female (94%), aged between 40 and 50, with 6% over the age of 60. Mean hourly pay NZ$10.90 (minimum wage NZ$10.00 approx. UK3.00 at time of study) and 24 hours per week. The national paid caregiver turnover was 29% residential care and 39% community. Most providers recognised the importance of training, but felt their paid caregivers were not adequately trained. Training was poorly attended; reasons cited were funding, family, secondary employment, staff turnover, low pay and few incentives. The paid caregiver profile described reflects trends also observed in other countries. There is a clear policy direction in New Zealand and other countries to support people with a disability at home, and yet the workforce which is facilitating this vision is itself highly vulnerable. Paid caregivers have minimum pay, are female, work part-time and although it is recognised that training is important for them, they do not attend, so consequently remain untrained.

  18. An Evaluation of Detect and Avoid (DAA) Displays for Unmanned Aircraft Systems: The Effect of Information Level and Display Location on Pilot Performance

    NASA Technical Reports Server (NTRS)

    Fern, Lisa; Rorie, R. Conrad; Pack, Jessica S.; Shively, R. Jay; Draper, Mark H.

    2015-01-01

    A consortium of government, industry and academia is currently working to establish minimum operational performance standards for Detect and Avoid (DAA) and Control and Communications (C2) systems in order to enable broader integration of Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). One subset of these performance standards will need to address the DAA display requirements that support an acceptable level of pilot performance. From a pilot's perspective, the DAA task is the maintenance of self separation and collision avoidance from other aircraft, utilizing the available information and controls within the Ground Control Station (GCS), including the DAA display. The pilot-in-the-loop DAA task requires the pilot to carry out three major functions: 1) detect a potential threat, 2) determine an appropriate resolution maneuver, and 3) execute that resolution maneuver via the GCS control and navigation interface(s). The purpose of the present study was to examine two main questions with respect to DAA display considerations that could impact pilots' ability to maintain well clear from other aircraft. First, what is the effect of a minimum (or basic) information display compared to an advanced information display on pilot performance? Second, what is the effect of display location on UAS pilot performance? Two levels of information level (basic, advanced) were compared across two levels of display location (standalone, integrated), for a total of four displays. The authors propose an eight-stage pilot-DAA interaction timeline from which several pilot response time metrics can be extracted. These metrics were compared across the four display conditions. The results indicate that the advanced displays had faster overall response times compared to the basic displays, however, there were no significant differences between the standalone and integrated displays. Implications of the findings on understanding pilot performance on the DAA task, the development of DAA display performance standards, as well as the need for future research are discussed.

  19. Changes in temperature and precipitation extremes observed in Modena, Italy

    NASA Astrophysics Data System (ADS)

    Boccolari, M.; Malmusi, S.

    2013-03-01

    Climate changes has become one of the most analysed subjects from researchers community, mainly because of the numerous extreme events that hit the globe. To have a better view of climate changes and trends, long observations time series are needed. During last decade a lot of Italian time series, concerning several surface meteorological variables, have been analysed and published. No one of them includes one of the longest record in Italy, the time series of the Geophysical Observatory of the University of Modena and Reggio Emilia. Measurements, collected since early 19th century, always in the same position, except for some months during the second world war, embrace daily temperature, precipitation amount, relative humidity, pressure, cloudiness and other variables. In this work we concentrated on the analysis of yearly and seasonal trends and climate extremes of temperature, both minimum and maximum, and precipitation time series, for the periods 1861-2010 and 1831-2010 respectively, in which continuous measurements are available. In general, our results confirm quite well those reported by IPCC and in many other studies over Mediterranean area. In particular, we found that minimum temperature has a non significant positive trend of + 0.1 °C per decade considering all the period, the value increases to 0.9 °C per decade for 1981-2010. For maximum temperature we observed a non significant + 0.1 °C trend for all the period, while + 0.8 °C for the last thirty years. On the other hand precipitation is decreasing, -6.3 mm per decade, considering all the analysed period, while the last thirty years are characterised by a great increment of 74.8 mm per decade. For both variables several climate indices have been analysed and they confirm what has been found for minimum and maximum temperatures and precipitation. In particular, during last 30 years frost days and ice days are decreasing, whereas summer days are increasing. During the last 30-year tropical nights and warm spell duration indices are characterised by a particular strong increment, if compared to the ones of the entire period. Finally, a cursory comparison between winter precipitation and NAO index was done, showing a high anti-correlation, especially since the second half of 20th century.

  20. Preparing for the data revolution: identifying minimum health information competencies among the health workforce.

    PubMed

    Whittaker, Maxine; Hodge, Nicola; Mares, Renata E; Rodney, Anna

    2015-04-01

    Health information is required for a variety of purposes at all levels of a health system, and a workforce skilled in collecting, analysing, presenting, and disseminating such information is essential to fulfil these demands. While it is established that low- and middle-income countries (LMICs) are facing shortages in human resources for health (HRH), there has been little systematic attention focussed on non-clinical competencies. In response, we developed a framework that defines the minimum health information competencies required by health workers at various levels of a health system. Using the Delphi method, we consulted with leading global health information system (HIS) experts. An initial list of competencies and draft framework were developed based on results of a systematic literature review. During the second half of 2012, we sampled 38 experts with broad-based HIS knowledge and extensive development experience. Two rounds of consultation were carried out with the same group to establish validity of the framework and gain feedback on the draft competencies. Responses from consultations were analysed using Qualtrics® software and content analysis. In round one, 17 experts agreed to participate in the consultation and 11 (65%) completed the survey. In the second round, 11 experts agreed to participate and eight (73%) completed the survey. Overall, respondents agreed that there is a need for all health workers to have basic HIS competencies and that the concept of a minimum HIS competency framework is valid. Consensus was reached around the inclusion of 68 competencies across four levels of a health system. This consultation is one of the first to identify the HIS competencies required among general health workers, as opposed to specialist HIS roles. It is also one of the first attempts to develop a framework on minimum HIS competencies needed in LMICs, highlighting the skills needed at each level of the system, and identifying potential gaps in current training to allow a more systematic approach to HIS capacity-building.

  1. 12 CFR 925.20 - Stock purchase.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Stock purchase. 925.20 Section 925.20 Banks and... BANKS Stock Requirements § 925.20 Stock purchase. (a) Minimum stock purchase. Each member shall purchase... outstanding advances. (b) Timing of minimum stock purchase. (1) Within 60 calendar days after an institution...

  2. 75 FR 6151 - Minimum Capital

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... (FHFA) is issuing and seeking comment on a proposed rule to effect a provision of the Federal Housing... imposing a temporary increase, for rescinding such an increase and a time frame for review of such an... longer justify the temporary level.\\8\\ To effect the higher temporary minimum capital level, the Director...

  3. 14 CFR 91.1053 - Crewmember experience.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... and ratings: (1) Total flight time for all pilots: (i) Pilot in command—A minimum of 1,500 hours. (ii) Second in command—A minimum of 500 hours. (2) For multi-engine turbine-powered fixed-wing and powered...

  4. General Requirements and Minimum Standards.

    ERIC Educational Resources Information Center

    2003

    This publication provides the General Requirements and Minimum Standards developed by the National Court Reporters Association's Council on Approved Student Education (CASE). They are the same for all court reporter education programs, whether an institution is applying for approval for the first time or for a new grant of approval. The first…

  5. Reply to Comment on ‘The cancer Warburg effect may be a testable example of the minimum entropy production rate principle’

    NASA Astrophysics Data System (ADS)

    Sabater, Bartolomé; Marín, Dolores

    2018-03-01

    The minimum rate principle is applied to the chemical reaction in a steady-state open cell system where, under constant supply of the glucose precursor, reference to time or to glucose consumption does not affect the conclusions.

  6. Time-Series Evidence of the Effect of the Minimum Wage on Youth Employment and Unemployment.

    ERIC Educational Resources Information Center

    Brown, Charles; And Others

    1983-01-01

    The study finds that a 10 percent increase in the federal minimum wage (or the coverage rate) would reduce teenage (16-19) employment by about one percent, which is at the lower end of the range of estimates from previous studies. (Author/SSH)

  7. Strain Recovery by TiNi Element Under Fast Heating

    NASA Astrophysics Data System (ADS)

    Volkov, Aleksandr E.; Miszuris, Wiktoria; Volkova, Natalia A.

    2018-03-01

    A theoretical and experimental study of strain recovery under fast heating of a shape memory alloy (SMA) rod preliminarily stretched in the martensitic state is carried out. Two theoretical models are considered: instantaneous heating and heating with temperature variation during a finite time. In the first case, it is supposed that the straight SMA rod experiences an instantaneous reverse martensitic transformation, and in the second the transformation is supposed to progress at a rate corresponding to the temperature rate. Analytical expression for the time dependence of the rod free-end displacement is obtained. In the experiment, a wire specimen made of titanium-nickel SMA was heated by a short impulse of electric current. The variation of the specimen length in time was registered. Thus, it has been shown that the minimum operation time of an SMA actuator (time needed for the strain recovery) can be reduced to 20 µs. Comparison of the theoretical results with the experimental ones leads to the conclusion that the displacement variation in time is controlled by the rate of heating and the inertia of the specimen. The incubation time of the martensitic transformation on the microscale apparently is estimated as less than 1 µs.

  8. Constrained Optimization of Average Arrival Time via a Probabilistic Approach to Transport Reliability

    PubMed Central

    Namazi-Rad, Mohammad-Reza; Dunbar, Michelle; Ghaderi, Hadi; Mokhtarian, Payam

    2015-01-01

    To achieve greater transit-time reduction and improvement in reliability of transport services, there is an increasing need to assist transport planners in understanding the value of punctuality; i.e. the potential improvements, not only to service quality and the consumer but also to the actual profitability of the service. In order for this to be achieved, it is important to understand the network-specific aspects that affect both the ability to decrease transit-time, and the associated cost-benefit of doing so. In this paper, we outline a framework for evaluating the effectiveness of proposed changes to average transit-time, so as to determine the optimal choice of average arrival time subject to desired punctuality levels whilst simultaneously minimizing operational costs. We model the service transit-time variability using a truncated probability density function, and simultaneously compare the trade-off between potential gains and increased service costs, for several commonly employed cost-benefit functions of general form. We formulate this problem as a constrained optimization problem to determine the optimal choice of average transit time, so as to increase the level of service punctuality, whilst simultaneously ensuring a minimum level of cost-benefit to the service operator. PMID:25992902

  9. Strain Recovery by TiNi Element Under Fast Heating

    NASA Astrophysics Data System (ADS)

    Volkov, Aleksandr E.; Miszuris, Wiktoria; Volkova, Natalia A.

    2018-01-01

    A theoretical and experimental study of strain recovery under fast heating of a shape memory alloy (SMA) rod preliminarily stretched in the martensitic state is carried out. Two theoretical models are considered: instantaneous heating and heating with temperature variation during a finite time. In the first case, it is supposed that the straight SMA rod experiences an instantaneous reverse martensitic transformation, and in the second the transformation is supposed to progress at a rate corresponding to the temperature rate. Analytical expression for the time dependence of the rod free-end displacement is obtained. In the experiment, a wire specimen made of titanium-nickel SMA was heated by a short impulse of electric current. The variation of the specimen length in time was registered. Thus, it has been shown that the minimum operation time of an SMA actuator (time needed for the strain recovery) can be reduced to 20 µs. Comparison of the theoretical results with the experimental ones leads to the conclusion that the displacement variation in time is controlled by the rate of heating and the inertia of the specimen. The incubation time of the martensitic transformation on the microscale apparently is estimated as less than 1 µs.

  10. Dissociation of end systole from end ejection in patients with long-term mitral regurgitation.

    PubMed

    Brickner, M E; Starling, M R

    1990-04-01

    To determine whether left ventricular (LV) end systole and end ejection uncouple in patients with long-term mitral regurgitation, 59 patients (22 control patients with atypical chest pain, 21 patients with aortic regurgitation, and 16 patients with mitral regurgitation) were studied with micromanometer LV catheters and radionuclide angiograms. End systole was defined as the time of occurrence (Tmax) of the maximum time-varying elastance (Emax), and end ejection was defined as the time of occurrence of minimum ventricular volume (minV) and zero systolic flow as approximated by the aortic dicrotic notch (Aodi). The temporal relation between end systole and end ejection in the control patients was Tmax (331 +/- 42 [SD] msec), minV (336 +/- 36 msec), and then, zero systolic flow (355 +/- 23 msec). This temporal relation was maintained in the patients with aortic regurgitation. In contrast, in the patients with mitral regurgitation, the temporal relation was Tmax (266 +/- 49 msec), zero systolic flow (310 +/- 37 msec, p less than 0.01 vs. Tmax), and then, minV (355 +/- 37 msec, p less than 0.001 vs. Tmax and p less than 0.01 vs. Aodi). Additionally, the average Tmax occurred earlier in the patients with mitral regurgitation than in the control patients and patients with aortic regurgitation (p less than 0.01, for both), whereas the average time to minimum ventricular volume was similar in all three patient groups. Moreover, the average time to zero systolic flow also occurred earlier in the patients with mitral regurgitation than in the control patients (p less than 0.01) and patients with aortic regurgitation (p less than 0.05). Because of the dissociation of end systole from minimum ventricular volume in the patients with mitral regurgitation, the end-ejection pressure-volume relations calculated at minimum ventricular volume did not correlate (r = -0.09), whereas those calculated at zero systolic flow did correlate (r = 0.88) with the Emax slope values. We conclude that end ejection, defined as minimum ventricular volume, dissociates from end systole in patients with mitral regurgitation because of the shortened time to LV end systole in association with preservation of the time to LV end ejection due to the low impedance to ejection presented by the left atrium. Therefore, pressure-volume relations calculated at minimum ventricular volume might not be useful for assessing LV chamber performance in some patients with mitral regurgitation.

  11. Sanitary Survey Training. The Need-to-Know Material. Instructor's Technical Manual.

    ERIC Educational Resources Information Center

    Gardner, Anne, Ed.

    This manual, developed as an aid to state agencies who provide instruction to inspectors of water systems, is based on the minimum information that an inspector with limited experience needs to know to successfully assess a public water system. The manual is designed for use by individuals who are experts in the field of water systems and sanitary…

  12. Full and Open Access to Data in the Global Earth Observing System of Systems (GEOSS): Implementing the GEOSS Data Sharing Principles

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; Uhlir, P. F.; Gabrinowicz, J. I.

    2008-12-01

    Full and open access to data from remote sensing platforms and other sources can facilitate not only scientific research but also the more widespread and effective use of scientific data for the benefit of society. The Global Earth Observing System of Systems (GEOSS) is a major international initiative of the Group on Earth Observations (GEO) to develop "coordinated, comprehensive and sustained Earth observations and information." In 2005, GEO adopted the GEOSS Data Sharing Principles, which call for the "full and open exchange of data, metadata, and products shared within GEOSS, recognizing relevant international instruments and national policies and legislation." These Principles also note that "All shared data, metadata, and products will be made available with minimum time delay and at minimum cost" and that "All shared data, metadata, and products being free of charge or no more than cost of reproduction will be encouraged for research and education." GEOSS Task DA-06-01, aimed at developing a set of recommended implementation guidelines for the Principles, was established in 2006 under the leadership of CODATA, the Committee on Data for Science and Technology of the International Council for Science (ICSU). An international team of authors has developed a draft White Paper on the GEOSS Data Sharing Principles and a proposed set of implementation guidelines. These have been carefully reviewed by independent reviewers, various GEO Committees, and GEO National Members and Participating Organizations. It is expected that the proposed implementation guidelines will be discussed at the GEO-V Plenary in Budapest in November 2008. The current version of the proposed implementation guidelines recognizes the importance of good faith, voluntary adherence to the Principles by GEO National Members and Participating Organizations. It underscores the value of reuse and re-dissemination of GEOSS data with minimum restrictions, not only within GEOSS itself but on the part of GEOSS users. Consistency with relevant international instruments and applicable policies and legislation is essential, and therefore clarification and coordination of applicable policies and procedures are needed. Pricing of GEOSS data, metadata, and products should be based on the premise that the data and information within GEOSS is a public good for public-interest use in the nine societal benefit areas. Time delays for data access from both operational and research systems should be kept to a minimum, reflecting the norms of the relevant scientific communities or data processing centers. The proposed guidelines also emphasize the need to better define research and education uses and to develop and collect usage metrics and indicators. The draft White Paper provides a more detailed review of past and current data policies related to space-based and spatial data, assesses the implications of the Data Sharing Principles for selected case studies, and discusses a number of other important implementation issues. Successful implementation of the GEOSS Data Sharing Principles is likely to be a critical element in the future effectiveness and value of GEOSS.

  13. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  14. Recognizing interactions among lumber grading rules, gang-ripping technology, and industry needs could increase the use of No.2 Common lumber

    Treesearch

    Charles J. Gatchell; Charles J. Gatchell

    1989-01-01

    Recognizing the interactions among lumber grading rules, gang-ripping technology, and the parts needs of the furniture and cabinet industries could increase the use of No. 2 Common lumber as a raw material. The minimum piece size used in establishing the No.2 Common grade is 3 inches by 2 feet. Industry often needs shorter and narrower pieces than this. No.2 Common...

  15. Reconstruction of Absorbed Doses to Fibroglandular Tissue of the Breast of Women undergoing Mammography (1960 to the Present)

    PubMed Central

    Thierry-Chef, Isabelle; Simon, Steven L.; Weinstock, Robert M.; Kwon, Deukwoo; Linet, Martha S.

    2013-01-01

    The assessment of potential benefits versus harms from mammographic examinations as described in the controversial breast cancer screening recommendations of the U.S. Preventive Task Force included limited consideration of absorbed dose to the fibroglandular tissue of the breast (glandular tissue dose), the tissue at risk for breast cancer. Epidemiological studies on cancer risks associated with diagnostic radiological examinations often lack accurate information on glandular tissue dose, and there is a clear need for better estimates of these doses. Our objective was to develop a quantitative summary of glandular tissue doses from mammography by considering sources of variation over time in key parameters including imaging protocols, x-ray target materials, voltage, filtration, incident air kerma, compressed breast thickness, and breast composition. We estimated the minimum, maximum, and mean values for glandular tissue dose for populations of exposed women within 5-year periods from 1960 to the present, with the minimum to maximum range likely including 90% to 95% of the entirety of the dose range from mammography in North America and Europe. Glandular tissue dose from a single view in mammography is presently about 2 mGy, about one-sixth the dose in the 1960s. The ratio of our estimates of maximum to minimum glandular tissue doses for average-size breasts was about 100 in the 1960s compared to a ratio of about 5 in recent years. Findings from our analysis provide quantitative information on glandular tissue doses from mammographic examinations which can be used in epidemiologic studies of breast cancer. PMID:21988547

  16. Pulse Profiles, Accretion Column Dips and a Flare in GX 1+4 During a Faint State

    NASA Technical Reports Server (NTRS)

    Giles, A. B.; Galloway, D. K.; Greenhill, J. G.; Storey, M. C.; Wilson, C. A.

    1999-01-01

    The Rossi X-ray Timing Explorer (RXTE) spacecraft observed the X-ray GX 1+4 for it period of 34 hours on July 19/20 1996. The source faded front an intensity of approximately 20 mcrab to a minimum of <= 0.7 mcrab and then partially recovered towards the end of the observation. This extended minimum lasted approximately 40,000 seconds. Phase folded light curves at a barycentric rotation period of 124.36568 +/- 0.00020 seconds show that near the center of the extended minimum the source stopped pulsing in the traditional sense but retained a weak dip feature at the rotation period. Away from the extended minimum the dips are progressively narrower at higher energies and may be interpreted as obscurations or eclipses of the hot spot by the accretion column. The pulse profile changed from leading-edge bright before the extended minimum to trailing-edge bright after it. Data from the Burst and Transient Source Experiment (BATSE) show that a torque reversal occurred < 10 days after our observation. Our data indicate that the observed rotation departs from a constant period with a P/P value of approximately -1.5% per year at a 4.5sigma significance. We infer that we may have serendipitously obtained data, with high sensitivity and temporal resolution about the time of an accretion disk spin reversal. We also observed a rapid flare which had some precursor activity close to the center of the extended minimum.

  17. Resident away rotations allow adaptive neurosurgical training.

    PubMed

    Gephart, Melanie Hayden; Derstine, Pamela; Oyesiku, Nelson M; Grady, M Sean; Burchiel, Kim; Batjer, H Hunt; Popp, A John; Barbaro, Nicholas M

    2015-04-01

    Subspecialization of physicians and regional centers concentrate the volume of certain rare cases into fewer hospitals. Consequently, the primary institution of a neurological surgery training program may not have sufficient case volume to meet the current Residency Review Committee case minimum requirements in some areas. To ensure the competency of graduating residents through a comprehensive neurosurgical education, programs may need for residents to travel to outside institutions for exposure to cases that are either less common or more regionally focused. We sought to evaluate off-site rotations to better understand the changing demographics and needs of resident education. This would also allow prospective monitoring of modifications to the neurosurgery training landscape. We completed a survey of neurosurgery program directors and query of data from the Accreditation Council of Graduate Medical Education to characterize the current use of away rotations in neurosurgical education of residents. We found that 20% of programs have mandatory away rotations, most commonly for exposure to pediatric, functional, peripheral nerve, or trauma cases. Most of these rotations are done during postgraduate year 3 to 6, lasting 1 to 15 months. Twenty-six programs have 2 to 3 participating sites and 41 have 4 to 6 sites distinct from the host program. Programs frequently offset potential financial harm to residents rotating at a distant site by support of housing and transportation costs. As medical systems experience fluctuating treatment paradigms and demographics, over time, more residency programs may adapt to meet the Accreditation Council of Graduate Medical Education case minimum requirements through the implementation of away rotations.

  18. UNCOVERING THE INTRINSIC VARIABILITY OF GAMMA-RAY BURSTS

    NASA Astrophysics Data System (ADS)

    Golkhou, V. Zach; Butler, Nathaniel R

    2014-08-01

    We develop a robust technique to determine the minimum variability timescale for gamma-ray burst (GRB) light curves, utilizing Haar wavelets. Our approach averages over the data for a given GRB, providing an aggregate measure of signal variation while also retaining sensitivity to narrow pulses within complicated time series. In contrast to previous studies using wavelets, which simply define the minimum timescale in reference to the measurement noise floor, our approach identifies the signature of temporally smooth features in the wavelet scaleogram and then additionally identifies a break in the scaleogram on longer timescales as a signature of a true, temporally unsmooth light curve feature or features. We apply our technique to the large sample of Swift GRB gamma-ray light curves and for the first time—due to the presence of a large number of GRBs with measured redshift—determine the distribution of minimum variability timescales in the source frame. We find a median minimum timescale for long-duration GRBs in the source frame of Δtmin = 0.5 s, with the shortest timescale found being on the order of 10 ms. This short timescale suggests a compact central engine (3000 km). We discuss further implications for the GRB fireball model and present a tantalizing correlation between the minimum timescale and redshift, which may in part be due to cosmological time dilation.

  19. X-ray Monitoring of eta Carinae: Variations on a Theme

    NASA Technical Reports Server (NTRS)

    Corcoran, M. F.

    2004-01-01

    We present monitoring observations by the Rossi X-ray Timing Explorer of the 2-10 keV X-ray emission from the supermassive star eta Carinae from 1996 through late 2003. These data cover more than one of the stellar variability cycles in temporal detail and include especially detailed monitoring through two X-ray minima. We compare the current X-ray minimum which began on June 29, 2003 to the previous X-ray minimum which began on December 15, 1997, and refine the X-ray period to 2024 days. We examine the variations in the X-ray spectrum with phase and with time, and also refine our understanding of the X-ray peaks which have a quasi-period of 84 days, with significant variation. Cycle-to-cycle differences are seen in the level of X-ray intensity and in the detailed variations of the X-ray flux on the rise to maximum just prior to the X-ray minimum. Despite these differences the similarities between the decline to minimum, the duration of the minimum, and correlated variations of the X-ray flux and other measures throughout the electromagnetic spectrum leave little doubt that that the X-ray variation is strictly periodic and produced by orbital motion as the wind from eta Carinae collides with the wind of an otherwise unseen companion.

  20. 40 CFR Table 1a to Subpart Ce of... - Emissions Limits for Small, Medium, and Large HMIWI at Designated Facilities as Defined in § 60...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) (grains per dry standard cubic foot (gr/dscf)) 115 (0.05) 69 (0.03) 34 (0.015) 3-run average (1-hour minimum sample time per run) EPA Reference Method 5 of appendix A-3 of part 60, or EPA Reference Method...-run average (1-hour minimum sample time per run) EPA Reference Method 10 or 10B of appendix A-4 of...

  1. 40 CFR Table 1a to Subpart Ce of... - Emissions Limits for Small, Medium, and Large HMIWI at Designated Facilities as Defined in § 60...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) (grains per dry standard cubic foot (gr/dscf)) 115 (0.05) 69 (0.03) 34 (0.015) 3-run average (1-hour minimum sample time per run) EPA Reference Method 5 of appendix A-3 of part 60, or EPA Reference Method...-run average (1-hour minimum sample time per run) EPA Reference Method 10 or 10B of appendix A-4 of...

  2. State-Of High Brightness RF Photo-Injector Design

    NASA Astrophysics Data System (ADS)

    Ferrario, Massimo; Clendenin, Jym; Palmer, Dennis; Rosenzweig, James; Serafini, Luca

    2000-04-01

    The art of designing optimized high brightness electron RF Photo-Injectors has moved in the last decade from a cut and try procedure, guided by experimental experience and time consuming particle tracking simulations, up to a fast parameter space scanning, guided by recent analytical results and a fast running semi-analytical code, so to reach the optimum operating point which corresponds to maximum beam brightness. Scaling laws and the theory of invariant envelope provide to the designers excellent tools for a first parameters choice and the code HOMDYN, based on a multi-slice envelope description of the beam dynamics, is tailored to describe the space charge dominated dynamics of laminar beams in presence of time dependent space charge forces, giving rise to a very fast modeling capability for photo-injectors design. We report in this talk the results of a recent beam dynamics study, motivated by the need to redesign the LCLS photoinjector. During this work a new effective working point for a split RF photoinjector has been discovered by means of the previous mentioned approach. By a proper choice of rf gun and solenoid parameters, the emittance evolution shows a double minimum behavior in the drifting region. If the booster is located where the relative emittance maximum and the envelope waist occur, the second emittance minimum can be shifted at the booster exit and frozen at a very low level (0.3 mm-mrad for a 1 nC flat top bunch), to the extent that the invariant envelope matching conditions are satisfied.

  3. Parametric dense stereovision implementation on a system-on chip (SoC).

    PubMed

    Gardel, Alfredo; Montejo, Pablo; García, Jorge; Bravo, Ignacio; Lázaro, José L

    2012-01-01

    This paper proposes a novel hardware implementation of a dense recovery of stereovision 3D measurements. Traditionally 3D stereo systems have imposed the maximum number of stereo correspondences, introducing a large restriction on artificial vision algorithms. The proposed system-on-chip (SoC) provides great performance and efficiency, with a scalable architecture available for many different situations, addressing real time processing of stereo image flow. Using double buffering techniques properly combined with pipelined processing, the use of reconfigurable hardware achieves a parametrisable SoC which gives the designer the opportunity to decide its right dimension and features. The proposed architecture does not need any external memory because the processing is done as image flow arrives. Our SoC provides 3D data directly without the storage of whole stereo images. Our goal is to obtain high processing speed while maintaining the accuracy of 3D data using minimum resources. Configurable parameters may be controlled by later/parallel stages of the vision algorithm executed on an embedded processor. Considering hardware FPGA clock of 100 MHz, image flows up to 50 frames per second (fps) of dense stereo maps of more than 30,000 depth points could be obtained considering 2 Mpix images, with a minimum initial latency. The implementation of computer vision algorithms on reconfigurable hardware, explicitly low level processing, opens up the prospect of its use in autonomous systems, and they can act as a coprocessor to reconstruct 3D images with high density information in real time.

  4. Impact of performance goals on the needs of highway infrastructure maintenance.

    DOT National Transportation Integrated Search

    2011-07-01

    Although it is widely accepted that establishing suitable performance goal is critical for system : maintenance and preservation, a framework that considers the inter-relationship between conflicting : objectives of minimum maintenance and rehabilita...

  5. Association between household food security and infant feeding practices in urban informal settlements in Nairobi, Kenya.

    PubMed

    Macharia, T N; Ochola, S; Mutua, M K; Kimani-Murage, E W

    2018-02-01

    Studies in urban informal settlements show widespread inappropriate infant and young child feeding (IYCF) practices and high rates of food insecurity. This study assessed the association between household food security and IYCF practices in two urban informal settlements in Nairobi, Kenya. The study adopted a longitudinal design that involved a census sample of 1110 children less than 12 months of age and their mothers aged between 12 and 49 years. A questionnaire was used to collect information on: IYCF practices and household food security. Logistic regression was used to determine the association between food insecurity and IYFC practices. The findings showed high household food insecurity; only 19.5% of the households were food secure based on Household Insecurity Access Score. Infant feeding practices were inappropriate: 76% attained minimum meal frequency; 41% of the children attained a minimum dietary diversity; and 27% attained minimum acceptable diet. With the exception of the minimum meal frequency, infants living in food secure households were significantly more likely to achieve appropriate infant feeding practices than those in food insecure households: minimum meal frequency (adjusted odds ratio (AOR)=1.26, P=0.530); minimum dietary diversity (AOR=1.84, P=0.046) and minimum acceptable diet (AOR=2.35, P=0.008). The study adds to the existing body of knowledge by demonstrating an association between household food security and infant feeding practices in low-income settings. The findings imply that interventions aimed at improving infant feeding practices and ultimately nutritional status need to also focus on improving household food security.

  6. Development of specifications for surface and subsurface oceanic environmental data

    NASA Technical Reports Server (NTRS)

    Wolff, P. M.

    1976-01-01

    The existing need for synoptic subsurface observations was demonstrated giving special attention to the requirements of meteorology. The current state of synoptic oceanographic observations was assessed; a preliminary design for the Basic Observational Network needed to fulfill the minimum needs of synoptic meteorology and oceanography was presented. There is an existing critical need for such a network in the support of atmospheric modeling and operational meteorological prediction, and through utilization of the regional water mass concept an adequate observational system can be designed which is realistic in terms of cost and effort.

  7. Effect of handling characteristics on minimum time cornering with torque vectoring

    NASA Astrophysics Data System (ADS)

    Smith, E. N.; Velenis, E.; Tavernini, D.; Cao, D.

    2018-02-01

    In this paper, the effect of both passive and actively-modified vehicle handling characteristics on minimum time manoeuvring for vehicles with 4-wheel torque vectoring (TV) capability is studied. First, a baseline optimal TV strategy is sought, independent of any causal control law. An optimal control problem (OCP) is initially formulated considering 4 independent wheel torque inputs, together with the steering angle rate, as the control variables. Using this formulation, the performance benefit using TV against an electric drive train with a fixed torque distribution, is demonstrated. The sensitivity of TV-controlled manoeuvre time to the passive understeer gradient of the vehicle is then studied. A second formulation of the OCP is introduced where a closed-loop TV controller is incorporated into the system dynamics of the OCP. This formulation allows the effect of actively modifying a vehicle's handling characteristic via TV on its minimum time cornering performance of the vehicle to be assessed. In particular, the effect of the target understeer gradient as the key tuning parameter of the literature-standard steady-state linear single-track model yaw rate reference is analysed.

  8. Effects of tidal current phase at the junction of two straits

    USGS Publications Warehouse

    Warner, J.; Schoellhamer, D.; Burau, J.; Schladow, G.

    2002-01-01

    Estuaries typically have a monotonic increase in salinity from freshwater at the head of the estuary to ocean water at the mouth, creating a consistent direction for the longitudinal baroclinic pressure gradient. However, Mare Island Strait in San Francisco Bay has a local salinity minimum created by the phasing of the currents at the junction of Mare Island and Carquinez Straits. The salinity minimum creates converging baroclinic pressure gradients in Mare Island Strait. Equipment was deployed at four stations in the straits for 6 months from September 1997 to March 1998 to measure tidal variability of velocity, conductivity, temperature, depth, and suspended sediment concentration. Analysis of the measured time series shows that on a tidal time scale in Mare Island Strait, the landward and seaward baroclinic pressure gradients in the local salinity minimum interact with the barotropic gradient, creating regions of enhanced shear in the water column during the flood and reduced shear during the ebb. On a tidally averaged time scale, baroclinic pressure gradients converge on the tidally averaged salinity minimum and drive a converging near-bed and diverging surface current circulation pattern, forming a "baroclinic convergence zone" in Mare Island Strait. Historically large sedimentation rates in this area are attributed to the convergence zone. 

  9. Positive dwell time algorithm with minimum equal extra material removal in deterministic optical surfacing technology.

    PubMed

    Li, Longxiang; Xue, Donglin; Deng, Weijie; Wang, Xu; Bai, Yang; Zhang, Feng; Zhang, Xuejun

    2017-11-10

    In deterministic computer-controlled optical surfacing, accurate dwell time execution by computer numeric control machines is crucial in guaranteeing a high-convergence ratio for the optical surface error. It is necessary to consider the machine dynamics limitations in the numerical dwell time algorithms. In this paper, these constraints on dwell time distribution are analyzed, and a model of the equal extra material removal is established. A positive dwell time algorithm with minimum equal extra material removal is developed. Results of simulations based on deterministic magnetorheological finishing demonstrate the necessity of considering machine dynamics performance and illustrate the validity of the proposed algorithm. Indeed, the algorithm effectively facilitates the determinacy of sub-aperture optical surfacing processes.

  10. Empirical Model of the Location of the Main Ionospheric Trough

    NASA Astrophysics Data System (ADS)

    Deminov, M. G.; Shubin, V. N.

    2018-05-01

    The empirical model of the location of the main ionospheric trough (MIT) is developed based on an analysis of data from CHAMP satellite measured at the altitudes of 350-450 km during 2000-2007; the model is presented in the form of the analytical dependence of the invariant latitude of the trough minimum Φm on the magnetic local time (MLT), the geomagnetic activity, and the geographical longitude for the Northern and Southern Hemispheres. The time-weighted average index Kp(τ), the coefficient of which τ = 0.6 is determined by the requirement of the model minimum deviation from experimental data, is used as an indicator of geomagnetic activity. The model has no limitations, either in local time or geomagnetic activity. However, the initial set of MIT minima mainly contains data dealing with an interval of 16-08 MLT for Kp(τ) < 6; therefore, the model is rather qualitative outside this interval. It is also established that (a) the use of solar local time (SLT) instead of MLT increases the model error no more than by 5-10%; (b) the amplitude of the longitudinal effect at the latitude of MIT minimum in geomagnetic (invariant) coordinates is ten times lower than that in geographical coordinates.

  11. Investigating seismoionospheric effects on a long subionospheric path

    NASA Astrophysics Data System (ADS)

    Clilverd, Mark A.; Rodger, Craig J.; Thomson, Neil R.

    We examine the possibility of earthquake precursors influencing the subionospheric propagation of VLF transmissions. We consider the long (12 Mm) path from northeastern United States to Faraday, Antarctica (65°S, 64°W), during 1990-1995 and investigate the subionospheric amplitude variation of signals from the NAA communication transmitter (24.0 kHz, 1 MW) in Cutler, Maine, with particular emphasis on possible changes induced by seismic events occurring in South America. We have analyzed the changes in timing of modal minima generated by the passage of the sunrise terminator over the Andes, i.e., the ``VLF terminator time'' (TT) method. The anomalous variations in timing throughout the year are of a size and occurrence frequency similar to those previously reported, i.e., +/-0.5-1 hour and 1-2 per month. However, we find that in these anomalous cases, the time of the sunrise modal minimum does not change significantly, but rather, the minimum becomes insufficiently deep to be detected, and the time of the next nearest minimum is logged. Our analysis indicates that the occurrence rate of successful earthquake predictions using the TT method cannot be distinguished from that of chance. Additionally, the level of false earthquake prediction using the TT method is high.

  12. The predatory mite Phytoseiulus persimilis adjusts patch-leaving to own and progeny prey needs.

    PubMed

    Vanas, V; Enigl, M; Walzer, A; Schausberger, P

    2006-01-01

    Integration of optimal foraging and optimal oviposition theories suggests that predator females should adjust patch leaving to own and progeny prey needs to maximize current and future reproductive success. We tested this hypothesis in the predatory mite Phytoseiulus persimilis and its patchily distributed prey, the two-spotted spider mite Tetranychus urticae. In three separate experiments we assessed (1) the minimum number of prey needed to complete juvenile development, (2) the minimum number of prey needed to produce an egg, and (3) the ratio between eggs laid and spider mites left when a gravid P. persimilis female leaves a patch. Experiments (1) and (2) were the pre-requirements to assess the fitness costs associated with staying or leaving a prey patch. Immature P. persimilis needed at least 7 and on average 14+/-3.6 (SD) T. urticae eggs to reach adulthood. Gravid females needed at least 5 and on average 8.5+/-3.1 (SD) T. urticae eggs to produce an egg. Most females left the initial patch before spider mite extinction, leaving prey for progeny to develop to adulthood. Females placed in a low density patch left 5.6+/-6.1 (SD) eggs per egg laid, whereas those placed in a high density patch left 15.8+/-13.7 (SD) eggs per egg laid. The three experiments in concert suggest that gravid P. persimilis females are able to balance the trade off between optimal foraging and optimal oviposition and adjust patch-leaving to own and progeny prey needs.

  13. A Computational Model for Predicting Gas Breakdown

    NASA Astrophysics Data System (ADS)

    Gill, Zachary

    2017-10-01

    Pulsed-inductive discharges are a common method of producing a plasma. They provide a mechanism for quickly and efficiently generating a large volume of plasma for rapid use and are seen in applications including propulsion, fusion power, and high-power lasers. However, some common designs see a delayed response time due to the plasma forming when the magnitude of the magnetic field in the thruster is at a minimum. New designs are difficult to evaluate due to the amount of time needed to construct a new geometry and the high monetary cost of changing the power generation circuit. To more quickly evaluate new designs and better understand the shortcomings of existing designs, a computational model is developed. This model uses a modified single-electron model as the basis for a Mathematica code to determine how the energy distribution in a system changes with regards to time and location. By analyzing this energy distribution, the approximate time and location of initial plasma breakdown can be predicted. The results from this code are then compared to existing data to show its validity and shortcomings. Missouri S&T APLab.

  14. Metastable Features of Economic Networks and Responses to Exogenous Shocks

    PubMed Central

    Hosseiny, Ali; Bahrami, Mohammad; Palestrini, Antonio; Gallegati, Mauro

    2016-01-01

    It is well known that a network structure plays an important role in addressing a collective behavior. In this paper we study a network of firms and corporations for addressing metastable features in an Ising based model. In our model we observe that if in a recession the government imposes a demand shock to stimulate the network, metastable features shape its response. Actually we find that there exists a minimum bound where any demand shock with a size below it is unable to trigger the market out of recession. We then investigate the impact of network characteristics on this minimum bound. We surprisingly observe that in a Watts-Strogatz network, although the minimum bound depends on the average of the degrees, when translated into the language of economics, such a bound is independent of the average degrees. This bound is about 0.44ΔGDP, where ΔGDP is the gap of GDP between recession and expansion. We examine our suggestions for the cases of the United States and the European Union in the recent recession, and compare them with the imposed stimulations. While the stimulation in the US has been above our threshold, in the EU it has been far below our threshold. Beside providing a minimum bound for a successful stimulation, our study on the metastable features suggests that in the time of crisis there is a “golden time passage” in which the minimum bound for successful stimulation can be much lower. Hence, our study strongly suggests stimulations to arise within this time passage. PMID:27706166

  15. Singular perturbation techniques for real time aircraft trajectory optimization and control

    NASA Technical Reports Server (NTRS)

    Calise, A. J.; Moerder, D. D.

    1982-01-01

    The usefulness of singular perturbation methods for developing real time computer algorithms to control and optimize aircraft flight trajectories is examined. A minimum time intercept problem using F-8 aerodynamic and propulsion data is used as a baseline. This provides a framework within which issues relating to problem formulation, solution methodology and real time implementation are examined. Theoretical questions relating to separability of dynamics are addressed. With respect to implementation, situations leading to numerical singularities are identified, and procedures for dealing with them are outlined. Also, particular attention is given to identifying quantities that can be precomputed and stored, thus greatly reducing the on-board computational load. Numerical results are given to illustrate the minimum time algorithm, and the resulting flight paths. An estimate is given for execution time and storage requirements.

  16. QUIKVIS- CELESTIAL TARGET AVAILABILITY INFORMATION

    NASA Technical Reports Server (NTRS)

    Petruzzo, C.

    1994-01-01

    QUIKVIS computes the times during an Earth orbit when geometric requirements are satisfied for observing celestial objects. The observed objects may be fixed (stars, etc.) or moving (sun, moon, planets). QUIKVIS is useful for preflight analysis by those needing information on the availability of celestial objects to be observed. Two types of analyses are performed by QUIKVIS. One is used when specific objects are known, the other when targets are unknown and potentially useful regions of the sky must be identified. The results are useful in selecting candidate targets, examining the effects of observation requirements, and doing gross assessments of the effects of the orbit's right ascension of the ascending node (RAAN). The results are not appropriate when high accuracy is needed (e.g. for scheduling actual mission operations). The observation duration is calculated as a function of date, orbit node, and geometric requirements. The orbit right ascension of the ascending node can be varied to account for the effects of an uncertain launch time of day. The orbit semimajor axis and inclination are constant throughout the run. A circular orbit is assumed, but a simple program modification will allow eccentric orbits. The geometric requirements that can be processed are: 1) minimum separation angle between the line of sight to the object and the earth's horizon; 2) minimum separation angle between the line of sight to the object and the spacecraft velocity vector; 3) maximum separation angle between the line of sight to the object and the zenith direction; and 4) presence of the spacecraft in the earth's shadow. The user must supply a date or date range, the spacecraft orbit and inclination, up to 700 observation targets, and any geometric requirements to be met. The primary output is the time per orbit that conditions are satisfied, with options for sky survey maps, time since a user-specified orbit event, and bar graphs illustrating overlapping requirements. The output is printed in visually convenient lineprinter form but is also available on data files for use by postprocessors such as external XY plotters. QUIKVIS is written in FORTRAN 77 for batch or interactive execution and has been implemented on a DEC VAX 11/780 operating under VMS with a central memory requirement of approximately 500K of 8 bit bytes. QUIKVIS was developed in 1986 and revised in 1987.

  17. Optimization of HTST process parameters for production of ready-to-eat potato-soy snack.

    PubMed

    Nath, A; Chattopadhyay, P K; Majumdar, G C

    2012-08-01

    Ready-to-eat (RTE) potato-soy snacks were developed using high temperature short time (HTST) air puffing process and the process was found to be very useful for production of highly porous and light texture snack. The process parameters considered viz. puffing temperature (185-255 °C) and puffing time (20-60 s) with constant initial moisture content of 36.74% and air velocity of 3.99 m.s(-1) for potato-soy blend with varying soy flour content from 5% to 25% were investigated using response surface methodology following central composite rotatable design (CCRD). The optimum product in terms of minimum moisture content (11.03% db), maximum expansion ratio (3.71), minimum hardness (2,749.4 g), minimum ascorbic acid loss (9.24% db) and maximum overall acceptability (7.35) were obtained with 10.0% soy flour blend in potato flour at the process conditions of puffing temperature (231.0 °C) and puffing time (25.0 s).

  18. Power Controller

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The device called the Power Factor Controller (PFC) offers exceptional energy conservation potential by virtue of its ability to sense shifts in the relationship between voltage and current flow, and to match them with the motor's need. Originating from the solar heating/cooling program, the PFC senses a light load, it cuts the voltage level to the minimum needed which in turn reduces current flow and heat loss. Laboratory tests showed that the PFC could reduce power used by six to eight percent under normal motor loads, and as much as 65 percent when the motor was idling. Over 150 companies have been granted NASA licenses for commercial use of this technology. One system that utilizes this technology is the Vectrol Energy System, (VES) produced by Vectrol, Inc. a subsidiary of Westinghouse. The VES is being used at Woodward & Lothrop, on their escalators. Energy use is regulated according to how many people are on the escalator at any time. It is estimated that the energy savings are between 30 to 40 percent.

  19. Use of Electronic Journals in Astronomy and Astrophysics Libraries and Information Centres in India: A Librarians' Perspective

    NASA Astrophysics Data System (ADS)

    Pathak, S. K.; Deshpande, N. J.; Rai, V.

    2010-10-01

    The objectives of this study were to find out whether librarians are satisfied with the present infrastructure for electronic journals and also to find out whether librarians are taking advantage of consortia. A structured questionnaire for librarians was divided into eight parts which were further sub-divided and designed to get information on various aspects of library infrastructure and usage of electronic journals. The goal was to find out the basic minimum infrastructure needed to provide access to electronic journals to a community of users and to facilitate communication in all major astronomy & astrophysics organizations in India. The study aims to highlight key insights from responses of librarians who are responsible for managing astronomy & astrophysics libraries in India and to identify the information needs of the users. Each community and discipline will have its own specific legacy of journal structure, reading, publishing, and researching practices, and time will show which kinds of e-journals are most effective and useful.

  20. Management of cataract in uveitis patients.

    PubMed

    Conway, Mandi D; Stern, Ethan; Enfield, David B; Peyman, Gholam A

    2018-01-01

    This review is timely because the outcomes of surgical invention in uveitic eyes with cataract can be optimized with adherence to strict anti-inflammatory principles. All eyes should be free of any cell/ flare for a minimum of 3 months preoperatively. Another helpful maneuver is to place dexamethasone in the infusion fluid or triamcinolone intracamerally at the end of surgery. Recent reports about the choice of intraocular lens material or lens design are germane to the best surgical outcome. Integrating these findings will promote better visual outcomes and allow advancement in research to further refine these surgical interventions in high-risk uveitic eyes. Control of inflammation has been shown to greatly improve postoperative outcomes in patients with uveitis. Despite better outcomes, more scientific research needs to be done regarding lens placement and materials and further research needs to adhere to the standardized reporting of uveitis nomenclature. Future studies should improve postoperative outcomes in eyes with uveitis so that they approach those of eyes undergoing routine cataract procedures.

Top