Sample records for formal methods lfm

  1. Lfm2000: Fifth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler)

    2000-01-01

    This is the proceedings of Lfm2000: Fifth NASA Langley Formal Methods Workshop. The workshop was held June 13-15, 2000, in Williamsburg, Virginia. See the web site for complete information about the event.

  2. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  3. Pilot interaction with automated airborne decision making systems

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1981-01-01

    The role of the pilot and crew for future aircraft is discussed. Fifteen formal experimental studies and the development of a variety of models of human behavior based on queueing history, pattern recognition methods, control theory, fuzzy set theory, and artificial intelligence concepts are presented. L.F.M.

  4. Tracing cell lineages in videos of lens-free microscopy.

    PubMed

    Rempfler, Markus; Stierle, Valentin; Ditzel, Konstantin; Kumar, Sanjeev; Paulitschke, Philipp; Andres, Bjoern; Menze, Bjoern H

    2018-06-05

    In vitro experiments with cultured cells are essential for studying their growth and migration pattern and thus, for gaining a better understanding of cancer progression and its treatment. Recent progress in lens-free microscopy (LFM) has rendered it an inexpensive tool for label-free, continuous live cell imaging, yet there is only little work on analysing such time-lapse image sequences. We propose (1) a cell detector for LFM images based on fully convolutional networks and residual learning, and (2) a probabilistic model based on moral lineage tracing that explicitly handles multiple detections and temporal successor hypotheses by clustering and tracking simultaneously. (3) We benchmark our method in terms of detection and tracking scores on a dataset of three annotated sequences of several hours of LFM, where we demonstrate our method to produce high quality lineages. (4) We evaluate its performance on a somewhat more challenging problem: estimating cell lineages from the LFM sequence as would be possible from a corresponding fluorescence microscopy sequence. We present experiments on 16 LFM sequences for which we acquired fluorescence microscopy in parallel and generated annotations from them. Finally, (5) we showcase our methods effectiveness for quantifying cell dynamics in an experiment with skin cancer cells. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Use of the Contingent Valuation Method in the assessment of a landfill mining project.

    PubMed

    Marella, Giuliano; Raga, Roberto

    2014-07-01

    A comprehensive approach for the evaluation of the economic feasibility of landfill mining (LFM) should take into account not only the direct costs and revenues for the private investor, but also the social benefits or costs (generally called externalities), in such a way that projects generating major social benefits (and no significant private revenues) are not overlooked. With a view to contributing to the development of a common framework for the evaluation of LFM projects, this paper presents the results of a case study where the issue of the assessment of social benefits from a LFM project is addressed. In particular, the Contingent Valuation Method is applied for the monetary assessment of the community-perceived benefits from the remediation of an old uncontrolled waste deposit by means of LFM and the conversion of the area into a park. Based on the results of a survey carried out on a random sample of people living near the old landfill, the economic values of the individual willingness to pay (WTP) for LFM and the subsequent creation of a public park were calculated and the correlations with the relevant variables (distance from the landfill site, age, income, sex, education level) assessed. The results were then suitably extended and the monetary value of the welfare increase of the whole population resident in the area and potentially affected both by LFM and the creation of the park was calculated. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  7. Assessment of Wildfire Risk in Southern California with Live Fuel Moisture Measurement and Remotely Sensed Vegetation Water Content Proxies

    NASA Astrophysics Data System (ADS)

    Jia, S.; Kim, S. H.; Nghiem, S. V.; Kafatos, M.

    2017-12-01

    Live fuel moisture (LFM) is the water content of live herbaceous plants expressed as a percentage of the oven-dry weight of plant. It is a critical parameter in fire ignition in Mediterranean climate and routinely measured in sites selected by fire agencies across the U.S. Vegetation growing cycle, meteorological metrics, soil type, and topography all contribute to the seasonal and inter-annual variation of LFM, and therefore, the risk of wildfire. The optical remote sensing-based vegetation indices (VIs) have been used to estimate the LFM. Comparing to the VIs, microwave remote sensing products have advantages like less saturation effect in greenness and representing the water content of the vegetation cover. In this study, we established three models to evaluate the predictability of LFM in Southern California using MODIS NDVI, vegetation temperature condition index (VTCI) from downscaled Soil Moisture Active Passive (SMAP) products, and vegetation optical depth (VOD) derived by Land Parameter Retrieval Model. Other ancillary variables, such as topographic factors (aspects and slope) and meteorological metrics (air temperature, precipitation, and relative humidity), are also considered in the models. The model results revealed an improvement of LFM estimation from SMAP products and VOD, despite the uncertainties introduced in the downscaling and parameter retrieval. The estimation of LFM using remote sensing data can provide an assessment of wildfire danger better than current methods using NDVI-based growing seasonal index. Future study will test the VOD estimation from SMAP data using the multi-temporal dual channel algorithm (MT-DCA) and extend the LFM modeling to a regional scale.

  8. A parameter estimation algorithm for LFM/BPSK hybrid modulated signal intercepted by Nyquist folding receiver

    NASA Astrophysics Data System (ADS)

    Qiu, Zhaoyang; Wang, Pei; Zhu, Jun; Tang, Bin

    2016-12-01

    Nyquist folding receiver (NYFR) is a novel ultra-wideband receiver architecture which can realize wideband receiving with a small amount of equipment. Linear frequency modulated/binary phase shift keying (LFM/BPSK) hybrid modulated signal is a novel kind of low probability interception signal with wide bandwidth. The NYFR is an effective architecture to intercept the LFM/BPSK signal and the LFM/BPSK signal intercepted by the NYFR will add the local oscillator modulation. A parameter estimation algorithm for the NYFR output signal is proposed. According to the NYFR prior information, the chirp singular value ratio spectrum is proposed to estimate the chirp rate. Then, based on the output self-characteristic, matching component function is designed to estimate Nyquist zone (NZ) index. Finally, matching code and subspace method are employed to estimate the phase change points and code length. Compared with the existing methods, the proposed algorithm has a better performance. It also has no need to construct a multi-channel structure, which means the computational complexity for the NZ index estimation is small. The simulation results demonstrate the efficacy of the proposed algorithm.

  9. Investigation of Non-linear Chirp Coding for Improved Second Harmonic Pulse Compression.

    PubMed

    Arif, Muhammad; Ali, Muhammad Asim; Shaikh, Muhammad Mujtaba; Freear, Steven

    2017-08-01

    Non-linear frequency-modulated (NLFM) chirp coding was investigated to improve the pulse compression of the second harmonic chirp signal by reducing the range side lobe level. The problem of spectral overlap between the fundamental component and second harmonic component (SHC) was also investigated. Therefore, two methods were proposed: method I for the non-overlap condition and method II with the pulse inversion technique for the overlap harmonic condition. In both methods, the performance of the NLFM chirp was compared with that of the reference LFM chirp signals. Experiments were performed using a 2.25 MHz transducer mounted coaxially at a distance of 5 cm with a 1 mm hydrophone in a water tank, and the peak negative pressure of 300 kPa was set at the receiver. Both simulations and experimental results revealed that the peak side lobe level (PSL) of the compressed SHC of the NLFM chirp was improved by at least 13 dB in method I and 5 dB in method II when compared with the PSL of LFM chirps. Similarly, the integrated side lobe level (ISL) of the compressed SHC of the NLFM chirp was improved by at least 8 dB when compared with the ISL of LFM chirps. In both methods, the axial main lobe width of the compressed NLFM chirp was comparable to that of the LFM signals. The signal-to-noise ratio of the SHC of NLFM was improved by as much as 0.8 dB, when compared with the SHC of the LFM signal having the same energy level. The results also revealed the robustness of the NLFM chirp under a frequency-dependent attenuation of 0.5 dB/cm·MHz up to a penetration depth of 5 cm and a Doppler shift up to 12 kHz. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  10. Monetizing the social benefits of landfill mining: Evidence from a Contingent Valuation survey in a rural area in Greece.

    PubMed

    Damigos, Dimitris; Menegaki, Maria; Kaliampakos, Dimitris

    2016-05-01

    Despite the emerging global attention towards promoting waste management policies that reduce environmental impacts and conserve natural resources, landfilling still remains the dominant waste management practice in many parts of the world. Owing to this situation, environmental burdens are bequeathed to and large amounts of potentially valuable materials are lost for future generations. As a means to undo these adverse effects a process known as landfill mining (LFM) could be implemented provided that economic feasibility is ensured. So far, only a few studies have focused on the economic feasibility of LFM from a private point of view and even less studies have attempted to economically justify the need for LMF projects from a social point of view. This paper, aiming to add to the limited literature in the field, presents the results of a survey conducted in a rural district in Greece, by means of the Contingent Valuation method (CVM) in order to estimate society's willingness to pay for LFM programs. According to the empirical survey, more than 95% of the respondents recognize the need for LFM programs. Nevertheless, only one-fourth of the respondents are willing to pay through increased taxes for LFM, owing mainly to economic depression and unemployment. Those who accept the increased tax are willing to pay about €50 per household per year, on average, which results in a mean willingness to pay (WTP) for the entire population under investigation of around €12 per household per year. The findings of this research work provide useful insights about the 'dollar-based' benefits of LFM in the context of social cost-benefit analysis of LFM projects. Yet, it is evident that further research is necessary. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A novel joint timing/frequency synchronization scheme based on Radon-Wigner transform of LFM signals in CO-OFDM systems

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Wei, Ying; Zeng, Xiangye; Lu, Jia; Zhang, Shuangxi; Wang, Mengjun

    2018-03-01

    A joint timing and frequency synchronization method has been proposed for coherent optical orthogonal frequency-division multiplexing (CO-OFDM) system in this paper. The timing offset (TO), integer frequency offset (FO) and the fractional FO can be realized by only one training symbol, which consists of two linear frequency modulation (LFM) signals with opposite chirp rates. By detecting the peak of LFM signals after Radon-Wigner transform (RWT), the TO and the integer FO can be estimated at the same time, moreover, the fractional FO can be acquired correspondingly through the self-correlation characteristic of the same training symbol. Simulation results show that the proposed method can give a more accurate TO estimation than the existing methods, especially at poor OSNR conditions; for the FO estimation, both the fractional and the integer FO can be estimated through the proposed training symbol with no extra overhead, a more accurate estimation and a large FO estimation range of [ - 5 GHz, 5GHz] can be acquired.

  12. Spectroscopic Analysis of Temporal Changes in Leaf Moisture and Dry Matter Content

    NASA Astrophysics Data System (ADS)

    Qi, Y.; Dennison, P. E.; Brewer, S.; Jolly, W. M.; Kropp, R.

    2013-12-01

    Live fuel moisture (LFM), the ratio of water content to dry matter content (DMC) in live fuel, is critical for determining fire danger and behavior. Remote sensing estimation of LFM often relies on an assumption of changing water content and stable DMC over time. In order to advance understanding of temporal variation in LFM and DMC, we collected field samples and spectroscopic data for two species, lodgepole pine (Pinus contorta) and big sagebrush (Artemisia tridentata), to explore seasonal trends and spectral expression of these trends. New and old needles were measured separately for lodgepole pine. All samples were measured using a visible/NIR/SWIR spectrometer, and coincident samples were processed to provide LFM, DMC, water content and chemical components including structural and non-structural carbohydrates. New needles initially exhibited higher LFM and a smaller proportion of DMC, but differences between new and old needles converged as the new needles hardened. DMC explained more variation in LFM than water content for new pine needles and sagebrush leaves. Old pine needles transported non-structural carbohydrates to new needles to accumulate DMC during the growth season, resulting decreasing LFM in new needles. DMC and water content co-varied with vegetation chemical components and physical structure. Spectral variation in response to changing DMC is difficulty to isolate from the spectral signatures of multiple chemical components. Partial least square regression combined with hyperspectral data may increase modeling performance in LFM estimation.

  13. Effects of protein hydrolysates supplementation in low fish meal diets on growth performance, innate immunity and disease resistance of red sea bream Pagrus major.

    PubMed

    Khosravi, Sanaz; Rahimnejad, Samad; Herault, Mikaël; Fournier, Vincent; Lee, Cho-Rong; Dio Bui, Hien Thi; Jeong, Jun-Bum; Lee, Kyeong-Jun

    2015-08-01

    This study was conducted to evaluate the supplemental effects of three different types of protein hydrolysates in a low fish meal (FM) diet on growth performance, feed utilization, intestinal morphology, innate immunity and disease resistance of juvenile red sea bream. A FM-based diet was used as a high fish meal diet (HFM) and a low fish meal (LFM) diet was prepared by replacing 50% of FM by soy protein concentrate. Three other diets were prepared by supplementing shrimp, tilapia or krill hydrolysate to the LFM diet (designated as SH, TH and KH, respectively). Triplicate groups of fish (4.9 ± 0.1 g) were fed one of the test diets to apparent satiation twice daily for 13 weeks and then challenged by Edwardsiella tarda. At the end of the feeding trial, significantly (P < 0.05) higher growth performance was obtained in fish fed HFM and hydrolysate treated groups compared to those fed the LFM diet. Significant improvements in feed conversion and protein efficiency ratios were obtained in fish fed the hydrolysates compared to those fed the LFM diet. Significant enhancement in digestibility of protein was found in fish fed SH and KH diets and dry matter digestibility was increased in the group fed SH diet in comparison to LFM group. Fish fed the LFM diet showed significantly higher glucose level than all the other treatments. Whole-body and dorsal muscle compositions were not significantly influenced by dietary treatments. Histological analysis revealed significant reductions in goblet cell numbers and enterocyte length in the proximal intestine of fish fed the LFM diet. Superoxide dismutase activity and total immunoglobulin level were significantly increased in fish fed the diets containing protein hydrolysates compared to the LFM group. Also, significantly higher lysozyme and antiprotease activities were found in fish fed the hydrolysates and HFM diets compared to those offered LFM diet. Fish fed the LFM diet exhibited the lowest disease resistance against E. tarda and dietary inclusion of the hydrolysates resulted in significant enhancement of survival rate. The results of the current study indicated that the inclusion of the tested protein hydrolysates, particularly SH, in a LFM diet can improve growth performance, feed utilization, digestibility, innate immunity and disease resistance of juvenile red sea bream. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Investigating the viscous interaction and its role in generating the ionospheric potential during the Whole Heliosphere Interval

    NASA Astrophysics Data System (ADS)

    Bruntz, R.; Lopez, R. E.; Bhattarai, S. K.; Pham, K. H.; Deng, Y.; Huang, Y.; Wiltberger, M.; Lyon, J. G.

    2012-07-01

    The Whole Heliosphere Interval (WHI), comprising March 20-April 16, 2008 (DOY 80-107), is a single Carrington Rotation (2068) designated for intense study through observations and simulations. We used solar wind data from the WHI to run the Coupled Magnetosphere-Ionosphere-Thermosphere (CMIT) and stand-alone Lyon-Fedder-Mobarry (LFM) models. The LFM model was also run with the WHI solar wind plasma parameters but with zero interplanetary magnetic field (IMF). With no IMF, we expect that the cross-polar cap potential (CPCP) is due entirely to the viscous interaction. Comparing the LFM runs with and without the IMF, we found that during strong driving with southward IMF Bz, the viscous potential could be a significant fraction of the total CPCP. During times of northward IMF Bz, the CPCP was generally lower than the CPCP value from the IMF=0 run. LFM tends to produce high polar cap potentials, but by using the Bruntz et al. (2012) viscous potential formula (ΦV=μn0.439V1.33, where μ=0.00431) and the IMF=0 LFM run, we calculated a scaling factor γ=1.54, which can be used to scale the LFM CPCP during the WHI down to realistic values. The Newell et al. (2008) viscous merging term can similarly be used to predict the viscous potential using the formula: ΦV=νn1/2V2, where the value ν=6.39×10-5 was also found using the zero IMF run. Both formulas were found to perform better when V (solar wind)=Vx, rather than Vtotal, yielding similar, accurate predictions of the LFM viscous potential, with R2>0.91 for both formulas. The γ factor was also used to scale down the LFM CPCP from the full solar wind run, with most of the resultant values matching the CPCP from the Weimer05 model well, even though γ was derived independent of the Weimer05 model or the full LFM data. We interpret this to be an indication that the conductivity model in LFM is producing values that are too low, thus elevating the CPCP values.

  15. [Efficacy of Lactose-free Milk in Korean Adults with Lactose Intolerance].

    PubMed

    Park, Sun Hee; Chang, Young Woon; Kim, Soo Jung; Lee, Min Hye; Nam, Ji Hyeok; Oh, Chi Hyuk; Kim, Jung-Wook; Jang, Jae-Young; Yang, Jin Oh; Yoo, Jin Ah; Chung, Jin Young

    2016-01-25

    Lactose-free milk (LFM) is available for nutrient supply for those with lactose intolerance (LI). However, there are no consistent results of the efficacy of LFM in LI subjects. We aimed to examine the changes of gastrointestinal (GI) symptoms and hydrogen breath test (HBT) values after ingestion of lactose contained milk (LCM) vs. LFM. From May 2015 to September 2015, thirty-five healthy adults with history of LCM-induced GI symptoms were recruited at a tertiary hospital. For the diagnosis of LI, HBT with LCM 550 mL (lactose 25 g) was performed every 20 minutes for 3 hours. The test was defined as "positive" when H2 peak exceeded 20 ppm above baseline values (ΔH2>20 ppm). When the subjects are diagnosed as LI, the second HBT using LFM 550 mL (lactose 0 g) was performed 7 days later. Subjects were asked to complete a questionnaire about the occurrence and severity of GI symptoms. Among a total of 35 subjects, 31 were diagnosed with LI at first visit, and their LCM-related symptoms were abdominal pain (98.6%), borborygmus (96.8%), diarrhea (90.3%), and flatus (87.1%). The ΔH2 value in subjects taking LCM (103.7±66.3ppm) significantly decreased to 6.3±4.9 ppm after ingesting LFM (p<0.0001). There were also significant reduction in total symptom scores and the severity of each symptom when LCM was changed to LFM (p<0.0001). This is the first report that LFM reduce LCM-related GI symptoms and H2 production in Korean adults. LFM can be an effective alternative for LCM in adults with LI.

  16. Local fisheries management at the Swedish coast: biological and social preconditions.

    PubMed

    Bruckmeier, Karl; Neuman, Erik

    2005-03-01

    Most of the Swedish coastal fisheries are not sustainable from either a social, economic or ecological point of view. We propose the introduction of local fisheries management (LFM) as a tool for restructuring the present large-scale management system in order to achieve sustainability. To implement LFM two questions need to be answered: How to distribute the resource fish among different resource user groups? How to restructure present fisheries management to meet the criteria of sustainability? Starting from these questions we describe possible forms of LFM for Swedish coastal fishery supported by recent research. The biological and social preconditions for restructuring fisheries management are derived from an analysis of the ecological and managerial situation in Swedish fishery. Three types of LFM--owner based, user based, and community based management--are analyzed with regard to the tasks to be carried outin LFM, the roles of management groups, and the definition and optimal size of management areas.

  17. Non-contact lateral force microscopy.

    PubMed

    Weymouth, A J

    2017-08-16

    The goal of atomic force microscopy (AFM) is to measure the short-range forces that act between the tip and the surface. The signal recorded, however, includes long-range forces that are often an unwanted background. Lateral force microscopy (LFM) is a branch of AFM in which a component of force perpendicular to the surface normal is measured. If we consider the interaction between tip and sample in terms of forces, which have both direction and magnitude, then we can make a very simple yet profound observation: over a flat surface, long-range forces that do not yield topographic contrast have no lateral component. Short-range interactions, on the other hand, do. Although contact-mode is the most common LFM technique, true non-contact AFM techniques can be applied to perform LFM without the tip depressing upon the sample. Non-contact lateral force microscopy (nc-LFM) is therefore ideal to study short-range forces of interest. One of the first applications of nc-LFM was the study of non-contact friction. A similar setup is used in magnetic resonance force microscopy to detect spin flipping. More recently, nc-LFM has been used as a true microscopy technique to systems unsuitable for normal force microscopy.

  18. Effects of including electrojet turbulence in LFM-RCM simulations of geospace storms

    NASA Astrophysics Data System (ADS)

    Oppenheim, M. M.; Wiltberger, M. J.; Merkin, V. G.; Zhang, B.; Toffoletto, F.; Wang, W.; Lyon, J.; Liu, J.; Dimant, Y. S.

    2016-12-01

    Global geospace system simulations need to incorporate nonlinear and small-scale physical processes in order to accurately model storms and other intense events. During times of strong magnetospheric disturbances, large-amplitude electric fields penetrate from the Earth's magnetosphere to the E-region ionosphere where they drive Farley-Buneman instabilities (FBI) that create small-scale plasma density turbulence. This induces nonlinear currents and leads to anomalous electron heating. Current global Magnetosphere-Ionosphere-Thermosphere (MIT) models disregard these effects by assuming simple laminar ionospheric currents. This paper discusses the effects of incorporating accurate turbulent conductivities into MIT models. Recently, we showed in Liu et al. (2016) that during storm-time, turbulence increases the electron temperatures and conductivities more than precipitation. In this talk, we present the effect of adding these effects to the combined Lyon-Fedder-Mobarry (LFM) global MHD magnetosphere simulator and the Rice Convection Model (RCM). The LFM combines a magnetohydrodynamic (MHD) simulation of the magnetosphere with a 2D electrostatic solution of the ionosphere. The RCM uses drift physics to accurately model the inner magnetosphere, including a storm enhanced ring current. The LFM and coupled LFM-RCM simulations have previously shown unrealistically high cross-polar-cap potentials during strong solar wind driving conditions. We have recently implemented an LFM module that modifies the ionospheric conductivity to account for FBI driven anomalous electron heating and non-linear cross-field current enhancements as a function of the predicted ionospheric electric field. We have also improved the LFM-RCM code by making it capable of handling dipole tilts and asymmetric ionospheric solutions. We have tested this new LFM version by simulating the March 17, 2013 geomagnetic storm. These simulations showed a significant reduction in the cross-polar-cap potential during the strongest driving conditions, significant increases in the ionospheric conductivity in the auroral oval, and better agreement with DMSP observations of sub-auroral polarization streams. We conclude that accurate MIT simulations of geospace storms require the inclusion of turbulent conductivities.

  19. Intake of milk with added micronutrients increases the effectiveness of an energy-restricted diet to reduce body weight: a randomized controlled clinical trial in Mexican women.

    PubMed

    Rosado, Jorge L; Garcia, Olga P; Ronquillo, Dolores; Hervert-Hernández, Deisy; Caamaño, Maria Del C; Martínez, Guadalupe; Gutiérrez, Jessica; García, Sandra

    2011-10-01

    Micronutrient deficiencies have been associated with an increase in fat deposition and body weight; thus, adding them to low-fat milk may facilitate weight loss when accompanied by an energy-restricted diet. The objective was to evaluate the effect of the intake of low-fat milk and low-fat milk with added micronutrients on anthropometrics, body composition, blood glucose levels, lipids profile, C-reactive protein, and blood pressure of women following an energy-restricted diet. A 16-week randomized, controlled intervention study. One hundred thirty-nine obese women (aged 34±6 years) from five rural communities in Querétaro, Mexico. Women followed an energy-restricted diet (-500 kcal) and received in addition one of the following treatments: 250 mL of low-fat milk (LFM) three times/day, 250 mL of low-fat milk with micronutrients (LFM+M) three times/day, or a no milk control group (CON). Weight, height, and hip and waist circumferences were measured at baseline and every 4 weeks. Body composition measured by dual-energy x-ray absorptiometry, blood pressure, and blood analysis were done at baseline and at the end of the 16 weeks. Changes in weight and body composition. One-factor analysis of variance, adjusted by age, baseline values, and community random effects. After the 16-week intervention, participants in the LFM+M group lost significantly more weight (-5.1 kg; 95% CI: -6.2 to -4.1) compared with LFM (-3.6 kg; 95% CI: -4.7 to -2.6) and CON (-3.2 kg; 95% CI: -4.3 to -2.2) group members (P=0.035). Body mass index change in the LFM+M group (-2.3; 95% CI: -2.7 to -1.8) was significantly greater than LFM group members (-1.5; 95% CI: -2.0 to -1.1) and CON group members (-1.4; 95% CI: -1.9 to -0.9) (P=0.022). Change in percent body fat among LFM+M group members (-2.7%; 95% CI: -3.2 to -2.1) was significantly higher than LFM group members (-1.8%; 95% CI: -2.3 to -1.3) and CON group members (-1.6%; 95% CI: -2.2 to -1.0) (P=0.019). Change in bone mineral content was significantly higher in LFM group members (29 mg; 95% CI: 15 to 44) and LFM+M group members (27 mg; 95% CI: 13 to 41) compared with CON group members (-2 mg; 95% CI: -17 to -14) (P=0.007). No differences were found between groups in glucose level, blood lipid profile, C-reactive protein level, or blood pressure. Intake of LFM+M increases the effectiveness of an energy-restricted diet to treat obesity, but had no effect on blood lipid levels, glucose levels, C-reactive protein, or blood pressure. Copyright © 2011 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  20. Investigation on the oscillation modes in a thermoacoustic Stirling prime mover: mode stability and mode transition

    NASA Astrophysics Data System (ADS)

    Yu, Z. B.; Li, Q.; Chen, X.; Guo, F. Z.; Xie, X. J.; Wu, J. H.

    2003-12-01

    The purpose of this paper is to investigate the stability of oscillation modes in a thermoacoustic Stirling prime mover, which is a combination of looped tube and resonator. Two modes, with oscillation frequencies of 76 and 528 Hz, have been observed, stabilities of which are widely different. The stability of the high frequency mode (HFM) is affected by low frequency mode (LFM) strongly. Once the LFM is excited when the HFM is present, the HFM will be gradually slaved and suppressed by the LFM. The details of the transition from HFM to LFM have been described. The two stability curves of the two modes have been measured. Mean pressure Pm is an important control parameter influencing the mode stability in the tested system.

  1. Probing fibronectin–antibody interactions using AFM force spectroscopy and lateral force microscopy

    PubMed Central

    Kulik, Andrzej J; Lee, Kyumin; Pyka-Fościak, Grazyna; Nowak, Wieslaw

    2015-01-01

    Summary The first experiment showing the effects of specific interaction forces using lateral force microscopy (LFM) was demonstrated for lectin–carbohydrate interactions some years ago. Such measurements are possible under the assumption that specific forces strongly dominate over the non-specific ones. However, obtaining quantitative results requires the complex and tedious calibration of a torsional force. Here, a new and relatively simple method for the calibration of the torsional force is presented. The proposed calibration method is validated through the measurement of the interaction forces between human fibronectin and its monoclonal antibody. The results obtained using LFM and AFM-based classical force spectroscopies showed similar unbinding forces recorded at similar loading rates. Our studies verify that the proposed lateral force calibration method can be applied to study single molecule interactions. PMID:26114080

  2. Analysis of gait patterns pre- and post- Single Event Multilevel Surgery in children with Cerebral Palsy by means of Offset-Wise Movement Analysis Profile and Linear Fit Method.

    PubMed

    Ancillao, Andrea; van der Krogt, Marjolein M; Buizer, Annemieke I; Witbreuk, Melinda M; Cappa, Paolo; Harlaar, Jaap

    2017-10-01

    Gait analysis is used for the assessment of walking ability of children with cerebral palsy (CP), to inform clinical decision making and to quantify changes after treatment. To simplify gait analysis interpretation and to quantify deviations from normality, some quantitative synthetic descriptors were developed over the years, such as the Movement Analysis Profile (MAP) and the Linear Fit Method (LFM), but their interpretation is not always straightforward. The aims of this work were to: (i) study gait changes, by means of synthetic descriptors, in children with CP that underwent Single Event Multilevel Surgery; (ii) compare the MAP and the LFM on these patients; (iii) design a new index that may overcome the limitations of the previous methods, i.e. the lack of information about the direction of deviation or its source. Gait analysis exams of 10 children with CP, pre- and post-surgery, were collected and MAP and LFM were computed. A new index was designed asa modified version of the MAP by separating out changes in offset (named OC-MAP). MAP documented an improvement in the gait pattern after surgery. The highest effect was observed for the knee flexion/extension angle. However, a worsening was observed as an increase in anterior pelvic tilt. An important source of gait deviation was recognized in the offset between observed tracks and reference. OC-MAP allowed the assessment of the offset component versus the shape component of deviation. LFM provided results similar to OC-MAP offset analysis but could not be considered reliable due to intrinsic limitations. As offset in gait features played an important role in gait deviation, OC-MAP synthetic analysis was proposed as a novel approach to a meaningful parameterisation of global deviations in gait patterns of subjects with CP and gait changes after treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. The mammalian response to lunar particulates.

    NASA Technical Reports Server (NTRS)

    Holland, J. M.; Simmonds, R. C.

    1973-01-01

    The response of germfree mice to subcutaneous and intraperitoneal injection of aqueous suspensions of lunar fine material (LFM) was evaluated. Both uninjected mice and mice injected with dry heat sterilized LFM were included as controls. After injection, the majority of mice were subjected to serial sacrifice to assess the time course of the tissue response. A smaller group of animals were held for lifespan studies. The observations suggest that LFM is relatively insoluble in tissue and that, while acting as a low grade irritant, it has little tendency to evoke reactive fibrosis.

  4. Urban Sprawl and Wildfire Danger along the Wildland-Urban Interface

    NASA Astrophysics Data System (ADS)

    Nghiem, S. V.; Kafatos, M.; Myoung, B.

    2015-12-01

    Urban sprawl has created an extensive wildland-urban interface (WUI) where urban areas encroach well into the wilderness that is highly susceptible to wildfire danger. To monitor urbanization along WUI, an innovative approach based on the Dense Sampling Method with the Rosette Transform (DSM-RT) enables the use of satellite scatterometer data to obtain observations without gaps in time and in space at 1-km posting in the decade of the 2000s. To explain how the satellite signature processed with DSM-RT represents physical urban infrastructures, the case of the mega city of Los Angeles is presented with the DSM-RT satellite image overlaid on three-dimensional buildings and road network from the commercial and industrial core of the city to the residential suburb extended into the wild land. Then the rate of urban development in the 2000s in terms of physical urban infrastructure change, rather than the arbitrary boundary defined by administrative or legislative measures, for 14 cities along the San Gabriel Mountains in California are evaluated to rank the degree of urbanization along the local WUI, which may increase the probability of fire ignitions and fire impacts. Moreover, the Enhanced Vegetation Index (EVI) from the MODIS Aqua satellite is used to estimate live fuel moisture (LFM) conditions around the WUI to evaluate fire danger levels, which are consistent to the specific definition currently used by fire agencies in making real-life decisions for fire preparedness pro-actively before the fire occurrence. As an example, a map of EVI-derived LFM for the Colby Fire in 2014 showing a complex spatial pattern of LFM reduction along an extensive WUI illustrates satellite advantage in monitoring LFM over the vast wild land in Southern California. Since the method is based on global satellite data, it is applicable to regions prone to wildfires across the world.

  5. Magnetic field experiment for Voyagers 1 and 2

    NASA Technical Reports Server (NTRS)

    Behannon, K. W.; Aluna, M. H.; Burlaga, L. F.; Lepping, R. P.; Ness, N. F.; Neubauer, F. M.

    1977-01-01

    The magnetic field experiment to be carried on the Voyager 1 and 2 missions consists of dual low field (LFM) and high field magnetometer (HFM) systems. The dual systems provide greater reliability and, in the case of the LFM's, permit the separation of spacecraft magnetic fields from the ambient fields. Additional reliability is achieved through electronics redundancy. The wide dynamic ranges of plus or minus 0.5G for the LFM's and plus or minus 20G for the HFM's, low quantization uncertainty of plus or minus 0.002 gamma in the most sensitive (plus or minus 8 gamma) LFM range, low sensor RMS noise level of 0.006 gamma, and use of data compaction schemes to optimize the experiment information rate all combine to permit the study of a broad spectrum of phenomena during the mission. Planetary fields at Jupiter, Saturn, and possibly Uranus; satellites of these planets; solar wind and satellite interactions with the planetary fields; and the large-scale structure and microscale characteristics of the interplanetary magnetic field are studied. The interstellar field may also be measured.

  6. Fire risk in California

    NASA Astrophysics Data System (ADS)

    Peterson, Seth Howard

    Fire is an integral part of ecosystems in the western United States. Decades of fire suppression have led to (unnaturally) large accumulations of fuel in some forest communities, such as the lower elevation forests of the Sierra Nevada. Urban sprawl into fire prone chaparral vegetation in southern California has put human lives at risk and the decreased fire return intervals have put the vegetation community at risk of type conversion. This research examines the factors affecting fire risk in two of the dominant landscapes in the state of California, chaparral and inland coniferous forests. Live fuel moisture (LFM) is important for fire ignition, spread rate, and intensity in chaparral. LFM maps were generated for Los Angeles County by developing and then inverting robust cross-validated regression equations from time series field data and vegetation indices (VIs) and phenological metrics from MODIS data. Fire fuels, including understory fuels which are not visible to remote sensing instruments, were mapped in Yosemite National Park using the random forests decision tree algorithm and climatic, topographic, remotely sensed, and fire history variables. Combining the disparate data sources served to improve classification accuracies. The models were inverted to produce maps of fuel models and fuel amounts, and these showed that fire fuel amounts are highest in the low elevation forests that have been most affected by fire suppression impacting the natural fire regime. Wildland fires in chaparral commonly burn in late summer or fall when LFM is near its annual low, however, the Jesusita Fire burned in early May of 2009, when LFM was still relatively high. The HFire fire spread model was used to simulate the growth of the Jesusita Fire using LFM maps derived from imagery acquired at the time of the fire and imagery acquired in late August to determine how much different the fire would have been if it had occurred later in the year. Simulated fires were 1.5 times larger, and the fire reached the wildland urban interface three hours earlier, when using August LFM.

  7. Betty Petersen Memorial Library - NCWCP Publications - NWS

    Science.gov Websites

    (.PDF file) 254 1982 Smith W. 1-2 Day Comparative BWB and LFM Threat Scores and Bias 1971 - 1982 (.PDF Stackpole J. Tracton M. S. Comparative Evaluation of ECMWF and NMC Spectral Forecasts February - July 1982 file) 269 1983 Smith W. 1-2 Day Comparative BWB and LFM Threat Scores, No Precipitation Threat Scores

  8. Microbial identification and automated antibiotic susceptibility testing directly from positive blood cultures using MALDI-TOF MS and VITEK 2.

    PubMed

    Wattal, C; Oberoi, J K

    2016-01-01

    The study addresses the utility of Matrix Assisted Laser Desorption/Ionisation Time-Of-Flight mass spectrometry (MALDI-TOF MS) using VITEK MS and the VITEK 2 antimicrobial susceptibility testing (AST) system for direct identification (ID) and timely AST from positive blood culture bottles using a lysis-filtration method (LFM). Between July and December 2014, a total of 140 non-duplicate mono-microbial blood cultures were processed. An aliquot of positive blood culture broth was incubated with lysis buffer before the bacteria were filtered and washed. Micro-organisms recovered from the filter were first identified using VITEK MS and its suspension was used for direct AST by VITEK 2 once the ID was known. Direct ID and AST results were compared with classical methods using solid growth. Out of the 140 bottles tested, VITEK MS resulted in 70.7 % correct identification to the genus and/ or species level. For the 103 bottles where identification was possible, there was agreement in 97 samples (94.17 %) with classical culture. Compared to the routine method, the direct AST resulted in category agreement in 860 (96.5 %) of 891 bacteria-antimicrobial agent combinations tested. The results of direct ID and AST were available 16.1 hours before those of the standard approach on average. The combined use of VITEK MS and VITEK 2 directly on samples from positive blood culture bottles using a LFM technique can result in rapid and reliable ID and AST results in blood stream infections to result in early institution of targeted treatment. The combination of LFM and AST using VITEK 2 was found to expedite AST more reliably.

  9. Draft Genome Sequence of Pseudomonas sp. Strain LFM046, a Producer of Medium-Chain-Length Polyhydroxyalkanoate

    PubMed Central

    Cardinali-Rezende, Juliana; Alexandrino, Paulo Moises Raduan; Nahat, Rafael Augusto Theodoro Pereira de Souza; Sant’Ana, Débora Parrine Vieira; Silva, Luiziana Ferreira; Gomez, José Gregório Cabrera

    2015-01-01

    Pseudomonas sp. LFM046 is a medium-chain-length polyhydroxyalkanoate (PHAMCL) producer capable of using various carbon sources (carbohydrates, organic acids, and vegetable oils) and was first isolated from sugarcane cultivation soil in Brazil. The genome sequence was found to be 5.97 Mb long with a G+C content of 66%. PMID:26294616

  10. Estimating the delay-Doppler of target echo in a high clutter underwater environment using wideband linear chirp signals: Evaluation of performance with experimental data.

    PubMed

    Yu, Ge; Yang, T C; Piao, Shengchun

    2017-10-01

    A chirp signal is a signal with linearly varying instantaneous frequency over the signal bandwidth, also known as a linear frequency modulated (LFM) signal. It is widely used in communication, radar, active sonar, and other applications due to its Doppler tolerance property in signal detection using the matched filter (MF) processing. Modern sonar uses high-gain, wideband signals to improve the signal to reverberation ratio. High gain implies a high product of the signal bandwidth and duration. However, wideband and/or long duration LFM signals are no longer Doppler tolerant. The shortcoming of the standard MF processing is loss of performance, and bias in range estimation. This paper uses the wideband ambiguity function and the fractional Fourier transform method to estimate the target velocity and restore the performance. Target velocity or Doppler provides a clue for differentiating the target from the background reverberation and clutter. The methods are applied to simulated and experimental data.

  11. Modeling and Analysis of Target Echo and Clutter in Range-Dependent Bistatic Environments: FY14 Annual Report for ONR

    DTIC Science & Technology

    2014-09-30

    with energy source level ESL of 198.4 dB; the omnidirectional results were reduced by the effective reverberation response [EP09] of 19.7 dB. The...analysis and improved environmental inputs. Similar graphs (not shown) were obtained for the 1900–2000 Hz LFM and 2700–2800 Hz LFM, using ESLs of

  12. Signal recognition and parameter estimation of BPSK-LFM combined modulation

    NASA Astrophysics Data System (ADS)

    Long, Chao; Zhang, Lin; Liu, Yu

    2015-07-01

    Intra-pulse analysis plays an important role in electronic warfare. Intra-pulse feature abstraction focuses on primary parameters such as instantaneous frequency, modulation, and symbol rate. In this paper, automatic modulation recognition and feature extraction for combined BPSK-LFM modulation signals based on decision theoretic approach is studied. The simulation results show good recognition effect and high estimation precision, and the system is easy to be realized.

  13. Draft Genome Sequence of Pseudomonas sp. Strain LFM046, a Producer of Medium-Chain-Length Polyhydroxyalkanoate.

    PubMed

    Cardinali-Rezende, Juliana; Alexandrino, Paulo Moises Raduan; Nahat, Rafael Augusto Theodoro Pereira de Souza; Sant'Ana, Débora Parrine Vieira; Silva, Luiziana Ferreira; Gomez, José Gregório Cabrera; Taciro, Marilda Keico

    2015-08-20

    Pseudomonas sp. LFM046 is a medium-chain-length polyhydroxyalkanoate (PHAMCL) producer capable of using various carbon sources (carbohydrates, organic acids, and vegetable oils) and was first isolated from sugarcane cultivation soil in Brazil. The genome sequence was found to be 5.97 Mb long with a G+C content of 66%. Copyright © 2015 Cardinali-Rezende et al.

  14. Evaluation of Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Moderate Resolution Imaging Spectrometer (MODIS) measures of live fuel moisture and fuel condition in a shrubland ecosystem in southern California

    Treesearch

    D. A. Roberts; P.E. Dennison; S. Peterson; S. Sweeney; J. Rechel

    2006-01-01

    Dynamic changes in live fuel moisture (LFM) and fuel condition modify fire danger in shrublands. We investigated the empirical relationship between field-measured LFM and remotely sensed greenness and moisture measures from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the Moderate Resolution Imaging Spectrometer (MODIS). Key goals were to assess the...

  15. Liquid Film Migration in Warm Formed Aluminum Brazing Sheet

    NASA Astrophysics Data System (ADS)

    Benoit, M. J.; Whitney, M. A.; Wells, M. A.; Jin, H.; Winkler, S.

    2017-10-01

    Warm forming has previously proven to be a promising manufacturing route to improve formability of Al brazing sheets used in automotive heat exchanger production; however, the impact of warm forming on subsequent brazing has not previously been studied. In particular, the interaction between liquid clad and solid core alloys during brazing through the process of liquid film migration (LFM) requires further understanding. Al brazing sheet comprised of an AA3003 core and AA4045 clad alloy, supplied in O and H24 tempers, was stretched between 0 and 12 pct strain, at room temperature and 523K (250 °C), to simulate warm forming. Brazeability was predicted through thermal and microstructure analysis. The rate of solid-liquid interactions was quantified using thermal analysis, while microstructure analysis was used to investigate the opposing processes of LFM and core alloy recrystallization during brazing. In general, liquid clad was consumed relatively rapidly and LFM occurred in forming conditions where the core alloy did not recrystallize during brazing. The results showed that warm forming could potentially impair brazeability of O temper sheet by extending the regime over which LFM occurs during brazing. No change in microstructure or thermal data was found for H24 sheet when the forming temperature was increased, and thus warm forming was not predicted to adversely affect the brazing performance of H24 sheet.

  16. Wildfire Danger Potential in California

    NASA Astrophysics Data System (ADS)

    Kafatos, M.; Myoung, B.; Kim, S. H.; Fujioka, F. M.; Kim, J.

    2015-12-01

    Wildfires are an important concern in California (CA) which is characterized by the semi-arid to arid climate and vegetation types. Highly variable winter precipitation and extended hot and dry warm season in the region challenge an effective strategic fire management. Climatologically, the fire season which is based on live fuel moisture (LFM) of generally below 80% in Los Angeles County spans 4 months from mid-July to mid-November, but it has lasted over 7 months in the past several years. This behavior is primarily due to the ongoing drought in CA during the last decade, which is responsible for frequent outbreaks of severe wildfires in the region. Despite their importance, scientific advances for the recent changes in wildfire risk and effective assessments of wildfire risk are lacking. In the present study, we show impacts of large-scale atmospheric circulations on an early start and then extended length of fire seasons. For example, the strong relationships of North Atlantic Oscillation (NAO) with springtime temperature and precipitation in the SWUS that was recently revealed by our team members have led to an examination of the possible impact of NAO on wildfire danger in the spring. Our results show that the abnormally warm and dry spring conditions associated with positive NAO phases can cause an early start of a fire season and high fire risks throughout the summer and fall. For an effective fire danger assessment, we have tested the capability of satellite vegetation indices (VIs) in replicating in situ LFM of Southern CA chaparral ecosystems by 1) comparing seasonal/interannual characteristics of in-situ LFM with VIs and 2) developing an empirical model function of LFM. Unlike previous studies attempting a point-to-point comparison, we attempt to examine the LFM relationship with VIs averaged over different areal coverage with chamise-dominant grids (i.e., 0.5 km to 25 km radius circles). Lastly, we discuss implications of the results for fire danger assessment and prediction.

  17. The Role of Membrane-Derived Second Messengers and Bmx/Etk in Response to Radiation Treatment of Prostate Cancer

    DTIC Science & Technology

    2009-01-01

    promising pharmacologic target for radiation enhancement. Although LMF- A13 is clinically used as a Btk inhibitor , many groups have used LFM-A13 as a Bmx... inhibitor due to the high homology between Bmx and Btk . Because Btk is only found in bone marrow–derived cells, we felt that LFM-A13 could be used...discovered a number of selective irreversible Btk inhibitors aimed at treating rheumatoid arthritis. Moreover, CGI Pharmaceuticals, Inc. has been developing

  18. On the local field method with the account of spatial dispersion. Application to the optical activity theory

    NASA Astrophysics Data System (ADS)

    Tyu, N. S.; Ekhilevsky, S. G.

    1992-07-01

    For the perfect molecular crystals the equations of the local field method (LFM) with the account of spatial dispersion are formulated. They are used to derive the expression for the crystal polarizability tensor. For the first time within the framework of this method the formula for the gyrotropy tensor of an arbitrary optically active molecular crystal is obtained. This formula is analog of well known relationships of Lorentz-Lorenz.

  19. Simulating Sources of Superstorm Plasmas

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching

    2008-01-01

    We evaluated the contributions to magnetospheric pressure (ring current) of the solar wind, polar wind, auroral wind, and plasmaspheric wind, with the surprising result that the main phase pressure is dominated by plasmaspheric protons. We used global simulation fields from the LFM single fluid ideal MHD model. We embedded the Comprehensive Ring Current Model within it, driven by the LFM transpolar potential, and supplied with plasmas at its boundary including solar wind protons, polar wind protons, auroral wind O+, and plasmaspheric protons. We included auroral outflows and acceleration driven by the LFM ionospheric boundary condition, including parallel ion acceleration driven by upward currents. Our plasmasphere model runs within the CRCM and is driven by it. Ionospheric sources were treated using our Global Ion Kinetics code based on full equations of motion. This treatment neglects inertial loading and pressure exerted by the ionospheric plasmas, and will be superceded by multifluid simulations that include those effects. However, these simulations provide new insights into the respective role of ionospheric sources in storm-time magnetospheric dynamics.

  20. Health-Related Quality-of-Life Outcomes: A Reflexology Trial With Patients With Advanced-Stage Breast Cancer

    PubMed Central

    Wyatt, Gwen; Sikorskii, Alla; Rahbar, Mohammad Hossein; Victorson, David; You, Mei

    2013-01-01

    Purpose/Objectives To evaluate the safety and efficacy of reflexology, a complementary therapy that applies pressure to specific areas of the feet. Design Longitudinal, randomized clinical trial. Setting Thirteen community-based medical oncology clinics across the midwestern United States. Sample A convenience sample of 385 predominantly Caucasian women with advanced-stage breast cancer receiving chemotherapy and/or hormonal therapy. Methods Following the baseline interview, women were randomized into three primary groups: reflexology (n = 95), lay foot manipulation (LFM) (n = 95), or conventional care (n = 96). Two preliminary reflexology (n = 51) and LFM (n = 48) test groups were used to establish the protocols. Participants were interviewed again postintervention at study weeks 5 and 11. Main Research Variables Breast cancer–specific health-related quality of life (HRQOL), physical functioning, and symptoms. Findings No adverse events were reported. A longitudinal comparison revealed significant improvements in physical functioning for the reflexology group compared to the control group (p = 0.04). Severity of dyspnea was reduced in the reflexology group compared to the control group (p < 0.01) and the LFM group (p = 0.02). No differences were found on breast cancer–specific HRQOL, depressive symptomatology, state anxiety, pain, and nausea. Conclusions Reflexology may be added to existing evidence-based supportive care to improve HRQOL for patients with advanced-stage breast cancer during chemotherapy and/or hormonal therapy. Implications for Nursing Reflexology can be recommended for safety and usefulness in relieving dyspnea and enhancing functional status among women with advanced-stage breast cancer. PMID:23107851

  1. Nutrient Intake, Diet Quality, and Weight Measures in Breakfast Patterns Consumed by Children Compared with Breakfast Skippers: NHANES 2001-2008.

    PubMed

    O'Neil, Carol E; Nicklas, Theresa A; Fulgoni, Victor L

    2015-01-01

    Most studies showing that children consuming breakfast have better nutrient intakes, diet quality, and lower weight than breakfast skippers have the incorrect premise that breakfast meals are homogeneous. The purpose of this study was to classify breakfast meals into patterns and determine the association of the breakfast patterns with daily and breakfast nutrient intakes, diet quality, and weight. Data from children (2-18 years of age; N = 14,200) participating in the National Health and Nutrition Examination Survey 2001-2008 were used. Intake was determined from one day 24-hour dietary recalls. Diet quality was measured using the Healthy Eating Index-2005 (HEI-2005). Body mass index (BMI) z-scores were determined. Twelve patterns (including No Breakfast [∼19% of population]), explaining 63% of the variance in energy from breakfast, were examined. Covariate adjusted general linear models were used to compare outcome variables of consumers of different patterns with breakfast skippers. The p value was Bonferroni corrected (< 0.05/12 = < 0.0042). Consumers of the Eggs/Grain/Meat, Poultry, Fish (MPF)/ Fruit Juice (FJ) and MPF/ Grain/FJ patterns showed higher daily intakes of saturated fats, solid fats, and sodium and lower daily intakes of added sugars than breakfast skippers. Consumers of most breakfast patterns showed higher daily intakes of some nutrients of public health concern (dietary fiber, vitamin D, calcium, and potassium); however, those consuming the Grain or MPF/Grain/FJ pattern did not. Consumers of the Grain/Lower Fat Milk (LFM)/Sweets/FJ, Presweetened (PS) Ready-to-eat Cereal (RTEC)/ LFM, RTEC/LFM, Cooked Cereal/Milk/FJ, and Whole Fruit patterns had higher total HEI-2005 scores than breakfast skippers; those consuming the MPF/ Grain/FJ pattern had lower diet quality than breakfast skippers. Consumption of the Grain/ LFM/Sweets/FJ, PSRTEC/whole milk, Soft Drinks/ FJ/Grain/Potatoes, RTEC/whole milk, and Cooked Cereal/ Milk/ FJ patterns was associated with lower BMI z-scores than seen in breakfast skippers. There are dietary and weight advantages of consuming breakfast, especially breakfasts that include grains, cereals, LFM, and fruit/ FJ, in contrast to the potential adverse effects of skipping breakfast.

  2. Global MHD modeling of resonant ULF waves: Simulations with and without a plasmasphere.

    PubMed

    Claudepierre, S G; Toffoletto, F R; Wiltberger, M

    2016-01-01

    We investigate the plasmaspheric influence on the resonant mode coupling of magnetospheric ultralow frequency (ULF) waves using the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamic (MHD) model. We present results from two different versions of the model, both driven by the same solar wind conditions: one version that contains a plasmasphere (the LFM coupled to the Rice Convection Model, where the Gallagher plasmasphere model is also included) and another that does not (the stand-alone LFM). We find that the inclusion of a cold, dense plasmasphere has a significant impact on the nature of the simulated ULF waves. For example, the inclusion of a plasmasphere leads to a deeper (more earthward) penetration of the compressional (azimuthal) electric field fluctuations, due to a shift in the location of the wave turning points. Consequently, the locations where the compressional electric field oscillations resonantly couple their energy into local toroidal mode field line resonances also shift earthward. We also find, in both simulations, that higher-frequency compressional (azimuthal) electric field oscillations penetrate deeper than lower frequency oscillations. In addition, the compressional wave mode structure in the simulations is consistent with a radial standing wave oscillation pattern, characteristic of a resonant waveguide. The incorporation of a plasmasphere into the LFM global MHD model represents an advance in the state of the art in regard to ULF wave modeling with such simulations. We offer a brief discussion of the implications for radiation belt modeling techniques that use the electric and magnetic field outputs from global MHD simulations to drive particle dynamics.

  3. COUPLING OF CORONAL AND HELIOSPHERIC MAGNETOHYDRODYNAMIC MODELS: SOLUTION COMPARISONS AND VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkin, V. G.; Lionello, R.; Linker, J.

    2016-11-01

    Two well-established magnetohydrodynamic (MHD) codes are coupled to model the solar corona and the inner heliosphere. The corona is simulated using the MHD algorithm outside a sphere (MAS) model. The Lyon–Fedder–Mobarry (LFM) model is used in the heliosphere. The interface between the models is placed in a spherical shell above the critical point and allows both models to work in either a rotating or an inertial frame. Numerical tests are presented examining the coupled model solutions from 20 to 50 solar radii. The heliospheric simulations are run with both LFM and the MAS extension into the heliosphere, and use themore » same polytropic coronal MAS solutions as the inner boundary condition. The coronal simulations are performed for idealized magnetic configurations, with an out-of-equilibrium flux rope inserted into an axisymmetric background, with and without including the solar rotation. The temporal evolution at the inner boundary of the LFM and MAS solutions is shown to be nearly identical, as are the steady-state background solutions, prior to the insertion of the flux rope. However, after the coronal mass ejection has propagated through the significant portion of the simulation domain, the heliospheric solutions diverge. Additional simulations with different resolution are then performed and show that the MAS heliospheric solutions approach those of LFM when run with progressively higher resolution. Following these detailed tests, a more realistic simulation driven by the thermodynamic coronal MAS is presented, which includes solar rotation and an azimuthally asymmetric background and extends to the Earth’s orbit.« less

  4. Radar wideband digital beamforming based on time delay and phase compensation

    NASA Astrophysics Data System (ADS)

    Fu, Wei; Jiang, Defu

    2018-07-01

    In conventional phased array radars, analogue time delay devices and phase shifters have been used for wideband beamforming. These methods suffer from insertion losses, gain mismatches and delay variations, and they occupy a large chip area. To solve these problems, a compact architecture of digital array antennas based on subarrays was considered. In this study, the receiving beam patterns of wideband linear frequency modulation (LFM) signals were constructed by applying analogue stretch processing via mixing with delayed reference signals at the subarray level. Subsequently, narrowband digital time delaying and phase compensation of the tone signals were implemented with reduced arithmetic complexity. Due to the differences in amplitudes, phases and time delays between channels, severe performance degradation of the beam patterns occurred without corrections. To achieve good beamforming performance, array calibration was performed in each channel to adjust the amplitude, frequency and phase of the tone signal. Using a field-programmable gate array, wideband LFM signals and finite impulse response filters with continuously adjustable time delays were implemented in a polyphase structure. Simulations and experiments verified the feasibility and effectiveness of the proposed digital beamformer.

  5. Role of Bruton’s tyrosine kinase in myeloma cell migration and induction of bone disease

    PubMed Central

    Bam, Rakesh; Ling, Wen; Khan, Sharmin; Pennisi, Angela; Venkateshaiah, Sathisha Upparahalli; Li, Xin; van Rhee, Frits; Usmani, Saad; Barlogie, Bart; Shaughnessy, John; Epstein, Joshua; Yaccoby, Shmuel

    2014-01-01

    Myeloma cells typically grow in bone, recruit osteoclast precursors and induce their differentiation and activity in areas adjacent to tumor foci. Bruton’s tyrosine kinase (BTK), of the TEC family, is expressed in hematopoietic cells and is particularly involved in B-lymphocyte function and osteoclastogenesis. We demonstrated BTK expression in clinical myeloma plasma cells, interleukin (IL) –6– or stroma–dependent cell lines and osteoclasts. SDF-1 induced BTK activation in myeloma cells and BTK inhibition by small hairpin RNA or the small molecule inhibitor, LFM-A13, reduced their migration toward stromal cell-derived factor-1 (SDF-1). Pretreatment with LFM-A13 also reduced in vivo homing of myeloma cells to bone using bioluminescence imaging in the SCID-rab model. Enforced expression of BTK in myeloma cell line enhanced cell migration toward SDF-1 but had no effect on short-term growth. BTK expression was correlated with cell-surface CXCR4 expression in myeloma cells (n = 33, r = 0.81, P < 0.0001), and BTK gene and protein expression was more profound in cell-surface CXCR4-expressing myeloma cells. BTK was not upregulated by IL-6 while its inhibition had no effect on IL-6 signaling in myeloma cells. Human osteoclast precursors also expressed BTK and cell-surface CXCR4 and migrated toward SDF-1. LFM-A13 suppressed migration and differentiation of osteoclast precursors as well as bone-resorbing activity of mature osteoclasts. In primary myeloma-bearing SCID-rab mice, LFM-A13 inhibited osteoclast activity, prevented myeloma-induced bone resorption and moderately suppressed myeloma growth. These data demonstrate BTK and cell-surface CXCR4 association in myeloma cells and that BTK plays a role in myeloma cell homing to bone and myeloma-induced bone disease. PMID:23456977

  6. Nutrient Intake, Diet Quality, and Weight Measures in Breakfast Patterns Consumed by Children Compared with Breakfast Skippers: NHANES 2001–2008

    PubMed Central

    O'Neil, Carol E.; Nicklas, Theresa A.; Fulgoni, Victor L.

    2015-01-01

    Most studies showing that children consuming breakfast have better nutrient intakes, diet quality, and lower weight than breakfast skippers have the incorrect premise that breakfast meals are homogeneous. The purpose of this study was to classify breakfast meals into patterns and determine the association of the breakfast patterns with daily and breakfast nutrient intakes, diet quality, and weight. Data from children (2–18 years of age; N = 14,200) participating in the National Health and Nutrition Examination Survey 2001–2008 were used. Intake was determined from one day 24-hour dietary recalls. Diet quality was measured using the Healthy Eating Index-2005 (HEI-2005). Body mass index (BMI) z-scores were determined. Twelve patterns (including No Breakfast [∼19% of population]), explaining 63% of the variance in energy from breakfast, were examined. Covariate adjusted general linear models were used to compare outcome variables of consumers of different patterns with breakfast skippers. The p value was Bonferroni corrected (< 0.05/12 = < 0.0042). Consumers of the Eggs/Grain/Meat, Poultry, Fish (MPF)/ Fruit Juice (FJ) and MPF/ Grain/FJ patterns showed higher daily intakes of saturated fats, solid fats, and sodium and lower daily intakes of added sugars than breakfast skippers. Consumers of most breakfast patterns showed higher daily intakes of some nutrients of public health concern (dietary fiber, vitamin D, calcium, and potassium); however, those consuming the Grain or MPF/Grain/FJ pattern did not. Consumers of the Grain/Lower Fat Milk (LFM)/Sweets/FJ, Presweetened (PS) Ready-to-eat Cereal (RTEC)/ LFM, RTEC/LFM, Cooked Cereal/Milk/FJ, and Whole Fruit patterns had higher total HEI-2005 scores than breakfast skippers; those consuming the MPF/ Grain/FJ pattern had lower diet quality than breakfast skippers. Consumption of the Grain/ LFM/Sweets/FJ, PSRTEC/whole milk, Soft Drinks/ FJ/Grain/Potatoes, RTEC/whole milk, and Cooked Cereal/ Milk/ FJ patterns was associated with lower BMI z-scores than seen in breakfast skippers. There are dietary and weight advantages of consuming breakfast, especially breakfasts that include grains, cereals, LFM, and fruit/ FJ, in contrast to the potential adverse effects of skipping breakfast. PMID:29546119

  7. Investigating country-specific music preferences and music recommendation algorithms with the LFM-1b dataset.

    PubMed

    Schedl, Markus

    2017-01-01

    Recently, the LFM-1b dataset has been proposed to foster research and evaluation in music retrieval and music recommender systems, Schedl (Proceedings of the ACM International Conference on Multimedia Retrieval (ICMR). New York, 2016). It contains more than one billion music listening events created by more than 120,000 users of Last.fm. Each listening event is characterized by artist, album, and track name, and further includes a timestamp. Basic demographic information and a selection of more elaborate listener-specific descriptors are included as well, for anonymized users. In this article, we reveal information about LFM-1b's acquisition and content and we compare it to existing datasets. We furthermore provide an extensive statistical analysis of the dataset, including basic properties of the item sets, demographic coverage, distribution of listening events (e.g., over artists and users), and aspects related to music preference and consumption behavior (e.g., temporal features and mainstreaminess of listeners). Exploiting country information of users and genre tags of artists, we also create taste profiles for populations and determine similar and dissimilar countries in terms of their populations' music preferences. Finally, we illustrate the dataset's usage in a simple artist recommendation task, whose results are intended to serve as baseline against which more elaborate techniques can be assessed.

  8. A Dynamic Coupled Magnetosphere-Ionosphere-Ring Current Model

    NASA Astrophysics Data System (ADS)

    Pembroke, Asher

    In this thesis we describe a coupled model of Earth's magnetosphere that consists of the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) simulation, the MIX ionosphere solver and the Rice Convection Model (RCM). We report some results of the coupled model using idealized inputs and model parameters. The algorithmic and physical components of the model are described, including the transfer of magnetic field information and plasma boundary conditions to the RCM and the return of ring current plasma properties to the LFM. Crucial aspects of the coupling include the restriction of RCM to regions where field-line averaged plasma-beta ¡=1, the use of a plasmasphere model, and the MIX ionosphere model. Compared to stand-alone MHD, the coupled model produces a substantial increase in ring current pressure and reduction of the magnetic field near the Earth. In the ionosphere, stronger region-1 and region-2 Birkeland currents are seen in the coupled model but with no significant change in the cross polar cap potential drop, while the region-2 currents shielded the low-latitude convection potential. In addition, oscillations in the magnetic field are produced at geosynchronous orbit with the coupled code. The diagnostics of entropy and mass content indicate that these oscillations are associated with low-entropy flow channels moving in from the tail and may be related to bursty bulk flows and bubbles seen in observations. As with most complex numerical models, there is the ongoing challenge of untangling numerical artifacts and physics, and we find that while there is still much room for improvement, the results presented here are encouraging. Finally, we introduce several new methods for magnetospheric visualization and analysis, including a fluid-spatial volume for RCM and a field-aligned analysis mesh for the LFM. The latter allows us to construct novel visualizations of flux tubes, drift surfaces, topological boundaries, and bursty-bulk flows.

  9. A VARI-Based Relative Greenness from MODIS Data for Computing the Fire Potential Index

    NASA Technical Reports Server (NTRS)

    Schneider, P.; Roberts, D. A.; Kyriakidis, P. C.

    2008-01-01

    The Fire Potential Index (FPI) relies on relative greenness (RG) estimates from remote sensing data. The Normalized Difference Vegetation index (NDVI), derived from NOAA Advanced Very High Resolution Radiometer (AVHRR) imagery is currently used to calculate RG operationally. Here we evaluated an alternate measure of RG using the Visible Atmospheric Resistant Index (VARI) derived from Moderate Resolution Imaging Spectrometer (MODIS) data. VARI was chosen because it has previously been shown to have the strongest relationship with Live Fuel Moisture (LFM) out of a wide selection of MODIS-derived indices in southern California shrublands. To compare MODIS-based NDVI-FPI and VARI-FPI, RG was calculated from a 6-year time series of MODIS composites and validated against in-situ observations of LFM as a surrogate for vegetation greenness. RG from both indices was then compared in terms of its performance for computing the FPI using historical wildfire data. Computed RG values were regressed against ground-sampled LFM at 14 sites within Los Angeles County. The results indicate the VARI-based RG consistently shows a stronger relationship with observed LFM than NDVI-based RG. With an average R2 of 0.727 compared to a value of only 0.622 for NDVI-RG, VARI-RG showed stronger relationships at 13 out of 14 sites. Based on these results, daily FPI maps were computed for the years 2001 through 2005 using both NDVI-RG and VARI-RG. These were then validated against 12,490 fire detections from the MODIS active fire product using logistic regression. Deviance of the logistic regression model was 408.8 for NDVI-FPI and 176.2 for VARI-FPI. The c-index was found to be 0.69 and 0.78, respectively. The results show that VARI-FP outperforms NDVI-FPI in distinguishing between fire and no-fire events for historical wildfire data in southern California for the given time period.

  10. Do comorbid fibromyalgia diagnoses change after a functional restoration program in patients with chronic disabling occupational musculoskeletal disorders?

    PubMed

    Hartzell, Meredith M; Neblett, Randy; Perez, Yoheli; Brede, Emily; Mayer, Tom G; Gatchel, Robert J

    2014-08-01

    A retrospective study of prospectively collected data. To determine whether comorbid fibromyalgia, identified in patients with chronic disabling occupational musculoskeletal disorders (CDOMDs), resolves with a functional restoration program (FRP). Fibromyalgia involves widespread bodily pain and tenderness to palpation. In recent studies, 23% to 41% of patients with CDOMDs entering an FRP had comorbid fibromyalgia, compared with population averages of 2% to 5%. Few studies have examined whether fibromyalgia diagnoses resolve with any treatment, and none have investigated diagnosis responsiveness to an FRP. A consecutive cohort of patients with CDOMDs (82% with spinal disorders and all reporting chronic spinal pain) and comorbid fibromyalgia (N = 117) completed an FRP, which included quantitatively directed exercise progression and multimodal disability management. Diagnosis responsiveness, evaluated at discharge, created 2 groups: those who retained fibromyalgia and those who did not. These groups were compared with chronic regional lumbar pain only patients (LO group, n = 87), lacking widespread pain and fibromyalgia. Of the patients with comorbid fibromyalgia, 59% (n = 69) retained the fibromyalgia diagnosis (RFM group) and 41% (n = 48) lost the fibromyalgia diagnosis (LFM group) at discharge. Although all 3 groups reported decreased pain intensity, disability, and depressive symptoms from admission to discharge, RFM patients reported higher symptom levels than the LFM and LO groups at discharge. The LFM and LO groups were statistically similar. At 1-year follow-up, LO patients demonstrated higher work retention than both fibromyalgia groups (P < 0.03). Despite a significant comorbid fibromyalgia prevalence in a cohort of patients with CDOMDs entering an FRP, 41% of patients with an initial fibromyalgia diagnosis no longer met diagnostic criteria for fibromyalgia at discharge and were indistinguishable from LO patients on pain, disability, and depression symptoms. However, both fibromyalgia groups (LFM and RFM) had lower work retention than LO patients 1 year later, suggesting that an FRP may suppress symptoms of fibromyalgia in a subset of patients, but prolonged fibromyalgia-related disability may be more difficult to overcome. 2.

  11. Agile waveforms for joint SAR-GMTI processing

    NASA Astrophysics Data System (ADS)

    Jaroszewski, Steven; Corbeil, Allan; McMurray, Stephen; Majumder, Uttam; Bell, Mark R.; Corbeil, Jeffrey; Minardi, Michael

    2016-05-01

    Wideband radar waveforms that employ spread-spectrum techniques were investigated and experimentally tested. The waveforms combine bi-phase coding with a traditional LFM chirp and are applicable to joint SAR-GMTI processing. After de-spreading, the received signals can be processed to support simultaneous GMTI and high resolution SAR imaging missions by airborne radars. The spread spectrum coding techniques can provide nearly orthogonal waveforms and offer enhanced operations in some environments by distributing the transmitted energy over a large instantaneous bandwidth. The LFM component offers the desired Doppler tolerance. In this paper, the waveforms are formulated and a shift-register approach for de-spreading the received signals is described. Hardware loop-back testing has shown the feasibility of using these waveforms in experimental radar test bed.

  12. Photonics-based real-time ultra-high-range-resolution radar with broadband signal generation and processing.

    PubMed

    Zhang, Fangzheng; Guo, Qingshui; Pan, Shilong

    2017-10-23

    Real-time and high-resolution target detection is highly desirable in modern radar applications. Electronic techniques have encountered grave difficulties in the development of such radars, which strictly rely on a large instantaneous bandwidth. In this article, a photonics-based real-time high-range-resolution radar is proposed with optical generation and processing of broadband linear frequency modulation (LFM) signals. A broadband LFM signal is generated in the transmitter by photonic frequency quadrupling, and the received echo is de-chirped to a low frequency signal by photonic frequency mixing. The system can operate at a high frequency and a large bandwidth while enabling real-time processing by low-speed analog-to-digital conversion and digital signal processing. A conceptual radar is established. Real-time processing of an 8-GHz LFM signal is achieved with a sampling rate of 500 MSa/s. Accurate distance measurement is implemented with a maximum error of 4 mm within a range of ~3.5 meters. Detection of two targets is demonstrated with a range-resolution as high as 1.875 cm. We believe the proposed radar architecture is a reliable solution to overcome the limitations of current radar on operation bandwidth and processing speed, and it is hopefully to be used in future radars for real-time and high-resolution target detection and imaging.

  13. Liposome encapsulated soy lecithin and cholesterol can efficiently replace chicken egg yolk in human semen cryopreservation medium.

    PubMed

    Mutalik, Srinivas; Salian, Sujith Raj; Avadhani, Kiran; Menon, Jyothsna; Joshi, Haritima; Hegde, Aswathi Raju; Kumar, Pratap; Kalthur, Guruprasad; Adiga, Satish Kumar

    2014-06-01

    Cryopreservation of spermatozoa plays a significant role in reproductive medicine and fertility preservation. Chicken egg yolk is used as an extender in cryopreservation of human spermatozoa using glycerol egg yolk citrate (GEYC) buffered medium. Even though 50% survival of spermatozoa is generally achieved with this method, the risk of high levels of endotoxins and transmission pathogens from chicken egg yolk is a matter of concern. In the present study we attempted to establish a chemically defined cryopreservation medium which can replace the chicken egg yolk without affecting sperm survival. Ejaculates from 28 men were cryopreserved with GEYC based freezing medium or liposome encapsulated soy lecithin-cholesterol based freezing medium (LFM). The semen samples were subjected to rapid thawing after 14 days of storage in liquid nitrogen. Post-thaw analysis indicated significantly higher post-thaw motility and sperm survival in spermatozoa cryopreserved with LFM compared to conventional GEYC freezing medium. The soy lecithin and cholesterol at the ratio of 80:20 with sucrose showed the highest percentage of post-thaw motility and survival compared to the other compositions. In conclusion, chemically defined cryopreservation medium with liposome encapsulated soy lecithin and cholesterol can effectively replace the chicken egg yolk from human semen cryopreservation medium without compromising post-thaw outcome.

  14. Bile components and lecithin supplemented to plant based diets do not diminish diet related intestinal inflammation in Atlantic salmon.

    PubMed

    Kortner, Trond M; Penn, Michael H; Bjӧrkhem, Ingemar; Måsøval, Kjell; Krogdahl, Åshild

    2016-09-07

    The present study was undertaken to gain knowledge on the role of bile components and lecithin on development of aberrations in digestive functions which seemingly have increased in Atlantic salmon in parallel with the increased use of plant ingredients in fish feed. Post smolt Atlantic salmon were fed for 77 days one of three basal diets: a high fish meal diet (HFM), a low fishmeal diet (LFM), or a diet with high protein soybean meal (HPS). Five additional diets were made from the LFM diet by supplementing with: purified taurocholate (1.8 %), bovine bile salt (1.8 %), taurine (0.4 %), lecithin (1.5 %), or a mix of supplements (suppl mix) containing taurocholate (1.8 %), cholesterol (1.5 %) and lecithin (0.4 %). Two additional diets were made from the HPS diet by supplementing with: bovine bile salt (1.8 %) or the suppl mix. Body and intestinal weights were recorded, and blood, bile, intestinal tissues and digesta were sampled for evaluation of growth, nutrient metabolism and intestinal structure and function. In comparison with fish fed the HFM diet fish fed the LFM and HPS diets grew less and showed reduced plasma bile salt and cholesterol levels. Histological examination of the distal intestine showed signs of enteritis in both LFM and HPS diet groups, though more pronounced in the HPS diet group. The HPS diet reduced digesta dry matter and capacity of leucine amino peptidase in the distal intestine. None of the dietary supplements improved endpoints regarding fish performance, gut function or inflammation in the distal intestine. Some endpoints rather indicated negative effects. Dietary supplementation with bile components or lecithin in general did not improve endpoints regarding performance or gut health in Atlantic salmon, in clear contrast to what has been previously reported for rainbow trout. Follow-up studies are needed to clarify if lower levels of bile salts and cholesterol may give different and beneficial effects, or if other supplements, and other combinations of supplements might prevent or ameliorate inflammation in the distal intestine.

  15. Physical properties and collapse force of according to the z-position of poly-Si pattern using nano-tribology.

    PubMed

    Kim, Soo In; Lee, Chang Woo

    2011-02-01

    Nowadays, many researchers try to measure the collapse force of fine pattern. However, most of the researches use LFM to gauge it indirectly and LFM can measure not for collapse force directly but only limited for horizontal force. Thus, nano-scratch is suggested to measure the collapse force possibly. We used poly-Si pattern on Si plate and changed the z-location of the pattern. From these experiments, the stiffness was decease as depth increase from surface and well fitted with negative exponential curve. Also, the elastic modulus was decreased. From the results, the collapse force of poly-Si nano-patterns was decreased as the depth increased over than 30% from the surface and the maximum collapse force was 26.91 microN and pattern was collapsed between poly-Si and plate.

  16. Health-related quality-of-life outcomes: a reflexology trial with patients with advanced-stage breast cancer.

    PubMed

    Wyatt, Gwen; Sikorskii, Alla; Rahbar, Mohammad Hossein; Victorson, David; You, Mei

    2012-11-01

    To evaluate the safety and efficacy of reflexology, a complementary therapy that applies pressure to specific areas of the feet. Longitudinal, randomized clinical trial. Thirteen community-based medical oncology clinics across the midwestern United States. A convenience sample of 385 predominantly Caucasian women with advanced-stage breast cancer receiving chemotherapy and/or hormonal therapy. Following the baseline interview, women were randomized into three primary groups: reflexology (n = 95), lay foot manipulation (LFM) (n = 95), or conventional care (n = 96). Two preliminary reflexology (n = 51) and LFM (n = 48) test groups were used to establish the protocols. Participants were interviewed again postintervention at study weeks 5 and 11. Breast cancer-specific health-related quality of life (HRQOL), physical functioning, and symptoms. No adverse events were reported. A longitudinal comparison revealed significant improvements in physical functioning for the reflexology group compared to the control group (p = 0.04). Severity of dyspnea was reduced in the reflexology group compared to the control group (p < 0.01) and the LFM group (p = 0.02). No differences were found on breast cancer-specific HRQOL, depressive symptomatology, state anxiety, pain, and nausea. Reflexology may be added to existing evidence-based supportive care to improve HRQOL for patients with advanced-stage breast cancer during chemotherapy and/or hormonal therapy. Reflexology can be recommended for safety and usefulness in relieving dyspnea and enhancing functional status among women with advanced-stage breast cancer.

  17. Postprandial Glycemic and Insulinemic Responses to Common Breakfast Beverages Consumed with a Standard Meal in Adults Who Are Overweight and Obese.

    PubMed

    Li, Jia; Janle, Elsa; Campbell, Wayne W

    2017-01-04

    Breakfast beverages with different nutrient compositions may affect postprandial glycemic control differently. We assessed the effects of consuming (1) common breakfast beverages (water, sugar-sweetened coffee, reduced-energy orange juice (OJ), and low-fat milk (LFM)); and (2) fat-free, low-fat, and whole milk with breakfast on postprandial plasma glucose and insulin responses in adults who were overweight/obese. Forty-six subjects (33F/13M, body mass index: 32.5 ± 0.7 kg/m², age: 50 ± 1 years, mean ± SEMs) consumed a standard sandwich with one of the six beverages on separate mornings in randomized order. The test beverages (except water) each contained 12 g digestible carbohydrate. Plasma glucose and insulin concentrations were measured from blood obtained pre- and post-meal at 30-min intervals for 4 h and incremental areas under the curve (AUC) were computed. We found (1) among different beverage types, glucose AUC was higher for coffee versus water, OJ, and LFM. Insulin AUC was higher for coffee and LFM versus OJ and water; (2) Glucose AUCs were not different among water and milks while insulin AUC was higher for milks versus water. In conclusion, consumption of water, reduced-energy OJ, or milk (irrespective of fat content) with a meal may be preferable to consuming sugar-sweetened coffee for glucose control in middle-aged adults who are overweight and obese.

  18. Structure of high latitude currents in magnetosphere-ionosphere models

    NASA Astrophysics Data System (ADS)

    Wiltberger, M. J.; Lyon, J.; Merkin, V. G.; Rigler, E. J.

    2016-12-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model the structure of the high latitude field-aligned current patterns is examined. Each LFM resolution was run for the entire Whole Heliosphere Interval (WHI), which contained two high-speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results from the Weimer 2005 computed using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and confined. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths in the model also results in a better shielding of mid- and low-latitude ionosphere from the polar cap convection, also in agreement with observations. Current-voltage relationships between the R1 strength and the cross-polar cap potential (CPCP) are quite similar at the higher resolutions indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  19. Postprandial Glycemic and Insulinemic Responses to Common Breakfast Beverages Consumed with a Standard Meal in Adults Who Are Overweight and Obese

    PubMed Central

    Li, Jia; Janle, Elsa; Campbell, Wayne W.

    2017-01-01

    Breakfast beverages with different nutrient compositions may affect postprandial glycemic control differently. We assessed the effects of consuming (1) common breakfast beverages (water, sugar-sweetened coffee, reduced-energy orange juice (OJ), and low-fat milk (LFM)); and (2) fat-free, low-fat, and whole milk with breakfast on postprandial plasma glucose and insulin responses in adults who were overweight/obese. Forty-six subjects (33F/13M, body mass index: 32.5 ± 0.7 kg/m2, age: 50 ± 1 years, mean ± SEMs) consumed a standard sandwich with one of the six beverages on separate mornings in randomized order. The test beverages (except water) each contained 12 g digestible carbohydrate. Plasma glucose and insulin concentrations were measured from blood obtained pre- and post-meal at 30-min intervals for 4 h and incremental areas under the curve (AUC) were computed. We found (1) among different beverage types, glucose AUC was higher for coffee versus water, OJ, and LFM. Insulin AUC was higher for coffee and LFM versus OJ and water; (2) Glucose AUCs were not different among water and milks while insulin AUC was higher for milks versus water. In conclusion, consumption of water, reduced-energy OJ, or milk (irrespective of fat content) with a meal may be preferable to consuming sugar-sweetened coffee for glucose control in middle-aged adults who are overweight and obese. PMID:28054966

  20. Friction imprint effect in mechanically cleaved BaTiO{sub 3} (001)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Christian J.; Maryland Nanocenter, University of Maryland, College Park, Maryland 20742; Ebeling, Daniel

    2014-09-28

    Adsorption, chemisorption, and reconstruction at the surfaces of ferroelectric materials can all contribute toward the pinning of ferroelectric polarization, which is called the electrical imprint effect. Here, we show that the opposite is also true: freshly cleaved, atomically flat surfaces of (001) oriented BaTiO{sub 3} exhibit a persistent change in surface chemistry that is driven by ferroelectric polarization. This surface modification is explored using lateral force microscopy (LFM), while the ferroelectric polarization is probed using piezoresponse force microscopy. We find that immediately after cleaving BaTiO{sub 3}, LFM reveals friction contrast between ferroelectric domains. We also find that this surface modificationmore » remains after the ferroelectric domain distribution is modified, resulting in an imprint of the original ferroelectric domain distribution on the sample surface. This friction imprint effect has implications for surface patterning as well as ferroelectric device operation and failure.« less

  1. Ion Transport and Acceleration at Dipolarization Fronts: High-Resolution MHD/Test-Particle Simulations

    NASA Astrophysics Data System (ADS)

    Ukhorskiy, A. Y.; Sorathia, K.; Merkin, V. G.; Sitnov, M. I.; Mitchell, D. G.; Wiltberger, M. J.; Lyon, J.

    2017-12-01

    Much of plasma heating and transport from the magnetotail into the inner magnetosphere occurs in the form of mesoscale discrete injections associated with sharp dipolarizations of magnetic field (dipolarization fronts). In this study we investigate the mechanisms of ion acceleration at dipolarization fronts in a high-resolution global magnetospheric MHD model (LFM). We use large-scale three-dimensional test-particle simulations (CHIMP) to address the following science questions: 1) what are the characteristic scales of dipolarization regions that can stably trap ions? 2) what role does the trapping play in ion transport and acceleration? 3) how does it depend on particle energy and distance from Earth? 4) to what extent ion acceleration is adiabatic? High-resolution LFM was run using idealized solar wind conditions with fixed nominal values of density and velocity and a southward IMF component of -5 nT. To simulate ion interaction with dipolarization fronts, a large ensemble of test particles distributed in energy, pitch-angle, and gyrophase was initialized inside one of the LFM dipolarization channels in the magnetotail. Full Lorentz ion trajectories were then computed over the course of the front inward propagation from the distance of 17 to 6 Earth radii. A large fraction of ions with different initial energies stayed in phase with the front over the entire distance. The effect of magnetic trapping at different energies was elucidated with a correlation of the ion guiding center and the ExB drift velocities. The role of trapping in ion energization was quantified by comparing the partial pressure of ions that exhibit trapping to the pressure of all trapped ions.

  2. Satellite aerosol retrieval using dark target algorithm by coupling BRDF effect over AERONET site

    NASA Astrophysics Data System (ADS)

    Yang, Leiku; Xue, Yong; Guang, Jie; Li, Chi

    2012-11-01

    For most satellite aerosol retrieval algorithms even for multi-angle instrument, the simple forward model (FM) based on Lambertian surface assumption is employed to simulate top of the atmosphere (TOA) spectral reflectance, which does not fully consider the surface bi-directional reflectance functions (BRDF) effect. The approximating forward model largely simplifies the radiative transfer model, reduces the size of the look-up tables, and creates faster algorithm. At the same time, it creates systematic biases in the aerosol optical depth (AOD) retrieval. AOD product from the Moderate Resolution Imaging Spectro-radiometer (MODIS) data based on the dark target algorithm is considered as one of accurate satellite aerosol products at present. Though it performs well at a global scale, uncertainties are still found on regional in a lot of studies. The Lambertian surface assumpiton employed in the retrieving algorithm may be one of the uncertain factors. In this study, we first use radiative transfer simulations over dark target to assess the uncertainty to what extent is introduced from the Lambertian surface assumption. The result shows that the uncertainties of AOD retrieval could reach up to ±0.3. Then the Lambertian FM (L_FM) and the BRDF FM (BRDF_FM) are respectively employed in AOD retrieval using dark target algorithm from MODARNSS (MODIS/Terra and MODIS/Aqua Atmosphere Aeronet Subsetting Product) data over Beijing AERONET site. The validation shows that accuracy in AOD retrieval has been improved by employing the BRDF_FM accounting for the surface BRDF effect, the regression slope of scatter plots with retrieved AOD against AEROENET AOD increases from 0.7163 (for L_FM) to 0.7776 (for BRDF_FM) and the intercept decreases from 0.0778 (for L_FM) to 0.0627 (for BRDF_FM).

  3. Smeared spectrum jamming suppression based on generalized S transform and threshold segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xin; Wang, Chunyang; Tan, Ming; Fu, Xiaolong

    2018-04-01

    Smeared Spectrum (SMSP) jamming is an effective jamming in countering linear frequency modulation (LFM) radar. According to the time-frequency distribution difference between jamming and echo, a jamming suppression method based on Generalized S transform (GST) and threshold segmentation is proposed. The sub-pulse period is firstly estimated based on auto correlation function firstly. Secondly, the time-frequency image and the related gray scale image are achieved based on GST. Finally, the Tsallis cross entropy is utilized to compute the optimized segmentation threshold, and then the jamming suppression filter is constructed based on the threshold. The simulation results show that the proposed method is of good performance in the suppression of false targets produced by SMSP.

  4. ECCM Scheme against Interrupted Sampling Repeater Jammer Based on Parameter-Adjusted Waveform Design

    PubMed Central

    Wei, Zhenhua; Peng, Bo; Shen, Rui

    2018-01-01

    Interrupted sampling repeater jamming (ISRJ) is an effective way of deceiving coherent radar sensors, especially for linear frequency modulated (LFM) radar. In this paper, for a simplified scenario with a single jammer, we propose a dynamic electronic counter-counter measure (ECCM) scheme based on jammer parameter estimation and transmitted signal design. Firstly, the LFM waveform is transmitted to estimate the main jamming parameters by investigating the discontinuousness of the ISRJ’s time-frequency (TF) characteristics. Then, a parameter-adjusted intra-pulse frequency coded signal, whose ISRJ signal after matched filtering only forms a single false target, is designed adaptively according to the estimated parameters, i.e., sampling interval, sampling duration and repeater times. Ultimately, for typical jamming scenes with different jamming signal ratio (JSR) and duty cycle, we propose two particular ISRJ suppression approaches. Simulation results validate the effective performance of the proposed scheme for countering the ISRJ, and the trade-off relationship between the two approaches is demonstrated. PMID:29642508

  5. The President's Day cyclone 17-19 February 1979: An analysis of jet streak interactions prior to cyclogenesis

    NASA Technical Reports Server (NTRS)

    Uccellini, L. W.; Kocin, P. J.; Walsh, C. H.

    1981-01-01

    The President's Day cyclone, produced record breaking snowfall along the East Coast of the United States in February 1979. Conventional radiosonde data, SMS GOES infrared imagery and LFM 2 model diagnostics were used to analyze the interaction of upper and lower tropospheric jet streaks prior to cyclogenesis. The analysis reveals that a series of complex scale interactive processes is responsible for the development of the intense cyclone. The evolution of the subsynoptic scale mass and momentum fields prior to and during the period of rapid development of the President's Day cyclone utilizing conventional data and SMS GOES imagery is documented. The interaction between upper and lower tropospheric jet streaks which occurred prior to the onset of cyclogenesis is discussed as well as the possible effects of terrain modified airflow within the precyclogenesis environment. Possible deficiencies in the LFM-2 initial wind fields that could have been responsible, in part, for the poor numerical forecast are examined.

  6. FPGA based hardware optimized implementation of signal processing system for LFM pulsed radar

    NASA Astrophysics Data System (ADS)

    Azim, Noor ul; Jun, Wang

    2016-11-01

    Signal processing is one of the main parts of any radar system. Different signal processing algorithms are used to extract information about different parameters like range, speed, direction etc, of a target in the field of radar communication. This paper presents LFM (Linear Frequency Modulation) pulsed radar signal processing algorithms which are used to improve target detection, range resolution and to estimate the speed of a target. Firstly, these algorithms are simulated in MATLAB to verify the concept and theory. After the conceptual verification in MATLAB, the simulation is converted into implementation on hardware using Xilinx FPGA. Chosen FPGA is Xilinx Virtex-6 (XC6LVX75T). For hardware implementation pipeline optimization is adopted and also other factors are considered for resources optimization in the process of implementation. Focusing algorithms in this work for improving target detection, range resolution and speed estimation are hardware optimized fast convolution processing based pulse compression and pulse Doppler processing.

  7. Waveform Retrieval and Phase Identification for Seismic Data from the CASS Experiment

    NASA Astrophysics Data System (ADS)

    Li, Zhiwei; You, Qingyu; Ni, Sidao; Hao, Tianyao; Wang, Hongti; Zhuang, Cantao

    2013-05-01

    The little destruction to the deployment site and high repeatability of the Controlled Accurate Seismic Source (CASS) shows its potential for investigating seismic wave velocities in the Earth's crust. However, the difficulty in retrieving impulsive seismic waveforms from the CASS data and identifying the seismic phases substantially prevents its wide applications. For example, identification of the seismic phases and accurate measurement of travel times are essential for resolving the spatial distribution of seismic velocities in the crust. Until now, it still remains a challenging task to estimate the accurate travel times of different seismic phases from the CASS data which features extended wave trains, unlike processing of the waveforms from impulsive events such as earthquakes or explosive sources. In this study, we introduce a time-frequency analysis method to process the CASS data, and try to retrieve the seismic waveforms and identify the major seismic phases traveling through the crust. We adopt the Wigner-Ville Distribution (WVD) approach which has been used in signal detection and parameter estimation for linear frequency modulation (LFM) signals, and proves to feature the best time-frequency convergence capability. The Wigner-Hough transform (WHT) is applied to retrieve the impulsive waveforms from multi-component LFM signals, which comprise seismic phases with different arrival times. We processed the seismic data of the 40-ton CASS in the field experiment around the Xinfengjiang reservoir with the WVD and WHT methods. The results demonstrate that these methods are effective in waveform retrieval and phase identification, especially for high frequency seismic phases such as PmP and SmS with strong amplitudes in large epicenter distance of 80-120 km. Further studies are still needed to improve the accuracy on travel time estimation, so as to further promote applicability of the CASS for and imaging the seismic velocity structure.

  8. 77 FR 50185 - LoCorr Fund Management, LLC and LoCorr Investment Trust; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-20

    ... Fund Management, LLC and LoCorr Investment Trust; Notice of Application August 14, 2012. AGENCY.... Applicants: LoCorr Fund Management, LLC (``LFM'' or the ``Adviser'') and LoCorr Investment Trust (the ``Trust... Mary Kay Frech, Branch Chief, at (202) 551-6821 (Division of Investment Management, Office of...

  9. Standoff Acoustic Shear Wave Imaging Using LFM Chirps

    DTIC Science & Technology

    2011-03-21

    is typically ignored due to the large wavelengths in biological tissue. For the test material presented in this paper ( expanded polystyrene foam...inhomogeneous sound speed, 1( )c x , for a 2.5×5×7 cm steel parallelepiped embedded in a 15×23×23 cm block of expanded polystyrene foam, which

  10. Laminar flamelet modeling of turbulent diffusion flames

    NASA Technical Reports Server (NTRS)

    Mell, W. E.; Kosaly, G.; Planche, O.; Poinsot, T.; Ferziger, J. H.

    1990-01-01

    In modeling turbulent combustion, decoupling the chemistry from the turbulence is of great practical significance. In cases in which the equilibrium chemistry model breaks down, laminar flamelet modeling (LFM) is a promising approach to decoupling. Here, the validity of this approach is investigated using direct numerical simulation of a simple chemical reaction in two-dimensional turbulence.

  11. The War Next Door: DoD’s Role in Combating Mexican TCOs

    DTIC Science & Technology

    2013-03-01

    Zetas, Gulf Cartel, Juárez Cartel, Beltran Leyva Organization (BLO), La Familia Michoacan (LFM), and Tijuana Cartel.17 This has resulted in...entered the United States by way of the Southwest Border.26 In 2002, authorities arrested Salim Mucharrafille, a café owner in Tijuana , Mexico, for

  12. Localization and tracking of moving objects in two-dimensional space by echolocation.

    PubMed

    Matsuo, Ikuo

    2013-02-01

    Bats use frequency-modulated echolocation to identify and capture moving objects in real three-dimensional space. Experimental evidence indicates that bats are capable of locating static objects with a range accuracy of less than 1 μs. A previously introduced model estimates ranges of multiple, static objects using linear frequency modulation (LFM) sound and Gaussian chirplets with a carrier frequency compatible with bat emission sweep rates. The delay time for a single object was estimated with an accuracy of about 1.3 μs by measuring the echo at a low signal-to-noise ratio (SNR). The range accuracy was dependent not only on the SNR but also the Doppler shift, which was dependent on the movements. However, it was unclear whether this model could estimate the moving object range at each timepoint. In this study, echoes were measured from the rotating pole at two receiving points by intermittently emitting LFM sounds. The model was shown to localize moving objects in two-dimensional space by accurately estimating the object's range at each timepoint.

  13. Effect of Carbon Black on Elastomer Blends

    NASA Astrophysics Data System (ADS)

    Si, Mayu; Koga, Tadanori; Ji, Yuan; Seo, Young-Soo; Rafailovich, Miriam; Sokolov, Jonathan; Gerspacher, M.; Dias, A. J.; Karp, Kriss R.; Satija, Sushil; Lin, Min Y.

    2003-03-01

    The effects of untreated and heat-treated carbon black N299 on the interfacial properties of PB (Polybutadiene) and terpolymer BIMS [brominated Poly(isobutylene-co-methyl styrene)] were investigated by neutron reflectivity (NR) and lateral force microscopy (LFM). The NR results show that the addition of carbon black significantly slows down the interfacial broadening while heat-treated carbon black has less effect on slowing down the diffusion compared with untreated carbon black. These results were confirmed by the LFM data, which shows the magnitude of lateral force loop of heat-treated carbon black is bigger than that of untreated one. Ultra small and small angle neutron scattering (USANS and SANS) were used to probe the morphology and surface lateral force. Increasing volume concentration of carbon black to 5glass transition temperature of BIMS is also decreased, which was measured by Differential scanning Calorimeter (DSC). XRD analysis indicates that the heat treatment crystallizes the carbon black and strong graphitic peaks are observed. The large degree of crystallization decreases the interaction with the polymer matrix and hence minimizes the effect on the internal dynamics

  14. GAMERA - The New Magnetospheric Code

    NASA Astrophysics Data System (ADS)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  15. Optimal geometry for a quartz multipurpose SPM sensor.

    PubMed

    Stirling, Julian

    2013-01-01

    We propose a geometry for a piezoelectric SPM sensor that can be used for combined AFM/LFM/STM. The sensor utilises symmetry to provide a lateral mode without the need to excite torsional modes. The symmetry allows normal and lateral motion to be completely isolated, even when introducing large tips to tune the dynamic properties to optimal values.

  16. Vegetation Monitoring with Gaussian Processes and Latent Force Models

    NASA Astrophysics Data System (ADS)

    Camps-Valls, Gustau; Svendsen, Daniel; Martino, Luca; Campos, Manuel; Luengo, David

    2017-04-01

    Monitoring vegetation by biophysical parameter retrieval from Earth observation data is a challenging problem, where machine learning is currently a key player. Neural networks, kernel methods, and Gaussian Process (GP) regression have excelled in parameter retrieval tasks at both local and global scales. GP regression is based on solid Bayesian statistics, yield efficient and accurate parameter estimates, and provides interesting advantages over competing machine learning approaches such as confidence intervals. However, GP models are hampered by lack of interpretability, that prevented the widespread adoption by a larger community. In this presentation we will summarize some of our latest developments to address this issue. We will review the main characteristics of GPs and their advantages in vegetation monitoring standard applications. Then, three advanced GP models will be introduced. First, we will derive sensitivity maps for the GP predictive function that allows us to obtain feature ranking from the model and to assess the influence of examples in the solution. Second, we will introduce a Joint GP (JGP) model that combines in situ measurements and simulated radiative transfer data in a single GP model. The JGP regression provides more sensible confidence intervals for the predictions, respects the physics of the underlying processes, and allows for transferability across time and space. Finally, a latent force model (LFM) for GP modeling that encodes ordinary differential equations to blend data-driven modeling and physical models of the system is presented. The LFM performs multi-output regression, adapts to the signal characteristics, is able to cope with missing data in the time series, and provides explicit latent functions that allow system analysis and evaluation. Empirical evidence of the performance of these models will be presented through illustrative examples.

  17. Examining the Effect of Time Constraint on the Online Mastery Learning Approach towards Improving Postgraduate Students' Achievement

    ERIC Educational Resources Information Center

    Ee, Mong Shan; Yeoh, William; Boo, Yee Ling; Boulter, Terry

    2018-01-01

    Time control plays a critical role within the online mastery learning (OML) approach. This paper examines the two commonly implemented mastery learning strategies--personalised system of instructions and learning for mastery (LFM)--by focusing on what occurs when there is an instructional time constraint. Using a large data set from a postgraduate…

  18. Quantifying palpation techniques in relation to performance in a clinical prostate exam.

    PubMed

    Wang, Ninghuan; Gerling, Gregory J; Childress, Reba Moyer; Martin, Marcus L

    2010-07-01

    This paper seeks to quantify finger palpation techniques in the prostate clinical exam, determine their relationship with performance in detecting abnormalities, and differentiate the tendencies of nurse practitioner students and resident physicians. One issue with the digital rectal examination (DRE) is that performance in detecting abnormalities varies greatly and agreement between examiners is low. The utilization of particular palpation techniques may be one way to improve clinician ability. Based on past qualitative instruction, this paper algorithmically defines a set of palpation techniques for the DRE, i.e., global finger movement (GFM), local finger movement (LFM), and average intentional finger pressure, and utilizes a custom-built simulator to analyze finger movements in an experiment with two groups: 18 nurse practitioner students and 16 resident physicians. Although technique utilization varied, some elements clearly impacted performance. For example, those utilizing the LFM of vibration were significantly better at detecting abnormalities. Also, the V GFM led to greater success, but finger pressure played a lesser role. Interestingly, while the residents were clearly the superior performers, their techniques differed only subtly from the students. In summary, the quantified palpation techniques appear to account for examination ability at some level, but not entirely for differences between groups.

  19. The Role of Membrane-Derived Second Messengers and Bmx/Etk in Response to Radiation Treatment of Prostate Cancer

    DTIC Science & Technology

    2008-01-01

    enhanced HUVEC radiosensitization. Furthermore, pretreatment of HUVEC with a pharmacological inhibitor of Bmx, LFM-A13, produced significant...Prostate cancer, Bmx, tyrosine kinase, kinase inhibitors , angiogenesis, tumor vasculature, radiation 16. SECURITY CLASSIFICATION OF: 17...activation and that a small molecule inhibitor of Bmx modulates the cellular viability of endothelial and prostate cancer cells, particularly with radiation

  20. Automatic Modulation Classification of Common Communication and Pulse Compression Radar Waveforms using Cyclic Features

    DTIC Science & Technology

    2013-03-01

    intermediate frequency LFM linear frequency modulation MAP maximum a posteriori MATLAB® matrix laboratory ML maximun likelihood OFDM orthogonal frequency...spectrum, frequency hopping, and orthogonal frequency division multiplexing ( OFDM ) modulations. Feature analysis would be a good research thrust to...determine feature relevance and decide if removing any features improves performance. Also, extending the system for simulations using a MIMO receiver or

  1. Reliability-Centered Maintenance

    DTIC Science & Technology

    1978-12-29

    the pack through a flow-control valve and is cooled and dehumidified by a heat exchanger and the turbine of an air-cycle refrigeration ma- chine. The...dirt, moisture, and heat are the most susceptible to corrosion, and properly applied and maintained protective coatings are necessary to prevent...LFM’TNT i RCJL, ’A~r I ARLCA & WORK JNI- N UMUL R, United Airlines San Francisco International Airport San Francisco, Ca 94128 Office of Assistant

  2. Micro-Doppler Signal Time-Frequency Algorithm Based on STFRFT.

    PubMed

    Pang, Cunsuo; Han, Yan; Hou, Huiling; Liu, Shengheng; Zhang, Nan

    2016-09-24

    This paper proposes a time-frequency algorithm based on short-time fractional order Fourier transformation (STFRFT) for identification of a complicated movement targets. This algorithm, consisting of a STFRFT order-changing and quick selection method, is effective in reducing the computation load. A multi-order STFRFT time-frequency algorithm is also developed that makes use of the time-frequency feature of each micro-Doppler component signal. This algorithm improves the estimation accuracy of time-frequency curve fitting through multi-order matching. Finally, experiment data were used to demonstrate STFRFT's performance in micro-Doppler time-frequency analysis. The results validated the higher estimate accuracy of the proposed algorithm. It may be applied to an LFM (Linear frequency modulated) pulse radar, SAR (Synthetic aperture radar), or ISAR (Inverse synthetic aperture radar), for improving the probability of target recognition.

  3. Multi-fluid simulations of the coupled solar wind-magnetosphere-ionsphere system

    NASA Astrophysics Data System (ADS)

    Lyon, J.

    2011-12-01

    This paper will review recent work done with the multi-fluid version of the Lyon-Fedder-Mobarry (MF-LFM) global MHD simulation code. We will concentrate on O+ outflow from the ionosphere and its importance for magnetosphere-ionosphere (MI) coupling and also the importance of ionospheric conditions in determining the outflow. While the predominant method of coupling between the magnetosphere and ionosphere is electrodynamic, it has become apparent the mass flows from the ionosphere into the magnetosphere can have profound effects on both systems. The earliest models to attempt to incorporate this effect used very crude clouds of plasma near the Earth. The earliest MF-LFM results showed that depending on the details of the outflow - where, how much, how fast - very different magnetospheric responses could be found. Two approaches to causally driven models for the outflow have been developed for use in global simulations, the Polar Wind Outflow Model (PWOM), started at the Univ. of Michigan, and the model used by Bill Lotko and co-workers at Dartmouth. We will give a quick review of this model which is based on the empirical relation between outflow fluence and Poynting flux discovered by Strangeway. An additional factor used in this model is the precipitating flux of electrons, which is presumed to correlate with the scale height of the upwelling ions. parameters such as outflow speed and density are constrained by the total fluence. The effects of the outflow depend on the speed. Slower outflow tends to land in the inner magnetosphere increasing the strength of the ring current. Higher speed flow out in the tail. Using this model, simulations have shown that solar wind dynamic pressure has a profound effect on the amount of fluence. The most striking result has been the simulation of magnetospheric sawtooth events. We will discuss future directions for this research, emphasizing the need for better physical models for the outflow process and its coupling to the ionosphere.

  4. Self-reported use of complementary and alternative medicine therapies in a reflexology randomized clinical trial.

    PubMed

    Wyatt, Gwen; Sikorskii, Alla; You, Mei

    2013-01-01

    According to the National Center for Complementary and Alternative Medicine (NCCAM), about one-third of American cancer patients have used complementary and alternative medicine (CAM). The objective of this secondary analysis was an assessment of the use of other CAM by women with advanced breast cancer who were undergoing chemotherapy and who participated in a randomized clinical trial (RCT) studying the safety and efficacy of reflexology. For this secondary analysis, the research team hypothesized an increased CAM use due to exposure to the reflexology trial. For this secondary analysis, the team conducted telephone interviews at baseline, wk 5, and wk 11 to assess the use of 23 common CAM therapies. The study took place at 14 medical oncology clinics across the Midwestern United States. Participants included women with advanced breast cancer who were undergoing chemotherapy and/or hormonal therapy. In the study related to this secondary analysis, the research team randomly assigned the women to one of three primary groups: (1) reflexology; (2) lay foot manipulation (LFM); and (3) control. In addition, the research team used two test groups to establish the study's protocol: (1) test reflexology and (2) test LFM. For this secondary analysis, the research team considered the two reflexology groups (test and intervention) and the two LFM groups (test and intervention) to be the active groups, comparing their use of CAM to the control group's use at the selected time points. The research team used a linear, mixed-effects model to analyze the number of therapies used at the three time points. The team performed t tests to compare therapy use at baseline for those women who completed the study vs those who dropped out. The team used the CAM-use instrument. In total, 385 women participated. The research team found no differences in CAM use for the active groups vs the control group over time or in those women who stayed in the study vs those who dropped out. The team found an increase in CAM use at wk 5 compared to baseline, followed by a decrease at wk 11; however, the time trends were the same in the active groups and the control group In women with advanced breast cancer, researchers can rely upon one assessment of CAM use during an RCT of a CAM therapy.

  5. AFHRL Annual Report FY 81.

    DTIC Science & Technology

    1981-01-01

    iiruI. sulihrur~. arid comtiije iii-aro’ d~ namny rallier- than slin’. ’I’ll(-% nralt oilwulf-d’’I’r tip li[’ I Lahioraiorn s R&D~ I) rezar ’ I .I s...XI)- \\093) 2831. fl’/I. ide anfgle’ ifi/lifil P pIc v"isuafl s.Iovin. Xf1lfM.- TH-81-51. \\D - \\105l 50I8. Idrn ). It.. & kaii r. J. F.. Ii rffflafif

  6. Structure of High Latitude Currents in Magnetosphere-Ionosphere Models

    NASA Astrophysics Data System (ADS)

    Wiltberger, M.; Rigler, E. J.; Merkin, V.; Lyon, J. G.

    2017-03-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model we examine the structure of the high latitude field-aligned current patterns. Each resolution was run for the entire Whole Heliosphere Interval which contained two high speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results obtained from the Weimer 2005 computing using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and narrow. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths also results in the cross polar cap potential (CPCP) pattern being concentrated in higher latitudes. Current-voltage relationships between the R1 and CPCP are quite similar at the higher resolution indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  7. Cramer-Rao Lower Bound Evaluation for Linear Frequency Modulation Based Active Radar Networks Operating in a Rice Fading Environment.

    PubMed

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-12-06

    This paper investigates the joint target parameter (delay and Doppler) estimation performance of linear frequency modulation (LFM)-based radar networks in a Rice fading environment. The active radar networks are composed of multiple radar transmitters and multichannel receivers placed on moving platforms. First, the log-likelihood function of the received signal for a Rician target is derived, where the received signal scattered off the target comprises of dominant scatterer (DS) component and weak isotropic scatterers (WIS) components. Then, the analytically closed-form expressions of the Cramer-Rao lower bounds (CRLBs) on the Cartesian coordinates of target position and velocity are calculated, which can be adopted as a performance metric to access the target parameter estimation accuracy for LFM-based radar network systems in a Rice fading environment. It is found that the cumulative Fisher information matrix (FIM) is a linear combination of both DS component and WIS components, and it also demonstrates that the joint CRLB is a function of signal-to-noise ratio (SNR), target's radar cross section (RCS) and transmitted waveform parameters, as well as the relative geometry between the target and the radar network architectures. Finally, numerical results are provided to indicate that the joint target parameter estimation performance of active radar networks can be significantly improved with the exploitation of DS component.

  8. Cramer-Rao Lower Bound Evaluation for Linear Frequency Modulation Based Active Radar Networks Operating in a Rice Fading Environment

    PubMed Central

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-01-01

    This paper investigates the joint target parameter (delay and Doppler) estimation performance of linear frequency modulation (LFM)-based radar networks in a Rice fading environment. The active radar networks are composed of multiple radar transmitters and multichannel receivers placed on moving platforms. First, the log-likelihood function of the received signal for a Rician target is derived, where the received signal scattered off the target comprises of dominant scatterer (DS) component and weak isotropic scatterers (WIS) components. Then, the analytically closed-form expressions of the Cramer-Rao lower bounds (CRLBs) on the Cartesian coordinates of target position and velocity are calculated, which can be adopted as a performance metric to access the target parameter estimation accuracy for LFM-based radar network systems in a Rice fading environment. It is found that the cumulative Fisher information matrix (FIM) is a linear combination of both DS component and WIS components, and it also demonstrates that the joint CRLB is a function of signal-to-noise ratio (SNR), target’s radar cross section (RCS) and transmitted waveform parameters, as well as the relative geometry between the target and the radar network architectures. Finally, numerical results are provided to indicate that the joint target parameter estimation performance of active radar networks can be significantly improved with the exploitation of DS component. PMID:27929433

  9. Local vibrations in disordered solids studied via single-molecule spectroscopy: Comparison with neutron, nuclear, Raman scattering, and photon echo data

    NASA Astrophysics Data System (ADS)

    Vainer, Yu. G.; Naumov, A. V.; Kador, L.

    2008-06-01

    The energy spectrum of low-frequency vibrational modes (LFMs) in three disordered organic solids—amorphous polyisobutylene (PIB), toluene and deuterated toluene glasses, weakly doped with fluorescent chromophore molecules of tetra-tert-butylterrylene (TBT) has been measured via single-molecule (SM) spectroscopy. Analysis of the individual temperature dependences of linewidths of single TBT molecules allowed us to determine the values of the vibrational mode frequencies and the SM-LFM coupling constants for vibrations in the local environment of the molecules. The measured LFM spectra were compared with the “Boson peak” as measured in pure PIB by inelastic neutron scattering, in pure toluene glass by low-frequency Raman scattering, in doped toluene glass by nuclear inelastic scattering, and with photon echo data. The comparative analysis revealed close agreement between the spectra of the local vibrations as measured in the present study and the literature data of the Boson peak in PIB and toluene. The analysis has also the important result that weak doping of the disordered matrices with nonpolar probe molecules whose chemical composition is similar to that of the matrix molecules does not influence the observed vibrational dynamics markedly. The experimental data displaying temporal stability on the time scale of a few hours of vibrational excitation parameters in local surroundings was obtained for the first time both for polymer and molecular glass.

  10. Structure of high latitude currents in global magnetospheric-ionospheric models

    USGS Publications Warehouse

    Wiltberger, M; Rigler, E. J.; Merkin, V; Lyon, J. G

    2016-01-01

    Using three resolutions of the Lyon-Fedder-Mobarry global magnetosphere-ionosphere model (LFM) and the Weimer 2005 empirical model we examine the structure of the high latitude field-aligned current patterns. Each resolution was run for the entire Whole Heliosphere Interval which contained two high speed solar wind streams and modest interplanetary magnetic field strengths. Average states of the field-aligned current (FAC) patterns for 8 interplanetary magnetic field clock angle directions are computed using data from these runs. Generally speaking the patterns obtained agree well with results obtained from the Weimer 2005 computing using the solar wind and IMF conditions that correspond to each bin. As the simulation resolution increases the currents become more intense and narrow. A machine learning analysis of the FAC patterns shows that the ratio of Region 1 (R1) to Region 2 (R2) currents decreases as the simulation resolution increases. This brings the simulation results into better agreement with observational predictions and the Weimer 2005 model results. The increase in R2 current strengths also results in the cross polar cap potential (CPCP) pattern being concentrated in higher latitudes. Current-voltage relationships between the R1 and CPCP are quite similar at the higher resolution indicating the simulation is converging on a common solution. We conclude that LFM simulations are capable of reproducing the statistical features of FAC patterns.

  11. Doppler Compensation for Airborne Non-Side-Looking Phased-Array Radar

    DTIC Science & Technology

    2015-09-01

    Box 1500 Edinburgh South Australia 5111 Australia Telephone: 1300 333 362 Fax: (08) 7389 6567 © Commonwealth of Australia 2013 AR-016...Security and ISR Division Dr Yunhan Dong received his Bachelor and Master degrees in 1980s in China and PhD in 1995 at UNSW, Australia , all in...waveform length, 0λ 0.25 m Bandwidth of LFM 5 MHz Sampling rate 10 MHz Number of array elements, N 25 Number of pulses in a CPI, M 31 Antenna

  12. Performance of Passive and Active Sonars in the Philippine Sea

    DTIC Science & Technology

    2012-09-30

    demodulated to 250 Hz , so the output of the matched filter for the LFM signal is the complex envelope of the pulse compressed signal. We want to...this cross coherence matrix for the 90−180Hz band on the left and 375 −525Hz one on the right. We note that there are 150 entries in each matrix which...Laboratory (CONOPS and vector sensor processing) 4. SSTAG ( Submarine Surveillance Technical Advisory Group) for N975) 5. FCP (Future Concepts Program for

  13. PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.

    2017-12-01

    Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.

  14. High-Resolution Radar Waveforms Based on Randomized Latin Square Sequences

    DTIC Science & Technology

    2017-04-18

    familiar Costas sequence [17]. The ambiguity function first introduced by Woodward in [13] is used to evaluate the matched filter output of a Radar waveform...the zero-delay cut that the result takes the shape of a sinc function which shows, even for significant Doppler shifts, the matched filter output...bad feature as the high ridge of the LFM waveform will still result in a large matched filter response from the target, just not at the correct delay

  15. Mending the Gap, An Effort to Aid the Transfer of Formal Methods Technology

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly

    2009-01-01

    Formal methods can be applied to many of the development and verification activities required for civil avionics software. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification, gives a brief description of using formal methods as an alternate method of compliance with the objectives of that standard. Despite this, the avionics industry at large has been hesitant to adopt formal methods, with few developers have actually used formal methods for certification credit. Why is this so, given the volume of evidence of the benefits of formal methods? This presentation will explore some of the challenges to using formal methods in a certification context and describe the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to develop guidance to make the use of formal methods a recognized approach.

  16. Formal methods technology transfer: Some lessons learned

    NASA Technical Reports Server (NTRS)

    Hamilton, David

    1992-01-01

    IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.

  17. Advanced Analysis and Visualization of Space Weather Phenomena

    NASA Astrophysics Data System (ADS)

    Murphy, Joshua J.

    As the world becomes more technologically reliant, the more susceptible society as a whole is to adverse interactions with the sun. This "space weather'' can produce significant effects on modern technology, from interrupting satellite service, to causing serious damage to Earth-side power grids. These concerns have, over the past several years, prompted an out-welling of research in an attempt to understand the processes governing, and to provide a means of forecasting, space weather events. The research presented in this thesis couples to current work aimed at understanding Coronal Mass Ejections (CMEs) and their influence on the evolution of Earth's magnetic field and associated Van Allen radiation belts. To aid in the analysis of how these solar wind transients affect Earth's magnetic field, a system named Geospace/Heliosphere Observation & Simulation Tool-kit (GHOSTkit), along with its python analysis tools, GHOSTpy, has been devised to calculate the adiabatic invariants of trapped particle motion within Earth's magnetic field. These invariants aid scientists in ordering observations of the radiation belts, providing a more natural presentation of data, but can be computationally expensive to calculate. The GHOSTpy system, in the phase presented here, is aimed at providing invariant calculations based on LFM magnetic field simulation data. This research first examines an ideal dipole application to gain understanding on system performance. Following this, the challenges of applying the algorithms to gridded LFM MHD data is examined. Performance profiles are then presented, followed by a real-world application of the system.

  18. Global Response to Local Ionospheric Mass Ejection

    NASA Technical Reports Server (NTRS)

    Moore, T. E.; Fok, M.-C.; Delcourt, D. C.; Slinker, S. P.; Fedder, J. A.

    2010-01-01

    We revisit a reported "Ionospheric Mass Ejection" using prior event observations to guide a global simulation of local ionospheric outflows, global magnetospheric circulation, and plasma sheet pressurization, and comparing our results with the observed global response. Our simulation framework is based on test particle motions in the Lyon-Fedder-Mobarry (LFM) global circulation model electromagnetic fields. The inner magnetosphere is simulated with the Comprehensive Ring Current Model (CRCM) of Fok and Wolf, driven by the transpolar potential developed by the LFM magnetosphere, and includes an embedded plasmaspheric simulation. Global circulation is stimulated using the observed solar wind conditions for the period 24-25 Sept 1998. This period begins with the arrival of a Coronal Mass Ejection, initially with northward, but later with southward interplanetary magnetic field. Test particles are launched from the ionosphere with fluxes specified by local empirical relationships of outflow to electrodynamic and particle precipitation imposed by the MIlD simulation. Particles are tracked until they are lost from the system downstream or into the atmosphere, using the full equations of motion. Results are compared with the observed ring current and a simulation of polar and auroral wind outflows driven globally by solar wind dynamic pressure. We find good quantitative agreement with the observed ring current, and reasonable qualitative agreement with earlier simulation results, suggesting that the solar wind driven global simulation generates realistic energy dissipation in the ionosphere and that the Strangeway relations provide a realistic local outflow description.

  19. Formal Methods Tool Qualification

    NASA Technical Reports Server (NTRS)

    Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain

    2017-01-01

    Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.

  20. Adaptive ISAR Imaging of Maneuvering Targets Based on a Modified Fourier Transform.

    PubMed

    Wang, Binbin; Xu, Shiyou; Wu, Wenzhen; Hu, Pengjiang; Chen, Zengping

    2018-04-27

    Focusing on the inverse synthetic aperture radar (ISAR) imaging of maneuvering targets, this paper presents a new imaging method which works well when the target's maneuvering is not too severe. After translational motion compensation, we describe the equivalent rotation of maneuvering targets by two variables-the relative chirp rate of the linear frequency modulated (LFM) signal and the Doppler focus shift. The first variable indicates the target's motion status, and the second one represents the possible residual error of the translational motion compensation. With them, a modified Fourier transform matrix is constructed and then used for cross-range compression. Consequently, the imaging of maneuvering is converted into a two-dimensional parameter optimization problem in which a stable and clear ISAR image is guaranteed. A gradient descent optimization scheme is employed to obtain the accurate relative chirp rate and Doppler focus shift. Moreover, we designed an efficient and robust initialization process for the gradient descent method, thus, the well-focused ISAR images of maneuvering targets can be achieved adaptively. Human intervention is not needed, and it is quite convenient for practical ISAR imaging systems. Compared to precedent imaging methods, the new method achieves better imaging quality under reasonable computational cost. Simulation results are provided to validate the effectiveness and advantages of the proposed method.

  1. Scientific and Engineering Studies, Compiled 1989. Signal Processing Studies

    DTIC Science & Technology

    1989-01-01

    Version W(t,f) . . . . . .......... 25 3 W(t,f) for Real Waveform s(t) ............... 25 4 Contour of WDF (72) at l/e Relative Level . . . . . . . . . 30...spectral level , (189) B Passband of filter H, figure 8 Duration of weighting v, figure 8 LFM Linear Frequency Modulation sgn(x) 1 for x > 0, -1 for x...figure 4. the area of thl parir( Iua level ellipse is 1/2 In the t.f Vifne Wher this area i *, rlt IVe! bl ty e peak height of ?[, the product Is . *ýich

  2. Environmental Impact Analysis Process. Final Environmental Impact Statement. Air Force, Space Division Housing Project, San Pedro, California

    DTIC Science & Technology

    1986-07-24

    impact on tme local zommunity’s use of :hese facilities. g ) Released to the public 24, :986. FINAL ENVIRONMENTAL IMPACT STATEMENT kIR FORCE, SP.CE...LFM.*’* Alternative G 9 80 0 0 21 90 9 acres in southeast corner of WP and 21 acres at FM.** Alternative H 0 0 22 80 21 90 22 acres at BP and (Buildable...it may not be considered a permanent irreversible or irretrievable use of the land, the Proposed Action and alternatives (except Alternative G which

  3. Guidance for Using Formal Methods in a Certification Context

    NASA Technical Reports Server (NTRS)

    Brown, Duncan; Delseny, Herve; Hayhurst, Kelly; Wiels, Virginie

    2010-01-01

    This paper discusses some of the challenges to using formal methods in a certification context and describes the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to propose guidance to make the use of formal methods a recognized approach. This guidance, expected to take the form of a Formal Methods Technical Supplement to DO-178C/ED-12C, is described, including the activities that are needed when using formal methods, new or modified objectives with respect to the core DO-178C/ED-12C document, and evidence needed for meeting those objectives.

  4. Initial results from a dynamic coupled magnetosphere-ionosphere-ring current model

    NASA Astrophysics Data System (ADS)

    Pembroke, Asher; Toffoletto, Frank; Sazykin, Stanislav; Wiltberger, Michael; Lyon, John; Merkin, Viacheslav; Schmitt, Peter

    2012-02-01

    In this paper we describe a coupled model of Earth's magnetosphere that consists of the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) simulation, the MIX ionosphere solver and the Rice Convection Model (RCM) and report some results using idealized inputs and model parameters. The algorithmic and physical components of the model are described, including the transfer of magnetic field information and plasma boundary conditions to the RCM and the return of ring current plasma properties to the LFM. Crucial aspects of the coupling include the restriction of RCM to regions where field-line averaged plasma-β ≤ 1, the use of a plasmasphere model, and the MIX ionosphere model. Compared to stand-alone MHD, the coupled model produces a substantial increase in ring current pressure and reduction of the magnetic field near the Earth. In the ionosphere, stronger region-1 and region-2 Birkeland currents are seen in the coupled model but with no significant change in the cross polar cap potential drop, while the region-2 currents shielded the low-latitude convection potential. In addition, oscillations in the magnetic field are produced at geosynchronous orbit with the coupled code. The diagnostics of entropy and mass content indicate that these oscillations are associated with low-entropy flow channels moving in from the tail and may be related to bursty bulk flows and bubbles seen in observations. As with most complex numerical models, there is the ongoing challenge of untangling numerical artifacts and physics, and we find that while there is still much room for improvement, the results presented here are encouraging.

  5. Recent trends related to the use of formal methods in software engineering

    NASA Technical Reports Server (NTRS)

    Prehn, Soren

    1986-01-01

    An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.

  6. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  7. Rotor-Bearing Dynamics Technology Design Guide. Part 1. Flexible Rotor Dynamics

    DTIC Science & Technology

    1980-06-01

    DIMENSION TABLX(6O,122) (1254) CONMON/ BMAT /XLX(60,12O),YLX(60,120),ZLX(6O. 120), (1255) IXHALF(20,40),YHALF(20,40),ZNALF(20,40),QNALF(20,40) (1256) COMNON...ISO-1 (2969) 105 RETURN (2970) END IVK 317 - ~ --- = - 7: - (2971) SUBROUTINE IGNRL(HBEHD1 INDEXISO,DETER,IVARH) (2972) CONNON/ BMAT /XLX(60.120),YLX...8217,.SLANT’v, PHASE’lfM?;10 (3865) DIMENSION !RLAB(40),ICLAB(40) (3966) COMMON/ BMAT /XLX(60V120),YLX(60,120),ZLX(6OA 1201, (3867) +XHALF(20,40),YHALF(2O

  8. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  9. Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.

  10. Third NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler)

    1995-01-01

    This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.

  11. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  12. High-Performance Anti-Retransmission Deception Jamming Utilizing Range Direction Multiple Input and Multiple Output (MIMO) Synthetic Aperture Radar (SAR).

    PubMed

    Wang, Ruijia; Chen, Jie; Wang, Xing; Sun, Bing

    2017-01-09

    Retransmission deception jamming seriously degrades the Synthetic Aperture Radar (SAR) detection efficiency and can mislead SAR image interpretation by forming false targets. In order to suppress retransmission deception jamming, this paper proposes a novel multiple input and multiple output (MIMO) SAR structure range direction MIMO SAR, whose multiple channel antennas are vertical to the azimuth. First, based on the multiple channels of range direction MIMO SAR, the orthogonal frequency division multiplexing (OFDM) linear frequency modulation (LFM) signal was adopted as the transmission signal of each channel, which is defined as a sub-band signal. This sub-band signal corresponds to the transmission channel. Then, all of the sub-band signals are modulated with random initial phases and concurrently transmitted. The signal form is more complex and difficult to intercept. Next, the echoes of the sub-band signal are utilized to synthesize a wide band signal after preprocessing. The proposed method will increase the signal to interference ratio and peak amplitude ratio of the signal to resist retransmission deception jamming. Finally, well-focused SAR imagery is obtained using a conventional imaging method where the retransmission deception jamming strength is degraded and defocused. Simulations demonstrated the effectiveness of the proposed method.

  13. High-Performance Anti-Retransmission Deception Jamming Utilizing Range Direction Multiple Input and Multiple Output (MIMO) Synthetic Aperture Radar (SAR)

    PubMed Central

    Wang, Ruijia; Chen, Jie; Wang, Xing; Sun, Bing

    2017-01-01

    Retransmission deception jamming seriously degrades the Synthetic Aperture Radar (SAR) detection efficiency and can mislead SAR image interpretation by forming false targets. In order to suppress retransmission deception jamming, this paper proposes a novel multiple input and multiple output (MIMO) SAR structure range direction MIMO SAR, whose multiple channel antennas are vertical to the azimuth. First, based on the multiple channels of range direction MIMO SAR, the orthogonal frequency division multiplexing (OFDM) linear frequency modulation (LFM) signal was adopted as the transmission signal of each channel, which is defined as a sub-band signal. This sub-band signal corresponds to the transmission channel. Then, all of the sub-band signals are modulated with random initial phases and concurrently transmitted. The signal form is more complex and difficult to intercept. Next, the echoes of the sub-band signal are utilized to synthesize a wide band signal after preprocessing. The proposed method will increase the signal to interference ratio and peak amplitude ratio of the signal to resist retransmission deception jamming. Finally, well-focused SAR imagery is obtained using a conventional imaging method where the retransmission deception jamming strength is degraded and defocused. Simulations demonstrated the effectiveness of the proposed method. PMID:28075367

  14. The Second NASA Formal Methods Workshop 1992

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)

    1992-01-01

    The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.

  15. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  16. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  17. NASA Formal Methods Workshop, 1990

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Compiler)

    1990-01-01

    The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.

  18. Modeling the superstorm in November 2003

    NASA Astrophysics Data System (ADS)

    Fok, Mei-Ching; Moore, Thomas E.; Slinker, Steve P.; Fedder, Joel A.; Delcourt, Dominique C.; Nosé, Masahito; Chen, Sheng-Hsien

    2011-01-01

    The superstorm on 20-21 November 2003 was the largest geomagnetic storm in solar cycle 23 as measured by Dst, which attained a minimum value of -422 nT. We have simulated this storm to understand how particles originating from the solar wind and ionosphere get access to the magnetosphere and how the subsequent transport and energization processes contribute to the buildup of the ring current. The global electromagnetic configuration and the solar wind H+ distribution are specified by the Lyon-Fedder-Mobarry (LFM) magnetohydrodynamics model. The outflow of H+ and O+ ions from the ionosphere are also considered. Their trajectories in the magnetosphere are followed by a test-particle code. The particle distributions at the inner plasma sheet established by the LFM model and test-particle calculations are then used as boundary conditions for a ring current model. Our simulations reproduce the rapid decrease of Dst during the storm main phase and the fast initial phase of recovery. Shielding in the inner magnetosphere is established at early main phase. This shielding field lasts several hours and then breaks down at late main phase. At the peak of the storm, strong penetration of ions earthward to L shell of 1.5 is revealed in the simulation. It is surprising that O+ is significant but not the dominant species in the ring current in our calculation for this major storm. It is very likely that substorm effects are not well represented in the models and O+ energization is underestimated. Ring current simulation with O+ energy density at the boundary set comparable to Geotail observations produces excellent agreement with the observed symH. As expected in superstorms, ring current O+ is the dominant species over H+ during the main to midrecovery phase of the storm.

  19. Nano-enabled paper humidity sensor for mobile based point-of-care lung function monitoring.

    PubMed

    Bhattacharjee, Mitradip; Nemade, Harshal B; Bandyopadhyay, Dipankar

    2017-08-15

    The frequency of breathing and peak flow rate of exhaled air are necessary parameters to detect chronic obstructive pulmonary diseases (COPDs) such as asthma, bronchitis, or pneumonia. We developed a lung function monitoring point-of-care-testing device (LFM-POCT) consisting of mouthpiece, paper-based humidity sensor, micro-heater, and real-time monitoring unit. Fabrication of a mouthpiece of optimal length ensured that the exhaled air was focused on the humidity-sensor. The resistive relative humidity sensor was developed using a filter paper coated with nanoparticles, which could easily follow the frequency and peak flow rate of the human breathing. Adsorption followed by condensation of the water molecules of the humid air on the paper-sensor during the forced exhalation reduced the electrical resistance of the sensor, which was converted to an electrical signal for sensing. A micro-heater composed of a copper-coil embedded in a polymer matrix helped in maintaining an optimal temperature on the sensor surface. Thus, water condensed on the sensor surface only during forcible breathing and the sensor recovered rapidly after the exhalation was complete by rapid desorption of water molecules from the sensor surface. Two types of real-time monitoring units were integrated into the device based on light emitting diodes (LEDs) and smart phones. The LED based unit displayed the diseased, critical, and fit conditions of the lungs by flashing LEDs of different colors. In comparison, for the mobile based monitoring unit, an application was developed employing an open source software, which established a wireless connectivity with the LFM-POCT device to perform the tests. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Test Particle Simulations of Electron Injection by the Bursty Bulk Flows (BBFs) using High Resolution Lyon-Feddor-Mobarry (LFM) Code

    NASA Astrophysics Data System (ADS)

    Eshetu, W. W.; Lyon, J.; Wiltberger, M. J.; Hudson, M. K.

    2017-12-01

    Test particle simulations of electron injection by the bursty bulk flows (BBFs) have been done using a test particle tracer code [1], and the output fields of the Lyon-Feddor-Mobarry global magnetohydro- dynamics (MHD) code[2]. The MHD code was run with high resolu- tion (oct resolution), and with specified solar wind conditions so as to reproduce the observed qualitative picture of the BBFs [3]. Test par- ticles were injected so that they interact with earthward propagating BBFs. The result of the simulation shows that electrons are pushed ahead of the BBFs and accelerated into the inner magnetosphere. Once electrons are in the inner magnetosphere they are further energized by drift resonance with the azimuthal electric field. In addition pitch angle scattering of electrons resulting in the violation conservation of the first adiabatic invariant has been observed. The violation of the first adiabatic invariant occurs as electrons cross a weak magnetic field region with a strong gradient of the field perturbed by the BBFs. References 1. Kress, B. T., Hudson,M. K., Looper, M. D. , Albert, J., Lyon, J. G., and Goodrich, C. C. (2007), Global MHD test particle simulations of ¿ 10 MeV radiation belt electrons during storm sudden commencement, J. Geophys. Res., 112, A09215, doi:10.1029/2006JA012218. Lyon,J. G., Fedder, J. A., and Mobarry, C.M., The Lyon- Fedder-Mobarry (LFM) Global MHD Magnetospheric Simulation Code (2004), J. Atm. And Solar-Terrestrial Phys., 66, Issue 15-16, 1333- 1350,doi:10.1016/j.jastp. Wiltberger, Merkin, M., Lyon, J. G., and Ohtani, S. (2015), High-resolution global magnetohydrodynamic simulation of bursty bulk flows, J. Geophys. Res. Space Physics, 120, 45554566, doi:10.1002/2015JA021080.

  1. Interaction of the geomagnetic field with northward interplanetary magnetic field

    NASA Astrophysics Data System (ADS)

    Bhattarai, Shree Krishna

    The interaction of the solar wind with Earth's magnetic field causes the transfer of momentum and energy from the solar wind to geospace. The study of this interaction is gaining significance as our society is becoming more and more space based, due to which, predicting space weather has become more important. The solar wind interacts with the geomagnetic field primarily via two processes: viscous interaction and the magnetic reconnection. Both of these interactions result in the generation of an electric field in Earth's ionosphere. The overall topology and dynamics of the magnetosphere, as well as the electric field imposed on the ionosphere, vary with speed, density, and magnetic field orientation of the solar wind as well as the conductivity of the ionosphere. In this dissertation, I will examine the role of northward interplanetary magnetic field (IMF) and discuss the global topology of the magnetosphere and the interaction with the ionosphere using results obtained from the Lyon-Fedder-Mobarry (LFM) simulation. The electric potentials imposed on the ionosphere due to viscous interaction and magnetic reconnection are called the viscous and the reconnection potentials, respectively. A proxy to measure the overall effect of these potentials is to measure the cross polar potential (CPP). The CPP is defined as the difference between the maximum and the minimum of the potential in a given polar ionosphere. I will show results from the LFM simulation showing saturation of the CPP during periods with purely northward IMF of sufficiently large magnitude. I will further show that the viscous potential, which was assumed to be independent of IMF orientation until this work, is reduced during periods of northward IMF. Furthermore, I will also discuss the implications of these results for a simulation of an entire solar rotation.

  2. Ten Commandments of Formal Methods...Ten Years Later

    NASA Technical Reports Server (NTRS)

    Bowen, Jonathan P.; Hinchey, Michael G.

    2006-01-01

    More than a decade ago, in "Ten Commandments of Formal Methods," we offered practical guidelines for projects that sought to use formal methods. Over the years, the article, which was based on our knowledge of successful industrial projects, has been widely cited and has generated much positive feedback. However, despite this apparent enthusiasm, formal methods use has not greatly increased, and some of the same attitudes about the infeasibility of adopting them persist. Formal methodists believe that introducing greater rigor will improve the software development process and yield software with better structure, greater maintainability, and fewer errors.

  3. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  4. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  5. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  6. Formal Methods Case Studies for DO-333

    NASA Technical Reports Server (NTRS)

    Cofer, Darren; Miller, Steven P.

    2014-01-01

    RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.

  7. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  8. A brief overview of NASA Langley's research program in formal methods

    NASA Technical Reports Server (NTRS)

    1992-01-01

    An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.

  9. Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report

    NASA Technical Reports Server (NTRS)

    Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael

    2017-01-01

    This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.

  10. NASA Langley Research and Technology-Transfer Program in Formal Methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.

    1995-01-01

    This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.

  11. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  12. Why are Formal Methods Not Used More Widely?

    NASA Technical Reports Server (NTRS)

    Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.

    1997-01-01

    Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.

  13. SAR System for UAV Operation with Motion Error Compensation beyond the Resolution Cell

    PubMed Central

    González-Partida, José-Tomás; Almorox-González, Pablo; Burgos-García, Mateo; Dorta-Naranjo, Blas-Pablo

    2008-01-01

    This paper presents an experimental Synthetic Aperture Radar (SAR) system that is under development in the Universidad Politécnica de Madrid. The system uses Linear Frequency Modulated Continuous Wave (LFM-CW) radar with a two antenna configuration for transmission and reception. The radar operates in the millimeter-wave band with a maximum transmitted bandwidth of 2 GHz. The proposed system is being developed for Unmanned Aerial Vehicle (UAV) operation. Motion errors in UAV operation can be critical. Therefore, this paper proposes a method for focusing SAR images with movement errors larger than the resolution cell. Typically, this problem is solved using two processing steps: first, coarse motion compensation based on the information provided by an Inertial Measuring Unit (IMU); and second, fine motion compensation for the residual errors within the resolution cell based on the received raw data. The proposed technique tries to focus the image without using data of an IMU. The method is based on a combination of the well known Phase Gradient Autofocus (PGA) for SAR imagery and typical algorithms for translational motion compensation on Inverse SAR (ISAR). This paper shows the first real experiments for obtaining high resolution SAR images using a car as a mobile platform for our radar. PMID:27879884

  14. SAR System for UAV Operation with Motion Error Compensation beyond the Resolution Cell.

    PubMed

    González-Partida, José-Tomás; Almorox-González, Pablo; Burgos-Garcia, Mateo; Dorta-Naranjo, Blas-Pablo

    2008-05-23

    This paper presents an experimental Synthetic Aperture Radar (SAR) system that is under development in the Universidad Politécnica de Madrid. The system uses Linear Frequency Modulated Continuous Wave (LFM-CW) radar with a two antenna configuration for transmission and reception. The radar operates in the millimeter-wave band with a maximum transmitted bandwidth of 2 GHz. The proposed system is being developed for Unmanned Aerial Vehicle (UAV) operation. Motion errors in UAV operation can be critical. Therefore, this paper proposes a method for focusing SAR images with movement errors larger than the resolution cell. Typically, this problem is solved using two processing steps: first, coarse motion compensation based on the information provided by an Inertial Measuring Unit (IMU); and second, fine motion compensation for the residual errors within the resolution cell based on the received raw data. The proposed technique tries to focus the image without using data of an IMU. The method is based on a combination of the well known Phase Gradient Autofocus (PGA) for SAR imagery and typical algorithms for translational motion compensation on Inverse SAR (ISAR). This paper shows the first real experiments for obtaining high resolution SAR images using a car as a mobile platform for our radar.

  15. Weaving a Formal Methods Education with Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Gibson, J. Paul

    The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.

  16. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  17. Ten Commandments Revisited: A Ten-Year Perspective on the Industrial Application of Formal Methods

    NASA Technical Reports Server (NTRS)

    Bowen, Jonathan P.; Hinchey, Michael G.

    2005-01-01

    Ten years ago, our 1995 paper Ten Commandments of Formal Methods suggested some guidelines to help ensure the success of a formal methods project. It proposed ten important requirements (or "commandments") for formal developers to consider and follow, based on our knowledge of several industrial application success stories, most of which have been reported in more detail in two books. The paper was surprisingly popular, is still widely referenced, and used as required reading in a number of formal methods courses. However, not all have agreed with some of our commandments, feeling that they may not be valid in the long-term. We re-examine the original commandments ten years on, and consider their validity in the light of a further decade of industrial best practice and experiences.

  18. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  19. Tethered Lubricants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Lynden

    We have performed extensive experimental and theoretical studies of interfacial friction, relaxation dynamics, and thermodynamics of polymer chains tethered to points, planes, and particles. A key result from our tribology studies using lateral force microscopy (LFM) measurements of polydisperse brushes of linear and branched chains densely grafted to planar substrates is that there are exceedingly low friction coefficients for these systems. Specific project achievements include: (1) Synthesis of three-tiered lubricant films containing controlled amounts of free and pendent PDMS chains, and investigated the effect of their molecular weight and volume fraction on interfacial friction. (2.) Detailed studies of a familymore » of hairy particles termed nanoscale organic hybrid materials (NOHMs) and demonstration of their use as lubricants.« less

  20. Spectral reconstruction of signals from periodic nonuniform subsampling based on a Nyquist folding scheme

    NASA Astrophysics Data System (ADS)

    Jiang, Kaili; Zhu, Jun; Tang, Bin

    2017-12-01

    Periodic nonuniform sampling occurs in many applications, and the Nyquist folding receiver (NYFR) is an efficient, low complexity, and broadband spectrum sensing architecture. In this paper, we first derive that the radio frequency (RF) sample clock function of NYFR is periodic nonuniform. Then, the classical results of periodic nonuniform sampling are applied to NYFR. We extend the spectral reconstruction algorithm of time series decomposed model to the subsampling case by using the spectrum characteristics of NYFR. The subsampling case is common for broadband spectrum surveillance. Finally, we take example for a LFM signal under large bandwidth to verify the proposed algorithm and compare the spectral reconstruction algorithm with orthogonal matching pursuit (OMP) algorithm.

  1. ELF Communications System Ecological Monitoring Program: Electromagnetic Field Measurements and Engineering Support

    DTIC Science & Technology

    1994-04-01

    1 lfm ýA a I - @0 38Ud U 0 E loll I0 F4J O 3Uv ~:~3 Igaas .1c % jWC (9 Um IDP. 0 di d0 1.00 00 .d .d d V d d~ di dI C-14 rM D0209- II dUmW o iNc cl...5C15-1 (In River) I FIGURE E-6. MEASUREMENT POINT AT TRANSMISSION UNE; 5C15-1. E-9 IITRI D06209-1 I I 0CS I C .00o tD I I- U, I S IcEoE 0 E0 a-0

  2. Local flow management/profile descent algorithm. Fuel-efficient, time-controlled profiles for the NASA TSRV airplane

    NASA Technical Reports Server (NTRS)

    Groce, J. L.; Izumi, K. H.; Markham, C. H.; Schwab, R. W.; Thompson, J. L.

    1986-01-01

    The Local Flow Management/Profile Descent (LFM/PD) algorithm designed for the NASA Transport System Research Vehicle program is described. The algorithm provides fuel-efficient altitude and airspeed profiles consistent with ATC restrictions in a time-based metering environment over a fixed ground track. The model design constraints include accommodation of both published profile descent procedures and unpublished profile descents, incorporation of fuel efficiency as a flight profile criterion, operation within the performance capabilities of the Boeing 737-100 airplane with JT8D-7 engines, and conformity to standard air traffic navigation and control procedures. Holding and path stretching capabilities are included for long delay situations.

  3. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  4. Two-Step Formal Advertisement: An Examination.

    DTIC Science & Technology

    1976-10-01

    The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition

  5. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  6. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  7. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  8. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  9. Coupled storm-time magnetosphere-ionosphere-thermosphere simulations including microscopic ionospheric turbulence

    NASA Astrophysics Data System (ADS)

    Merkin, V. G.; Wiltberger, M. J.; Zhang, B.; Liu, J.; Wang, W.; Dimant, Y. S.; Oppenheim, M. M.; Lyon, J.

    2017-12-01

    During geomagnetic storms the magnetosphere-ionosphere-thermosphere system becomes activated in ways that are unique to disturbed conditions. This leads to emergence of physical feedback loops that provide tighter coupling between the system elements, often operating across disparate spatial and temporal scales. One such process that has recently received renewed interest is the generation of microscopic ionospheric turbulence in the electrojet regions (electrojet turbulence, ET) that results from strong convective electric fields imposed by the solar wind-magnetosphere interaction. ET leads to anomalous electron heating and generation of non-linear Pedersen current - both of which result in significant increases in effective ionospheric conductances. This, in turn, provides strong non-linear feedback on the magnetosphere. Recently, our group has published two studies aiming at a comprehensive analysis of the global effects of this microscopic process on the magnetosphere-ionosphere-thermosphere system. In one study, ET physics was incorporated in the TIEGCM model of the ionosphere-thermosphere. In the other study, ad hoc corrections to the ionospheric conductances based on ET theory were incorporated in the conductance module of the Lyon-Fedder-Mobarry (LFM) global magnetosphere model. In this presentation, we make the final step toward the full coupling of the microscopic ET physics within our global coupled model including LFM, the Rice Convection Model (RCM) and TIEGCM. To this end, ET effects are incorporated in the TIEGCM model and propagate throughout the system via thus modified TIEGCM conductances. The March 17, 2013 geomagnetic storm is used as a testbed for these fully coupled simulations, and the results of the model are compared with various ionospheric and magnetospheric observatories, including DMSP, AMPERE, and Van Allen Probes. Via these comparisons, we investigate, in particular, the ET effects on the global magnetosphere indicators such as the strength of the ionospheric convection, field-aligned current densities and ring current pressure amplitude and distribution.

  10. Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.

    2006-01-01

    NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.

  11. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    NASA Technical Reports Server (NTRS)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  12. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  13. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  14. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  15. General Formalism of Mass Scaling Approach for Replica-Exchange Molecular Dynamics and its Application

    NASA Astrophysics Data System (ADS)

    Nagai, Tetsuro

    2017-01-01

    Replica-exchange molecular dynamics (REMD) has demonstrated its efficiency by combining trajectories of a wide range of temperatures. As an extension of the method, the author formalizes the mass-manipulating replica-exchange molecular dynamics (MMREMD) method that allows for arbitrary mass scaling with respect to temperature and individual particles. The formalism enables the versatile application of mass-scaling approaches to the REMD method. The key change introduced in the novel formalism is the generalized rules for the velocity and momentum scaling after accepted replica-exchange attempts. As an application of this general formalism, the refinement of the viscosity-REMD (V-REMD) method [P. H. Nguyen, J. Chem. Phys. 132, 144109 (2010)] is presented. Numerical results are provided using a pilot system, demonstrating easier and more optimized applicability of the new version of V-REMD as well as the importance of adherence to the generalized velocity scaling rules. With the new formalism, more sound and efficient simulations will be performed.

  16. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  17. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  18. Bruton tyrosine kinase inhibitors: a promising novel targeted treatment for B cell lymphomas

    PubMed Central

    Aalipour, Amin; Advani, Ranjana H.

    2015-01-01

    Summary Constitutive or aberrant signalling of the B cell receptor signalling cascade has been implicated in the propagation and maintenance of a variety of B cell malignancies. Small molecule inhibitors of Bruton tyrosine kinase (BTK), a protein early in this cascade and specifically expressed in B cells, have emerged as a new class of targeted agents. There are several BTK inhibitors, including ONO-WG-307, LFM-A13, dasatinib, CC-292, and PCI-32765 (ibrutinib), in preclinical and/or clinical development of which ibrutinib is currently in phase III trials. Recent clinical data suggest significant activity of ibrutinib as a first in class oral inhibitor of BTK. This review provides an overview of ongoing clinical studies of BTK inhibitors. PMID:24111579

  19. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  20. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  1. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  2. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  3. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  4. Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.

    PubMed

    Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2016-06-01

    Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.

  5. A Natural Language Interface Concordant with a Knowledge Base.

    PubMed

    Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young

    2016-01-01

    The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.

  6. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  7. State-vector formalism and the Legendre polynomial solution for modelling guided waves in anisotropic plates

    NASA Astrophysics Data System (ADS)

    Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin

    2018-01-01

    We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.

  8. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  9. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  10. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  11. The Development of Introductory Statistics Students' Informal Inferential Reasoning and Its Relationship to Formal Inferential Reasoning

    ERIC Educational Resources Information Center

    Jacob, Bridgette L.

    2013-01-01

    The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…

  12. Why Engineers Should Consider Formal Methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1997-01-01

    This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.

  13. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  14. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  16. Simplification of the time-dependent generalized self-interaction correction method using two sets of orbitals: Application of the optimized effective potential formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messud, J.; Dinh, P. M.; Suraud, Eric

    2009-10-15

    We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent 'generalized SIC-OEP'. A straightforward approximation, using the spatial localization of one set of orbitals, leads to the 'generalized SIC-Slater' formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.

  17. Simplification of the time-dependent generalized self-interaction correction method using two sets of orbitals: Application of the optimized effective potential formalism

    NASA Astrophysics Data System (ADS)

    Messud, J.; Dinh, P. M.; Reinhard, P.-G.; Suraud, Eric

    2009-10-01

    We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent “generalized SIC-OEP.” A straightforward approximation, using the spatial localization of one set of orbitals, leads to the “generalized SIC-Slater” formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.

  18. A partially folded structure of amyloid-beta(1-40) in an aqueous environment.

    PubMed

    Vivekanandan, Subramanian; Brender, Jeffrey R; Lee, Shirley Y; Ramamoorthy, Ayyalusamy

    2011-07-29

    Aggregation of the Aβ(1-40) peptide is linked to the development of extracellular plaques characteristic of Alzheimer's disease. While previous studies commonly show the Aβ(1-40) is largely unstructured in solution, we show that Aβ(1-40) can adopt a compact, partially folded structure. In this structure (PDB ID: 2LFM), the central hydrophobic region of the peptide forms a 3(10) helix from H13 to D23 and the N- and C-termini collapse against the helix due to the clustering of hydrophobic residues. Helical intermediates have been predicted to be crucial on-pathway intermediates in amyloid fibrillogenesis, and the structure presented here presents a new target for investigation of early events in Aβ(1-40) fibrillogenesis. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Comparison of LFM-test particle simulations and radial diffusion models of radiation belt electron injection into the slot region

    NASA Astrophysics Data System (ADS)

    Chu, F.; Hudson, M.; Kress, B.

    2008-12-01

    The physics-based Lyon-Fedder-Mobarry (LFM) code simulates Earth's magnetospheric topology and dynamics by solving the equations of ideal MHD using input solar wind parameters at the upstream boundary. Comparison with electron phase space density evolution during storms using a radial diffusion code, as well as spacecraft measurements where available, will tell us when diffusion is insufficiently accurate for radiation belt simulation, for example, during CME-shock injection events like March 24, 1991, which occurred on MeV electron drift time scales of minutes (Li et al., 1993). The 2004 July and 2004 November storms, comparable in depth of penetration into the slot region to the Halloween 2003 storm, have been modeled with both approaches. The November 8, 2004 storm was preceded by a Storm Sudden Commencement produced by a CME-shock followed by minimum Dst = -373 nT, while the July 23 to July 28 storm interval had milder consecutive drops in Dst, corresponding to multiple CME shocks and southward IMF Bz turnings. We have run the November and July storms with LFM using ACE data as upstream input, running the July storm with lower temporal resolution over a longer time interval. The November storm was different because the SCC shock was unusually intense, therefore the possibility of drift time scale acceleration by the associated magnetosonic impulse produced by the shock exists, as in March 1991 and also Halloween 2003 events (Kress et al., 2007). It can then take a short time (minutes) for electrons to be transported to low L shell while conserving their first invariant, resulting in a peak in energy and phase space density in the slot region. Radial diffusion suffices for some storm periods like the July 2004 sequence of three storms, while the guiding center test particle simulation in MHD fields is necessary to describe prompt injections which occur faster than diffusive time scales, for which November 2004 is a likely candidate. Earlier examples have been studied, including the Kress et al., 2007 study of the Halloween 2003 storm and Li et al., 1993 study of the March 24, 1991 injection event with MHD simulation carried out by Elkington et al. (2002) for this event. Radial diffusion remains the best approach for extended relatively quiet periods like the two month interval following the March 1991 prompt injection. Strong shocks will inject particles into lower L shell within a few minutes violating the third adiabatic invariant, so the diffusion mechanism cannot be adopted for sudden commencements, when Dst increases then decreases drastically; however particle tracing in time-dependent MHD fields will give an accurate estimation, so radial diffusion and particle tracing in MHD fields complement each other in radiation belt studies. Elkington, S. R., M.K. Hudson, M.J. Wiltberger, J.G. Lyon (2002) JASTP, 64, p. 607-615; Kress B. T., M. K. Hudson, M. D. Looper, J. Albert, J. G. Lyon, C. C. Goodrich (2007), J. Geophys. Res., 112, A09215, doi:10.1029/2006JA012218; Li, X., I. Roth, M. Temerin, J. R. Wygant, M. K. Hudson, and J. B. Blake (1993), Geophys. Res. Lett., 20, p. 2423-2426.

  20. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  1. Comparison of the iterated equation of motion approach and the density matrix formalism for the quantum Rabi model

    NASA Astrophysics Data System (ADS)

    Kalthoff, Mona; Keim, Frederik; Krull, Holger; Uhrig, Götz S.

    2017-05-01

    The density matrix formalism and the equation of motion approach are two semi-analytical methods that can be used to compute the non-equilibrium dynamics of correlated systems. While for a bilinear Hamiltonian both formalisms yield the exact result, for any non-bilinear Hamiltonian a truncation is necessary. Due to the fact that the commonly used truncation schemes differ for these two methods, the accuracy of the obtained results depends significantly on the chosen approach. In this paper, both formalisms are applied to the quantum Rabi model. This allows us to compare the approximate results and the exact dynamics of the system and enables us to discuss the accuracy of the approximations as well as the advantages and the disadvantages of both methods. It is shown to which extent the results fulfill physical requirements for the observables and which properties of the methods lead to unphysical results.

  2. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    DTIC Science & Technology

    2017-04-17

    Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed

  3. Dominant partition method. [based on a wave function formalism

    NASA Technical Reports Server (NTRS)

    Dixon, R. M.; Redish, E. F.

    1979-01-01

    By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.

  4. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  5. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  6. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  7. NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.

    2008-01-01

    This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.

  8. A comparison between state-specific and linear-response formalisms for the calculation of vertical electronic transition energy in solution with the CCSD-PCM method.

    PubMed

    Caricato, Marco

    2013-07-28

    The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.

  9. IDEF3 Formalization Report

    DTIC Science & Technology

    1991-10-01

    SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department

  10. Assumed oxygen consumption based on calculation from dye dilution cardiac output: an improved formula.

    PubMed

    Bergstra, A; van Dijk, R B; Hillege, H L; Lie, K I; Mook, G A

    1995-05-01

    This study was performed because of observed differences between dye dilution cardiac output and the Fick cardiac output, calculated from estimated oxygen consumption according to LaFarge and Miettinen, and to find a better formula for assumed oxygen consumption. In 250 patients who underwent left and right heart catheterization, the oxygen consumption VO2 (ml.min-1) was calculated using Fick's principle. Either pulmonary or systemic flow, as measured by dye dilution, was used in combination with the concordant arteriovenous oxygen concentration difference. In 130 patients, who matched the age of the LaFarge and Miettinen population, the obtained values of oxygen consumption VO2(dd) were compared with the estimated oxygen consumption values VO2(lfm), found using the LaFarge and Miettinen formulae. The VO2(lfm) was significantly lower than VO2(dd); -21.8 +/- 29.3 ml.min-1 (mean +/- SD), P < 0.001, 95% confidence interval (95% CI) -26.9 to -16.7, limits of agreement (LA) -80.4 to 36.9. A new regression formula for the assumed oxygen consumption VO2(ass) was derived in 250 patients by stepwise multiple regression analysis. The VO2(dd) was used as a dependent variable, and body surface area BSA (m2). Sex (0 for female, 1 for male), Age (years), Heart rate (min-1) and the presence of a left to right shunt as independent variables. The best fitting formula is expressed as: VO2(ass) = (157.3 x BSA + 10.0 x Sex - 10.5 x In Age + 4.8) ml.min-1, where ln Age = the natural logarithm of the age. This formula was validated prospectively in 60 patients. A non-significant difference between VO2(ass) and VO2(dd) was found; mean 2.0 +/- 23.4 ml.min-1, P = 0.771, 95% Cl = -4.0 to +8.0, LA -44.7 to +48.7. In conclusion, assumed oxygen consumption values, using our new formula, are in better agreement with the actual values than those found according to LaFarge and Miettinen's formulae.

  11. Optimization of sparse synthetic transmit aperture imaging with coded excitation and frequency division.

    PubMed

    Behar, Vera; Adam, Dan

    2005-12-01

    An effective aperture approach is used for optimization of a sparse synthetic transmit aperture (STA) imaging system with coded excitation and frequency division. A new two-stage algorithm is proposed for optimization of both the positions of the transmit elements and the weights of the receive elements. In order to increase the signal-to-noise ratio in a synthetic aperture system, temporal encoding of the excitation signals is employed. When comparing the excitation by linear frequency modulation (LFM) signals and phase shift key modulation (PSKM) signals, the analysis shows that chirps are better for excitation, since at the output of a compression filter the sidelobes generated are much smaller than those produced by the binary PSKM signals. Here, an implementation of a fast STA imaging is studied by spatial encoding with frequency division of the LFM signals. The proposed system employs a 64-element array with only four active elements used during transmit. The two-dimensional point spread function (PSF) produced by such a sparse STA system is compared to the PSF produced by an equivalent phased array system, using the Field II simulation program. The analysis demonstrates the superiority of the new sparse STA imaging system while using coded excitation and frequency division. Compared to a conventional phased array imaging system, this system acquires images of equivalent quality 60 times faster, when the transmit elements are fired in pairs consecutively and the power level used during transmit is very low. The fastest acquisition time is achieved when all transmit elements are fired simultaneously, which improves detectability, but at the cost of a slight degradation of the axial resolution. In real-time implementation, however, it must be borne in mind that the frame rate of a STA imaging system depends not only on the acquisition time of the data but also on the processing time needed for image reconstruction. Comparing to phased array imaging, a significant increase in the frame rate of a STA imaging system is possible if and only if an equivalent time efficient algorithm is used for image reconstruction.

  12. The Hierarchical Structure of Formal Operational Tasks.

    ERIC Educational Resources Information Center

    Bart, William M.; Mertens, Donna M.

    1979-01-01

    The hierarchical structure of the formal operational period of Piaget's theory of cognitive development was explored through the application of ordering theoretical methods to a set of data that systematically utilized the various formal operational schemes. Results suggested a common structure underlying task performance. (Author/BH)

  13. A Formal Semantics for the WS-BPEL Recovery Framework

    NASA Astrophysics Data System (ADS)

    Dragoni, Nicola; Mazzara, Manuel

    While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.

  14. Plasma Drifts in the Intermediate Magnetosphere: Simulation Results

    NASA Astrophysics Data System (ADS)

    Lyon, J.; Zhang, B.

    2016-12-01

    One of the outstanding questions about the inner magnetosphere dynamics is how the ring current is populated. It is not clear how much is due to a general injection over longer time and spatial scales and how much due to more bursty events. One of the major uncertainties is the behavior of the plasma in the intermediate magnetosphere: the region where the magnetosphere changes from being tail-like to one where the dipole field dominates. This is also the region where physically the plasma behavior changes from MHD-like in the tail to one dominated by particle drifts in the inner magnetosphere. No of the current simulation models self-consistently handle the region where drifts are important but not dominant. We have recently developed a version of the multi-fluid LFM code that can self-consistently handle this situation. The drifts are modeled in a fashion similar to the Rice Convection Model in that a number of energy "channels" are explicitly simulated. However, the method is not limited to the "slow flow" region and both diamagnetic and inertial drifts are included. We present results from a number of idealized cases of the global magnetosphere interacting with a southward turning of the IMF. We discuss the relative importance of general convection and bursty flows to the transport of particles and energy across this region.

  15. Boosting Probabilistic Graphical Model Inference by Incorporating Prior Knowledge from Multiple Sources

    PubMed Central

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291

  16. Automatically Grading Customer Confidence in a Formal Specification.

    ERIC Educational Resources Information Center

    Shukur, Zarina; Burke, Edmund; Foxley, Eric

    1999-01-01

    Describes an automatic grading system for a formal methods computer science course that is able to evaluate a formal specification written in the Z language. Quality is measured by considering first, specification correctness (syntax, semantics, and satisfaction of customer requirements), and second, specification maintainability (comparison of…

  17. Formal Solutions for Polarized Radiative Transfer. II. High-order Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch

    When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.

  18. Formalizing New Navigation Requirements for NASA's Space Shuttle

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  19. Job Search Methods: Consequences for Gender-based Earnings Inequality.

    ERIC Educational Resources Information Center

    Huffman, Matt L.; Torres, Lisa

    2001-01-01

    Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)

  20. Deep first formal concept search.

    PubMed

    Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu

    2014-01-01

    The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.

  1. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  2. Formalisms for user interface specification and design

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent J.

    1989-01-01

    The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.

  3. A STUDY OF FORMALLY ADVERTISED PROCUREMENT

    DTIC Science & Technology

    As a method of procuring goods and services, formally advertised procurement offers a number of advantages. These include the prevention of fraud and...two-thirds of all contracts are let in these cases. This is done by examining over 2,300 contracts let under formal advertising procedures. A measure of

  4. Geometry and Formal Linguistics.

    ERIC Educational Resources Information Center

    Huff, George A.

    This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…

  5. Stopping power of dense plasmas: The collisional method and limitations of the dielectric formalism.

    PubMed

    Clauser, C F; Arista, N R

    2018-02-01

    We present a study of the stopping power of plasmas using two main approaches: the collisional (scattering theory) and the dielectric formalisms. In the former case, we use a semiclassical method based on quantum scattering theory. In the latter case, we use the full description given by the extension of the Lindhard dielectric function for plasmas of all degeneracies. We compare these two theories and show that the dielectric formalism has limitations when it is used for slow heavy ions or atoms in dense plasmas. We present a study of these limitations and show the regimes where the dielectric formalism can be used, with appropriate corrections to include the usual quantum and classical limits. On the other hand, the semiclassical method shows the correct behavior for all plasma conditions and projectile velocity and charge. We consider different models for the ion charge distributions, including bare and dressed ions as well as neutral atoms.

  6. Mathematical, Logical, and Formal Methods in Information Retrieval: An Introduction to the Special Issue.

    ERIC Educational Resources Information Center

    Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost

    2003-01-01

    Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)

  7. Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.

    2015-01-01

    As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.

  8. A Formal Methods Approach to the Analysis of Mode Confusion

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.

    2004-01-01

    The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).

  9. Working the College System: Six Strategies for Building a Personal Powerbase

    ERIC Educational Resources Information Center

    Simplicio, Joseph S. C.

    2008-01-01

    Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…

  10. An Educational Development Tool Based on Principles of Formal Ontology

    ERIC Educational Resources Information Center

    Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter

    2005-01-01

    Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…

  11. Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method

    ERIC Educational Resources Information Center

    Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo

    2012-01-01

    This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)

  12. On the Need for Practical Formal Methods

    DTIC Science & Technology

    1998-01-01

    additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented

  13. A Vector Representation for Thermodynamic Relationships

    ERIC Educational Resources Information Center

    Pogliani, Lionello

    2006-01-01

    The existing vector formalism method for thermodynamic relationship maintains tractability and uses accessible mathematics, which can be seen as a diverting and entertaining step into the mathematical formalism of thermodynamics and as an elementary application of matrix algebra. The method is based on ideas and operations apt to improve the…

  14. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  15. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  16. What can formal methods offer to digital flight control systems design

    NASA Technical Reports Server (NTRS)

    Good, Donald I.

    1990-01-01

    Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.

  17. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  18. Location priority for non-formal early childhood education school based on promethee method and map visualization

    NASA Astrophysics Data System (ADS)

    Ayu Nurul Handayani, Hemas; Waspada, Indra

    2018-05-01

    Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.

  19. Generalizability of the Ordering among Five Formal Reasoning Tasks by an Ordering-Theoretic Method.

    ERIC Educational Resources Information Center

    Bart, William M.; And Others

    1979-01-01

    Five Inhelder-Piaget formal operations tasks were analyzed to determine the extent that the formal operational skills they assess were ordered into a stable hierarchy generalizable across samples of subjects. Subjects were 34 collegiate gymnasts (19 males, 15 females), and 22 students (1 male, 21 females) from a university nursing program.…

  20. The Values-Based Infrastructure of Non-Formal Education: A Case Study of Personal Education in Israeli Schools

    ERIC Educational Resources Information Center

    Goldratt, Miri; Cohen, Eric H.

    2016-01-01

    This article explores encounters between formal, informal, and non-formal education and the role of mentor-educators in creating values education in which such encounters take place. Mixed-methods research was conducted in Israeli public schools participating in the Personal Education Model, which combines educational modes. Ethnographic and…

  1. Formal Method of Description Supporting Portfolio Assessment

    ERIC Educational Resources Information Center

    Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou

    2006-01-01

    Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…

  2. Contour-time approach to the Bose-Hubbard model in the strong coupling regime: Studying two-point spatio-temporal correlations at the Hartree-Fock-Bogoliubov level

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.

    2018-05-01

    We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.

  3. Measurement bias detection with Kronecker product restricted models for multivariate longitudinal data: an illustration with health-related quality of life data from thirteen measurement occasions

    PubMed Central

    Verdam, Mathilde G. E.; Oort, Frans J.

    2014-01-01

    Highlights Application of Kronecker product to construct parsimonious structural equation models for multivariate longitudinal data. A method for the investigation of measurement bias with Kronecker product restricted models. Application of these methods to health-related quality of life data from bone metastasis patients, collected at 13 consecutive measurement occasions. The use of curves to facilitate substantive interpretation of apparent measurement bias. Assessment of change in common factor means, after accounting for apparent measurement bias. Longitudinal measurement invariance is usually investigated with a longitudinal factor model (LFM). However, with multiple measurement occasions, the number of parameters to be estimated increases with a multiple of the number of measurement occasions. To guard against too low ratios of numbers of subjects and numbers of parameters, we can use Kronecker product restrictions to model the multivariate longitudinal structure of the data. These restrictions can be imposed on all parameter matrices, including measurement invariance restrictions on factor loadings and intercepts. The resulting models are parsimonious and have attractive interpretation, but require different methods for the investigation of measurement bias. Specifically, additional parameter matrices are introduced to accommodate possible violations of measurement invariance. These additional matrices consist of measurement bias parameters that are either fixed at zero or free to be estimated. In cases of measurement bias, it is also possible to model the bias over time, e.g., with linear or non-linear curves. Measurement bias detection with Kronecker product restricted models will be illustrated with multivariate longitudinal data from 682 bone metastasis patients whose health-related quality of life (HRQL) was measured at 13 consecutive weeks. PMID:25295016

  4. Measurement bias detection with Kronecker product restricted models for multivariate longitudinal data: an illustration with health-related quality of life data from thirteen measurement occasions.

    PubMed

    Verdam, Mathilde G E; Oort, Frans J

    2014-01-01

    Application of Kronecker product to construct parsimonious structural equation models for multivariate longitudinal data.A method for the investigation of measurement bias with Kronecker product restricted models.Application of these methods to health-related quality of life data from bone metastasis patients, collected at 13 consecutive measurement occasions.The use of curves to facilitate substantive interpretation of apparent measurement bias.Assessment of change in common factor means, after accounting for apparent measurement bias.Longitudinal measurement invariance is usually investigated with a longitudinal factor model (LFM). However, with multiple measurement occasions, the number of parameters to be estimated increases with a multiple of the number of measurement occasions. To guard against too low ratios of numbers of subjects and numbers of parameters, we can use Kronecker product restrictions to model the multivariate longitudinal structure of the data. These restrictions can be imposed on all parameter matrices, including measurement invariance restrictions on factor loadings and intercepts. The resulting models are parsimonious and have attractive interpretation, but require different methods for the investigation of measurement bias. Specifically, additional parameter matrices are introduced to accommodate possible violations of measurement invariance. These additional matrices consist of measurement bias parameters that are either fixed at zero or free to be estimated. In cases of measurement bias, it is also possible to model the bias over time, e.g., with linear or non-linear curves. Measurement bias detection with Kronecker product restricted models will be illustrated with multivariate longitudinal data from 682 bone metastasis patients whose health-related quality of life (HRQL) was measured at 13 consecutive weeks.

  5. Formal methods demonstration project for space applications

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.

    1995-01-01

    The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.

  6. Unmanned Aircraft Systems in the National Airspace System: A Formal Methods Perspective

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Dutle, Aaron; Narkawicz, Anthony; Upchurch, Jason

    2016-01-01

    As the technological and operational capabilities of unmanned aircraft systems (UAS) have grown, so too have international efforts to integrate UAS into civil airspace. However, one of the major concerns that must be addressed in realizing this integration is that of safety. For example, UAS lack an on-board pilot to comply with the legal requirement that pilots see and avoid other aircraft. This requirement has motivated the development of a detect and avoid (DAA) capability for UAS that provides situational awareness and maneuver guidance to UAS operators to aid them in avoiding and remaining well clear of other aircraft in the airspace. The NASA Langley Research Center Formal Methods group has played a fundamental role in the development of this capability. This article gives a selected survey of the formal methods work conducted in support of the development of a DAA concept for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations.

  7. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    NASA Astrophysics Data System (ADS)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  8. Two-Component Noncollinear Time-Dependent Spin Density Functional Theory for Excited State Calculations.

    PubMed

    Egidi, Franco; Sun, Shichao; Goings, Joshua J; Scalmani, Giovanni; Frisch, Michael J; Li, Xiaosong

    2017-06-13

    We present a linear response formalism for the description of the electronic excitations of a noncollinear reference defined via Kohn-Sham spin density functional methods. A set of auxiliary variables, defined using the density and noncollinear magnetization density vector, allows the generalization of spin density functional kernels commonly used in collinear DFT to noncollinear cases, including local density, GGA, meta-GGA and hybrid functionals. Working equations and derivations of functional second derivatives with respect to the noncollinear density, required in the linear response noncollinear TDDFT formalism, are presented in this work. This formalism takes all components of the spin magnetization into account independent of the type of reference state (open or closed shell). As a result, the method introduced here is able to afford a nonzero local xc torque on the spin magnetization while still satisfying the zero-torque theorem globally. The formalism is applied to a few test cases using the variational exact-two-component reference including spin-orbit coupling to illustrate the capabilities of the method.

  9. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  10. Towards an Automated Development Methodology for Dependable Systems with Application to Sensor Networks

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.

  11. Evolution and Revolutions of Adult Learning: Capacity Building in Adult and Non-Formal Education in Nigeria

    ERIC Educational Resources Information Center

    Ugwu, Chinwe U.

    2015-01-01

    The National Commission for Mass Literacy, Adult and Non-Formal Education (NMEC) is the Federal Statutory Agency set up to co-ordinate all aspects of Non-Formal Education in Nigeria whether offered by government agencies or non-governmental organisations. This study looked at the existing Capacity Building Programme, the delivery methods, impact…

  12. Hydra Revisited: Substituting Formal for Self- and Informal In-Home Care among Older Adults with Disabilities

    ERIC Educational Resources Information Center

    Penning, Margaret J.

    2002-01-01

    Purpose: In response to concerns among policymakers and others that increases in the availability of publicly funded formal services will lead to reductions in self- and informal care, this study examines the relationship between the extent of formal in-home care received and levels of self- and informal care. Design and Methods: Two-stage least…

  13. The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid

    ERIC Educational Resources Information Center

    McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom

    2004-01-01

    Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…

  14. Medium-range, objective predictions of thunderstorm location and severity for aviation

    NASA Technical Reports Server (NTRS)

    Wilson, G. S.; Turner, R. E.

    1981-01-01

    This paper presents a computerized technique for medium-range (12-48h) prediction of both the location and severity of thunderstorms utilizing atmospheric predictions from the National Meteorological Center's limited-area fine-mesh model (LFM). A regional-scale analysis scheme is first used to examine the spatial and temporal distributions of forecasted variables associated with the structure and dynamics of mesoscale systems over an area of approximately 10 to the 6th sq km. The final prediction of thunderstorm location and severity is based upon an objective combination of these regionally analyzed variables. Medium-range thunderstorm predictions are presented for the late afternoon period of April 10, 1979, the day of the Wichita Falls, Texas tornado. Conventional medium-range thunderstorm forecasts, made from observed data, are presented with the case study to demonstrate the possible application of this objective technique in improving 12-48 h thunderstorm forecasts for aviation.

  15. Evaluation of the synoptic and mesoscale predictive capabilities of a mesoscale atmospheric simulation system

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K.; Keyser, D. A.; Mccumber, M. C.

    1983-01-01

    The overall performance characteristics of a limited area, hydrostatic, fine (52 km) mesh, primitive equation, numerical weather prediction model are determined in anticipation of satellite data assimilations with the model. The synoptic and mesoscale predictive capabilities of version 2.0 of this model, the Mesoscale Atmospheric Simulation System (MASS 2.0), were evaluated. The two part study is based on a sample of approximately thirty 12h and 24h forecasts of atmospheric flow patterns during spring and early summer. The synoptic scale evaluation results benchmark the performance of MASS 2.0 against that of an operational, synoptic scale weather prediction model, the Limited area Fine Mesh (LFM). The large sample allows for the calculation of statistically significant measures of forecast accuracy and the determination of systematic model errors. The synoptic scale benchmark is required before unsmoothed mesoscale forecast fields can be seriously considered.

  16. Deposition And Characterization of (Ti,Zr)N Thin Films Grown Through PAPVD By The Pulsed Arc Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marulanda, D. M.; Trujillo, O.; Devia, A.

    The Plasma Assisted Physic Vapor Deposition (PAPVD) by the pulsed arc technique has been used for deposition of Titanium Zirconium Nitride (Ti,Zr)N coatings, using a segmented target of TiZr. The deposition was performed in a vacuum chamber with two faced electrodes (target and substrate) using nitrogen as working gas, and a power-controlled source used to produce the arc discharges. Films were deposited on stainless steel 304, and they were characterized using the X-Ray Photoelectron Spectroscopy (XPS), X-Ray Diffraction (XRD), Energy Dispersion Spectroscopy (EDS) and Scanning Probe Microscopy (SPM) techniques. The XRD patterns show different planes in which the film grows.more » Through SPM, using Atomic Force Microscopy (AFM) and Lateral Force Microscopy (LFM) modes, a nanotribologic study of the thin film was made, determining hardness and friction coefficient.« less

  17. A preliminary intercomparison between numerical upper wind forecasts and research aircraft measurements of jet streams

    NASA Technical Reports Server (NTRS)

    Shapiro, M. A.

    1982-01-01

    During the past several years, research on the structure of extra-tropical jet streams has been carried out with direct measurements with instrumented research aircraft from the National Center for Atmospheric Research (NCAR). These measurements have been used to describe the wind, temperature, turbulence and chemical characteristics of jet streams. A fundamental question is one of assessing the potential value of existing operational numerical forecast models for forecasting the meteorological conditions along commercial aviation flight routes so as to execute Minimum Flight Time tracks and thus obtain the maximum efficiency in aviation fuel consumption. As an initial attempt at resolving this question, the 12 hour forecast output from two models was expressed in terms of a common output format to ease their intercomparison. The chosen models were: (1) the Fine-Mesh Spectral hemispheric and (2) the Limited Area Fine Mesh (LFM) model.

  18. Intra-pulse modulation recognition using short-time ramanujan Fourier transform spectrogram

    NASA Astrophysics Data System (ADS)

    Ma, Xiurong; Liu, Dan; Shan, Yunlong

    2017-12-01

    Intra-pulse modulation recognition under negative signal-to-noise ratio (SNR) environment is a research challenge. This article presents a robust algorithm for the recognition of 5 types of radar signals with large variation range in the signal parameters in low SNR using the combination of the Short-time Ramanujan Fourier transform (ST-RFT) and pseudo-Zernike moments invariant features. The ST-RFT provides the time-frequency distribution features for 5 modulations. The pseudo-Zernike moments provide invariance properties that are able to recognize different modulation schemes on different parameter variation conditions from the ST-RFT spectrograms. Simulation results demonstrate that the proposed algorithm achieves the probability of successful recognition (PSR) of over 90% when SNR is above -5 dB with large variation range in the signal parameters: carrier frequency (CF) for all considered signals, hop size (HS) for frequency shift keying (FSK) signals, and the time-bandwidth product for Linear Frequency Modulation (LFM) signals.

  19. Comparison of two underwater acoustic communications techniques for multi-user access

    NASA Astrophysics Data System (ADS)

    Hursky, Paul; Siderius, T. Martin; Kauaiex Group

    2004-05-01

    Frequency hopped frequency shift keying (FHFSK) and code division multiple access (CDMA) are two different modulation techniques for multiple users to communicate with a single receiver simultaneously. In July 2003, these two techniques were tested alongside each other in a shallow water coastal environment off the coast of Kauai. A variety of instruments were used to measure the prevailing oceanography, enabling detailed modeling of the channel. The channel was acoustically probed using LFM waveforms and m-sequences as well. We will present the results of demodulating the FHFSK and CDMA waveforms and discuss modeling the channel for the purpose of predicting multi-user communications performance. a)Michael B. Porter, Paul Hursky, Martin Siderius (SAIC), Mohsen Badiey (UD), Jerald Caruthers (USM), William S. Hodgkiss, Kaustubha Raghukumar (SIO), Dan Rouseff, Warren Fox (APL-UW), Christian de Moustier, Brian Calder, Barbara J. Kraft (UNH), Keyko McDonald (SPAWARSSC), Peter Stein, James K. Lewis, and Subramaniam Rajan (SSI).

  20. Application of Nimbus-6 microwave data to problems in precipitation prediction for the Pacific west coast

    NASA Technical Reports Server (NTRS)

    Viezee, W.; Shigeishi, H.; Chang, A. T. C.

    1979-01-01

    The preliminary results of a research study that emphasizes the analysis and interpretation of data related to total precipitable water and nonprecipitating cloud liquid water obtained from NIMBUS-6 SCAMS are reported. Sixteen cyclonic storm situations in the northeastern Pacific Ocean that resulted in significant rainfall along the west coast of the United States during the winter season October 1975 through February 1976 are analyzed in terms of their distributions and amounts of total water vapor and liquid water, as obtained from SCAMS data. The water-substance analyses for each storm case are related to the distribution and amount of coastal precipitation observed during the subsequent time period when the storm system crosses the coastline. Concomitant precipitation predictions from the LFM are also incorporated. Techniques by which satellite microwave data over the ocean can be used to improve precipitation prediction for the Pacific West Coast are emphasized.

  1. Literate Specification: Using Design Rationale To Support Formal Methods in the Development of Human-Machine Interfaces.

    ERIC Educational Resources Information Center

    Johnson, Christopher W.

    1996-01-01

    The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…

  2. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  3. Changes in formal sex education: 1995-2002.

    PubMed

    Lindberg, Laura Duberstein; Santelli, John S; Singh, Susheela

    2006-12-01

    Although comprehensive sex education is broadly supported by health professionals, funding for abstinence-only education has increased. Using data from the 1995 National Survey of Adolescent Males, the 1995 National Survey of Family Growth (NSFG) and the 2002 NSFG, changes in male and female adolescents' reports of the sex education they have received from formal sources were examined. Life-table methods were used to measure the timing of instruction, and t tests were used for changes over time. From 1995 to 2002, reports of formal instruction about birth control methods declined among both genders (males, from 81% to 66%; females, from 87% to 70%). This, combined with increases in reports of abstinence education among males (from 74% to 83%), resulted in a lower proportion of teenagers' overall receiving formal instruction about both abstinence and birth control methods (males, 65% to 59%; females, 84% to 65%), and a higher proportion of teenagers' receiving instruction only about abstinence (males, 9% to 24%; females, 8% to 21%). Teenagers in 2002 had received abstinence education about two years earlier (median age, 11.4 for males, 11.8 for females) than they had received birth control instruction (median age, 13.5 for both males and females). Among sexually experienced adolescents, 62% of females and 54% of males had received instruction about birth control methods prior to first sex. A substantial retreat from formal instruction about birth control methods has left increasing proportions of adolescents receiving only abstinence education. Efforts are needed to expand teenagers' access to medically accurate and comprehensive reproductive health information.

  4. Unpacking (In)formal Learning in an Academic Development Programme: A Mixed-Method Social Network Perspective

    ERIC Educational Resources Information Center

    Rienties, Bart; Hosein, Anesa

    2015-01-01

    How and with whom academics develop and maintain formal and informal networks for reflecting on their teaching practice has received limited attention even though academic development (AD) programmes have become an almost ubiquitous feature of higher education. The primary goal of this mixed-method study is to unpack how 114 academics in an AD…

  5. On the simulation of indistinguishable fermions in the many-body Wigner formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.

    2015-01-01

    The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less

  6. Energy/dissipation-preserving Birkhoffian multi-symplectic methods for Maxwell's equations with dissipation terms

    DOE PAGES

    Su, Hongling; Li, Shengtai

    2016-02-03

    In this study, we propose two new energy/dissipation-preserving Birkhoffian multi-symplectic methods (Birkhoffian and Birkhoffian box) for Maxwell's equations with dissipation terms. After investigating the non-autonomous and autonomous Birkhoffian formalism for Maxwell's equations with dissipation terms, we first apply a novel generating functional theory to the non-autonomous Birkhoffian formalism to propose our Birkhoffian scheme, and then implement a central box method to the autonomous Birkhoffian formalism to derive the Birkhoffian box scheme. We have obtained four formal local conservation laws and three formal energy global conservation laws. We have also proved that both of our derived schemes preserve the discrete versionmore » of the global/local conservation laws. Furthermore, the stability, dissipation and dispersion relations are also investigated for the schemes. Theoretical analysis shows that the schemes are unconditionally stable, dissipation-preserving for Maxwell's equations in a perfectly matched layer (PML) medium and have second order accuracy in both time and space. Numerical experiments for problems with exact theoretical results are given to demonstrate that the Birkhoffian multi-symplectic schemes are much more accurate in preserving energy than both the exponential finite-difference time-domain (FDTD) method and traditional Hamiltonian scheme. Finally, we also solve the electromagnetic pulse (EMP) propagation problem and the numerical results show that the Birkhoffian scheme recovers the magnitude of the current source and reaction history very well even after long time propagation.« less

  7. Energy/dissipation-preserving Birkhoffian multi-symplectic methods for Maxwell's equations with dissipation terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hongling; Li, Shengtai

    In this study, we propose two new energy/dissipation-preserving Birkhoffian multi-symplectic methods (Birkhoffian and Birkhoffian box) for Maxwell's equations with dissipation terms. After investigating the non-autonomous and autonomous Birkhoffian formalism for Maxwell's equations with dissipation terms, we first apply a novel generating functional theory to the non-autonomous Birkhoffian formalism to propose our Birkhoffian scheme, and then implement a central box method to the autonomous Birkhoffian formalism to derive the Birkhoffian box scheme. We have obtained four formal local conservation laws and three formal energy global conservation laws. We have also proved that both of our derived schemes preserve the discrete versionmore » of the global/local conservation laws. Furthermore, the stability, dissipation and dispersion relations are also investigated for the schemes. Theoretical analysis shows that the schemes are unconditionally stable, dissipation-preserving for Maxwell's equations in a perfectly matched layer (PML) medium and have second order accuracy in both time and space. Numerical experiments for problems with exact theoretical results are given to demonstrate that the Birkhoffian multi-symplectic schemes are much more accurate in preserving energy than both the exponential finite-difference time-domain (FDTD) method and traditional Hamiltonian scheme. Finally, we also solve the electromagnetic pulse (EMP) propagation problem and the numerical results show that the Birkhoffian scheme recovers the magnitude of the current source and reaction history very well even after long time propagation.« less

  8. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    DTIC Science & Technology

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  9. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    NASA Technical Reports Server (NTRS)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  10. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.

  11. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    PubMed

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  12. Formal Assurance for Cognitive Architecture Based Autonomous Agent

    NASA Technical Reports Server (NTRS)

    Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco

    2017-01-01

    Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.

  13. Refinement for fault-tolerance: An aircraft hand-off protocol

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Schneider, Fred B.; Dehn, Jon

    1994-01-01

    Part of the Advanced Automation System (AAS) for air-traffic control is a protocol to permit flight hand-off from one air-traffic controller to another. The protocol must be fault-tolerant and, therefore, is subtle -- an ideal candidate for the application of formal methods. This paper describes a formal method for deriving fault-tolerant protocols that is based on refinement and proof outlines. The AAS hand-off protocol was actually derived using this method; that derivation is given.

  14. Integration Toolkit and Methods (ITKM) Corporate Data Integration Tools (CDIT). Review of the State-of-the-Art with Respect to Integration Toolkits and Methods (ITKM)

    DTIC Science & Technology

    1992-06-01

    system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy

  15. A 3D generic inverse dynamic method using wrench notation and quaternion algebra.

    PubMed

    Dumas, R; Aissaoui, R; de Guise, J A

    2004-06-01

    In the literature, conventional 3D inverse dynamic models are limited in three aspects related to inverse dynamic notation, body segment parameters and kinematic formalism. First, conventional notation yields separate computations of the forces and moments with successive coordinate system transformations. Secondly, the way conventional body segment parameters are defined is based on the assumption that the inertia tensor is principal and the centre of mass is located between the proximal and distal ends. Thirdly, the conventional kinematic formalism uses Euler or Cardanic angles that are sequence-dependent and suffer from singularities. In order to overcome these limitations, this paper presents a new generic method for inverse dynamics. This generic method is based on wrench notation for inverse dynamics, a general definition of body segment parameters and quaternion algebra for the kinematic formalism.

  16. Formalizing Evaluation Procedures for Marketing Faculty Research Performance.

    ERIC Educational Resources Information Center

    McDermott, Dennis R.; And Others

    1994-01-01

    Results of a national survey of marketing department heads (n=142) indicate that few marketing departments have formalized the development and communication of research performance standards to faculty. Guidelines and methods to accomplish those procedures most efficiently were proposed. (Author/JOW)

  17. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  18. Taxonomy of Teaching Methods and Teaching Forms for Youth in Non-Formal Education in the National Youth Council of Slovenia

    ERIC Educational Resources Information Center

    Miloševic Zupancic, Vesna

    2018-01-01

    Research from the field of non-formal education (NFE) in youth work emphasises the central role of experiential learning and learning in groups. The present paper aims to research teaching methods and teaching forms in NFE in youth work. The research sought to answer the following research questions: 'What teaching forms can be found in NFE for…

  19. Products of composite operators in the exact renormalization group formalism

    NASA Astrophysics Data System (ADS)

    Pagani, C.; Sonoda, H.

    2018-02-01

    We discuss a general method of constructing the products of composite operators using the exact renormalization group formalism. Considering mainly the Wilson action at a generic fixed point of the renormalization group, we give an argument for the validity of short-distance expansions of operator products. We show how to compute the expansion coefficients by solving differential equations, and test our method with some simple examples.

  20. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  1. The equivalence of Darmois-Israel and distributional method for thin shells in general relativity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansouri, R.; Khorrami, M.

    1996-11-01

    A distributional method to solve the Einstein{close_quote}s field equations for thin shells is formulated. The familiar field equations and jump conditions of Darmois-Israel formalism are derived. A careful analysis of the Bianchi identities shows that, for cases under consideration, they make sense as distributions and lead to jump conditions of Darmois-Israel formalism. {copyright} {ital 1996 American Institute of Physics.}

  2. Discrete mathematics, formal methods, the Z schema and the software life cycle

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.

  3. Mapping of Primary Instructional Methods and Teaching Techniques for Regularly Scheduled, Formal Teaching Sessions in an Anesthesia Residency Program.

    PubMed

    Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro

    2016-06-01

    In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.

  4. A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)

    NASA Astrophysics Data System (ADS)

    High, Wayne

    1993-03-01

    This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.

  5. Acquisition of Formal Operations: The Effects of Two Training Procedures.

    ERIC Educational Resources Information Center

    Rosenthal, Doreen A.

    1979-01-01

    A study of 11- and 12-year-old girls indicates that either of two training procedures, method training or dimension training, can aid in the transition from concrete operational to formal operational thought by promoting a hypothesis-testing attitude. (BH)

  6. Towards Formal Implementation of PUS Standard

    NASA Astrophysics Data System (ADS)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  7. New method of contour image processing based on the formalism of spiral light beams

    NASA Astrophysics Data System (ADS)

    Volostnikov, Vladimir G.; Kishkin, S. A.; Kotova, S. P.

    2013-07-01

    The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented.

  8. Community Garden: A Bridging Program between Formal and Informal Learning

    ERIC Educational Resources Information Center

    Datta, Ranjan

    2016-01-01

    Community garden activities can play a significant role in bridging formal and informal learning, particularly in urban children's science and environmental education. It promotes relational methods of learning, discussing, and practicing that will integrate food security, social interactions, community development, environmental activism, and…

  9. Adolescent thinking ála Piaget: The formal stage.

    PubMed

    Dulit, E

    1972-12-01

    Two of the formal-stage experiments of Piaget and Inhelder, selected largely for their closeness to the concepts defining the stage, were replicated with groups of average and gifted adolescents. This report describes the relevant Piagetian concepts (formal stage, concrete stage) in context, gives the methods and findings of this study, and concludes with a section discussing implications and making some reformulations which generally support but significantly qualify some of the central themes of the Piaget-Inhelder work. Fully developed formal-stage thinking emerges as far from commonplace among normal or average adolescents (by marked contrast with the impression created by the Piaget-Inhelder text, which chooses to report no middle or older adolescents who function at less than fully formal levels). In this respect, the formal stage differs appreciably from the earlier Piagetian stages, and early adolescence emerges as the age for which a "single path" model of cognitive development becomes seriously inadequate and a more complex model becomes essential. Formal-stage thinking seems best conceptualized, like most other aspects of psychological maturity, as a potentiality only partially attained by most and fully attained only by some.

  10. A Formal Valuation Framework for Emotions and Their Control.

    PubMed

    Huys, Quentin J M; Renz, Daniel

    2017-09-15

    Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  11. An International Survey of Industrial Applications of Formal Methods. Volume 2. Case Studies

    DTIC Science & Technology

    1993-09-30

    impact of the product on IBM revenues. 4. Error rates were claimed to be below industrial average and errors were minimal to fix. Formal methods, as...critical applications. These include: 3 I I International Survey of Industrial Applications 41 i) "Software failures, particularly under first use, seem...project to add improved modelling capability. I U International Survey of Industrial Applications 93 I Design and Implementation These products are being

  12. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  13. Experience Using Formal Methods for Specifying a Multi-Agent System

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.

  14. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  15. High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains

    NASA Technical Reports Server (NTRS)

    Fisher, Travis C.; Carpenter, Mark H.

    2013-01-01

    Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.

  16. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  17. Influences of thickness, scanning velocity and relative humidity on the frictional properties of WS2 nanosheets

    NASA Astrophysics Data System (ADS)

    Feng, Dongdong; Peng, Jinfeng; Liu, Sisi; Zheng, Xuejun; Yan, Xinyang; He, Wenyuan

    2018-01-01

    Distinguishing with the traditional cantilever mechanics method, we propose the extended cantilever mechanics method to calibrate the lateral calibration factor by using the normal spring constant obtained from atomic force microscopy (AFM) but not the Young’s modulus and the width of the cantilever, before the influences of thickness, scanning velocity and humidity on the frictional properties are investigated via friction measurement performed by the lateral force mode (LFM) of AFM. Tungsten disulfide (WS2) nanosheets were prepared through hydrothermal intercalation and exfoliation route, and AFM and Raman microscope were used to investigate the frictional properties, thickness and crystalline structure. The friction force and coefficient decrease monotonically with the increase of the nanosheet’s thickness, and the friction coefficient minimum value is close to 0.012 when the thickness larger than 5 nm. The friction property variation on the nanosheet’s thickness can be explained by the puckering effect of tip-sheet adhesion according thickness dependence of bending stiffness in the frame of continuum mechanics. The friction force is a constant value 1.7 nN when the scanning speed larger than the critical value 3.10 μm s-1, while it logarithmically increases for the scanning speed less than the critical value. It is easy to understand through the energy dissipation model and the thermally activated effect. The friction force and friction coefficient increase with the relative humidity at the range of 30%-60%, and the latter is at the range of 0.010-0.013. Influence of relative humidity is discussed via the increasing area of the water monolayer during the water adsorption process. The research can not only enrich nanotribology theory, but also prompt two dimensions materials for nanomechanical applications.

  18. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.

  19. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  20. a Norm Pairing in Formal Modules

    NASA Astrophysics Data System (ADS)

    Vostokov, S. V.

    1980-02-01

    A pairing of the multiplicative group of a local field (a finite extension of the field of p-adic numbers Qp) with the group of points of a Lubin-Tate formal group is defined explicitly. The values of the pairing are roots of an isogeny of the formal group. The main properties of this pairing are established: bilinearity, invariance under the choice of a local uniformizing element, and independence of the method of expanding elements into series with respect to this uniformizing element. These properties of the pairing are used to prove that it agrees with the generalized Hilbert norm residue symbol when the field over whose ring of integers the formal group is defined is totally ramified over Qp. This yields an explicit expression for the generalized Hilbert symbol on the group of points of the formal group. Bibliography: 12 titles.

  1. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  2. Formal Solutions for Polarized Radiative Transfer. III. Stiffness and Instability

    NASA Astrophysics Data System (ADS)

    Janett, Gioele; Paganini, Alberto

    2018-04-01

    Efficient numerical approximation of the polarized radiative transfer equation is challenging because this system of ordinary differential equations exhibits stiff behavior, which potentially results in numerical instability. This negatively impacts the accuracy of formal solvers, and small step-sizes are often necessary to retrieve physical solutions. This work presents stability analyses of formal solvers for the radiative transfer equation of polarized light, identifies instability issues, and suggests practical remedies. In particular, the assumptions and the limitations of the stability analysis of Runge–Kutta methods play a crucial role. On this basis, a suitable and pragmatic formal solver is outlined and tested. An insightful comparison to the scalar radiative transfer equation is also presented.

  3. Deriving Safety Cases from Machine-Generated Proofs

    NASA Technical Reports Server (NTRS)

    Basir, Nurlida; Fischer, Bernd; Denney, Ewen

    2009-01-01

    Proofs provide detailed justification for the validity of claims and are widely used in formal software development methods. However, they are often complex and difficult to understand, because they use machine-oriented formalisms; they may also be based on assumptions that are not justified. This causes concerns about the trustworthiness of using formal proofs as arguments in safety-critical applications. Here, we present an approach to develop safety cases that correspond to formal proofs found by automated theorem provers and reveal the underlying argumentation structure and top-level assumptions. We concentrate on natural deduction proofs and show how to construct the safety cases by covering the proof tree with corresponding safety case fragments.

  4. Identification of overlapping communities and their hierarchy by locally calculating community-changing resolution levels

    NASA Astrophysics Data System (ADS)

    Havemann, Frank; Heinz, Michael; Struck, Alexander; Gläser, Jochen

    2011-01-01

    We propose a new local, deterministic and parameter-free algorithm that detects fuzzy and crisp overlapping communities in a weighted network and simultaneously reveals their hierarchy. Using a local fitness function, the algorithm greedily expands natural communities of seeds until the whole graph is covered. The hierarchy of communities is obtained analytically by calculating resolution levels at which communities grow rather than numerically by testing different resolution levels. This analytic procedure is not only more exact than its numerical alternatives such as LFM and GCE but also much faster. Critical resolution levels can be identified by searching for intervals in which large changes of the resolution do not lead to growth of communities. We tested our algorithm on benchmark graphs and on a network of 492 papers in information science. Combined with a specific post-processing, the algorithm gives much more precise results on LFR benchmarks with high overlap compared to other algorithms and performs very similarly to GCE.

  5. Roughness-dependent tribology effects on discontinuous shear thickening

    PubMed Central

    Hsu, Chiao-Peng; Ramakrishna, Shivaprakash N.; Zanini, Michele; Spencer, Nicholas D.

    2018-01-01

    Surface roughness affects many properties of colloids, from depletion and capillary interactions to their dispersibility and use as emulsion stabilizers. It also impacts particle–particle frictional contacts, which have recently emerged as being responsible for the discontinuous shear thickening (DST) of dense suspensions. Tribological properties of these contacts have been rarely experimentally accessed, especially for nonspherical particles. Here, we systematically tackle the effect of nanoscale surface roughness by producing a library of all-silica, raspberry-like colloids and linking their rheology to their tribology. Rougher surfaces lead to a significant anticipation of DST onset, in terms of both shear rate and solid loading. Strikingly, they also eliminate continuous thickening. DST is here due to the interlocking of asperities, which we have identified as “stick–slip” frictional contacts by measuring the sliding of the same particles via lateral force microscopy (LFM). Direct measurements of particle–particle friction therefore highlight the value of an engineering-tribology approach to tuning the thickening of suspensions. PMID:29717043

  6. Simulating the Fate of an Ionospheric Mass Ejection

    NASA Astrophysics Data System (ADS)

    Moore, T. E.; Fok, M. H.; Delcourt, D. C.; Slinker, S. P.; Fedder, J. A.

    2008-12-01

    We report global ion kinetic (GIK) simulations of the 24-25 Sep 1998 storm, with all relevant ionospheric outflows including polar, auroral, and plasmaspheric winds. This storm included substantial periods of northward interplanetary magnetic field, but did develop a Dst of -200 nT at its peak. The solar disturbance resulted form a coronal mass ejection that reached a peak dynamic pressure at the magnetosphere of 6.2 nPa, and produced a substantial enhancement of auroral wind oxygen outflow from the dayside, which has been termed an "ionospheric mass ejection" in an earlier observational paper. We use the LFM global simulation model to produce electric and magnetic fields in the outer magnetosphere, the Strangeway-Zheng outflow scalings with Delcourt ion trajectories to include ionospheric outflows, and the Fok-Ober inner magnetospheric model for the plasmaspheric and ring current response to all particle populations. We assess the combined contributions of heliospheric and geospheric plasmas to the ring current for this event.

  7. Roughness-dependent tribology effects on discontinuous shear thickening.

    PubMed

    Hsu, Chiao-Peng; Ramakrishna, Shivaprakash N; Zanini, Michele; Spencer, Nicholas D; Isa, Lucio

    2018-05-15

    Surface roughness affects many properties of colloids, from depletion and capillary interactions to their dispersibility and use as emulsion stabilizers. It also impacts particle-particle frictional contacts, which have recently emerged as being responsible for the discontinuous shear thickening (DST) of dense suspensions. Tribological properties of these contacts have been rarely experimentally accessed, especially for nonspherical particles. Here, we systematically tackle the effect of nanoscale surface roughness by producing a library of all-silica, raspberry-like colloids and linking their rheology to their tribology. Rougher surfaces lead to a significant anticipation of DST onset, in terms of both shear rate and solid loading. Strikingly, they also eliminate continuous thickening. DST is here due to the interlocking of asperities, which we have identified as "stick-slip" frictional contacts by measuring the sliding of the same particles via lateral force microscopy (LFM). Direct measurements of particle-particle friction therefore highlight the value of an engineering-tribology approach to tuning the thickening of suspensions. Copyright © 2018 the Author(s). Published by PNAS.

  8. GEM-CEDAR Study of Ionospheric Energy Input and Joule Dissipation

    NASA Technical Reports Server (NTRS)

    Rastaetter, Lutz; Kuznetsova, Maria M.; Shim, Jasoon

    2012-01-01

    We are studying ionospheric model performance for six events selected for the GEM-CEDAR modeling challenge. DMSP measurements of electric and magnetic fields are converted into Poynting Flux values that estimate the energy input into the ionosphere. Models generate rates of ionospheric Joule dissipation that are compared to the energy influx. Models include the ionosphere models CTIPe and Weimer and the ionospheric electrodynamic outputs of global magnetosphere models SWMF, LFM, and OpenGGCM. This study evaluates the model performance in terms of overall balance between energy influx and dissipation and tests the assumption that Joule dissipation occurs locally where electromagnetic energy flux enters the ionosphere. We present results in terms of skill scores now commonly used in metrics and validation studies and we can measure the agreement in terms of temporal and spatial distribution of dissipation (i.e, location of auroral activity) along passes of the DMSP satellite with the passes' proximity to the magnetic pole and solar wind activity level.

  9. Matched Bearing Processing for Airborne Source Localization by an Underwater Horizontal Line Array

    NASA Astrophysics Data System (ADS)

    Peng, Zhao-Hui; Li, Zheng-Lin; Wang, Guang-Xu

    2010-11-01

    Location of an airborne source is estimated from signals measured by a horizontal line array (HLA), based on the fact that a signal transmitted by an airborne source will reach a underwater hydrophone in different ways: via a direct refracted path, via one or more bottom and surface reflections, via the so-called lateral wave. As a result, when an HLA near the airborne source is used for beamforming, several peaks at different bearing angles will appear. By matching the experimental beamforming outputs with the predicted outputs for all source locations, the most likely location is the one which gives minimum difference. An experiment is conducted for airborne source localization in the Yellow Sea in October 2008. An HLA was laid on the sea bottom at the depth of 30m. A high-power loudspeaker was hung on a research ship floating near the HLA and sent out LFM pulses. The estimated location of the loudspeaker is in agreement well with the GPS measurements.

  10. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  11. Combining Education and Work; Experiences in Asia and Oceania: Bangladesh.

    ERIC Educational Resources Information Center

    Dacca Univ., Bangladesh. Inst. of Education and Research.

    Bangladesh stresses the importance of education responsive to the country's development needs and capable of producing, through formal or non-formal methods, skilled, employable manpower. Although no pre-vocational training exists, new curricula have introduced practical work experience in the primary schools and have integrated agriculture,…

  12. Results of Two Tenth-Grade Biology Teaching Procedures.

    ERIC Educational Resources Information Center

    Purser, Roger K.; Renner, John W.

    1983-01-01

    Examined influence of teaching methods on content achievement of concrete and formal concepts by students differing in level of operational thought and influence of concrete/formal teaching on the intellectural development of students (N=86 grade 9-10 biology students). Methodology, results, conclusions, and implications are discussed. (Author/JN)

  13. A Survey of Formal Methods for Intelligent Swarms

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.

  14. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  15. Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.

    PubMed

    Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin

    2013-08-01

    The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.

  16. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  17. Gaussian-based techniques for quantum propagation from the time-dependent variational principle: Formulation in terms of trajectories of coupled classical and quantum variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shalashilin, Dmitrii V.; Burghardt, Irene

    2008-08-28

    In this article, two coherent-state based methods of quantum propagation, namely, coupled coherent states (CCS) and Gaussian-based multiconfiguration time-dependent Hartree (G-MCTDH), are put on the same formal footing, using a derivation from a variational principle in Lagrangian form. By this approach, oscillations of the classical-like Gaussian parameters and oscillations of the quantum amplitudes are formally treated in an identical fashion. We also suggest a new approach denoted here as coupled coherent states trajectories (CCST), which completes the family of Gaussian-based methods. Using the same formalism for all related techniques allows their systematization and a straightforward comparison of their mathematical structuremore » and cost.« less

  18. Fellows' in intensive care medicine views on professionalism and how they learn it.

    PubMed

    van Mook, Walther N K A; de Grave, Willem S; Gorter, Simone L; Muijtjens, Arno M M; Zwaveling, Jan Harm; Schuwirth, Lambert W; van der Vleuten, Cees P M

    2010-02-01

    The emphasis on the importance of professionalism in a recent CoBaTrICE-IT paper was impressive. However, insight into the elements of professionalism as perceived relevant for intensivists from the fellows' view, and how these are taught and learned, is limited. A nationwide study was performed in 2007-2008. All ICM fellows (n = 90) were sent a questionnaire containing the following questions regarding training in professionalism (7-point Likert scale (1 = very inadequate, 7 = very adequate)): which are the elements perceived to be important in intensivists'' daily practice (38 items, cat. I)? Which methods of learning and teaching are recognised (16 items, cat. II)? Which methods of teaching and learning are considered especially useful (16 items, cat. III)? Finally, the perceived quantity and quality of formal and informal learning methods, as well as the responsible organisational body was studied. Data were analysed using SPSS 15.0. Response was 75.5 % (n = 68), mean age 34 years. Regarding Elements, scores on virtually all items were high. The factor 'striving for excellence' explained half the variance. Two other aspects, 'Teamwork' and 'Dealing with ethical dilemmas', were identified. Regarding Methods, three dimensions, 'formal curriculum'', 'private and academic experiences' and 'role modelling', proved important. The factor 'formal curriculum' explained most of the variance. Regarding Usefulness the same factors, now mainly explained by the factor Private and academic experiences, emerged with variance. In both categories the items 'observations in daily practice' and 'watching television programmes like ER and House' were the highest- and lowest-scoring items (5.99 and 5.81, and 2.69 and 2.49, respectively). Mean scores regarding the quantity of formal and informal teaching were 4.06 and 4.58 (range 1.841 and 1.519). For the quality of teaching, the figures were 4.22 and 4.52 (range 1.659 and 1.560, respectively). 54 suggestions for improvement of teaching were documented. The need for some form of formal teaching of professionalism aspects as well as for feedback was most frequently mentioned (n = 19 and 16). The local training centres are considered and should remain pivotal for teaching professionalism issues (n = 17 and 28). Almost all elements of professionalism were considered relevant to intensivists' daily practice. Although formal teaching methods regarding professionalism aspects are easily recognised in daily practice, learning by personal experiences and informal ways quantitatively plays a more important, and more valued role. Qualitative comments, nevertheless, stress the need for providing and receiving (solicited and unsolicited) feedback, thereby requesting expansion of formal teaching methods. The local training centres (should continue to) play a major role in teaching professionalism, although an additional role for the (inter)national intensive care organisations remains.

  19. Analytical solution of Schrödinger equation in minimal length formalism for trigonometric potential using hypergeometry method

    NASA Astrophysics Data System (ADS)

    Nurhidayati, I.; Suparmi, A.; Cari, C.

    2018-03-01

    The Schrödinger equation has been extended by applying the minimal length formalism for trigonometric potential. The wave function and energy spectra were used to describe the behavior of subatomic particle. The wave function and energy spectra were obtained by using hypergeometry method. The result showed that the energy increased by the increasing both of minimal length parameter and the potential parameter. The energy were calculated numerically using MatLab.

  20. Keldysh formalism for multiple parallel worlds

    NASA Astrophysics Data System (ADS)

    Ansari, M.; Nazarov, Y. V.

    2016-03-01

    We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.

  1. Formal Analysis of Extended Well-Clear Boundaries for Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Narkawicz, Anthony

    2016-01-01

    This paper concerns the application of formal methods to the definition of a detect and avoid concept for unmanned aircraft systems (UAS). In particular, it illustrates how formal analysis was used to explain and correct unexpected behaviors of the logic that issues alerts when two aircraft are predicted not to be well clear from one another. As a result of this analysis, a recommendation was proposed to, and subsequently adopted by, the US standards organization that defines the minimum operational requirements for the UAS detect and avoid concept.

  2. Keldysh formalism for multiple parallel worlds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansari, M.; Nazarov, Y. V., E-mail: y.v.nazarov@tudelft.nl

    We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.

  3. Access to timely formal dementia care in Europe: protocol of the Actifcare (ACcess to Timely Formal Care) study.

    PubMed

    Kerpershoek, Liselot; de Vugt, Marjolein; Wolfs, Claire; Jelley, Hannah; Orrell, Martin; Woods, Bob; Stephan, Astrid; Bieber, Anja; Meyer, Gabriele; Engedal, Knut; Selbaek, Geir; Handels, Ron; Wimo, Anders; Hopper, Louise; Irving, Kate; Marques, Maria; Gonçalves-Pereira, Manuel; Portolani, Elisa; Zanetti, Orazio; Verhey, Frans

    2016-08-23

    Previous findings indicate that people with dementia and their informal carers experience difficulties accessing and using formal care services due to a mismatch between needs and service use. This mismatch causes overall dissatisfaction and is a waste of the scarce financial care resources. This article presents the background and methods of the Actifcare (ACcess to Timely Formal Care) project. This is a European study aiming at best-practice development in finding timely access to formal care for community-dwelling people with dementia and their informal carers. There are five main objectives: 1) Explore predisposing and enabling factors associated with the use of formal care, 2) Explore the association between the use of formal care, needs and quality of life and 3) Compare these across European countries, 4) Understand the costs and consequences of formal care services utilization in people with unmet needs, 5) Determine the major costs and quality of life drivers and their relationship with formal care services across European countries. In a longitudinal cohort study conducted in eight European countries approximately 450 people with dementia and informal carers will be assessed three times in 1 year (baseline, 6 and 12 months). In this year we will closely monitor the process of finding access to formal care. Data on service use, quality of life and needs will be collected. The results of Actifcare are expected to reveal best-practices in organizing formal care. Knowledge about enabling and predisposing factors regarding access to care services, as well as its costs and consequences, can advance the state of the art in health systems research into pathways to dementia care, in order to benefit people with dementia and their informal carers.

  4. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  5. Some Aspects of Forecasting Severe Thunderstorms during Cool-Season Return-Flow Episodes.

    NASA Astrophysics Data System (ADS)

    Weiss, Steven J.

    1992-08-01

    Historically, the Gulf of Mexico has been considered a primary source of water vapor that influences the weather for much of the United States east of the Rocky Mountains. Although severe thunderstorms and tornadoes occur most frequently during the spring and summer months, the periodic transport of Gulf moisture inland ahead of traveling baroclinic waves can result in significant severe-weather episodes during the cool season.To gain insight into the short-range skill in forecasting surface synoptic patterns associated with moisture return from the Gulf, operational numerical weather prediction models from the National Meteorological Center were examined. Sea level pressure fields from the Limited-Area Fine-Mesh Model (LFM), Nested Grid Model (NGM), and the aviation (AVN) run of the Global Spectral Model, valid 48 h after initial data time, were evaluated for three cool-season cases that preceded severe local storm outbreaks. The NGM and AVN provided useful guidance in forecasting the onset of return flow along the Gulf coast. There was a slight tendency for these models to be slightly slow in the development of return flow. In contrast the LFM typically overforecasts the occurrence of return flow and tends to `open the Gulf' from west to east too quickly.Although the low-level synoptic pattern may be forecast correctly, the overall prediction process is hampered by a data void over the Gulf. It is hypothesized that when the return-flow moisture is located over the Gulf, model forecasts of stability and the resultant operational severe local storm forecasts are less skillful compared to situations when the moisture has spread inland already. This hypothesis is tested by examining the performance of the initial second-day (day 2) severe thunderstorm outlook issued by the National Severe Storms Forecast Center during the Gulf of Mexico Experiment (GUFMEX) in early 1988.It has been found that characteristically different air masses were present along the Gulf coast prior to the issuance of outlooks that accurately predicted the occurrence of severe thunderstorms versus outlooks that did not verify well. Unstable air masses with ample low-level moisture were in place along the coast prior to the issuance of the `good' day 2 outlooks, whereas relatively dry, stable air masses were present before the issuance of `false-alarm' outlooks. In the latter cases, large errors in the NGM 48-h lifted-index predictions were located north of the Gulf coast.

  6. Connecting Formal and Informal Learning Experiences

    ERIC Educational Resources Information Center

    O'Mahony, Timothy Kieran

    2010-01-01

    The learning study reports on part of a larger project being lead by the author. In this dissertation I explore one goal of this project--to understand effects on student learning outcomes as a function of using different methods for connecting out-of-school experiential learning with formal school-based instruction. There is a long history of…

  7. An Evaluation of the Preceptor Model versus the Formal Teaching Model.

    ERIC Educational Resources Information Center

    Shamian, Judith; Lemieux, Suzanne

    1984-01-01

    This study evaluated the effectiveness of two teaching methods to determine which is more effective in enhancing the knowledge base of participating nurses: the preceptor model embodies decentralized instruction by a member of the nursing staff, and the formal teaching model uses centralized teaching by the inservice education department. (JOW)

  8. Non-Formal Alternatives to Schooling: A Glossary of Educational Methods.

    ERIC Educational Resources Information Center

    Massachusetts Univ., Amherst. Center for International Education.

    This document describes activities in the field of nonformal education as an aid to educators as they develop programs to meet individual student needs. Advantages of nonformal education include that it is need-oriented, less expensive than formal systems, flexible, involves peer teaching, and does not encourage elitist feelings among students.…

  9. The Personnel Effectiveness Grid (PEG): A New Tool for Estimating Personnel Department Effectiveness

    ERIC Educational Resources Information Center

    Petersen, Donald J.; Malone, Robert L.

    1975-01-01

    Examines the difficulties inherent in attempting a formal personnel evaluation system, the major formal methods currently used for evaluating personnel department accountabilities, some parameters that should be part of a valid evaluation program, and a model for conducting the evaluation. (Available from Office of Publications, Graduate School of…

  10. 77 FR 39700 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ..., and teaching hospitals and non-profit research institutes that are either owned by or formally... to determine the most appropriate and effective method of compliance with these requirements by... teaching hospital that is not owned by a college or university must keep a copy of its formal written...

  11. A Formal Construction of Term Classes. Technical Report No. TR73-18.

    ERIC Educational Resources Information Center

    Yu, Clement T.

    The computational complexity of a formal process for the construction of term classes for information retrieval is examined. While the process is proven to be difficult computationally, heuristic methods are applied. Experimental results are obtained to illustrate the maximum possible improvement in system performance of retrieval using the formal…

  12. School Policies and Practices that Improve Indoor Air Quality

    ERIC Educational Resources Information Center

    Jones, Sherry Everett; Smith, Alisa M.; Wheeler, Lani S.; McManus, Tim

    2010-01-01

    Background: To determine whether schools with a formal indoor air quality management program were more likely than schools without a formal program to have policies and practices that promote superior indoor air quality. Methods: This study analyzed school-level data from the 2006 School Health Policies and Programs Study, a national study of…

  13. Thermal Cyclotron Absorption Coefficients. II. Opacities in the Stokes Formalism

    NASA Astrophysics Data System (ADS)

    Vaeth, H. M.; Chanmugam, G.

    1995-05-01

    We extend the discussion of the calculation of the cyclotron opacities α± of the ordinary and extraordinary mode (Chanmugam et al.) to the opacities κ, q, υ in the Stokes formalism. We derive formulae with which a can be calculated from κ, q, υ. We are hence able to compare our calculations of the opacities, which are based on the single-particle method, with results obtained with the dielectric tensor method of Tam or. Excellent agreement is achieved. We present extensive tables of the opacities in the Stokes formalism for frequencies up to 25ωc, where ωc is the cyclotron frequency, and temperatures kT = 5, 10,20, 30,40, and 50 keV. Furthermore, we derive approximate formulae with which κ, q, υ can be calculated from α± and hence use the Robinson & Melrose analytic formulae for α± in order to calculate the opacities in the Stokes formalism. We compare these opacities to accurate numerical opacities and find that the analytic formulae can reproduce the qualitative behavior of the opacities in the regions where the harmonic structure is unimportant.

  14. Efficient calculation of beyond RPA correlation energies in the dielectric matrix formalism

    NASA Astrophysics Data System (ADS)

    Beuerle, Matthias; Graf, Daniel; Schurkus, Henry F.; Ochsenfeld, Christian

    2018-05-01

    We present efficient methods to calculate beyond random phase approximation (RPA) correlation energies for molecular systems with up to 500 atoms. To reduce the computational cost, we employ the resolution-of-the-identity and a double-Laplace transform of the non-interacting polarization propagator in conjunction with an atomic orbital formalism. Further improvements are achieved using integral screening and the introduction of Cholesky decomposed densities. Our methods are applicable to the dielectric matrix formalism of RPA including second-order screened exchange (RPA-SOSEX), the RPA electron-hole time-dependent Hartree-Fock (RPA-eh-TDHF) approximation, and RPA renormalized perturbation theory using an approximate exchange kernel (RPA-AXK). We give an application of our methodology by presenting RPA-SOSEX benchmark results for the L7 test set of large, dispersion dominated molecules, yielding a mean absolute error below 1 kcal/mol. The present work enables calculating beyond RPA correlation energies for significantly larger molecules than possible to date, thereby extending the applicability of these methods to a wider range of chemical systems.

  15. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  16. Astronomical Distance Determination in the Space Age. Secondary Distance Indicators

    NASA Astrophysics Data System (ADS)

    Czerny, Bożena; Beaton, Rachael; Bejger, Michał; Cackett, Edward; Dall'Ora, Massimo; Holanda, R. F. L.; Jensen, Joseph B.; Jha, Saurabh W.; Lusso, Elisabeta; Minezaki, Takeo; Risaliti, Guido; Salaris, Maurizio; Toonen, Silvia; Yoshii, Yuzuru

    2018-02-01

    The formal division of the distance indicators into primary and secondary leads to difficulties in description of methods which can actually be used in two ways: with, and without the support of the other methods for scaling. Thus instead of concentrating on the scaling requirement we concentrate on all methods of distance determination to extragalactic sources which are designated, at least formally, to use for individual sources. Among those, the Supernovae Ia is clearly the leader due to its enormous success in determination of the expansion rate of the Universe. However, new methods are rapidly developing, and there is also a progress in more traditional methods. We give a general overview of the methods but we mostly concentrate on the most recent developments in each field, and future expectations.

  17. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.

  18. Formalization of the Access Control on ARM-Android Platform with the B Method

    NASA Astrophysics Data System (ADS)

    Ren, Lu; Wang, Wei; Zhu, Xiaodong; Man, Yujia; Yin, Qing

    2018-01-01

    ARM-Android is a widespread mobile platform with multi-layer access control mechanisms, security-critical in the system. Many access control vulnerabilities still exist due to the course-grained policy and numerous engineering defects, which have been widely studied. However, few researches focus on the mechanism formalization, including the Android permission framework, kernel process management and hardware isolation. This paper first develops a comprehensive formal access control model on the ARM-Android platform using the B method, from the Android middleware to hardware layer. All the model specifications are type checked and proved to be well-defined, with 75%of proof obligations demonstrated automatically. The results show that the proposed B model is feasible to specify and verify access control schemes in the ARM-Android system, and capable of implementing a practical control module.

  19. Unravelling the Lifelong Learning Process for Canadian Workers and Adult Learners Acquiring Higher Skills

    ERIC Educational Resources Information Center

    Taylor, Maurice; Trumpower, David; Pavic, Ivana

    2013-01-01

    This article reports on a mixed methods study that investigated aspects of formal, non-formal and informal learning for workers and adult high school learners seeking literacy and essential skills. Three key themes emerged from the qualitative data: motivations for participation in various forms of learning; seeking out informal learning…

  20. Attitudes to Formal Business Training and Learning amongst Entrepreneurs in the Cultural Industries: Situated Business Learning through "Doing with Others."

    ERIC Educational Resources Information Center

    Raffo, Carlo; O'Connor, Justin; Lovatt, Andy; Banks, Mark

    2000-01-01

    Presents arguments supporting a social model of learning linked to situated learning and cultural capital. Critiques training methods used in cultural industries (arts, publishing, broadcasting, design, fashion, restaurants). Uses case study evidence to demonstrates inadequacies of formal training in this sector. (Contains 49 references.) (SK)

  1. Diagnostic games: from adequate formalization of clinical experience to structure discovery.

    PubMed

    Shifrin, Michael A; Kasparova, Eva I

    2008-01-01

    A method of obtaining well-founded and reproducible results in clinical decision making is presented. It is based on "diagnostic games", a procedure of elicitation and formalization of experts' knowledge and experience. The use of this procedure allows formulating decision rules in the terms of an adequate language, that are both unambiguous and clinically clear.

  2. Formal reasoning about systems biology using theorem proving

    PubMed Central

    Hasan, Osman; Siddique, Umair; Tahar, Sofiène

    2017-01-01

    System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950

  3. The Formalism of Quantum Mechanics Specified by Covariance Properties

    NASA Astrophysics Data System (ADS)

    Nisticò, G.

    2009-03-01

    The known methods, due for instance to G.W. Mackey and T.F. Jordan, which exploit the transformation properties with respect to the Euclidean and Galileian group to determine the formalism of the Quantum Theory of a localizable particle, fail in the case that the considered transformations are not symmetries of the physical system. In the present work we show that the formalism of standard Quantum Mechanics for a particle without spin can be completely recovered by exploiting the covariance properties with respect to the group of Euclidean transformations, without requiring that these transformations are symmetries of the physical system.

  4. Can Regulatory Bodies Expect Efficient Help from Formal Methods?

    NASA Technical Reports Server (NTRS)

    Lopez Ruiz, Eduardo R.; Lemoine, Michel

    2010-01-01

    In the context of EDEMOI - a French national project that proposed the use of semiformal and formal methods to infer the consistency and robustness of aeronautical regulations through the analysis of faithfully representative models- a methodology had been suggested (and applied) to different (safety and security-related) aeronautical regulations. This paper summarizes the preliminary results of this experience by stating which were the methodology s expected benefits, from a scientific point of view, and which are its useful benefits, from a regulatory body s point of view.

  5. Formal Modeling and Analysis of a Preliminary Small Aircraft Transportation System (SATS)Concept

    NASA Technical Reports Server (NTRS)

    Carrreno, Victor A.; Gottliebsen, Hanne; Butler, Ricky; Kalvala, Sara

    2004-01-01

    New concepts for automating air traffic management functions at small non-towered airports raise serious safety issues associated with the software implementations and their underlying key algorithms. The criticality of such software systems necessitates that strong guarantees of the safety be developed for them. In this paper we present a formal method for modeling and verifying such systems using the PVS theorem proving system. The method is demonstrated on a preliminary concept of operation for the Small Aircraft Transportation System (SATS) project at NASA Langley.

  6. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  7. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  8. Formal Safety Certification of Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2005-01-01

    In principle, formal methods offer many advantages for aerospace software development: they can help to achieve ultra-high reliability, and they can be used to provide evidence of the reliability claims which can then be subjected to external scrutiny. However, despite years of research and many advances in the underlying formalisms of specification, semantics, and logic, formal methods are not much used in practice. In our opinion this is related to three major shortcomings. First, the application of formal methods is still expensive because they are labor- and knowledge-intensive. Second, they are difficult to scale up to complex systems because they are based on deep mathematical insights about the behavior of the systems (t.e., they rely on the "heroic proof"). Third, the proofs can be difficult to interpret, and typically stand in isolation from the original code. In this paper, we describe a tool for formally demonstrating safety-relevant aspects of aerospace software, which largely circumvents these problems. We focus on safely properties because it has been observed that safety violations such as out-of-bounds memory accesses or use of uninitialized variables constitute the majority of the errors found in the aerospace domain. In our approach, safety means that the program will not violate a set of rules that can range for the simple memory access rules to high-level flight rules. These different safety properties are formalized as different safety policies in Hoare logic, which are then used by a verification condition generator along with the code and logical annotations in order to derive formal safety conditions; these are then proven using an automated theorem prover. Our certification system is currently integrated into a model-based code generation toolset that generates the annotations together with the code. However, this automated formal certification technology is not exclusively constrained to our code generator and could, in principle, also be integrated with other code generators such as RealTime Workshop or even applied to legacy code. Our approach circumvents the historical problems with formal methods by increasing the degree of automation on all levels. The restriction to safety policies (as opposed to arbitrary functional behavior) results in simpler proof problems that can generally be solved by fully automatic theorem proves. An automated linking mechanism between the safety conditions and the code provides some of the traceability mandated by process standards such as DO-178B. An automated explanation mechanism uses semantic markup added by the verification condition generator to produce natural-language explanations of the safety conditions and thus supports their interpretation in relation to the code. It shows an automatically generated certification browser that lets users inspect the (generated) code along with the safety conditions (including textual explanations), and uses hyperlinks to automate tracing between the two levels. Here, the explanations reflect the logical structure of the safety obligation but the mechanism can in principle be customized using different sets of domain concepts. The interface also provides some limited control over the certification process itself. Our long-term goal is a seamless integration of certification, code generation, and manual coding that results in a "certified pipeline" in which specifications are automatically transformed into executable code, together with the supporting artifacts necessary for achieving and demonstrating the high level of assurance needed in the aerospace domain.

  9. Deriving Safety Cases from Automatically Constructed Proofs

    NASA Technical Reports Server (NTRS)

    Basir, Nurlida; Denney, Ewen; Fischer, Bernd

    2009-01-01

    Formal proofs provide detailed justification for the validity of claims and are widely used in formal software development methods. However, they are often complex and difficult to understand, because the formalism in which they are constructed and encoded is usually machine-oriented, and they may also be based on assumptions that are not justified. This causes concerns about the trustworthiness of using formal proofs as arguments in safety-critical applications. Here, we present an approach to develop safety cases that correspond to formal proofs found by automated theorem provers and reveal the underlying argumentation structure and top-level assumptions. We concentrate on natural deduction style proofs, which are closer to human reasoning than resolution proofs, and show how to construct the safety cases by covering the natural deduction proof tree with corresponding safety case fragments. We also abstract away logical book-keeping steps, which reduces the size of the constructed safety cases. We show how the approach can be applied to the proofs found by the Muscadet prover.

  10. Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric; Duraisamy, Karthk

    2017-11-01

    The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  11. Expert2OWL: A Methodology for Pattern-Based Ontology Development.

    PubMed

    Tahar, Kais; Xu, Jie; Herre, Heinrich

    2017-01-01

    The formalization of expert knowledge enables a broad spectrum of applications employing ontologies as underlying technology. These include eLearning, Semantic Web and expert systems. However, the manual construction of such ontologies is time-consuming and thus expensive. Moreover, experts are often unfamiliar with the syntax and semantics of formal ontology languages such as OWL and usually have no experience in developing formal ontologies. To overcome these barriers, we developed a new method and tool, called Expert2OWL that provides efficient features to support the construction of OWL ontologies using GFO (General Formal Ontology) as a top-level ontology. This method allows a close and effective collaboration between ontologists and domain experts. Essentially, this tool integrates Excel spreadsheets as part of a pattern-based ontology development and refinement process. Expert2OWL enables us to expedite the development process and modularize the resulting ontologies. We applied this method in the field of Chinese Herbal Medicine (CHM) and used Expert2OWL to automatically generate an accurate Chinese Herbology ontology (CHO). The expressivity of CHO was tested and evaluated using ontology query languages SPARQL and DL. CHO shows promising results and can generate answers to important scientific questions such as which Chinese herbal formulas contain which substances, which substances treat which diseases, and which ones are the most frequently used in CHM.

  12. A review of the matrix-exponential formalism in radiative transfer

    NASA Astrophysics Data System (ADS)

    Efremenko, Dmitry S.; Molina García, Víctor; Gimeno García, Sebastián; Doicu, Adrian

    2017-07-01

    This paper outlines the matrix exponential description of radiative transfer. The eigendecomposition method which serves as a basis for computing the matrix exponential and for representing the solution in a discrete ordinate setting is considered. The mathematical equivalence of the discrete ordinate method, the matrix operator method, and the matrix Riccati equations method is proved rigorously by means of the matrix exponential formalism. For optically thin layers, approximate solution methods relying on the Padé and Taylor series approximations to the matrix exponential, as well as on the matrix Riccati equations, are presented. For optically thick layers, the asymptotic theory with higher-order corrections is derived, and parameterizations of the asymptotic functions and constants for a water-cloud model with a Gamma size distribution are obtained.

  13. Education in the responsible conduct of research in psychology: methods and scope.

    PubMed

    DiLorenzo, Terry A; Becker-Fiegeles, Jill; Gibelman, Margaret

    2014-01-01

    In this mixed-method study of education in the responsible conduct of research (RCR) in psychology, phase one survey respondents (n = 141) reported that faculty and students were familiar with RCR standards and procedures to educate them were believed to be adequate. However, educational methods varied widely. In phase two, seven survey respondents completed in-depth interviews assessing RCR training and education and research review procedures. Educational methods through which RCR content was presented included the following ones: traditional (lectures), technical (web-based), and experiential (internships), but RCR was often minimally considered in the formal curriculum. Our results suggest that psychology training programs might benefit from more formal consideration of RCR education and training in the curriculum.

  14. Model Checking JAVA Programs Using Java Pathfinder

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Pressburger, Thomas

    2000-01-01

    This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.

  15. DRS: Derivational Reasoning System

    NASA Technical Reports Server (NTRS)

    Bose, Bhaskar

    1995-01-01

    The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.

  16. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  17. The Contribution of Non-Formal Learning in Higher Education to Student Teachers' Professional Competence

    ERIC Educational Resources Information Center

    Tang, Sylvia Y. F.; Wong, Angel K. Y.; Li, Dora D. Y.; Cheng, May M. H.

    2017-01-01

    This article reports a mixed methods study on the contribution of various aspects of pre-service student teachers' learning in initial teacher education (ITE) to their professional competence in a Five-year Bachelor of Education Programme in Hong Kong. Special attention is given to how student teachers' non-formal learning in higher education…

  18. Early Legal Education in the United States: Natural Law Theory and Law as a Moral Science.

    ERIC Educational Resources Information Center

    Bailey, Mark Warren

    1998-01-01

    An examination of the history of legal education covers the long period of law-office apprenticeship as the principal method of legal education in the United States and reviews trends in the period of formal education, the relationship between formal education and professional practice, the philosophical context for legal education, instruction in…

  19. The Effects of Formalism on Teacher Trainees' Algebraic and Geometric Interpretation of the Notions of Linear Dependency/Independency

    ERIC Educational Resources Information Center

    Ertekin, E.; Solak, S.; Yazici, E.

    2010-01-01

    The aim of this study is to identify the effects of formalism in teaching on primary and secondary school mathematics teacher trainees' algebraic and geometric interpretations of the notions of linear dependency/independency. Quantitative research methods are drawn in order to determine differences in success levels between algebraic and geometric…

  20. Construction and Evaluation of an Integrated Formal/Informal Learning Environment for Foreign Language Learning across Real and Virtual Spaces

    ERIC Educational Resources Information Center

    Waragai, Ikumi; Ohta, Tatsuya; Kurabayashi, Shuichi; Kiyoki, Yasushi; Sato, Yukiko; Brückner, Stefan

    2017-01-01

    This paper presents the prototype of a foreign language learning space, based on the construction of an integrated formal/informal learning environment. Before the background of the continued innovation of information technology that places conventional learning styles and educational methods into new contexts based on new value-standards,…

  1. Lexical and Grammatical Abilities in Deaf Italian Preschoolers: The Role of Duration of Formal Language Experience

    ERIC Educational Resources Information Center

    Rinaldi, Pasquale; Caselli, Cristina

    2009-01-01

    We evaluated language development in deaf Italian preschoolers with hearing parents, taking into account the duration of formal language experience (i.e., the time elapsed since wearing a hearing aid and beginning language education) and different methods of language education. Twenty deaf children were matched with 20 hearing children for age and…

  2. Developing Metrics for Effective Teaching in Extension Education: A Multi-State Factor-Analytic and Psychometric Analysis of Effective Teaching

    ERIC Educational Resources Information Center

    McKim, Billy R.; Lawver, Rebecca G.; Enns, Kellie; Smith, Amy R.; Aschenbrener, Mollie S.

    2013-01-01

    To successfully educate the public about agriculture, food, and natural resources, we must have effective educators in both formal and nonformal settings. Specifically, this study, which is a valuable part of a larger sequential mixed-method study addressing effective teaching in formal and nonformal agricultural education, provides direction for…

  3. Hard Times for HRD, Lean Times for Learning?: Workplace Participatory Practices as Enablers of Learning

    ERIC Educational Resources Information Center

    Warhurst, Russell

    2013-01-01

    Purpose: This article aims to show how in times of austerity when formal HRD activity is curtailed and yet the need for learning is greatest, non-formal learning methods such as workplace involvement and participation initiated by line managers can compensate by enabling the required learning and change. Design/methodology/approach: A qualitative…

  4. Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1992-01-01

    This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.

  5. A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Grando, M Adela; Glasspool, David; Fox, John

    2012-01-01

    To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Don't abandon hope all ye who enter here: The protective role of formal mentoring and learning processes on burnout in correctional officers.

    PubMed

    Farnese, M L; Barbieri, B; Bellò, B; Bartone, P T

    2017-01-01

    Within a Job Demands-Resources Model framework, formal mentoring can be conceived as a job resource expressing the organization's support for new members, which may prevent their being at risk for burnout. This research aims at understanding the protective role of formal mentoring on burnout, through the effect of increasing learning personal resources. Specifically, we hypothesized that formal mentoring enhances newcomers' learning about job and social domains related to the new work context, thus leading to lower burnout. In order to test the hypotheses, a multiple regression analysis using the bootstrapping method was used. Based on a questionnaire administered to 117 correctional officer newcomers who had a formal mentor assigned, our results confirm that formal mentoring exerts a positive influence on newcomers' adjustment, and that this in turn exerts a protective influence against burnout onset by reducing cynicism and interpersonal stress and also enhancing the sense of personal accomplishment. Confirming previous literature's suggestions, supportive mentoring and effective socialization seem to represent job and personal resources that are protective against burnout. This study provides empirical support for this relation in the prison context.

  7. The Creative Power of Formal Analogies in Physics: The Case of Albert Einstein

    NASA Astrophysics Data System (ADS)

    Gingras, Yves

    2015-07-01

    In order to show how formal analogies between different physical systems play an important conceptual work in physics, this paper analyzes the evolution of Einstein's thoughts on the structure of radiation from the point of view of the formal analogies he used as "lenses" to "see" through the "black box" of Planck's blackbody radiation law. A comparison is also made with his 1925 paper on the quantum gas where he used the same formal methods. Changes of formal points of view are most of the time taken for granted or passed over in silence in studies on the mathematization of physics as if they had no special significance. Revisiting Einstein's classic papers on the nature of light and matter from the angle of the various theoretical tools he used, namely entropy and energy fluctuation calculations, helps explain why he was in a unique position to make visible the particle structure of radiation and the dual (particle and wave) nature of light and matter. Finally, this case study calls attention to the more general question of the surprising creative power of formal analogies and their frequent use in theoretical physics. This aspect of intellectual creation can be useful in the teaching of physics.

  8. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  9. Radiative transfer calculated from a Markov chain formalism

    NASA Technical Reports Server (NTRS)

    Esposito, L. W.; House, L. L.

    1978-01-01

    The theory of Markov chains is used to formulate the radiative transport problem in a general way by modeling the successive interactions of a photon as a stochastic process. Under the minimal requirement that the stochastic process is a Markov chain, the determination of the diffuse reflection or transmission from a scattering atmosphere is equivalent to the solution of a system of linear equations. This treatment is mathematically equivalent to, and thus has many of the advantages of, Monte Carlo methods, but can be considerably more rapid than Monte Carlo algorithms for numerical calculations in particular applications. We have verified the speed and accuracy of this formalism for the standard problem of finding the intensity of scattered light from a homogeneous plane-parallel atmosphere with an arbitrary phase function for scattering. Accurate results over a wide range of parameters were obtained with computation times comparable to those of a standard 'doubling' routine. The generality of this formalism thus allows fast, direct solutions to problems that were previously soluble only by Monte Carlo methods. Some comparisons are made with respect to integral equation methods.

  10. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  11. Perturbative universal state-selective correction for state-specific multi-reference coupled cluster methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brabec, Jiri; Banik, Subrata; Kowalski, Karol

    2016-10-28

    The implementation details of the universal state-selective (USS) multi-reference coupled cluster (MRCC) formalism with singles and doubles (USS(2)) are discussed on the example of several benchmark systems. We demonstrate that the USS(2) formalism is capable of improving accuracies of state specific multi-reference coupled-cluster (MRCC) methods based on the Brillouin-Wigner and Mukherjee’s sufficiency conditions. Additionally, it is shown that the USS(2) approach significantly alleviates problems associated with the lack of invariance of MRCC theories upon the rotation of active orbitals. We also discuss the perturbative USS(2) formulations that significantly reduce numerical overhead of the full USS(2) method.

  12. The substorm cycle as reproduced by global MHD models

    NASA Astrophysics Data System (ADS)

    Gordeev, E.; Sergeev, V.; Tsyganenko, N.; Kuznetsova, M.; Rastäetter, L.; Raeder, J.; Tóth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.

    2017-01-01

    Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized 2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to postprocessing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.

  13. The Substorm Cycle as Reproduced by Global MHD Models

    NASA Technical Reports Server (NTRS)

    Gordeev, E.; Sergee, V.; Tsyganenko, N.; Kuznetsova, M.; Rastaetter, Lutz; Raeder, J.; Toth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.

    2017-01-01

    Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to post processing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.

  14. A discriminatory function for prediction of protein-DNA interactions based on alpha shape modeling.

    PubMed

    Zhou, Weiqiang; Yan, Hong

    2010-10-15

    Protein-DNA interaction has significant importance in many biological processes. However, the underlying principle of the molecular recognition process is still largely unknown. As more high-resolution 3D structures of protein-DNA complex are becoming available, the surface characteristics of the complex become an important research topic. In our work, we apply an alpha shape model to represent the surface structure of the protein-DNA complex and developed an interface-atom curvature-dependent conditional probability discriminatory function for the prediction of protein-DNA interaction. The interface-atom curvature-dependent formalism captures atomic interaction details better than the atomic distance-based method. The proposed method provides good performance in discriminating the native structures from the docking decoy sets, and outperforms the distance-dependent formalism in terms of the z-score. Computer experiment results show that the curvature-dependent formalism with the optimal parameters can achieve a native z-score of -8.17 in discriminating the native structure from the highest surface-complementarity scored decoy set and a native z-score of -7.38 in discriminating the native structure from the lowest RMSD decoy set. The interface-atom curvature-dependent formalism can also be used to predict apo version of DNA-binding proteins. These results suggest that the interface-atom curvature-dependent formalism has a good prediction capability for protein-DNA interactions. The code and data sets are available for download on http://www.hy8.com/bioinformatics.htm kenandzhou@hotmail.com.

  15. Gender and ergonomics: a case study on the 'non-formal' work of women nurses.

    PubMed

    Salerno, Silvana; Livigni, Lucilla; Magrini, Andrea; Talamanca, Irene Figà

    2012-01-01

    Women's work activities are often characterised by 'non-formal actions' (such as giving support). Gender differences in ergonomics may be due to this peculiarity. We applied the method of organisational congruencies (MOC) to ascertain the 'non-formal' work portion of nurses employed in three hospital units (haematology, emergency room and general medicine) during the three work shifts in a major University Hospital in Rome, Italy. We recorded a total of 802 technical actions performed by nine nurses in 72 h of work. Twenty-six percent of the actions in direct patient's care were communicative actions (mainly giving psychological support) while providing physical care. These 'double actions' are often not considered to be a formal part of the job by hospital management. In our case study, the 'non-formal' work of nurses (psychological support) is mainly represented by double actions while taking physical care of the patients. The dual task paradigm in gender oriented research is discussed in terms of its implications in prevention in occupational health. The main purpose of the study was to assess all the formal and non-formal activities of women in the nursing work setting. Offering psychological support to patients is often not considered to be a formal part of the job. Our case study found that nurses receive no explicit guidelines on this activity and no time is assigned to perform it. In measuring the burden of providing psychological support to patients, we found that this is often done while nurses are performing tasks of physical care for the patients (double actions). The article discusses the significance of non-formal psychological work load of women nurses through double actions from the ergonomic point view.

  16. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Conti, C.; Barbero, C.; Galeão, A. P.

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  17. Indicators and protocols for monitoring impacts of formal and informal trails in protected areas

    USGS Publications Warehouse

    Marion, Jeffrey L.; Leung, Yu-Fai

    2011-01-01

    Trails are a common recreation infrastructure in protected areas and their conditions affect the quality of natural resources and visitor experiences. Various trail impact indicators and assessment protocols have been developed in support of monitoring programs, which are often used for management decision-making or as part of visitor capacity management frameworks. This paper reviews common indicators and assessment protocols for three types of trails, surfaced formal trails, unsurfaced formal trails, and informal (visitor-created) trails. Monitoring methods and selected data from three U.S. National Park Service units are presented to illustrate some common trail impact indicators and assessment options.

  18. Continuum Level Density of a Coupled-Channel System in the Complex Scaling Method

    NASA Astrophysics Data System (ADS)

    Suzuki, R.; Kruppa, A. T.; Giraud, B. G.; Katō, K.

    2008-06-01

    We study the continuum level density (CLD) in the formalism of the complex scaling method (CSM) for coupled-channel systems. We apply the formalism to the ^{4}He = [^{3}H + p] + [^3{He} + n] coupled-channel cluster model where there are resonances at low energy. Numerical calculations of the CLD in the CSM with a finite number of L^{2} basis functions are consistent with the exact result calculated from the S-matrix by solving coupled-channel equations. We also study channel densities. In this framework, the extended completeness relation (ECR) plays an important role.

  19. Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.

    DTIC Science & Technology

    1996-04-01

    This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In

  20. Professional Leadership Experiences with Formal and Informal Mentoring of College Deans of Education at 4-Year Nonprofit Private Colleges and Universities in California

    ERIC Educational Resources Information Center

    Baartman, Ingrid

    2011-01-01

    Purpose: The purpose of this study was to examine professional leadership experiences of college deans of education with formal and informal mentoring in the course of their career progression at 4-year nonprofit private colleges and universities in California. Methodology: Using a mixed-methods approach that emphasized the qualitative data…

  1. Security Modeling and Correctness Proof Using Specware and Isabelle

    DTIC Science & Technology

    2008-12-01

    proving requires substantial knowledge and experience in logical calculus . 15. NUMBER OF PAGES 146 14. SUBJECT TERMS Formal Method, Theorem...although the actual proving requires substantial knowledge and experience in logical calculus . vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF...formal language and provides tools for proving those formulas in a logical calculus ” [5]. We are demonstrating in this thesis that a specification in

  2. Enhancing 21st Century Skills with AR: Using the Gradual Immersion Method to Develop Collaborative Creativity

    ERIC Educational Resources Information Center

    Sanabria, Jorge C.; Arámburo-Lizárraga, Jesús

    2017-01-01

    As 21st century skills (e.g., creativity and collaboration) are informally developed by tech-savvy learners in the Digital Age, technology-based strategies to develop such skills in non-formal and formal contexts are necessary to reduce the gap between academic and business organizations on the one hand, and the revolutionary wave of self-taught…

  3. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility.

    PubMed

    Jung, Kiwook; Morris, K C; Lyons, Kevin W; Leong, Swee; Cho, Hyunbo

    2015-12-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified.

  4. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility

    PubMed Central

    Jung, Kiwook; Morris, KC; Lyons, Kevin W.; Leong, Swee; Cho, Hyunbo

    2016-01-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified. PMID:27141209

  5. Multifrequency OFDM SAR in Presence of Deception Jamming

    NASA Astrophysics Data System (ADS)

    Schuerger, Jonathan; Garmatyuk, Dmitriy

    2010-12-01

    Orthogonal frequency division multiplexing (OFDM) is considered in this paper from the perspective of usage in imaging radar scenarios with deception jamming. OFDM radar signals are inherently multifrequency waveforms, composed of a number of subbands which are orthogonal to each other. While being employed extensively in communications, OFDM has not found comparatively wide use in radar, and, particularly, in synthetic aperture radar (SAR) applications. In this paper, we aim to show the advantages of OFDM-coded radar signals with random subband composition when used in deception jamming scenarios. Two approaches to create a radar signal by the jammer are considered: instantaneous frequency (IF) estimator and digital-RF-memory- (DRFM-) based reproducer. In both cases, the jammer aims to create a copy of a valid target image via resending the radar signal at prescribed time intervals. Jammer signals are derived and used in SAR simulations with three types of signal models: OFDM, linear frequency modulated (LFM), and frequency-hopped (FH). Presented results include simulated peak side lobe (PSL) and peak cross-correlation values for random OFDM signals, as well as simulated SAR imagery with IF and DRFM jammers'-induced false targets.

  6. Plasma Sheet Circulation Pathways

    NASA Technical Reports Server (NTRS)

    Moore, Thomas E.; Delcourt, D. C.; Slinker, S. P.; Fedder, J. A.; Damiano, P.; Lotko, W.

    2008-01-01

    Global simulations of Earth's magnetosphere in the solar wind compute the pathways of plasma circulation through the plasma sheet. We address the pathways that supply and drain the plasma sheet, by coupling single fluid simulations with Global Ion Kinetic simulations of the outer magnetosphere and the Comprehensive Ring Current Model of the inner magnetosphere, including plasmaspheric plasmas. We find that the plasma sheet is supplied with solar wind plasmas via the magnetospheric flanks, and that this supply is most effective for northward IMF. For southward IMF, the innermost plasma sheet and ring current region are directly supplied from the flanks, with an asymmetry of single particle entry favoring the dawn flank. The central plasma sheet (near midnight) is supplied, as expected, from the lobes and polar cusps, but the near-Earth supply consists mainly of slowly moving ionospheric outflows for typical conditions. Work with the recently developed multi-fluid LFM simulation shows transport via plasma "fingers" extending Earthward from the flanks, suggestive of an interchange instability. We investigate this with solar wind ion trajectories, seeking to understand the fingering mechanisms and effects on transport rates.

  7. Interfacial Properties of EXXPRO(TM) and General Purpose Elastomers

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Rafailovich, M.; Sokolov, Jon; Qu, S.; Ge, S.; Ngyuen, D.; Li, Z.; Peiffer, D.; Song, L.; Dias, J. A.; McElrath, K. O.

    1998-03-01

    EXXPRO(Trademark) elastomers are used for tires and many other applications. This elastomer (denoted as BIMS) is a random copolymer of p-methylstyrene (MS) and polyisobutylene (I) with varying degrees of PMS content and bromination (B) on the p-methyl group. BIMS is impermeable to gases, and has good heat, ozone and flex resistance. Very often general purpose elastomers are blended with BIMS. The interfacial width between polybutadiene and BIMS is a sensitive function of the Br level and PMS content. By neutron reflectivity (NR), we studied the dynamics of interface formation as a function of time and temperature for BIMS with varying degrees of PMS and Br. We found that in addition to the bulk parameters, the total film thickness and the proximity of an interactive surface can affect the interfacial interaction rates. The interfacial properties can also be modified by inclusion of particles, such as carbon black (a filler component in tire rubbers). Results will be presented on the relation between the interfacial width as measured by NR and compatibilization studies via AFM and LFM.

  8. Effect of impurity molecules on the low-temperature vibrational dynamics of polyisobutylene: Investigation by single-molecule spectroscopy

    NASA Astrophysics Data System (ADS)

    Eremchev, I. Yu.; Naumov, A. V.; Vainer, Yu. G.; Kador, L.

    2009-05-01

    The influence of impurity chromophore molecules—tetra-tert-butylterrylene (TBT) and dibenzo-anthanthrene (DBATT)—on the vibrational dynamics of the amorphous polymer polyisobutylene (PIB) has been studied via single-molecule spectroscopy. The measurements were performed in the temperature region of 7-30 K, where the interaction of the chromophores with quasilocalized low-frequency vibrational modes (LFMs) determines the observed spectral line broadening. The analysis of the individual temperature dependences of the linewidths for a large number of single probe molecules yielded effective frequency values of those LFMs which are located near the respective chromophores. In this way the distributions of the LFM frequencies were measured for the two systems, and they were found to be similar. Moreover, they are in good agreement with the vibrational density of states as measured in pure PIB by inelastic neutron scattering. This allows us to conclude that, at least in the case of PIB, doping with low concentrations of the nonpolar and neutral molecules TBT and DBATT does not affect the vibrational dynamics of the matrix markedly.

  9. Multi-static MIMO along track interferometry (ATI)

    NASA Astrophysics Data System (ADS)

    Knight, Chad; Deming, Ross; Gunther, Jake

    2016-05-01

    Along-track interferometry (ATI) has the ability to generate high-quality synthetic aperture radar (SAR) images and concurrently detect and estimate the positions of ground moving target indicators (GMTI) with moderate processing requirements. This paper focuses on several different ATI system configurations, with an emphasis on low-cost configurations employing no active electronic scanned array (AESA). The objective system has two transmit phase centers and four receive phase centers and supports agile adaptive radar behavior. The advantages of multistatic, multiple input multiple output (MIMO) ATI system configurations are explored. The two transmit phase centers can employ a ping-pong configuration to provide the multistatic behavior. For example, they can toggle between an up and down linear frequency modulated (LFM) waveform every other pulse. The four receive apertures are considered in simple linear spatial configurations. Simulated examples are examined to understand the trade space and verify the expected results. Finally, actual results are collected with the Space Dynamics Laboratorys (SDL) FlexSAR system in diverse configurations. The theory, as well as the simulated and actual SAR results, are presented and discussed.

  10. A flight management algorithm and guidance for fuel-conservative descents in a time-based metered air traffic environment: Development and flight test results

    NASA Technical Reports Server (NTRS)

    Knox, C. E.

    1984-01-01

    A simple airborne flight management descent algorithm designed to define a flight profile subject to the constraints of using idle thrust, a clean airplane configuration (landing gear up, flaps zero, and speed brakes retracted), and fixed-time end conditions was developed and flight tested in the NASA TSRV B-737 research airplane. The research test flights, conducted in the Denver ARTCC automated time-based metering LFM/PD ATC environment, demonstrated that time guidance and control in the cockpit was acceptable to the pilots and ATC controllers and resulted in arrival of the airplane over the metering fix with standard deviations in airspeed error of 6.5 knots, in altitude error of 23.7 m (77.8 ft), and in arrival time accuracy of 12 sec. These accuracies indicated a good representation of airplane performance and wind modeling. Fuel savings will be obtained on a fleet-wide basis through a reduction of the time error dispersions at the metering fix and on a single-airplane basis by presenting the pilot with guidance for a fuel-efficient descent.

  11. 7 CFR 1780.72 - Procurement methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Procurement methods. 1780.72 Section 1780.72..., Constructing and Inspections § 1780.72 Procurement methods. Procurement shall be made by one of the following methods: Small purchase procedures; competitive sealed bids (formal advertising); competitive negotiation...

  12. 7 CFR 1780.72 - Procurement methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Procurement methods. 1780.72 Section 1780.72..., Constructing and Inspections § 1780.72 Procurement methods. Procurement shall be made by one of the following methods: Small purchase procedures; competitive sealed bids (formal advertising); competitive negotiation...

  13. 7 CFR 1780.72 - Procurement methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Procurement methods. 1780.72 Section 1780.72..., Constructing and Inspections § 1780.72 Procurement methods. Procurement shall be made by one of the following methods: Small purchase procedures; competitive sealed bids (formal advertising); competitive negotiation...

  14. 7 CFR 1780.72 - Procurement methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Procurement methods. 1780.72 Section 1780.72..., Constructing and Inspections § 1780.72 Procurement methods. Procurement shall be made by one of the following methods: Small purchase procedures; competitive sealed bids (formal advertising); competitive negotiation...

  15. 7 CFR 1780.72 - Procurement methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 12 2011-01-01 2011-01-01 false Procurement methods. 1780.72 Section 1780.72..., Constructing and Inspections § 1780.72 Procurement methods. Procurement shall be made by one of the following methods: Small purchase procedures; competitive sealed bids (formal advertising); competitive negotiation...

  16. Formal Analysis of the Remote Agent Before and After Flight

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.

    2000-01-01

    This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.

  17. The kinematic dynamo problem, part I: analytical treatment with the Bullard-Gellman formalism

    NASA Astrophysics Data System (ADS)

    Glane, Sebastian; Reich, Felix A.; Müller, Wolfgang H.

    2018-03-01

    This paper is dedicated to the description of kinematic dynamo action in a sphere and its analytical treatment with the uc(Bullard)-uc(Gellman) formalism. One goal of dynamo theory is to answer the question: Can magnetic fields of stellar objects be generated or sustained due to (fluid) motion in the interior? uc(Bullard) and uc(Gellman) were among the first to study this question, leading the way for many subsequent studies, cf. Bullard (Philos Trans R Soc A 247(928):213-278, 1954). In their publication the differential equations resulting from a toroidal-poloidal decomposition of the velocity and magnetic field are stated without an in-depth discussion of the employed methods and computation steps. This study derives the necessary formalism in a compact and concise manner by using an operator-based approach. The focus lies on the mathematical steps and necessary properties of the considered formalism. Prior to that a derivation of the induction equation is presented based on rational continuum electrodynamics. As an example of the formalism the decay of two magnetic fields is analyzed.

  18. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  19. Adaptation of the projector-augmented-wave formalism to the treatment of orbital-dependent exchange-correlation functionals

    NASA Astrophysics Data System (ADS)

    Xu, Xiao; Holzwarth, N. A. W.

    2011-10-01

    This paper presents the formulation and numerical implementation of a self-consistent treatment of orbital-dependent exchange-correlation functionals within the projector-augmented-wave method of Blöchl [Phys. Rev. BPRBMDO1098-012110.1103/PhysRevB.50.17953 50, 17953 (1994)] for electronic structure calculations. The methodology is illustrated with binding energy curves for C in the diamond structure and LiF in the rock salt structure, by comparing results from the Hartree-Fock (HF) formalism and the optimized effective potential formalism in the so-called KLI approximation [Krieger, Li, and Iafrate, Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.45.101 45, 101 (1992)] with those of the local density approximation. While the work here uses pure Fock exchange only, the formalism can be extended to treat orbital-dependent functionals more generally.

  20. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  1. Human Fecal Source Identification: Real-Time Quantitative PCR Method Standardization

    EPA Science Inventory

    Method standardization or the formal development of a protocol that establishes uniform performance benchmarks and practices is necessary for widespread adoption of a fecal source identification approach. Standardization of a human-associated fecal identification method has been...

  2. On Dramatic Instruction: Towards a Taxonomy of Methods.

    ERIC Educational Resources Information Center

    Courtney, Richard

    1987-01-01

    Examines the many possible methods used by instructors who work with dramatic action: in educational drama, drama therapy, social drama, and theater. Discusses an emergent taxonomy whereby instructors choose either spontaneous/formal, overt/covert/, or intrinsic/extrinsic methods. (JC)

  3. De novo reconstruction of gene regulatory networks from time series data, an approach based on formal methods.

    PubMed

    Ceccarelli, Michele; Cerulo, Luigi; Santone, Antonella

    2014-10-01

    Reverse engineering of gene regulatory relationships from genomics data is a crucial task to dissect the complex underlying regulatory mechanism occurring in a cell. From a computational point of view the reconstruction of gene regulatory networks is an undetermined problem as the large number of possible solutions is typically high in contrast to the number of available independent data points. Many possible solutions can fit the available data, explaining the data equally well, but only one of them can be the biologically true solution. Several strategies have been proposed in literature to reduce the search space and/or extend the amount of independent information. In this paper we propose a novel algorithm based on formal methods, mathematically rigorous techniques widely adopted in engineering to specify and verify complex software and hardware systems. Starting with a formal specification of gene regulatory hypotheses we are able to mathematically prove whether a time course experiment belongs or not to the formal specification, determining in fact whether a gene regulation exists or not. The method is able to detect both direction and sign (inhibition/activation) of regulations whereas most of literature methods are limited to undirected and/or unsigned relationships. We empirically evaluated the approach on experimental and synthetic datasets in terms of precision and recall. In most cases we observed high levels of accuracy outperforming the current state of art, despite the computational cost increases exponentially with the size of the network. We made available the tool implementing the algorithm at the following url: http://www.bioinformatics.unisannio.it. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. 10 CFR 2.705 - Discovery-additional methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Discovery-additional methods. 2.705 Section 2.705 Energy... Rules for Formal Adjudications § 2.705 Discovery-additional methods. (a) Discovery methods. Parties may obtain discovery by one or more of the following methods: depositions upon oral examination or written...

  5. Implementation of the nudged elastic band method in a dislocation dynamics formalism: Application to dislocation nucleation

    NASA Astrophysics Data System (ADS)

    Geslin, Pierre-Antoine; Gatti, Riccardo; Devincre, Benoit; Rodney, David

    2017-11-01

    We propose a framework to study thermally-activated processes in dislocation glide. This approach is based on an implementation of the nudged elastic band method in a nodal mesoscale dislocation dynamics formalism. Special care is paid to develop a variational formulation to ensure convergence to well-defined minimum energy paths. We also propose a methodology to rigorously parametrize the model on atomistic data, including elastic, core and stacking fault contributions. To assess the validity of the model, we investigate the homogeneous nucleation of partial dislocation loops in aluminum, recovering the activation energies and loop shapes obtained with atomistic calculations and extending these calculations to lower applied stresses. The present method is also applied to heterogeneous nucleation on spherical inclusions.

  6. Methods for solving reasoning problems in abstract argumentation – A survey

    PubMed Central

    Charwat, Günther; Dvořák, Wolfgang; Gaggl, Sarah A.; Wallner, Johannes P.; Woltran, Stefan

    2015-01-01

    Within the last decade, abstract argumentation has emerged as a central field in Artificial Intelligence. Besides providing a core formalism for many advanced argumentation systems, abstract argumentation has also served to capture several non-monotonic logics and other AI related principles. Although the idea of abstract argumentation is appealingly simple, several reasoning problems in this formalism exhibit high computational complexity. This calls for advanced techniques when it comes to implementation issues, a challenge which has been recently faced from different angles. In this survey, we give an overview on different methods for solving reasoning problems in abstract argumentation and compare their particular features. Moreover, we highlight available state-of-the-art systems for abstract argumentation, which put these methods to practice. PMID:25737590

  7. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    PubMed

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  8. A Comparison of Correctional Adult Educators and Formal Adult Educators in Terms of Their Expressed Beliefs in the Collaborative Teaching Mode. Theory and Methods of Adult Education.

    ERIC Educational Resources Information Center

    Sua, Dangbe Wuo

    A study compared correctional adult educators and formal adult educators in terms of their expressed beliefs in the collaborative teaching mode as measured by the Principles of Adult Learning Scale. The sample consisted of 8 correctional adult educators from the Lake Correctional Institution and 10 adult education teachers from the Manatee Area…

  9. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  10. Sex education and contraceptive use at coital debut in the United States: results from Cycle 6 of the National Survey of Family Growth.

    PubMed

    Isley, Michelle M; Edelman, Alison; Kaneshiro, Bliss; Peters, Dawn; Nichols, Mark D; Jensen, Jeffrey T

    2010-09-01

    The study was conducted to characterize the relationship between formal sex education and the use and type of contraceptive method used at coital debut among female adolescents. This study employed a cross-sectional, nationally representative database (2002 National Survey of Family Growth). Contraceptive use and type used were compared among sex education groups [abstinence only (AO), birth control methods only (MO) and comprehensive (AM)]. Analyses also evaluated the association between demographic, socioeconomic, behavioral variables and sex education. Multiple logistic regression with adjustment for sampling design was used to measure associations of interest. Of 1150 adolescent females aged 15-19 years, 91% reported formal sex education (AO 20.4%, MO 4.9%, AM 65.1%). The overall use of contraception at coitarche did not differ between groups. Compared to the AO and AM groups, the proportion who used a reliable method in the MO group (37%) was significantly higher (p=.03) (vs. 15.8% and 14.8%, respectively). Data from the 2002 NSFG do not support an association between type of formal sex education and contraceptive use at coitarche but do support an association between abstinence-only messaging and decreased reliable contraceptive method use at coitarche. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Quasipolynomial generalization of Lotka-Volterra mappings

    NASA Astrophysics Data System (ADS)

    Hernández-Bermejo, Benito; Brenig, Léon

    2002-07-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.

  12. An analytical formalism accounting for clouds and other `surfaces' for exoplanet transmission spectroscopy

    NASA Astrophysics Data System (ADS)

    Bétrémieux, Yan; Swain, Mark R.

    2017-05-01

    Although the formalism of Lecavelier des Etangs et al. is extremely useful to understand what shapes transmission spectra of exoplanets, it does not include the effects of a sharp change in flux with altitude generally associated with surfaces and optically thick clouds. Recent advances in understanding the effects of refraction in exoplanet transmission spectra have, however, demonstrated that even clear thick atmospheres have such a sharp change in flux due to a refractive boundary. We derive a more widely applicable analytical formalism by including first-order effects from all these 'surfaces' to compute an exoplanet's effective radius, effective atmospheric thickness and spectral modulation for an atmosphere with a constant scaleheight. We show that the effective radius cannot be located below these 'surfaces' and that our formalism matches the formalism of Lecavelier des Etangs et al. in the case of a clear atmosphere. Our formalism explains why clouds and refraction reduce the contrast of spectral features, and why refraction decreases the Rayleigh scattering slope as wavelength increases, but also shows that these are common effects of all 'surfaces'. We introduce the concept of a 'surface' cross-section, the minimum mean cross-section that can be observed, as an index to characterize the location of 'surfaces' and provide a simple method to estimate their effects on the spectral modulation of homogeneous atmospheres. We finally devise a numerical recipe that extends our formalism to atmospheres with a non-constant scaleheight and arbitrary sources of opacity, a potentially necessary step to interpret observations.

  13. Generalized Bondi-Sachs equations for characteristic formalism of numerical relativity

    NASA Astrophysics Data System (ADS)

    Cao, Zhoujian; He, Xiaokai

    2013-11-01

    The Cauchy formalism of numerical relativity has been successfully applied to simulate various dynamical spacetimes without any symmetry assumption. But discovering how to set a mathematically consistent and physically realistic boundary condition is still an open problem for Cauchy formalism. In addition, the numerical truncation error and finite region ambiguity affect the accuracy of gravitational wave form calculation. As to the finite region ambiguity issue, the characteristic extraction method helps much. But it does not solve all of the above issues. Besides the above problems for Cauchy formalism, the computational efficiency is another problem. Although characteristic formalism of numerical relativity suffers the difficulty from caustics in the inner near zone, it has advantages in relation to all of the issues listed above. Cauchy-characteristic matching (CCM) is a possible way to take advantage of characteristic formalism regarding these issues and treat the inner caustics at the same time. CCM has difficulty treating the gauge difference between the Cauchy part and the characteristic part. We propose generalized Bondi-Sachs equations for characteristic formalism for the Cauchy-characteristic matching end. Our proposal gives out a possible same numerical evolution scheme for both the Cauchy part and the characteristic part. And our generalized Bondi-Sachs equations have one adjustable gauge freedom which can be used to relate the gauge used in the Cauchy part. Then these equations can make the Cauchy part and the characteristic part share a consistent gauge condition. So our proposal gives a possible new starting point for Cauchy-characteristic matching.

  14. HUMAN FECAL SOURCE IDENTIFICATION: REAL-TIME QUANTITATIVE PCR METHOD STANDARDIZATION - abstract

    EPA Science Inventory

    Method standardization or the formal development of a protocol that establishes uniform performance benchmarks and practices is necessary for widespread adoption of a fecal source identification approach. Standardization of a human-associated fecal identification method has been...

  15. UML activity diagrams in requirements specification of logic controllers

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał

    2015-12-01

    Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.

  16. Teaching Astronomy in non-formal education: stars workshop

    NASA Astrophysics Data System (ADS)

    Hernán-Obispo, M.; Crespo-Chacón, I.; Gálvez, M. C.; López-Santiago, J.

    One of the fields in which teaching Astronomy is more demanded is non-formal education. The Stars Workshop we present in this contribution consisted on an introduction to Astronomy and observation methods. The main objectives were: to know the main components of the Universe, their characteristics and the scales of size and time existing between them; to understand the movement of the different celestial objects; to know the different observational techniques; to value the different historical explanations about the Earth and the position of Humanity in the Universe. This Stars Workshop was a collaboration with the Escuela de Tiempo Libre Jumavi, which is a school dedicated to the training and non-formal education in the leisure field.

  17. A Framework for Modeling Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.

  18. The inner mass power spectrum of galaxies using strong gravitational lensing: beyond linear approximation

    NASA Astrophysics Data System (ADS)

    Chatterjee, Saikat; Koopmans, Léon V. E.

    2018-02-01

    In the last decade, the detection of individual massive dark matter sub-haloes has been possible using potential correction formalism in strong gravitational lens imaging. Here, we propose a statistical formalism to relate strong gravitational lens surface brightness anomalies to the lens potential fluctuations arising from dark matter distribution in the lens galaxy. We consider these fluctuations as a Gaussian random field in addition to the unperturbed smooth lens model. This is very similar to weak lensing formalism and we show that in this way we can measure the power spectrum of these perturbations to the potential. We test the method by applying it to simulated mock lenses of different geometries and by performing an MCMC analysis of the theoretical power spectra. This method can measure density fluctuations in early type galaxies on scales of 1-10 kpc at typical rms levels of a per cent, using a single lens system observed with the Hubble Space Telescope with typical signal-to-noise ratios obtained in a single orbit.

  19. E-assessment of prior learning: a pilot study of interactive assessment of staff with no formal education who are working in Swedish elderly care

    PubMed Central

    2014-01-01

    Background The current paper presents a pilot study of interactive assessment using information and communication technology (ICT) to evaluate the knowledge, skills and abilities of staff with no formal education who are working in Swedish elderly care. Methods Theoretical and practical assessment methods were developed and used with simulated patients and computer-based tests to identify strengths and areas for personal development among staff with no formal education. Results Of the 157 staff with no formal education, 87 began the practical and/or theoretical assessments, and 63 completed both assessments. Several of the staff passed the practical assessments, except the morning hygiene assessment, where several failed. Other areas for staff development, i.e. where several failed (>50%), were the theoretical assessment of the learning objectives: Health, Oral care, Ergonomics, hygiene, esthetic, environmental, Rehabilitation, Assistive technology, Basic healthcare and Laws and organization. None of the staff passed all assessments. Number of years working in elderly care and staff age were not statistically significantly related to the total score of grades on the various learning objectives. Conclusion The interactive assessments were useful in assessing staff members’ practical and theoretical knowledge, skills, and abilities and in identifying areas in need of development. It is important that personnel who lack formal qualifications be clearly identified and given a chance to develop their competence through training, both theoretical and practical. The interactive e-assessment approach analyzed in the present pilot study could serve as a starting point. PMID:24742168

  20. 40 CFR 261.20 - General.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... will consider a sample obtained using any of the applicable sampling methods specified in appendix I to... appendix I sampling methods are not being formally adopted by the Administrator, a person who desires to employ an alternative sampling method is not required to demonstrate the equivalency of his method under...

  1. Discrete Fourier transforms of nonuniformly spaced data

    NASA Technical Reports Server (NTRS)

    Swan, P. R.

    1982-01-01

    Time series or spatial series of measurements taken with nonuniform spacings have failed to yield fully to analysis using the Discrete Fourier Transform (DFT). This is due to the fact that the formal DFT is the convolution of the transform of the signal with the transform of the nonuniform spacings. Two original methods are presented for deconvolving such transforms for signals containing significant noise. The first method solves a set of linear equations relating the observed data to values defined at uniform grid points, and then obtains the desired transform as the DFT of the uniform interpolates. The second method solves a set of linear equations relating the real and imaginary components of the formal DFT directly to those of the desired transform. The results of numerical experiments with noisy data are presented in order to demonstrate the capabilities and limitations of the methods.

  2. Non-perturbative background field calculations

    NASA Astrophysics Data System (ADS)

    Stephens, C. R.

    1988-01-01

    New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.

  3. A new method based on the Butler-Volmer formalism to evaluate voltammetric cation and anion sensors.

    PubMed

    Cano, Manuel; Rodríguez-Amaro, Rafael; Fernández Romero, Antonio J

    2008-12-11

    A new method based on the Butler-Volmer formalism is applied to assess the capability of two voltammetric ion sensors based on polypyrrole films: PPy/DBS and PPy/ClO4 modified electrodes were studied as voltammetric cation and anion sensors, respectively. The reversible potential versus electrolyte concentrations semilogarithm plots provided positive calibration slopes for PPy/DBS and negative ones for PPy/ClO4, as was expected from the proposed method and that based on the Nernst equation. The slope expressions deduced from Butler-Volmer include the electron-transfer coefficient, which allows slope values different from the ideal Nernstian value to be explained. Both polymeric films exhibited a degree of ion-selectivity when they were immersed in mixed-analyte solutions. Selectivity coefficients for the two proposed voltammetric cation and anion sensors were obtained by several experimental methods, including the separated solution method (SSM) and matched potential method (MPM). The K values acquired by the different methods were very close for both polymeric sensors.

  4. Spinor helicity methods in high-energy factorization: Efficient momentum-space calculations in the Color Glass Condensate formalism

    NASA Astrophysics Data System (ADS)

    Ayala, Alejandro; Hentschinski, Martin; Jalilian-Marian, Jamal; Tejeda-Yeomans, Maria Elena

    2017-07-01

    We use the spinor helicity formalism to calculate the cross section for production of three partons of a given polarization in Deep Inelastic Scattering (DIS) off proton and nucleus targets at small Bjorken x. The target proton or nucleus is treated as a classical color field (shock wave) from which the produced partons scatter multiple times. We reported our result for the final expression for the production cross section and studied the azimuthal angular correlations of the produced partons in [1]. Here we provide the full details of the calculation of the production cross section using the spinor helicity methods.

  5. The HACMS program: using formal methods to eliminate exploitable bugs

    PubMed Central

    Launchbury, John; Richards, Raymond

    2017-01-01

    For decades, formal methods have offered the promise of verified software that does not have exploitable bugs. Until recently, however, it has not been possible to verify software of sufficient complexity to be useful. Recently, that situation has changed. SeL4 is an open-source operating system microkernel efficient enough to be used in a wide range of practical applications. Its designers proved it to be fully functionally correct, ensuring the absence of buffer overflows, null pointer exceptions, use-after-free errors, etc., and guaranteeing integrity and confidentiality. The CompCert Verifying C Compiler maps source C programs to provably equivalent assembly language, ensuring the absence of exploitable bugs in the compiler. A number of factors have enabled this revolution, including faster processors, increased automation, more extensive infrastructure, specialized logics and the decision to co-develop code and correctness proofs rather than verify existing artefacts. In this paper, we explore the promise and limitations of current formal-methods techniques. We discuss these issues in the context of DARPA’s HACMS program, which had as its goal the creation of high-assurance software for vehicles, including quadcopters, helicopters and automobiles. This article is part of the themed issue ‘Verified trustworthy software systems’. PMID:28871050

  6. The HACMS program: using formal methods to eliminate exploitable bugs.

    PubMed

    Fisher, Kathleen; Launchbury, John; Richards, Raymond

    2017-10-13

    For decades, formal methods have offered the promise of verified software that does not have exploitable bugs. Until recently, however, it has not been possible to verify software of sufficient complexity to be useful. Recently, that situation has changed. SeL4 is an open-source operating system microkernel efficient enough to be used in a wide range of practical applications. Its designers proved it to be fully functionally correct, ensuring the absence of buffer overflows, null pointer exceptions, use-after-free errors, etc., and guaranteeing integrity and confidentiality. The CompCert Verifying C Compiler maps source C programs to provably equivalent assembly language, ensuring the absence of exploitable bugs in the compiler. A number of factors have enabled this revolution, including faster processors, increased automation, more extensive infrastructure, specialized logics and the decision to co-develop code and correctness proofs rather than verify existing artefacts. In this paper, we explore the promise and limitations of current formal-methods techniques. We discuss these issues in the context of DARPA's HACMS program, which had as its goal the creation of high-assurance software for vehicles, including quadcopters, helicopters and automobiles.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Authors.

  7. Efficacy of changing physics misconceptions held by ninth grade students at varying developmental levels through teacher addition of a prediction phase to the learning cycle

    NASA Astrophysics Data System (ADS)

    Oglesby, Michael L.

    This study examines the efficacy in correcting student misconceptions about science concepts by using the pedagogical method of asking students to make a prediction in science laboratory lessons for students within pre-formal, transitional, or formal stages of cognitive development. The subjects were students (n = 235) enrolled in ninth grade physical science classes (n=15) in one high school of an urban profile school district. The four freshmen physical science teachers who were part of the study routinely taught the concepts in the study as a part of the normal curriculum during the time of the school year in which the research was conducted. Classrooms representing approximately half of the students were presented with a prediction phase at the start of each of ten learning cycle lesson. The other classrooms were not presented with a prediction phase. Students were pre and post tested using a 40 question instrument based on the Force Concept Inventory augmented with questions on the concepts taught during the period of the study. Students were also tested using the Test of Scientific Reasoning to determine their cognitive developmental level. Results showed 182 of the students to be cognitively pre-formal, 50 to be transitional, and only 3 to be cognitively formal. There were significantly higher gains (p < .05) for the formal group over the transitional group and for the transitional group over the Pre-formal group. However, there were not significantly higher gains (p > .05) for the total students having a prediction phase compared to those not having a prediction phase. Neither were there significant gains (p > .05) within the pre-formal group or within the transitional group. There were too few students within the formal group for meaningful results.

  8. Access and acceptability of community-based services for older Greek migrants in Australia: user and provider perspectives.

    PubMed

    Hurley, Catherine; Panagiotopoulos, Georgia; Tsianikas, Michael; Newman, Lareen; Walker, Ruth

    2013-03-01

    In most developed nations, ageing migrants represent a growing proportion of the older population. Policies that emphasise care in the community depend on older migrants having access to formal services along with informal support, yet little is known about how older migrants experience community-based formal services. By examining the views of both Greek elders in Australia and those of formal service providers, this research fills an important gap in the literature around access to and acceptability of formal community-based services for older migrants. A research team including two Greek background researchers used existing social groups and a snowball sampling method to conduct face-to-face interviews and focus groups with seventy older Greeks in Adelaide, Australia. In addition, 22 community-based service providers were interviewed over the telephone. Results from users and providers showed that while many older Greeks experience service access issues, they also relied heavily on family for support and assistance at home. Reliance on family was both in preference to formal services or where formal services were used, to locate, negotiate and monitor such services. Common barriers identified by both groups included cost, transport and availability, but additional challenges were posed by language, literacy and cultural attitudes. Demographic changes including greater employment mobility and female workforce participation among adult children will have implications for both formal and informal care providers. Formal service providers need to ensure that services are promoted and delivered to take account of the important role of family in informal support while also addressing the access challenges posed by language and literacy. Research conducted by researchers from the same cultural background in the respondent's native language can further advance knowledge in this area. © 2012 Blackwell Publishing Ltd.

  9. Student Perceptions of Instructional Methods towards Alternative Energy Education

    ERIC Educational Resources Information Center

    Sallee, Clayton W.; Edgar, Don W.; Johnson, Donald M.

    2013-01-01

    The effectiveness of different methods of instruction has been discussed since the early years of formal education systems. Lecture has been deemed the most common method of presenting information to students (Kindsvatter, Wilen, & Ishler, 1992; Waldron & Moore, 1991) and the demonstration method has been symbolized as the most effective…

  10. Equivalence of quantum Boltzmann equation and Kubo formula for dc conductivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Z.B.; Chen, L.Y.

    1990-02-01

    This paper presents a derivation of the quantum Boltzmann equation for linear dc transport with a correction term to Mahan-Hansch's equations and derive a formal solution to it. Based on this formal solution, the authors find the electric conductivity can be expressed as the retarded current-current correlation. Therefore, the authors explicitly demonstrate the equivalence of the two most important theoretical methods: quantum Boltzmann equation and Kubo formula.

  11. Formal modeling of a system of chemical reactions under uncertainty.

    PubMed

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  12. 40 CFR Appendix B to Part 425 - Modified Monier-Williams Method

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Modified Monier-Williams Method B... Appendix B to Part 425—Modified Monier-Williams Method Outline of Method Hydrogen sulfide is liberated from.... Quality Control 1. Each laboratory that uses this method is required to operate a formal quality control...

  13. 40 CFR Appendix B to Part 425 - Modified Monier-Williams Method

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Modified Monier-Williams Method B... Part 425—Modified Monier-Williams Method Outline of Method Hydrogen sulfide is liberated from an.... Quality Control 1. Each laboratory that uses this method is required to operate a formal quality control...

  14. 40 CFR Appendix B to Part 425 - Modified Monier-Williams Method

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Modified Monier-Williams Method B... Part 425—Modified Monier-Williams Method Outline of Method Hydrogen sulfide is liberated from an.... Quality Control 1. Each laboratory that uses this method is required to operate a formal quality control...

  15. Teaching medical students how to teach: a national survey of students-as-teachers programs in U.S. medical schools.

    PubMed

    Soriano, Rainier P; Blatt, Benjamin; Coplit, Lisa; CichoskiKelly, Eileen; Kosowicz, Lynn; Newman, Linnie; Pasquale, Susan J; Pretorius, Richard; Rosen, Jonathan M; Saks, Norma S; Greenberg, Larrie

    2010-11-01

    A number of U.S. medical schools started offering formal students-as-teachers (SAT) training programs to assist medical students in their roles as future teachers. The authors report results of a national survey of such programs in the United States. In 2008, a 23-item survey was sent to 130 MD-granting U.S. schools. Responses to selective choice questions were quantitatively analyzed. Open-ended questions about benefits and barriers to SAT programs were given qualitative analyses. Ninety-nine U.S. schools responded. All used their medical students as teachers, but only 44% offered a formal SAT program. Most (95%) offered formal programs in the senior year. Common teaching strategies included small-group work, lectures, role-playing, and direct observation. Common learning content areas were small-group facilitation, feedback, adult learning principles, and clinical skills teaching. Assessment methods included evaluations from student-learners (72%) and direct observation/videotaping (59%). From the qualitative analysis, benefit themes included development of future physician-educators, enhancement of learning, and teaching assistance for faculty. Obstacles were competition with other educational demands, difficulty in faculty recruitment/retention, and difficulty in convincing others of program value. Formal SAT programs exist for 43 of 99 U.S. medical school respondents. Such programs should be instituted in all schools that use their students as teachers. National teaching competencies, best curriculum methods, and best methods to conduct skills reinforcement need to be determined. Finally, the SAT programs' impacts on patient care, on selection decisions of residency directors, and on residents' teaching effectiveness are areas for future research.

  16. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  17. Using Penelope to assess the correctness of NASA Ada software: A demonstration of formal methods as a counterpart to testing

    NASA Technical Reports Server (NTRS)

    Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey

    1993-01-01

    Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.

  18. Why formal learning theory matters for cognitive science.

    PubMed

    Fulop, Sean; Chater, Nick

    2013-01-01

    This article reviews a number of different areas in the foundations of formal learning theory. After outlining the general framework for formal models of learning, the Bayesian approach to learning is summarized. This leads to a discussion of Solomonoff's Universal Prior Distribution for Bayesian learning. Gold's model of identification in the limit is also outlined. We next discuss a number of aspects of learning theory raised in contributed papers, related to both computational and representational complexity. The article concludes with a description of how semi-supervised learning can be applied to the study of cognitive learning models. Throughout this overview, the specific points raised by our contributing authors are connected to the models and methods under review. Copyright © 2013 Cognitive Science Society, Inc.

  19. Formal design specification of a Processor Interface Unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1992-01-01

    This report describes work to formally specify the requirements and design of a processor interface unit (PIU), a single-chip subsystem providing memory-interface bus-interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. The need for high-quality design assurance in such applications is an undisputed fact, given the disastrous consequences that even a single design flaw can produce. Thus, the further development and application of formal methods to fault-tolerant systems is of critical importance as these systems see increasing use in modern society.

  20. Application of the Extended Completeness Relation to the Absorbing Boundary Condition

    NASA Astrophysics Data System (ADS)

    Iwasaki, Masataka; Otani, Reiji; Ito, Makoto

    The strength function of the linear response by the external field is calculated in the formalism of the absorbing boundary condition (ABC). The dipole excitation of a schematic two-body system is treated in the present study. The extended completeness relation, which is assumed on the analogy of the formulation in the complex scaling method (CSM), is applied to the calculation of the strength function. The calculation of the strength function is successful in the present formalism and hence, the extended completeness relation seems to work well in the ABC formalism. The contributions from the resonance and the non-resonant continuum are also analyzed according to the decomposition of the energy levels in the extended completeness relation.

  1. Ward identity and basis tensor gauge theory at one loop

    NASA Astrophysics Data System (ADS)

    Chung, Daniel J. H.

    2018-06-01

    Basis tensor gauge theory (BTGT) is a reformulation of ordinary gauge theory that is an analog of the vierbein formulation of gravity and is related to the Wilson line formulation. To match ordinary gauge theories coupled to matter, the BTGT formalism requires a continuous symmetry that we call the BTGT symmetry in addition to the ordinary gauge symmetry. After classically interpreting the BTGT symmetry, we construct using the BTGT formalism the Ward identities associated with the BTGT symmetry and the ordinary gauge symmetry. For a way of testing the quantum stability and the consistency of the Ward identities with a known regularization method, we explicitly renormalize the scalar QED at one loop using dimensional regularization using the BTGT formalism.

  2. A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Aoyama, Mikio

    Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.

  3. Towards improving phenotype representation in OWL

    PubMed Central

    2012-01-01

    Background Phenotype ontologies are used in species-specific databases for the annotation of mutagenesis experiments and to characterize human diseases. The Entity-Quality (EQ) formalism is a means to describe complex phenotypes based on one or more affected entities and a quality. EQ-based definitions have been developed for many phenotype ontologies, including the Human and Mammalian Phenotype ontologies. Methods We analyze formalizations of complex phenotype descriptions in the Web Ontology Language (OWL) that are based on the EQ model, identify several representational challenges and analyze potential solutions to address these challenges. Results In particular, we suggest a novel, role-based approach to represent relational qualities such as concentration of iron in spleen, discuss its ontological foundation in the General Formal Ontology (GFO) and evaluate its representation in OWL and the benefits it can bring to the representation of phenotype annotations. Conclusion Our analysis of OWL-based representations of phenotypes can contribute to improving consistency and expressiveness of formal phenotype descriptions. PMID:23046625

  4. [How to write an article: formal aspects].

    PubMed

    Corral de la Calle, M A; Encinas de la Iglesia, J

    2013-06-01

    Scientific research and the publication of the results of the studies go hand in hand. Exquisite research methods can only be adequately reflected in formal publication with the optimum structure. To ensure the success of this process, it is necessary to follow orderly steps, including selecting the journal in which to publish and following the instructions to authors strictly as well as the guidelines elaborated by diverse societies of editors and other institutions. It is also necessary to structure the contents of the article in a logical and attractive way and to use an accurate, clear, and concise style of language. Although not all the authors are directly involved in the actual writing, elaborating a scientific article is a collective undertaking that does not finish until the article is published. This article provides practical advice about formal and not-so-formal details to take into account when writing a scientific article as well as references that will help readers find more information in greater detail. Copyright © 2012 SERAM. Published by Elsevier Espana. All rights reserved.

  5. Interoperability between biomedical ontologies through relation expansion, upper-level ontologies and automatic reasoning.

    PubMed

    Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Rebholz-Schuhmann, Dietrich; Schofield, Paul N; Gkoutos, Georgios V

    2011-01-01

    Researchers design ontologies as a means to accurately annotate and integrate experimental data across heterogeneous and disparate data- and knowledge bases. Formal ontologies make the semantics of terms and relations explicit such that automated reasoning can be used to verify the consistency of knowledge. However, many biomedical ontologies do not sufficiently formalize the semantics of their relations and are therefore limited with respect to automated reasoning for large scale data integration and knowledge discovery. We describe a method to improve automated reasoning over biomedical ontologies and identify several thousand contradictory class definitions. Our approach aligns terms in biomedical ontologies with foundational classes in a top-level ontology and formalizes composite relations as class expressions. We describe the semi-automated repair of contradictions and demonstrate expressive queries over interoperable ontologies. Our work forms an important cornerstone for data integration, automatic inference and knowledge discovery based on formal representations of knowledge. Our results and analysis software are available at http://bioonto.de/pmwiki.php/Main/ReasonableOntologies.

  6. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  7. Which Method of Assigning Bond Orders in Lewis Structures Best Reflects Experimental Data? An Analysis of the Octet Rule and Formal Charge Systems for Period 2 and 3 Nonmetallic Compounds

    ERIC Educational Resources Information Center

    See, Ronald F.

    2009-01-01

    Two systems were evaluated for drawing Lewis structures of period 2 and 3 non-metallic compounds: the octet rule and minimization of formal charge. The test set of molecules consisted of the oxides, halides, oxohalides, oxoanions, and oxoacids of B, N, O, F, Al, P, S, and Cl. Bond orders were quantified using experimental data, including bond…

  8. A Formal Specification and Verification Method for the Prevention of Denial of Service in Ada Services

    DTIC Science & Technology

    1988-03-01

    Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy

  9. Changes in Adolescents’ Receipt of Sex Education, 2006–2013

    PubMed Central

    Lindberg, Laura Duberstein; Maddow-Zimet, Isaac; Boonstra, Heather

    2016-01-01

    Purpose Updated estimates of adolescents’ receipt of sex education are needed to monitor changing access to information. Methods Using nationally representative data from the 2006–2010 and 2011–2013 National Survey of Family Growth, we estimated changes over time in adolescents’ receipt of sex education from formal sources and from parents and differentials in these trends by adolescents’ gender, race/ethnicity, age, and place of residence. Results Between 2006–2010 and 2011–2013, there were significant declines in adolescent females’ receipt of formal instruction about birth control (70% to 60%), saying no to sex (89% to 82%), sexually transmitted disease (94% to 90%), and HIV/AIDS (89% to 86%). There was a significant decline in males’ receipt of instruction about birth control (61% to 55%). Declines were concentrated among adolescents living in nonmetropolitan areas. The proportion of adolescents talking with their parents about sex education topics did not change significantly. Twenty-one percent of females and 35% of males did not receive instruction about methods of birth control from either formal sources or a parent. Conclusions Declines in receipt of formal sex education and low rates of parental communication may leave adolescents without instruction, particularly in nonmetropolitan areas. More effort is needed to understand this decline and to explore adolescents’ potential other sources of reproductive health information. PMID:27032487

  10. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. VIPER project

    NASA Technical Reports Server (NTRS)

    Kershaw, John

    1990-01-01

    The VIPER project has so far produced a formal specification of a 32 bit RISC microprocessor, an implementation of that chip in radiation-hard SOS technology, a partial proof of correctness of the implementation which is still being extended, and a large body of supporting software. The time has now come to consider what has been achieved and what directions should be pursued in the future. The most obvious lesson from the VIPER project was the time and effort needed to use formal methods properly. Most of the problems arose in the interfaces between different formalisms, e.g., between the (informal) English description and the HOL spec, between the block-level spec in HOL and the equivalent in ELLA needed by the low-level CAD tools. These interfaces need to be made rigorous or (better) eliminated. VIPER 1A (the latest chip) is designed to operate in pairs, to give protection against breakdowns in service as well as design faults. We have come to regard redundancy and formal design methods as complementary, the one to guard against normal component failures and the other to provide insurance against the risk of the common-cause failures which bedevil reliability predictions. Any future VIPER chips will certainly need improved performance to keep up with increasingly demanding applications. We have a prototype design (not yet specified formally) which includes 32 and 64 bit multiply, instruction pre-fetch, more efficient interface timing, and a new instruction to allow a quick response to peripheral requests. Work is under way to specify this device in MIRANDA, and then to refine the spec into a block-level design by top-down transformations. When the refinement is complete, a relatively simple proof checker should be able to demonstrate its correctness. This paper is presented in viewgraph form.

  12. Magnetopause Standoff Position Changes and Geosynchronous Orbit Crossings: Models and Observations

    NASA Astrophysics Data System (ADS)

    Collado-Vega, Y. M.; Rastaetter, L.; Sibeck, D. G.

    2017-12-01

    The Earth's magnetopause is the boundary that mostly separates the solar wind with the Earth's magnetosphere. Its location has been studied and estimated via simulation models, observational data and empirical models. This research aims to study the changes of the magnetopause standoff location due to different solar wind conditions using a combination of all the different methods. We will use the Run-On-Request capabilities within the MHD models available from the Community Coordinated Modeling Center (CCMC) at NASA Goddard Space Flight Center, specifically BATS-R-US (SWMF), OpenGGCM, LFM and GUMICS models. The magnetopause standoff position prediction and response time to the solar wind changes will then be compared to results from available empirical models (e.g. Shue et al. 1998), and to THEMIS, Cluster, Geotail and MMS missions magnetopause crossing observations. We will also use times of extreme solar wind conditions where magnetopause crossings have been observed by the GOES satellites. Rigorous analysis/comparison of observations and empirical models is critical in determining magnetosphere dynamics for model validation. This research goes also hand in hand with the efforts of the working group at the CCMC/LWS International Forum for Space Weather Capabilities Assessment workshop that aims to analyze different events to define metrics for model-data comparison. Preliminary results of this particular research show that there are some discrepancies between the MHD models standoff positions of the dayside magnetopause for the same solar wind conditions that include an increase in solar wind dynamic pressure and a step function in the IMF Bz component. In cases of nominal solar wind conditions, it has been observed that the models do mostly agree with the observational data from the different satellite missions.

  13. Theory of the dynamical thermal conductivity of metals

    NASA Astrophysics Data System (ADS)

    Bhalla, Pankaj; Kumar, Pradeep; Das, Nabyendu; Singh, Navinder

    2016-09-01

    The Mori's projection method, known as the memory function method, is an important theoretical formalism to study various transport coefficients. In the present work, we calculate the dynamical thermal conductivity in the case of metals using the memory function formalism. We introduce thermal memory functions for the first time and discuss the behavior of thermal conductivity in both the zero frequency limit and in the case of nonzero frequencies. We compare our results for the zero frequency case with the results obtained by the Bloch-Boltzmann kinetic approach and find that both approaches agree with each other. Motivated by some recent experimental advancements, we obtain several new results for the ac or the dynamical thermal conductivity.

  14. Communication: GAIMS—Generalized Ab Initio Multiple Spawning for both internal conversion and intersystem crossing processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curchod, Basile F. E.; Martínez, Todd J., E-mail: toddjmartinez@gmail.com; SLAC National Accelerator Laboratory, Menlo Park, California 94025

    2016-03-14

    Full multiple spawning is a formally exact method to describe the excited-state dynamics of molecular systems beyond the Born-Oppenheimer approximation. However, it has been limited until now to the description of radiationless transitions taking place between electronic states with the same spin multiplicity. This Communication presents a generalization of the full and ab initio multiple spawning methods to both internal conversion (mediated by nonadiabatic coupling terms) and intersystem crossing events (triggered by spin-orbit coupling matrix elements) based on a spin-diabatic representation. The results of two numerical applications, a model system and the deactivation of thioformaldehyde, validate the presented formalism andmore » its implementation.« less

  15. Thin-film limit formalism applied to surface defect absorption.

    PubMed

    Holovský, Jakub; Ballif, Christophe

    2014-12-15

    The thin-film limit is derived by a nonconventional approach and equations for transmittance, reflectance and absorptance are presented in highly versatile and accurate form. In the thin-film limit the optical properties do not depend on the absorption coefficient, thickness and refractive index individually, but only on their product. We show that this formalism is applicable to the problem of ultrathin defective layer e.g. on a top of a layer of amorphous silicon. We develop a new method of direct evaluation of the surface defective layer and the bulk defects. Applying this method to amorphous silicon on glass, we show that the surface defective layer differs from bulk amorphous silicon in terms of light soaking.

  16. Formal concept analysis with background knowledge: a case study in paleobiological taxonomy of belemnites

    NASA Astrophysics Data System (ADS)

    Belohlavek, Radim; Kostak, Martin; Osicka, Petr

    2013-05-01

    We present a case study in identification of taxa in paleobiological data. Our approach utilizes formal concept analysis and is based on conceiving a taxon as a group of individuals sharing a collection of attributes. In addition to the incidence relation between individuals and their attributes, the method uses expert background knowledge regarding importance of attributes which helps to filter out correctly formed but paleobiologically irrelevant taxa. We present results of experiments carried out with belemnites-a group of extinct cephalopods which seems particularly suitable for such a purpose. We demonstrate that the methods are capable of revealing taxa and relationships among them that are relevant from a paleobiological point of view.

  17. NASA plan for international crustal dynamics studies

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The international activities being planned as part of the NASA geodynamics program are described. Methods of studying the Earth's crustal movements and deformation characteristics are discussed. The significance of the eventual formalations of earthquake predictions methods is also discussed.

  18. Three methods of interfacing with the private sector by mental health agencies.

    PubMed

    McRae, J A

    1989-01-01

    This article outlines three methods of mental health marketing--formal, intermediary, and interactive. It discusses advantages and disadvantages of each method. These approaches are particularly good for public, non-profit agencies and individuals in contacting the private sector. The need for flexibility and marketing mix is emphasized.

  19. An evaluation of the current state of genomic data privacy protection technology and a roadmap for the future.

    PubMed

    Malin, Bradley A

    2005-01-01

    The incorporation of genomic data into personal medical records poses many challenges to patient privacy. In response, various systems for preserving patient privacy in shared genomic data have been developed and deployed. Although these systems de-identify the data by removing explicit identifiers (e.g., name, address, or Social Security number) and incorporate sound security design principles, they suffer from a lack of formal modeling of inferences learnable from shared data. This report evaluates the extent to which current protection systems are capable of withstanding a range of re-identification methods, including genotype-phenotype inferences, location-visit patterns, family structures, and dictionary attacks. For a comparative re-identification analysis, the systems are mapped to a common formalism. Although there is variation in susceptibility, each system is deficient in its protection capacity. The author discovers patterns of protection failure and discusses several of the reasons why these systems are susceptible. The analyses and discussion within provide guideposts for the development of next-generation protection methods amenable to formal proofs.

  20. An Evaluation of the Current State of Genomic Data Privacy Protection Technology and a Roadmap for the Future

    PubMed Central

    Malin, Bradley A.

    2005-01-01

    The incorporation of genomic data into personal medical records poses many challenges to patient privacy. In response, various systems for preserving patient privacy in shared genomic data have been developed and deployed. Although these systems de-identify the data by removing explicit identifiers (e.g., name, address, or Social Security number) and incorporate sound security design principles, they suffer from a lack of formal modeling of inferences learnable from shared data. This report evaluates the extent to which current protection systems are capable of withstanding a range of re-identification methods, including genotype–phenotype inferences, location–visit patterns, family structures, and dictionary attacks. For a comparative re-identification analysis, the systems are mapped to a common formalism. Although there is variation in susceptibility, each system is deficient in its protection capacity. The author discovers patterns of protection failure and discusses several of the reasons why these systems are susceptible. The analyses and discussion within provide guideposts for the development of next-generation protection methods amenable to formal proofs. PMID:15492030

  1. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  2. Methods for converging correlation energies within the dielectric matrix formalism

    NASA Astrophysics Data System (ADS)

    Dixit, Anant; Claudot, Julien; Gould, Tim; Lebègue, Sébastien; Rocca, Dario

    2018-03-01

    Within the dielectric matrix formalism, the random-phase approximation (RPA) and analogous methods that include exchange effects are promising approaches to overcome some of the limitations of traditional density functional theory approximations. The RPA-type methods however have a significantly higher computational cost, and, similarly to correlated quantum-chemical methods, are characterized by a slow basis set convergence. In this work we analyzed two different schemes to converge the correlation energy, one based on a more traditional complete basis set extrapolation and one that converges energy differences by accounting for the size-consistency property. These two approaches have been systematically tested on the A24 test set, for six points on the potential-energy surface of the methane-formaldehyde complex, and for reaction energies involving the breaking and formation of covalent bonds. While both methods converge to similar results at similar rates, the computation of size-consistent energy differences has the advantage of not relying on the choice of a specific extrapolation model.

  3. Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions

    NASA Astrophysics Data System (ADS)

    Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus

    2017-10-01

    We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.

  4. Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems

    NASA Astrophysics Data System (ADS)

    Igaki, Hiroshi; Nakamura, Masahide

    This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.

  5. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  6. ``Dressing'' lines and vertices in calculations of matrix elements with the coupled-cluster method and determination of Cs atomic properties

    NASA Astrophysics Data System (ADS)

    Derevianko, Andrei; Porsev, Sergey G.

    2005-03-01

    We consider evaluation of matrix elements with the coupled-cluster method. Such calculations formally involve infinite number of terms and we devise a method of partial summation (dressing) of the resulting series. Our formalism is built upon an expansion of the product C†C of cluster amplitudes C into a sum of n -body insertions. We consider two types of insertions: particle (hole) line insertion and two-particle (two-hole) random-phase-approximation-like insertion. We demonstrate how to “dress” these insertions and formulate iterative equations. We illustrate the dressing equations in the case when the cluster operator is truncated at single and double excitations. Using univalent systems as an example, we upgrade coupled-cluster diagrams for matrix elements with the dressed insertions and highlight a relation to pertinent fourth-order diagrams. We illustrate our formalism with relativistic calculations of the hyperfine constant A(6s) and the 6s1/2-6p1/2 electric-dipole transition amplitude for the Cs atom. Finally, we augment the truncated coupled-cluster calculations with otherwise omitted fourth order diagrams. The resulting analysis for Cs is complete through the fourth order of many-body perturbation theory and reveals an important role of triple and disconnected quadruple excitations.

  7. Ensemble Kalman filtering in presence of inequality constraints

    NASA Astrophysics Data System (ADS)

    van Leeuwen, P. J.

    2009-04-01

    Kalman filtering is presence of constraints is an active area of research. Based on the Gaussian assumption for the probability-density functions, it looks hard to bring in extra constraints in the formalism. On the other hand, in geophysical systems we often encounter constraints related to e.g. the underlying physics or chemistry, which are violated by the Gaussian assumption. For instance, concentrations are always non-negative, model layers have non-negative thickness, and sea-ice concentration is between 0 and 1. Several methods to bring inequality constraints into the Kalman-filter formalism have been proposed. One of them is probability density function (pdf) truncation, in which the Gaussian mass from the non-allowed part of the variables is just equally distributed over the pdf where the variables are alolwed, as proposed by Shimada et al. 1998. However, a problem with this method is that the probability that e.g. the sea-ice concentration is zero, is zero! The new method proposed here does not have this drawback. It assumes that the probability-density function is a truncated Gaussian, but the truncated mass is not distributed equally over all allowed values of the variables, but put into a delta distribution at the truncation point. This delta distribution can easily be handled with in Bayes theorem, leading to posterior probability density functions that are also truncated Gaussians with delta distributions at the truncation location. In this way a much better representation of the system is obtained, while still keeping most of the benefits of the Kalman-filter formalism. In the full Kalman filter the formalism is prohibitively expensive in large-scale systems, but efficient implementation is possible in ensemble variants of the kalman filter. Applications to low-dimensional systems and large-scale systems will be discussed.

  8. Binary variable multiple-model multiple imputation to address missing data mechanism uncertainty: Application to a smoking cessation trial

    PubMed Central

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M.; Hedeker, Donald

    2014-01-01

    The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables that formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software. PMID:24634315

  9. Spectral analysis of the UFBG-based acousto—optical modulator in V-I transmission matrix formalism

    NASA Astrophysics Data System (ADS)

    Wu, Liang-Ying; Pei, Li; Liu, Chao; Wang, Yi-Qun; Weng, Si-Jun; Wang, Jian-Shuai

    2014-11-01

    In this study, the V-I transmission matrix formalism (V-I method) is proposed to analyze the spectrum characteristics of the uniform fiber Bragg grating (FBG)-based acousto—optic modulators (UFBG-AOM). The simulation results demonstrate that both the amplitude of the acoustically induced strain and the frequency of the acoustic wave (AW) have an effect on the spectrum. Additionally, the wavelength spacing between the primary reflectivity peak and the secondary reflectivity peak is proportional to the acoustic frequency with the ratio 0.1425 nm/MHz. Meanwhile, we compare the amount of calculation. For the FBG whose period is M, the calculation of the V-I method is 4 × (2M-1) in addition/subtraction, 8 × (2M - 1) in multiply/division and 2M in exponent arithmetic, which is almost a quarter of the multi-film method and transfer matrix (TM) method. The detailed analysis indicates that, compared with the conventional multi-film method and transfer matrix (TM) method, the V-I method is faster and less complex.

  10. Examining Variation in Mental Models of Influence and Leadership Among Nursing Leaders and Direct Care Nurses.

    PubMed

    Weaver, Sallie J; Mossburg, Sarah E; Pillari, MarieSarah; Kent, Paula S; Daugherty Biddison, Elizabeth Lee

    This study explored similarities and differences in the views on team membership and leadership held by nurses in formal unit leadership positions and direct care nurses. We used a mixed-methods approach and a maximum variance sampling strategy, sampling from units with both high and low safety behaviors and safety culture scores. We identified several key differences in mental models of care team membership and leadership between formal leaders and direct care nurses that warrant further exploration.

  11. Synthesis of Imidazopyridines via Copper-Catalyzed, Formal Aza-[3 + 2] Cycloaddition Reaction of Pyridine Derivatives with α-Diazo Oxime Ethers.

    PubMed

    Park, Sangjune; Kim, Hyunseok; Son, Jeong-Yu; Um, Kyusik; Lee, Sooho; Baek, Yonghyeon; Seo, Boram; Lee, Phil Ho

    2017-10-06

    The Cu-catalyzed, formal aza-[3 + 2] cycloaddition reaction of pyridine derivatives with α-diazo oxime ethers in trifluoroethanol was used to synthesize imidazopyridines via the release of molecular nitrogen and elimination of alcohol. These methods enabled modular synthesis of a wide range of N-heterobicyclic compounds such as imidazopyridazines, imidazopyrimidines, and imidazopyrazines with an α-imino Cu-carbenoid generated from the α-diazo oxime ethers and copper.

  12. Crisis leadership in an acute clinical setting: christchurch hospital, new zealand ICU experience following the february 2011 earthquake.

    PubMed

    Zhuravsky, Lev

    2015-04-01

    On Tuesday, February 22, 2011, a 6.3 magnitude earthquake struck Christchurch, New Zealand. This qualitative study explored the intensive care units (ICUs) staff experiences and adopted leadership approaches to manage a large-scale crisis resulting from the city-wide disaster. To date, there have been a very small number of research publications to provide a comprehensive overview of crisis leadership from the perspective of multi-level interactions among staff members in the acute clinical environment during the process of the crisis management. The research was qualitative in nature. Participants were recruited into the study through purposive sampling. A semi-structured, audio-taped, personal interview method was chosen as a single data collection method for this study. This study employed thematic analysis. Formal team leadership refers to the actions undertaken by a team leader to ensure the needs and goals of the team are met. Three core, formal, crisis-leadership themes were identified: decision making, ability to remain calm, and effective communication. Informal leaders are those individuals who exert significant influence over other members in the group to which they belong, although no formal authority has been assigned to them. Four core, informal, crisis-leadership themes were identified: motivation to lead, autonomy, emotional leadership, and crisis as opportunity. Shared leadership is a dynamic process among individuals in groups for which the objective is to lead one another to the achievement of group or organizational goals. Two core, shared-leadership themes were identified: shared leadership within formal medical and nursing leadership groups, and shared leadership between formal and informal leaders in the ICU. The capabilities of formal leaders all contributed to the overall management of a crisis. Informal leaders are a very cohesive group of motivated people who can make a substantial contribution and improve overall team performance in a crisis. While in many ways the research on shared leadership in a crisis is still in its early stages of development, there are some clear benefits from adopting this leadership approach in the management of complex crises. This study may be useful to the development of competency-based training programs for formal leaders, process improvements in fostering and supporting informal leaders, and it makes important contributions to a growing body of research of shared and collective leadership in crisis.

  13. Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B

    NASA Technical Reports Server (NTRS)

    Yeganefard, Sanaz; Butler, Michael; Rezazadeh, Abdolbaghi

    2010-01-01

    Recently a set of guidelines, or cookbook, has been developed for modelling and refinement of control problems in Event-B. The Event-B formal method is used for system-level modelling by defining states of a system and events which act on these states. It also supports refinement of models. This cookbook is intended to systematize the process of modelling and refining a control problem system by distinguishing environment, controller and command phenomena. Our main objective in this paper is to investigate and evaluate the usefulness and effectiveness of this cookbook by following it throughout the formal modelling of cruise control system found in cars. The outcomes are identifying the benefits of the cookbook and also giving guidance to its future users.

  14. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  15. Adiabatic invariance with first integrals of motion

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.

    2002-10-01

    The construction of a microthermodynamic formalism for isolated systems based on the concept of adiabatic invariance is an old but seldom appreciated effort in the literature, dating back at least to P. Hertz [Ann. Phys. (Leipzig) 33, 225 (1910)]. An apparently independent extension of such formalism for systems bearing additional first integrals of motion was recently proposed by Hans H. Rugh [Phys. Rev. E 64, 055101 (2001)], establishing the concept of adiabatic invariance even in such singular cases. After some remarks in connection with the formalism pioneered by Hertz, it will be suggested that such an extension can incidentally explain the success of a dynamical method for computing the entropy of classical interacting fluids, at least in some potential applications where the presence of additional first integrals cannot be ignored.

  16. Breaking the theoretical scaling limit for predicting quasiparticle energies: the stochastic GW approach.

    PubMed

    Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi

    2014-08-15

    We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.

  17. Revisiting the radio interferometer measurement equation. I. A full-sky Jones formalism

    NASA Astrophysics Data System (ADS)

    Smirnov, O. M.

    2011-03-01

    Context. Since its formulation by Hamaker et al., the radio interferometer measurement equation (RIME) has provided a rigorous mathematical basis for the development of novel calibration methods and techniques, including various approaches to the problem of direction-dependent effects (DDEs). However, acceptance of the RIME in the radio astronomical community at large has been slow, which is partially due to the limited availability of software to exploit its power, and the sparsity of practical results. This needs to change urgently. Aims: This series of papers aims to place recent developments in the treatment of DDEs into one RIME-based mathematical framework, and to demonstrate the ease with which the various effects can be described and understood. It also aims to show the benefits of a RIME-based approach to calibration. Methods: Paper I re-derives the RIME from first principles, extends the formalism to the full-sky case, and incorporates DDEs. Paper II then uses the formalism to describe self-calibration, both with a full RIME, and with the approximate equations of older software packages, and shows how this is affected by DDEs. It also gives an overview of real-life DDEs and proposed methods of dealing with them. Finally, in Paper III some of these methods are exercised to achieve an extremely high-dynamic range calibration of WSRT observations of 3C 147 at 21 cm, with full treatment of DDEs. Results: The RIME formalism is extended to the full-sky case (Paper I), and is shown to be an elegant way of describing calibration and DDEs (Paper II). Applying this to WSRT data (Paper III) results in a noise-limited image of the field around 3C 147 with a very high dynamic range (1.6 million), and none of the off-axis artifacts that plague regular selfcal. The resulting differential gain solutions contain significant information on DDEs and errors in the sky model. Conclusions: The RIME is a powerful formalism for describing radio interferometry, and underpins the development of novel calibration methods, in particular those dealing with DDEs. One of these is the differential gains approach used for the 3C 147 reduction. Differential gains can eliminate DDE-related artifacts, and provide information for iterative improvements of sky models. Perhaps most importantly, sources as faint as 2 mJy have been shown to yield meaningful differential gain solutions, and thus can be used as potential calibration beacons in other DDE-related schemes.

  18. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  19. On the Use of Low-Cost Radar Networks for Collision Warning Systems Aboard Dumpers

    PubMed Central

    González-Partida, José-Tomás; León-Infante, Francisco; Blázquez-García, Rodrigo; Burgos-García, Mateo

    2014-01-01

    The use of dumpers is one of the main causes of accidents in construction sites, many of them with fatal consequences. These kinds of work machines have many blind angles that complicate the driving task due to their large size and volume. To guarantee safety conditions is necessary to use automatic aid systems that can detect and locate the different objects and people in a work area. One promising solution is a radar network based on low-cost radar transceivers aboard the dumper. The complete system is specified to operate with a very low false alarm rate to avoid unnecessary stops of the dumper that reduce its productivity. The main sources of false alarm are the heavy ground clutter, and the interferences between the radars of the network. This article analyses the clutter for LFM signaling and proposes the use of Offset Linear Frequency Modulated Continuous Wave (OLFM-CW) as radar signal. This kind of waveform can be optimized to reject clutter and self-interferences. Jointly, a data fusion chain could be used to reduce the false alarm rate of the complete radar network. A real experiment is shown to demonstrate the feasibility of the proposed system. PMID:24577521

  20. Plasma sheet low-entropy flow channels and dipolarization fronts from macro to micro scales: Global MHD and PIC simulations

    NASA Astrophysics Data System (ADS)

    Merkin, V. G.; Wiltberger, M. J.; Sitnov, M. I.; Lyon, J.

    2016-12-01

    Observations show that much of plasma and magnetic flux transport in the magnetotail occurs in the form of discrete activations such as bursty bulk flows (BBFs). These flow structures are typically associated with strong peaks of the Z-component of the magnetic field normal to the magnetotail current sheet (dipolarization fronts, DFs), as well as density and flux tube entropy depletions also called plasma bubbles. Extensive observational analysis of these structures has been carried out using data from Geotail spacecraft and more recently from Cluster, THEMIS, and MMS multi-probe missions. Global magnetohydrodynamic (MHD) simulations of the magnetosphere reveal similar plasma sheet flow bursts, in agreement with regional MHD and particle-in-cell (PIC) models. We present results of high-resolution simulations using the Lyon-Fedder-Mobarry (LFM) global MHD model and analyze the properties of the bursty flows including their structure and evolution as they propagate from the mid-tail region into the inner magnetosphere. We highlight similarities and differences with the corresponding observations and discuss comparative properties of plasma bubbles and DFs in our global MHD simulations with their counterparts in 3D PIC simulations.

  1. On the use of low-cost radar networks for collision warning systems aboard dumpers.

    PubMed

    González-Partida, José-Tomás; León-Infante, Francisco; Blázquez-García, Rodrigo; Burgos-García, Mateo

    2014-02-26

    The use of dumpers is one of the main causes of accidents in construction sites, many of them with fatal consequences. These kinds of work machines have many blind angles that complicate the driving task due to their large size and volume. To guarantee safety conditions is necessary to use automatic aid systems that can detect and locate the different objects and people in a work area. One promising solution is a radar network based on low-cost radar transceivers aboard the dumper. The complete system is specified to operate with a very low false alarm rate to avoid unnecessary stops of the dumper that reduce its productivity. The main sources of false alarm are the heavy ground clutter, and the interferences between the radars of the network. This article analyses the clutter for LFM signaling and proposes the use of Offset Linear Frequency Modulated Continuous Wave (OLFM-CW) as radar signal. This kind of waveform can be optimized to reject clutter and self-interferences. Jointly, a data fusion chain could be used to reduce the false alarm rate of the complete radar network. A real experiment is shown to demonstrate the feasibility of the proposed system.

  2. A modified fluid percussion device.

    PubMed

    Yamaki, T; Murakami, N; Iwamoto, Y; Yoshino, E; Nakagawa, Y; Ueda, S; Horikawa, J; Tsujii, T

    1994-10-01

    This report examines a modified fluid percussion device with specific improvements made to address deficiencies found in previously reported devices. These improvements include the use of a cylindrical saline reservoir made of stainless steel, placement of the reservoir in a 15-degree head-up position for the easy release of air bubbles, placement of the fluid flushing outlet and the pressure transducer close to the piston on the same plane, with both perpendicular to the direction of the piston, and adjustable reservoir volume to vary the waveform of the pressure pulse, and a metallic central injury screw secured to the animal's skull over the exposed dura. Using this device, midline fluid percussion (MFP) and lateral fluid percussion (LFP) injuries were performed in 70 rats. Histopathologic findings included diffuse axonal injury in the MFP model and cortical contusion in the LFP model. Survival rate was 41.4% in MFP animals and 100% in LFM animals when the device settings were 178 mm3 of the cylindrical reservoir and 50 degrees-60 degrees in height of the pendulum. Our results suggest that this modified fluid percussion device may offer significant improvements over previously reported fluid percussion models for use in experimental head injury.

  3. Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models

    NASA Astrophysics Data System (ADS)

    Rigler, E. J.; Wiltberger, M. J.; Love, J. J.

    2017-12-01

    Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.

  4. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  5. Formalization of the engineering science discipline - knowledge engineering

    NASA Astrophysics Data System (ADS)

    Peng, Xiao

    Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.

  6. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  7. Visualizing Matrix Multiplication

    ERIC Educational Resources Information Center

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  8. Development and Evaluation of an Ontology for Guiding Appropriate Antibiotic Prescribing

    PubMed Central

    Furuya, E. Yoko; Kuperman, Gilad J.; Cimino, James J.; Bakken, Suzanne

    2011-01-01

    Objectives To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. Methods We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. Results The ontology includes 199 classes, 10 properties, and 1,636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: 1) antibiotic-microorganism mismatch alert; 2) medication-allergy alert; and 3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. Conclusions This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component—a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. PMID:22019377

  9. Spin formalism and applications to new physics searches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haber, H.E.

    1994-12-01

    An introduction to spin techniques in particle physics is given. Among the topics covered are: helicity formalism and its applications to the decay and scattering of spin-1/2 and spin-1 particles, techniques for evaluating helicity amplitudes (including projection operator methods and the spinor helicity method), and density matrix techniques. The utility of polarization and spin correlations for untangling new physics beyond the Standard Model at future colliders such as the LHC and a high energy e{sup +}e{sup {minus}} linear collider is then considered. A number of detailed examples are explored including the search for low-energy supersymmetry, a non-minimal Higgs boson sector,more » and new gauge bosons beyond the W{sup {+-}} and Z.« less

  10. "Debate" Learning Method and Its Implications for the Formal Education System

    ERIC Educational Resources Information Center

    Najafi, Mohammad; Motaghi, Zohre; Nasrabadi, Hassanali Bakhtiyar; Heshi, Kamal Nosrati

    2016-01-01

    Regarding the importance of enhancement in learner's social skills, especially in learning process, this study tries to introduce one of the group learning programs entitled "debate" as a teaching method in Iran religious universities. It also considers the concept and the history of this method by qualitative and descriptive-analytical…

  11. Contrasting lexical similarity and formal definitions in SNOMED CT: consistency and implications.

    PubMed

    Agrawal, Ankur; Elhanan, Gai

    2014-02-01

    To quantify the presence of and evaluate an approach for detection of inconsistencies in the formal definitions of SNOMED CT (SCT) concepts utilizing a lexical method. Utilizing SCT's Procedure hierarchy, we algorithmically formulated similarity sets: groups of concepts with similar lexical structure of their fully specified name. We formulated five random samples, each with 50 similarity sets, based on the same parameter: number of parents, attributes, groups, all the former as well as a randomly selected control sample. All samples' sets were reviewed for types of formal definition inconsistencies: hierarchical, attribute assignment, attribute target values, groups, and definitional. For the Procedure hierarchy, 2111 similarity sets were formulated, covering 18.1% of eligible concepts. The evaluation revealed that 38 (Control) to 70% (Different relationships) of similarity sets within the samples exhibited significant inconsistencies. The rate of inconsistencies for the sample with different relationships was highly significant compared to Control, as well as the number of attribute assignment and hierarchical inconsistencies within their respective samples. While, at this time of the HITECH initiative, the formal definitions of SCT are only a minor consideration, in the grand scheme of sophisticated, meaningful use of captured clinical data, they are essential. However, significant portion of the concepts in the most semantically complex hierarchy of SCT, the Procedure hierarchy, are modeled inconsistently in a manner that affects their computability. Lexical methods can efficiently identify such inconsistencies and possibly allow for their algorithmic resolution. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. ESD and lifelong learning: a case study of the Shangri-la Institute's current engagement with the Bazhu community in Diqing, China

    NASA Astrophysics Data System (ADS)

    Liu, Yunhua; Constable, Alicia

    2010-06-01

    This article argues that ESD should be integrated into lifelong learning and provides an example of how this might be done. It draws on a case study of a joint project between the Shangri-la Institute and the Bazhu community in Diqing, southwest China, to analyse a community-based approach to Education for Sustainable Development and assess its implications for lifelong learning. The article examines the different knowledge, skills and values needed for ESD across the life span and asserts the need for these competencies to be informed by the local context. The importance of linking ESD with local culture and indigenous knowledge is emphasised. The article goes on to propose methods for integrating ESD into lifelong learning and underscore the need for learning at the individual, institutional and societal levels in formal, non-formal and informal learning settings. It calls for institutional changes that link formal, non-formal and informal learning through the common theme of ESD, and establish platforms to share experiences, reflect on these and thereby continually improve ESD.

  13. BRST Quantization of the Proca Model Based on the BFT and the BFV Formalism

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Wan; Park, Mu-In; Park, Young-Jai; Yoon, Sean J.

    The BRST quantization of the Abelian Proca model is performed using the Batalin-Fradkin-Tyutin and the Batalin-Fradkin-Vilkovisky formalism. First, the BFT Hamiltonian method is applied in order to systematically convert a second class constraint system of the model into an effectively first class one by introducing new fields. In finding the involutive Hamiltonian we adopt a new approach which is simpler than the usual one. We also show that in our model the Dirac brackets of the phase space variables in the original second class constraint system are exactly the same as the Poisson brackets of the corresponding modified fields in the extended phase space due to the linear character of the constraints comparing the Dirac or Faddeev-Jackiw formalisms. Then, according to the BFV formalism we obtain that the desired resulting Lagrangian preserving BRST symmetry in the standard local gauge fixing procedure naturally includes the Stückelberg scalar related to the explicit gauge symmetry breaking effect due to the presence of the mass term. We also analyze the nonstandard nonlocal gauge fixing procedure.

  14. Consumer experience of formal crisis-response services and preferred methods of crisis intervention.

    PubMed

    Boscarato, Kara; Lee, Stuart; Kroschel, Jon; Hollander, Yitzchak; Brennan, Alice; Warren, Narelle

    2014-08-01

    The manner in which people with mental illness are supported in a crisis is crucial to their recovery. The current study explored mental health consumers' experiences with formal crisis services (i.e. police and crisis assessment and treatment (CAT) teams), preferred crisis supports, and opinions of four collaborative interagency response models. Eleven consumers completed one-on-one, semistructured interviews. The results revealed that the perceived quality of previous formal crisis interventions varied greatly. Most participants preferred family members or friends to intervene. However, where a formal response was required, general practitioners and mental health case managers were preferred; no participant wanted a police response, and only one indicated a preference for CAT team assistance. Most participants welcomed collaborative crisis interventions. Of four collaborative interagency response models currently being trialled internationally, participants most strongly supported the Ride-Along Model, which enables a police officer and a mental health clinician to jointly respond to distressed consumers in the community. The findings highlight the potential for an interagency response model to deliver a crisis response aligned with consumers' preferences. © 2014 Australian College of Mental Health Nurses Inc.

  15. Adolescent Learning in the Zoo: Embedding a Non-Formal Learning Environment to Teach Formal Aspects of Vertebrate Biology

    NASA Astrophysics Data System (ADS)

    Randler, Christoph; Kummer, Barbara; Wilhelm, Christian

    2012-06-01

    The aim of this study was to assess the outcome of a zoo visit in terms of learning and retention of knowledge concerning the adaptations and behavior of vertebrate species. Basis of the work was the concept of implementing zoo visits as an out-of-school setting for formal, curriculum based learning. Our theoretical framework centers on the self-determination theory, therefore, we used a group-based, hands-on learning environment. To address this questions, we used a treatment—control design (BACI) with different treatments and a control group. Pre-, post- and retention tests were applied. All treatments led to a substantial increase of learning and retention knowledge compared to the control group. Immediately after the zoo visit, the zoo-guide tour provided the highest scores, while after a delay of 6 weeks, the learner-centered environment combined with a teacher-guided summarizing scored best. We suggest incorporating the zoo as an out-of-school environment into formal school learning, and we propose different methods to improve learning in zoo settings.

  16. A Comparative Framework for Studying the Histories of the Humanities and Science.

    PubMed

    Bod, Rens

    2015-06-01

    While the humanities and the sciences have a closely connected history, there are no general histories that bring the two fields together on an equal footing. This paper argues that there is a level at which some humanistic and scientific disciplines can be brought under a common denominator and compared. This is at the level of underlying methods, especially at the level of formalisms and rule systems used by different disciplines. The essay formally compares linguistics and computer science by noting that the same grammar formalism was used in the 1950s for describing both human and. programming languages. Additionally, it examines the influence of philology on molecular biology, and vice versa, by recognizing that the tree-formalism and rule system used for text reconstruction was also employed in DNA genetics. It also shows that rule systems for source criticism in history are used in forensic science, evidence-based medicine, and jurisprudence. This paper thus opens up a new comparative approach within which the histories of the humanities and the sciences can be examined on a common level.

  17. The Effective-One-Body Approach to the General Relativistic Two Body Problem

    NASA Astrophysics Data System (ADS)

    Damour, Thibault; Nagar, Alessandro

    The two-body problem in General Relativity has been the subject of many analytical investigations. After reviewing some of the methods used to tackle this problem (and, more generally, the N-body problem), we focus on a new, recently introduced approach to the motion and radiation of (comparable mass) binary systems: the Effective One Body (EOB) formalism. We review the basic elements of this formalism, and discuss some of its recent developments. Several recent comparisons between EOB predictions and Numerical Relativity (NR) simulations have shown the aptitude of the EOB formalism to provide accurate descriptions of the dynamics and radiation of various binary systems (comprising black holes or neutron stars) in regimes that are inaccessible to other analytical approaches (such as the last orbits and the merger of comparable mass black holes). In synergy with NR simulations, post-Newtonian (PN) theory and Gravitational Self-Force (GSF) computations, the EOB formalism is likely to provide an efficient way of computing the very many accurate template waveforms that are needed for Gravitational Wave (GW) data analysis purposes.

  18. {ital R}-matrix theory, formal Casimirs and the periodic Toda lattice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morosi, C.; Pizzocchero, L.

    The nonunitary {ital r}-matrix theory and the associated bi- and triHamiltonian schemes are considered. The language of Poisson pencils and of their formal Casimirs is applied in this framework to characterize the biHamiltonian chains of integrals of motion, pointing out the role of the Schur polynomials in these constructions. This formalism is subsequently applied to the periodic Toda lattice. Some different algebraic settings and Lax formulations proposed in the literature for this system are analyzed in detail, and their full equivalence is exploited. In particular, the equivalence between the loop algebra approach and the method of differential-difference operators is illustrated;more » moreover, two alternative Lax formulations are considered, and appropriate reduction algorithms are found in both cases, allowing us to derive the multiHamiltonian formalism from {ital r}-matrix theory. The systems of integrals for the periodic Toda lattice known after Flaschka and H{acute e}non, and their functional relations, are recovered through systematic application of the previously outlined schemes. {copyright} {ital 1996 American Institute of Physics.}« less

  19. Calculation of Scattering Amplitude Without Partial Analysis. II; Inclusion of Exchange

    NASA Technical Reports Server (NTRS)

    Temkin, Aaron; Shertzer, J.; Fisher, Richard R. (Technical Monitor)

    2002-01-01

    There was a method for calculating the whole scattering amplitude, f(Omega(sub k)), directly. The idea was to calculate the complete wave function Psi numerically, and use it in an integral expression for f, which can be reduced to a 2 dimensional quadrature. The original application was for e-H scattering without exchange. There the Schrodinger reduces a 2-d partial differential equation (pde), which was solved using the finite element method (FEM). Here we extend the method to the exchange approximation. The S.E. can be reduced to a pair of coupled pde's, which are again solved by the FEM. The formal expression for f(Omega(sub k)) consists two integrals, f+/- = f(sub d) +/- f(sub e); f(sub d) is formally the same integral as the no-exchange f. We have also succeeded in reducing f(sub e) to a 2-d integral. Results will be presented at the meeting.

  20. Quantum crystallography: A perspective.

    PubMed

    Massa, Lou; Matta, Chérif F

    2018-06-30

    Extraction of the complete quantum mechanics from X-ray scattering data is the ultimate goal of quantum crystallography. This article delivers a perspective for that possibility. It is desirable to have a method for the conversion of X-ray diffraction data into an electron density that reflects the antisymmetry of an N-electron wave function. A formalism for this was developed early on for the determination of a constrained idempotent one-body density matrix. The formalism ensures pure-state N-representability in the single determinant sense. Applications to crystals show that quantum mechanical density matrices of large molecules can be extracted from X-ray scattering data by implementing a fragmentation method termed the kernel energy method (KEM). It is shown how KEM can be used within the context of quantum crystallography to derive quantum mechanical properties of biological molecules (with low data-to-parameters ratio). © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Cosmological perturbation theory using the FFTLog: formalism and connection to QFT loop integrals

    NASA Astrophysics Data System (ADS)

    Simonović, Marko; Baldauf, Tobias; Zaldarriaga, Matias; Carrasco, John Joseph; Kollmeier, Juna A.

    2018-04-01

    We present a new method for calculating loops in cosmological perturbation theory. This method is based on approximating a ΛCDM-like cosmology as a finite sum of complex power-law universes. The decomposition is naturally achieved using an FFTLog algorithm. For power-law cosmologies, all loop integrals are formally equivalent to loop integrals of massless quantum field theory. These integrals have analytic solutions in terms of generalized hypergeometric functions. We provide explicit formulae for the one-loop and the two-loop power spectrum and the one-loop bispectrum. A chief advantage of our approach is that the difficult part of the calculation is cosmology independent, need be done only once, and can be recycled for any relevant predictions. Evaluation of standard loop diagrams then boils down to a simple matrix multiplication. We demonstrate the promise of this method for applications to higher multiplicity/loop correlation functions.

  2. On the Adequacy of Bayesian Evaluations of Categorization Models: Reply to Vanpaemel and Lee (2012)

    ERIC Educational Resources Information Center

    Wills, Andy J.; Pothos, Emmanuel M.

    2012-01-01

    Vanpaemel and Lee (2012) argued, and we agree, that the comparison of formal models can be facilitated by Bayesian methods. However, Bayesian methods neither precede nor supplant our proposals (Wills & Pothos, 2012), as Bayesian methods can be applied both to our proposals and to their polar opposites. Furthermore, the use of Bayesian methods to…

  3. A bibliography on formal methods for system specification, design and validation

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Movaghar, A.

    1982-01-01

    Literature on the specification, design, verification, testing, and evaluation of avionics systems was surveyed, providing 655 citations. Journal papers, conference papers, and technical reports are included. Manual and computer-based methods were employed. Keywords used in the online search are listed.

  4. Evaluation of light extraction efficiency for the light-emitting diodes based on the transfer matrix formalism and ray-tracing method

    NASA Astrophysics Data System (ADS)

    Pingbo, An; Li, Wang; Hongxi, Lu; Zhiguo, Yu; Lei, Liu; Xin, Xi; Lixia, Zhao; Junxi, Wang; Jinmin, Li

    2016-06-01

    The internal quantum efficiency (IQE) of the light-emitting diodes can be calculated by the ratio of the external quantum efficiency (EQE) and the light extraction efficiency (LEE). The EQE can be measured experimentally, but the LEE is difficult to calculate due to the complicated LED structures. In this work, a model was established to calculate the LEE by combining the transfer matrix formalism and an in-plane ray tracing method. With the calculated LEE, the IQE was determined and made a good agreement with that obtained by the ABC model and temperature-dependent photoluminescence method. The proposed method makes the determination of the IQE more practical and conventional. Project supported by the National Natural Science Foundation of China (Nos.11574306, 61334009), the China International Science and Technology Cooperation Program (No. 2014DFG62280), and the National High Technology Program of China (No. 2015AA03A101).

  5. A new self-shielding method based on a detailed cross-section representation in the resolved energy domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saygin, H.; Hebert, A.

    The calculation of a dilution cross section {bar {sigma}}{sub e} is the most important step in the self-shielding formalism based on the equivalence principle. If a dilution cross section that accurately characterizes the physical situation can be calculated, it can then be used for calculating the effective resonance integrals and obtaining accurate self-shielded cross sections. A new technique for the calculation of equivalent cross sections based on the formalism of Riemann integration in the resolved energy domain is proposed. This new method is compared to the generalized Stamm`ler method, which is also based on an equivalence principle, for a two-regionmore » cylindrical cell and for a small pressurized water reactor assembly in two dimensions. The accuracy of each computing approach is obtained using reference results obtained from a fine-group slowing-down code named CESCOL. It is shown that the proposed method leads to slightly better performance than the generalized Stamm`ler approach.« less

  6. Explicating formal epistemology: Carnap's legacy as Jeffrey's radical probabilism.

    PubMed

    French, Christopher F

    2015-10-01

    Quine's "naturalized epistemology" presents a challenge to Carnapian explication: why try to rationally reconstruct probabilistic concepts instead of just doing psychology? This paper tracks the historical development of Richard C. Jeffrey who, on the one hand, voiced worries similar to Quine's about Carnapian explication but, on the other hand, claims that his own work in formal epistemology—what he calls "radical probabilism"—is somehow continuous with both Carnap's method of explication and logical empiricism. By examining how Jeffrey's claim could possibly be accurate, the paper suggests that Jeffrey's radical probabilism can be seen as a sort of alternative explication project to Carnap's own inductive logic. In so doing, it deflates both Quine's worries about Carnapian explication and so also, by extension, similar worries about formal epistemology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Non-Formal education in astronomy: The experience of the University the Carabobo

    NASA Astrophysics Data System (ADS)

    Falcón, Nelson

    2011-06-01

    Since 1995, the University the Carabobo, in Venezuela, has come developing a program of astronomical popularization and learning Astronomy using the Non formal education methods. A synopsis of the activities is presented. We will also discuss some conceptual aspects about the extension of the knowledge like supplementary function of the investigation and the university teaching. We illustrate the characteristics of the communication with an example of lectures and printed material. The efficiency of the heuristic arguments could be evaluated through a ethnology study. In that order of ideas, we show some images of the activities of astronomical popularization. We can see the population and great concurrence with chronological (and cultural) heterogeneity. We conclude that the Non formal education, structured with characteristic different to the usual educational instruction, constitutes a successful strategy in the diffusion and the communicating astronomy.

  8. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  9. Learning-assisted theorem proving with millions of lemmas☆

    PubMed Central

    Kaliszyk, Cezary; Urban, Josef

    2015-01-01

    Large formal mathematical libraries consist of millions of atomic inference steps that give rise to a corresponding number of proved statements (lemmas). Analogously to the informal mathematical practice, only a tiny fraction of such statements is named and re-used in later proofs by formal mathematicians. In this work, we suggest and implement criteria defining the estimated usefulness of the HOL Light lemmas for proving further theorems. We use these criteria to mine the large inference graph of the lemmas in the HOL Light and Flyspeck libraries, adding up to millions of the best lemmas to the pool of statements that can be re-used in later proofs. We show that in combination with learning-based relevance filtering, such methods significantly strengthen automated theorem proving of new conjectures over large formal mathematical libraries such as Flyspeck. PMID:26525678

  10. Formal thought disorder in people at ultra-high risk of psychosis

    PubMed Central

    Weinstein, Sara; Stahl, Daniel; Day, Fern; Valmaggia, Lucia; Rutigliano, Grazia; De Micheli, Andrea; Fusar-Poli, Paolo; McGuire, Philip

    2017-01-01

    Background Formal thought disorder is a cardinal feature of psychosis. However, the extent to which formal thought disorder is evident in ultra-high-risk individuals and whether it is linked to the progression to psychosis remains unclear. Aims Examine the severity of formal thought disorder in ultra-high-risk participants and its association with future psychosis. Method The Thought and Language Index (TLI) was used to assess 24 ultra-high-risk participants, 16 people with first-episode psychosis and 13 healthy controls. Ultra-high-risk individuals were followed up for a mean duration of 7 years (s.d.=1.5) to determine the relationship between formal thought disorder at baseline and transition to psychosis. Results TLI scores were significantly greater in the ultra-high-risk group compared with the healthy control group (effect size (ES)=1.2), but lower than in people with first-episode psychosis (ES=0.8). Total and negative TLI scores were higher in ultra-high-risk individuals who developed psychosis, but this was not significant. Combining negative TLI scores with attenuated psychotic symptoms and basic symptoms predicted transition to psychosis (P=0.04; ES=1.04). Conclusions TLI is beneficial in evaluating formal thought disorder in ultra-high-risk participants, and complements existing instruments for the evaluation of psychopathology in this group. Declaration of interests None. Copyright and usage © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license. PMID:28713586

  11. Informal and Formal Social Support and Caregiver Burden: The AGES Caregiver Survey

    PubMed Central

    Shiba, Koichiro; Kondo, Naoki; Kondo, Katsunori

    2016-01-01

    Background We examined the associations of informal (eg, family members and friends) and formal (eg, physician and visiting nurses) social support with caregiver’s burden in long-term care and the relationship between the number of available sources of social support and caregiver burden. Methods We conducted a mail-in survey in 2003 and used data of 2998 main caregivers of frail older adults in Aichi, Japan. We used a validated scale to assess caregiver burden. Results Multiple linear regression demonstrated that, after controlling for caregivers’ sociodemographic and other characteristics, informal social support was significantly associated with lower caregiver burden (β = −1.59, P < 0.0001), while formal support was not (β = −0.30, P = 0.39). Evaluating the associations by specific sources of social support, informal social supports from the caregiver’s family living together (β = −0.71, P < 0.0001) and from relatives (β = −0.61, P = 0.001) were associated with lower caregiver burden, whereas formal social support was associated with lower caregiver burden only if it was from family physicians (β = −0.56, P = 0.001). Compared to caregivers without informal support, those who had one support (β = −1.62, P < 0.0001) and two or more supports (β = −1.55, P < 0.0001) had significantly lower burden. This association was not observed for formal support. Conclusions Social support from intimate social relationships may positively affect caregivers’ psychological wellbeing independent of the receipt of formal social support, resulting in less burden. PMID:27180934

  12. How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Hagen, George; Maddalon, Jeffrey M.; Munoz, Cesar A.; Narkawicz, Anthony; Dowek, Gilles

    2010-01-01

    In this paper we describe a process of algorithmic discovery that was driven by our goal of achieving complete, mechanically verified algorithms that compute conflict prevention bands for use in en route air traffic management. The algorithms were originally defined in the PVS specification language and subsequently have been implemented in Java and C++. We do not present the proofs in this paper: instead, we describe the process of discovery and the key ideas that enabled the final formal proof of correctness

  13. Human aspects of mission safety

    NASA Technical Reports Server (NTRS)

    Connors, Mary M.

    1989-01-01

    Recent discussions of psychology's involvement in spaceflight have emphasized its role in enhancing space living conditions and incresing crew productivity. While these goals are central to space missions, behavioral scientists should not lose sight of a more basic flight requirement - that of crew safety. This paper examines some of the processes employed in the American space program in support of crew safety and suggests that behavioral scientists could contribute to flight safety, both through these formal processes and through less formal methods. Various safety areas of relevance to behavioral scientists are discussed.

  14. Quantum localization of classical mechanics

    NASA Astrophysics Data System (ADS)

    Batalin, Igor A.; Lavrov, Peter M.

    2016-07-01

    Quantum localization of classical mechanics within the BRST-BFV and BV (or field-antifield) quantization methods are studied. It is shown that a special choice of gauge fixing functions (or BRST-BFV charge) together with the unitary limit leads to Hamiltonian localization in the path integral of the BRST-BFV formalism. In turn, we find that a special choice of gauge fixing functions being proportional to extremals of an initial non-degenerate classical action together with a very special solution of the classical master equation result in Lagrangian localization in the partition function of the BV formalism.

  15. A Jeziorski-Monkhorst fully uncontracted multi-reference perturbative treatment. I. Principles, second-order versions, and tests on ground state potential energy curves

    NASA Astrophysics Data System (ADS)

    Giner, Emmanuel; Angeli, Celestino; Garniron, Yann; Scemama, Anthony; Malrieu, Jean-Paul

    2017-06-01

    The present paper introduces a new multi-reference perturbation approach developed at second order, based on a Jeziorski-Mokhorst expansion using individual Slater determinants as perturbers. Thanks to this choice of perturbers, an effective Hamiltonian may be built, allowing for the dressing of the Hamiltonian matrix within the reference space, assumed here to be a CAS-CI. Such a formulation accounts then for the coupling between the static and dynamic correlation effects. With our new definition of zeroth-order energies, these two approaches are strictly size-extensive provided that local orbitals are used, as numerically illustrated here and formally demonstrated in the Appendix. Also, the present formalism allows for the factorization of all double excitation operators, just as in internally contracted approaches, strongly reducing the computational cost of these two approaches with respect to other determinant-based perturbation theories. The accuracy of these methods has been investigated on ground-state potential curves up to full dissociation limits for a set of six molecules involving single, double, and triple bond breaking together with an excited state calculation. The spectroscopic constants obtained with the present methods are found to be in very good agreement with the full configuration interaction results. As the present formalism does not use any parameter or numerically unstable operation, the curves obtained with the two methods are smooth all along the dissociation path.

  16. Practical Weak-lensing Shear Measurement with Metacalibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheldon, Erin S.; Huff, Eric M.

    2017-05-20

    Metacalibration is a recently introduced method to accurately measure weak gravitational lensing shear using only the available imaging data, without need for prior information about galaxy properties or calibration from simulations. The method involves distorting the image with a small known shear, and calculating the response of a shear estimator to that applied shear. The method was shown to be accurate in moderate-sized simulations with galaxy images that had relatively high signal-to-noise ratios, and without significant selection effects. In this work we introduce a formalism to correct for both shear response and selection biases. We also observe that for imagesmore » with relatively low signal-to-noise ratios, the correlated noise that arises during the metacalibration process results in significant bias, for which we develop a simple empirical correction. To test this formalism, we created large image simulations based on both parametric models and real galaxy images, including tests with realistic point-spread functions. We varied the point-spread function ellipticity at the five-percent level. In each simulation we applied a small few-percent shear to the galaxy images. We introduced additional challenges that arise in real data, such as detection thresholds, stellar contamination, and missing data. We applied cuts on the measured galaxy properties to induce significant selection effects. Using our formalism, we recovered the input shear with an accuracy better than a part in a thousand in all cases.« less

  17. Transforming Elementary Science Teacher Education by Bridging Formal and Informal Science Education in an Innovative Science Methods Course

    ERIC Educational Resources Information Center

    Riedinger, Kelly; Marbach-Ad, Gili; McGinnis, J. Randy; Hestness, Emily; Pease, Rebecca

    2011-01-01

    We investigated curricular and pedagogical innovations in an undergraduate science methods course for elementary education majors at the University of Maryland. The goals of the innovative elementary science methods course included: improving students' attitudes toward and views of science and science teaching, to model innovative science teaching…

  18. Complex Langevin method: When can it be trusted?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aarts, Gert; Seiler, Erhard; Stamatescu, Ion-Olimpiu

    2010-03-01

    We analyze to what extent the complex Langevin method, which is in principle capable of solving the so-called sign problems, can be considered as reliable. We give a formal derivation of the correctness and then point out various mathematical loopholes. The detailed study of some simple examples leads to practical suggestions about the application of the method.

  19. Can Formal Methods Provide (Necessary and) Sufficient Conditions for Measurement?

    ERIC Educational Resources Information Center

    Mari, Luca

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul introduces and discusses several foundational issues and concludes that self-report measures may be particularly difficult to validate and may fall short…

  20. Method and Mythology.

    ERIC Educational Resources Information Center

    Giovanazzi, Anthony

    1993-01-01

    Overviews the teaching of languages, particularly English, over a long time span. For centuries, the creative process of language learning, as opposed to the formalized pattern of language teaching, was largely ignored. It is suggested that there is nothing doctrinal about teaching methods, language learning will never be an exact science, and…

Top