DOE Office of Scientific and Technical Information (OSTI.GOV)
Faidy, C.; Gilles, P.
The objective of the seminar was to present the current state of the art in Leak-Before-Break (LBB) methodology development, validation, and application in an international forum. With particular emphasis on industrial applications and regulatory policies, the seminar provided an opportunity to compare approaches, experiences, and codifications developed by different countries. The seminar was organized into four topic areas: status of LBB applications; technical issues in LBB methodology; complementary requirements (leak detection and inspection); LBB assessment and margins. As a result of this seminar, an improved understanding of LBB gained through sharing of different viewpoints from different countries, permits consideration of:more » simplified pipe support design and possible elimination of loss-of-coolant-accident (LOCA) mechanical consequences for specific cases; defense-in-depth type of applications without support modifications; support of safety cases for plants designed without the LOCA hypothesis. In support of these activities, better estimates of the limits to the LBB approach should follow, as well as an improvement in codifying methodologies. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swamy, S.A.; Bhowmick, D.C.; Prager, D.E.
The regulatory requirements for postulated pipe ruptures have changed significantly since the first nuclear plants were designed. The Leak-Before-Break (LBB) methodology is now accepted as a technically justifiable approach for eliminating postulation of double-ended guillotine breaks (DEGB) in high energy piping systems. The previous pipe rupture design requirements for nuclear power plant applications are responsible for all the numerous and massive pipe whip restraints and jet shields installed for each plant. This results in significant plant congestion, increased labor costs and radiation dosage for normal maintenance and inspection. Also the restraints increase the probability of interference between the piping andmore » supporting structures during plant heatup, thereby potentially impacting overall plant reliability. The LBB approach to eliminate postulating ruptures in high energy piping systems is a significant improvement to former regulatory methodologies, and therefore, the LBB approach to design is gaining worldwide acceptance. However, the methods and criteria for LBB evaluation depend upon the policy of individual country and significant effort continues towards accomplishing uniformity on a global basis. In this paper the historical development of the U.S. LBB criteria will be traced and the results of an LBB evaluation for a typical Japanese PWR primary loop applying U.S. NRC approved methods will be presented. In addition, another approach using the Japanese LBB criteria will be shown and compared with the U.S. criteria. The comparison will be highlighted in this paper with detailed discussion.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swamy, S.A.; Mandava, P.R.; Bhowmick, D.C.
The leak-before-break (LBB) methodology is accepted as a technically justifiable approach for eliminating postulation of Double-Ended Guillotine Breaks (DEGB) in high energy piping systems. This is the result of extensive research, development, and rigorous evaluations by the NRC and the commercial nuclear power industry since the early 1970s. The DEGB postulation is responsible for the many hundreds of pipe whip restraints and jet shields found in commercial nuclear plants. These restraints and jet shields not only cost many millions of dollars, but also cause plant congestion leading to reduced reliability in inservice inspection and increased man-rem exposure. While use ofmore » leak-before-break technology saved hundreds of millions of dollars in backfit costs to many operating Westinghouse plants, value-impacts resulting from the application of this technology for future plants are greater on a per plant basis. These benefits will be highlighted in this paper. The LBB technology has been applied extensively to high energy piping systems in operating plants. However, there are differences between the application of LBB technology to an operating plant and to a new plant design. In this paper an approach is proposed which is suitable for application of LBB to a new plant design such as the Westinghouse AP600. The approach is based on generating Bounding Analyses Curves (BAC) for the candidate piping systems. The general methodology and criteria used for developing the BACs are based on modified GDC-4 and Standard Review Plan (SRP) 3.6.3. The BAC allows advance evaluation of the piping system from the LBB standpoint thereby assuring LBB conformance for the piping system. The piping designer can use the results of the BACs to determine acceptability of design loads and make modifications (in terms of piping layout and support configurations) as necessary at the design stage to assure LBB for the, piping systems under consideration.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouchard, P.J.
A forthcoming revision to the R6 Leak-before-Break Assessment Procedure is briefly described. Practical application of the LbB concepts to safety-critical nuclear plant is illustrated by examples covering both low temperature and high temperature (>450{degrees}C) operating regimes. The examples highlight a number of issues which can make the development of a satisfactory LbB case problematic: for example, coping with highly loaded components, methodology assumptions and the definition of margins, the effect of crack closure owing to weld residual stresses, complex thermal stress fields or primary bending fields, the treatment of locally high stresses at crack intersections with free surfaces, the choicemore » of local limit load solution when predicting ligament breakthrough, and the scope of calculations required to support even a simplified LbB case for high temperature steam pipe-work systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wichman, K.; Tsao, J.; Mayfield, M.
The regulatory application of leak before break (LBB) for operating and advanced reactors in the U.S. is described. The U.S. Nuclear Regulatory Commission (NRC) has approved the application of LBB for six piping systems in operating reactors: reactor coolant system primary loop piping, pressurizer surge, safety injection accumulator, residual heat removal, safety injection, and reactor coolant loop bypass. The LBB concept has also been applied in the design of advanced light water reactors. LBB applications, and regulatory considerations, for pressurized water reactors and advanced light water reactors are summarized in this paper. Technology development for LBB performed by the NRCmore » and the International Piping Integrity Research Group is also briefly summarized.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Kyoung Mo; Jee, Kye Kwang; Pyo, Chang Ryul
The basis of the leak before break (LBB) concept is to demonstrate that piping will leak significantly before a double ended guillotine break (DEGB) occurs. This is demonstrated by quantifying and evaluating the leak process and prescribing safe shutdown of the plant on the basis of the monitored leak rate. The application of LBB for power plant design has reduced plant cost while improving plant integrity. Several evaluations employing LBB analysis on system piping based on DEGB design have been completed. However, the application of LBB on main steam (MS) piping, which is LBB applicable piping, has not been performedmore » due to several uncertainties associated with occurrence of steam hammer and dynamic strain aging (DSA). The objective of this paper is to demonstrate the applicability of the LBB design concept to main steam lines manufactured with SA106 Gr.C carbon steel. Based on the material properties, including fracture toughness and tensile properties obtained from the comprehensive material tests for base and weld metals, a parametric study was performed as described in this paper. The PICEP code was used to determine leak size crack (LSC) and the FLET code was used to perform the stability assessment of MS piping. The effects of material properties obtained from tests were evaluated to determine the LBB applicability for the MS piping. It can be shown from this parametric study that the MS piping has a high possibility of design using LBB analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roussel, G.
Leak-Before-Break (LBB) technology has not been applied in the first design of the seven Pressurized Water Reactors the Belgian utility is currently operating. The design basis of these plants required to consider the dynamic effects associated with the ruptures to be postulated in the high energy piping. The application of the LBB technology to the existing plants has been recently approved by the Belgian Safety Authorities but with a limitation to the primary coolant loop. LBB analysis has been initiated for the Doel 3 and Tihange 2 plants to allow the withdrawal of some of the reactor coolant pump snubbersmore » at both plants and not reinstall some of the restraints after steam generator replacement at Doel 3. LBB analysis was also found beneficial to demonstrate the acceptability of the primary components and piping to the new conditions resulting from power uprating and stretch-out operation. LBB analysis has been subsequently performed on the primary coolant loop of the Tihange I plant and is currently being performed for the Doel 4 plant. Application of the LBB to the primary coolant loop is based in Belgium on the U.S. Nuclear Regulatory Commission requirements. However the Belgian Safety Authorities required some additional analyses and put some restrictions on the benefits of the LBB analysis to maintain the global safety of the plant at a sufficient level. This paper develops the main steps of the safety evaluation performed by the Belgian Safety Authorities for accepting the application of the LBB technology to existing plants and summarizes the requirements asked for in addition to the U.S. Nuclear Regulatory Commission rules.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cauquelin, C.
This paper presents an overview of the use of leak-before-break (LBB) analysis for EPR reactors. EPR is an evolutionary Nuclear Island of the 4 loop x 1500 Mwe class currently in the design phase. Application of LBB to the main coolant lines and resulting design impacts are summarized. Background information on LBB analysis in France and Germany is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ting; Li, Panlai, E-mail: li_panlai@126.com; Fu, Nian, E-mail: funian3678@163.com
A series of Dy{sup 3+}, Ce{sup 3+}/Dy{sup 3+}, Eu{sup 2+}/Dy{sup 3+} and Ce{sup 3+}/Eu{sup 2+}/Dy{sup 3+} doping LiBaB{sub 9}O{sub 15} (LBB) phosphors were synthesized via a high temperature solid-state method. LBB:Dy{sup 3+} cannot create light under ultraviolet radiation, however, LBB:Ce{sup 3+}, Dy{sup 3+} can produce yellow emission under 295 nm excitation. The energy transfer occurs from Ce{sup 3+} to Dy{sup 3+} ions via electric dipole-dipole interaction and the critical distance is estimated to be 21.15 Å based on concentration quenching model. Generally, Eu{sup 2+} ion is a sensitizer to Dy{sup 3+} ion, however, there is only the emission of Eu{supmore » 2+} in LBB:Eu{sup 2+}, Dy{sup 3+}, which means there is no energy transfer from Eu{sup 2+} to Dy{sup 3+} ions. Interestingly enough, when doping Eu{sup 2+} ion into LBB:Ce{sup 3+}, Dy{sup 3+}, white emission can be achieved by increase the blue (350–425 nm) emission intensity. The spectral property, quantum efficiency, CIE chromaticity coordinates and thermal quenching property of LBB:Ce{sup 3+}, Eu{sup 2+}, Dy{sup 3+} are investigated. The results indicate that LBB:Ce{sup 3+}, Eu{sup 2+}, Dy{sup 3+} may be a potential application to white light emitting diodes. - Graphical abstract: LBB:Ce{sup 3+}, Dy{sup 3+} can create white emission by doping Eu{sup 2+} ions. - Highlights: • LBB:Ce{sup 3+}, Dy{sup 3+} can produce white emission by doping Eu{sup 2+} ion. • There is no energy transfer from Eu{sup 2+} to Dy{sup 3+} ions. • Energy transfer occurs from Ce{sup 3+} to Dy{sup 3+} ions. • LBB:Ce{sup 3+}, Eu{sup 2+}, Dy{sup 3+} may be a potential application for white LEDs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crutzen, S.; Koble, T.D.; Lemaitre, P.
Applications of the Leak Before Break (LBB) concept involve the knowledge of flaw presence and characteristics. In Service Inspection is given the responsibility of detecting flaws of a determined importance to locate them precisely and to classify them in broad families. Often LBB concepts application imply the knowledge of flaw characteristics such as through wall depth; length at the inner diameter (ID) or outer diameter (OD) surface; orientation or tilt and skew angles; branching; surface roughness; opening or width; crack tip aspect. Besides detection and characterization, LBB evaluations consider important the fact that a crack could be in the weldmore » material or in the base material or in the heat affected zone. Cracks in tee junctions, in homogenous simple welds and in elbows are not considered in the same way. Essential variables of a flaw or defect are illustrated, and examples of flaws found in primary piping as reported by plant operators or service vendors are given. If such flaw variables are important in the applications of LBB concepts, essential is then the knowledge of the performance achievable by NDE techniques, during an ISI, in detecting such flaws, in locating them and in correctly evaluating their characteristics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eperin, A.P.; Zakharzhevsky, Yu.O.; Arzhaev, A.I.
A two-year Finnish-Russian cooperation program has been initiated in 1995 to demonstrate the applicability of the leak-before-break concept (LBB) to the primary circuit piping of the Leningrad NPP. The program includes J-R curve testing of authentic pipe materials at full operating temperature, screening and computational LBB analyses complying with the USNRC Standard Review Plan 3.6.3, and exchange of LBB-related information with emphasis on NDE. Domestic computer codes are mainly used, and all tests and analyses are independently carried out by each party. The results are believed to apply generally to RBMK type plants of the first generation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiselyov, V.A.; Sokov, L.M.
The LBB regulatory approach adopted in Russia in 1993 as an extra safety barrier is described for advanced WWER 1000 reactor steamline. The application of LBB concept requires the following additional protections. First, the steamline should be a highly qualified piping, performed in accordance with the applicable regulations and guidelines, carefully screened to verify that it is not subjected to any disqualifying failure mechanism. Second, a deterministic fracture mechanics analysis and leak rate evaluation have been performed to demonstrate that postulated through-wall crack that yields 95 1/min at normal operation conditions is stable even under seismic loads. Finally, it hasmore » been verified that the leak detection systems are sufficiently reliable, diverse and sensitive, and that adequate margins exist to detect a through wall crack smaller than the critical size. The obtained results are encouraging and show the possibility of the application of the LBB case to the steamline of advanced WWER 1000 reactor.« less
Recent evaluations of crack-opening-area in circumferentially cracked pipes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, S.; Brust, F.; Ghadiali, N.
1997-04-01
Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zdarek, J.; Pecinka, L.
Leak-before-break (LBB) analysis of WWER type reactors in the Czech and Sloval Republics is summarized in this paper. Legislative bases, required procedures, and validation and verification of procedures are discussed. A list of significant issues identified during the application of LBB analysis is presented. The results of statistical evaluation of crack length characteristics are presented and compared for the WWER 440 Type 230 and 213 reactors and for the WWER 1000 Type 302, 320 and 338 reactors.
NASA Astrophysics Data System (ADS)
Polak, Mark L.; Hall, Jeffrey L.; Herr, Kenneth C.
1995-08-01
We present a ratioing algorithm for quantitative analysis of the passive Fourier-transform infrared spectrum of a chemical plume. We show that the transmission of a near-field plume is given by tau plume = (Lobsd - Lbb-plume)/(Lbkgd - Lbb-plume), where tau plume is the frequency-dependent transmission of the plume, L obsd is the spectral radiance of the scene that contains the plume, Lbkgd is the spectral radiance of the same scene without the plume, and Lbb-plume is the spectral radiance of a blackbody at the plume temperature. The algorithm simultaneously achieves background removal, elimination of the spectrometer internal signature, and quantification of the plume spectral transmission. It has applications to both real-time processing for plume visualization and quantitative measurements of plume column densities. The plume temperature (Lbb-plume ), which is not always precisely known, can have a profound effect on the quantitative interpretation of the algorithm and is discussed in detail. Finally, we provide an illustrative example of the use of the algorithm on a trichloroethylene and acetone plume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y.J.; Sohn, G.H.; Kim, Y.J.
Typical LBB (Leak-Before-Break) analysis is performed for the highest stress location for each different type of material in the high energy pipe line. In most cases, the highest stress occurs at the nozzle and pipe interface location at the terminal end. The standard finite element analysis approach to calculate J-Integral values at the crack tip utilizes symmetry conditions when modeling near the nozzle as well as away from the nozzle region to minimize the model size and simplify the calculation of J-integral values at the crack tip. A factor of two is typically applied to the J-integral value to accountmore » for symmetric conditions. This simplified analysis can lead to conservative results especially for small diameter pipes where the asymmetry of the nozzle-pipe interface is ignored. The stiffness of the residual piping system and non-symmetries of geometry along with different material for the nozzle, safe end and pipe are usually omitted in current LBB methodology. In this paper, the effects of non-symmetries due to geometry and material at the pipe-nozzle interface are presented. Various LBB analyses are performed for a small diameter piping system to evaluate the effect a nozzle has on the J-integral calculation, crack opening area and crack stability. In addition, material differences between the nozzle and pipe are evaluated. Comparison is made between a pipe model and a nozzle-pipe interface model, and a LBB PED (Piping Evaluation Diagram) curve is developed to summarize the results for use by piping designers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tendera, P.
At present there are two NPPs equipped with PWR units in Czech Republic. The Dukovany NPP is about ten years in operation (four units 440 MW - WWER model 213) and Temelin NPP is under construction (two units 1000 MW-WWER model 320). Both NPPs were built to Soviet design and according to Soviet regulations and standards but most of equipment for primary circuits was supplied by home manufactures. The objective for the Czech LBB programme is to prove the LBB status of the primary piping systems of these NPPs and the LBB concept is a part of strategy to meetmore » western style safety standards. The reason for the Czech LBB project is a lack of some standard safety facilities, too. For both Dukovany and Temolin NPPs a full LBB analysis should be carried out. The application of LBB to the piping system should be also a cost effective means to avoid installations of pipe whip restraints and jet shields. The Czech regulatory body issued non-mandatory requirement {open_quotes}Leak Before Break{close_quotes} which is in compliance with national legal documents and which is based on the US NRC Regulatory Procedures and US standards (ASME, CODE, ANSI). The requirement has been published in the document {open_quotes}Safety of Nuclear Facilities{close_quotes} No. 1/1991 as {open_quotes}Requirements on the Content and Format of Safety Reports and their Supplements{close_quotes} and consists of two parts (1) procedure for obtaining proof of evidence {open_quotes}Leak Before Break{close_quotes} (2) leak detection systems for the pressurized reactor primary circuit. At present some changes concerning both parts of the above document will be introduced. The reasons for this modifications will be presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turbat, A.; Deschanels, H.; Sperandio, M.
The leak before break (LBB) concept was not used at the design level for SUPERPHENIX (SPX), but different studies have been performed or are in progress concerning different components : Main Vessel (MV), pipings. These studies were undertaken to improve the defense in depth, an approach used in all French reactors. In a first study, the LBB approach has been applied to the MV of SPX plant to verify the absence of risk as regards the core supporting function and to help in the definition of in-service inspection (ISI) program. Defining a reference semi-elliptic defect located in the welds ofmore » the structure, it is verified that the crack growth is limited and that the end-of-life defect is smaller than the critical one. Then it is shown that the hoop welds (those which are the most important for safety) located between the roof and the triple point verify the leak-before-break criteria. However, generally speaking, the low level of membrane primary stresses which is favorable for the integrity of the vessel makes the application of the leak-before-break concept more difficult due to small crack opening areas. Finally, the extension of the methodology to the secondary pipings of SPX incorporating recent European works of DCRC is briefly presented.« less
Overview of large scale experiments performed within the LBB project in the Czech Republic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kadecka, P.; Lauerova, D.
1997-04-01
During several recent years NRI Rez has been performing the LBB analyses of safety significant primary circuit pipings of NPPs in Czech and Slovak Republics. The analyses covered the NPPs with reactors WWER 440 Type 230 and 213 and WWER 1000 Type 320. Within the relevant LBB projects undertaken with the aim to prove the fulfilling of the requirements of LBB, a series of large scale experiments were performed. The goal of these experiments was to verify the properties of the components selected, and to prove the quality and/or conservatism of assessments used in the LBB-analyses. In this poster, amore » brief overview of experiments performed in Czech Republic under guidance of NRI Rez is presented.« less
Draft Genome Sequence of Lactobacillus delbrueckii subsp. bulgaricus LBB.B5.
Urshev, Zoltan; Hajo, Karima; Lenoci, Leonardo; Bron, Peter A; Dijkstra, Annereinou; Alkema, Wynand; Wels, Michiel; Siezen, Roland J; Minkova, Svetlana; van Hijum, Sacha A F T
2016-10-06
Lactobacillus delbrueckii subsp. bulgaricus LBB.B5 originates from homemade Bulgarian yogurt and was selected for its ability to form a strong association with Streptococcus thermophilus The genome sequence will facilitate elucidating the genetic background behind the contribution of LBB.B5 to the taste and aroma of yogurt and its exceptional protocooperation with S. thermophilus. Copyright © 2016 Urshev et al.
Leak-Before-Break: Further developments in regulatory policies and supporting research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkowski, G.M.; Chao, K.-S.
1990-02-01
The fourth in a series of international Leak-Before-Break (LBB) Seminars supported in part by the US Nuclear Regulatory Commission was held at the National Central Library in Taipei, Taiwan on May 11 and 12, 1989. The seminar updated the international polices and supporting research on LBB. Attendees included representatives from regulatory agencies, electric utilities, nuclear power plant fabricators, research organizations, and academic institutions. Regulatory policy was the subject of presentations by Mr. G. Arlotto (US NRC, USA) Dr. B. Jarman (AECB, Canada), Dr.P. Milella (ENEA-DISP, Italy), Dr. C. Faidy (EDF/Septen, France ), and Dr. K. Takumi (NUPEC, Japan). A papermore » by Mr. K. Wichman and Mr. A. Lee of the US NRC Office of Nuclear Reactor Regulation is included as background material to these proceedings; it discusses the history and status of LBB applications in US nuclear power plants. In addition, several papers on the supporting research programs described regulatory policy or industry standards for flaw evaluations, e.g., the ASME Section XI code procedures. Supporting research programs were reviewed on the first and second day by several participants from Taiwan, US, Japan, Canada, Italy, and France. Each individual paper has been cataloged separately.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moulin, D.; Chapuliot, S.; Drubay, B.
For structures like vessels or pipes containing a fluid, the Leak-Before-Break (LBB) assessment requires to demonstrate that it is possible, during the lifetime of the component, to detect a rate of leakage due to a possible defect, the growth of which would result in a leak before-break of the component. This LBB assessment could be an important contribution to the overall structural integrity argument for many components. The aim of this paper is to review some practices used for LBB assessment and to describe how some new R & D results have been used to provide a simplified approach ofmore » fracture mechanics analysis and especially the evaluation of crack shape and size during the lifetime of the component.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hui, Yue; Jung, Haesung; Kim, Doyoon
While biomineralization in apoferritin has effectively synthesized highly monodispersed nanoparticles of various metal oxides and hydroxides, the detailed kinetics and mechanisms of Mn(III) (hydr)oxide formation inside apoferritin cavities have not been reported. To address this knowledge gap, we first identified the phase of solid Mn(III) formed inside apoferritin cavities as α-MnOOH. To analyze the oxidation and nucleation mechanism of α-MnOOH inside apoferritin by quantifying oxidized Mn, we used a colorimetric method with leucoberbelin blue (LBB) solution. In this method, LBB disassembled apoferritin by inducing an acidic pH environment, and reduced α-MnOOH nanoparticles. The LBB-enabled kinetic analyses of α-MnOOH nanoparticle formationmore » suggested that the orders of reaction with respect to Mn2+ and OH– are 2 and 4, respectively, and α-MnOOH formation follows two-step pathways: First, soluble Mn2+ undergoes apoferritin-catalyzed oxidation at the ferroxidase dinuclear center, forming a Mn(III)-protein complex, P-[Mn2O2(OH)2]. Second, the oxidized Mn(III) dissociates from the protein binding sites and is subsequently nucleated to form α-MnOOH nanoparticles in the apoferritin cavities. This study reveals key kinetics and mechanistic information on the Mn-apoferritin systems, and the results facilitate applications of apoferritin as a means of nanomaterial synthesis.« less
Herzing, Denise L; Augliere, Bethany N; Elliser, Cindy R; Green, Michelle L; Pack, Adam A
2017-01-01
Over the last 20 years, significant habitat shifts have been documented in some populations of cetaceans. On Little Bahama Bank (LBB) there are sympatric communities of resident Atlantic spotted dolphins (Stenella frontalis) and bottlenose dolphins (Tursiops truncatus), monitored since 1985. The size and social structure (three clusters: Northern, Central, Southern) have been stable among the spotted dolphin community with little immigration/emigration, even after large demographic losses (36%) following two major hurricanes in 2004. In 2013 an unprecedented exodus of over 50% (52 individuals) of the spotted dolphin community was documented. The entire Central cluster and a few Northern and Southern individuals relocated 161 km south to Great Bahama Bank (GBB), also home to two sympatric resident communities of spotted dolphins and bottlenose dolphins. During the late summer of 2013 and the summers of 2014 and 2015 both sites were regularly monitored but no former LBB dolphins returned to LBB. Uncharacteristic matriline splits were observed. Social analyses revealed random associations for those spotted dolphins and very little integration between spotted dolphins that moved to GBB (MGBB) and those dolphin resident to GBB (RGBB). Male alliances among spotted dolphins were present, with some altered patterns. On LBB, the operational sex ratio (OSR) was reduced (.40 to .25). OSR for MGBB and RGBB dolphins were similar (.45 and .43). A significant steady decrease in sea surface temperature and chlorophyll a (a proxy for plankton production) occurred on LBB leading up to this exodus. Similar trends were not present over the same period on GBB. The sudden large-scale shift of spotted dolphins from LBB to GBB in association with the gradual decline in certain environmental factors suggests that a possible "tipping point" was reached in prey availability. This study provides a unique view into social and genetic implications of large-scale displacement of stable dolphin communities.
Herzing, Denise L.; Augliere, Bethany N.; Elliser, Cindy R.; Green, Michelle L.; Pack, Adam A.
2017-01-01
Over the last 20 years, significant habitat shifts have been documented in some populations of cetaceans. On Little Bahama Bank (LBB) there are sympatric communities of resident Atlantic spotted dolphins (Stenella frontalis) and bottlenose dolphins (Tursiops truncatus), monitored since 1985. The size and social structure (three clusters: Northern, Central, Southern) have been stable among the spotted dolphin community with little immigration/emigration, even after large demographic losses (36%) following two major hurricanes in 2004. In 2013 an unprecedented exodus of over 50% (52 individuals) of the spotted dolphin community was documented. The entire Central cluster and a few Northern and Southern individuals relocated 161 km south to Great Bahama Bank (GBB), also home to two sympatric resident communities of spotted dolphins and bottlenose dolphins. During the late summer of 2013 and the summers of 2014 and 2015 both sites were regularly monitored but no former LBB dolphins returned to LBB. Uncharacteristic matriline splits were observed. Social analyses revealed random associations for those spotted dolphins and very little integration between spotted dolphins that moved to GBB (MGBB) and those dolphin resident to GBB (RGBB). Male alliances among spotted dolphins were present, with some altered patterns. On LBB, the operational sex ratio (OSR) was reduced (.40 to .25). OSR for MGBB and RGBB dolphins were similar (.45 and .43). A significant steady decrease in sea surface temperature and chlorophyll a (a proxy for plankton production) occurred on LBB leading up to this exodus. Similar trends were not present over the same period on GBB. The sudden large-scale shift of spotted dolphins from LBB to GBB in association with the gradual decline in certain environmental factors suggests that a possible “tipping point” was reached in prey availability. This study provides a unique view into social and genetic implications of large-scale displacement of stable dolphin communities. PMID:28792947
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kashima, K.; Wilkowski, G.M.
1988-03-01
The third in a series of international Leak-Before-Break (LBB) Seminars supported in part by the US Nuclear Regulatory Commission was held at TEPCO Hall in the Tokyo Electric Power Company's (TEPCO) Electric Power Museum on May 14 and 15, 1987. The seminar updated the international policies and supporting research on LBB. Attendees included representatives from regulatory agencies, electric utility representatives, fabricators of nuclear power plants, research organizations, and university professors. Regulatory policy was the subject of presentations by Mr. G. Arlotto (US NRC, USA), Dr. H. Schultz (GRS, W. Germany), Dr. P. Milella (ENEA-DISP, Italy), Dr. C. Faidy, P. Jamet,more » and S. Bhandari (EDF/Septen, CEA/CEN, and Framatome, France), and Mr. T. Fukuzawa (MITI, Japan). Dr. F. Nilsson presented revised nondestructive inspection requirements relative to LBB in Sweden. In addition, several papers on the supporting research programs discussed regulatory policy. Questions following the presentations of the papers focused on the impact of various LBB policies or the impact of research findings. Supporting research programs were reviewed on the first and second day by several participants from the US, Japan, Germany, Canada, Italy, Sweden, England, and France.« less
Experiences with leak rate calculations methods for LBB application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebner, H.; Kastner, W.; Hoefler, A.
1997-04-01
In this paper, three leak rate computer programs for the application of leak before break analysis are described and compared. The programs are compared to each other and to results of an HDR Reactor experiment and two real crack cases. The programs analyzed are PIPELEAK, FLORA, and PICEP. Generally, the different leak rate models are in agreement. To obtain reasonable agreement between measured and calculated leak rates, it was necessary to also use data from detailed crack investigations.
Morphological study of the atrioventricular conduction system and Purkinje fibers in yak.
Duan, Deyong; Yu, Sijiu; Cui, Yan; Li, Chaoxu
2017-07-01
We studied the morphology of the atrioventricular conduction system (AVCS) and Purkinje fibers of the yak. Light and transmission electron microscopy were used to study the histological features of AVCS. The distributional characteristics of the His-bundle, the left bundle branch (LBB), right bundle branch (RBB), and Purkinje fiber network of yak hearts were examined using gross dissection, ink injection, and ABS casting. The results showed that the atrioventricular node (AVN) of yak located in the right side of interatrial septum and had a flattened ovoid shape. The AVN of yak is composed of the slender, interweaving cells formed almost entirely of the transitional cells (T-cells). The His-bundle extended from the AVN, and split into left LBB and RBB at the crest of the interventricular septum. The LBB descended along the left side of interventricular septum. At approximately the upper 1/3 of the interventricular septum, the LBB typically divided into three branches. The RBB ran under the endocardium of the right side of interventricular septum, and extended to the base of septal papillary muscle, passed into the moderator band, crossed the right ventricular cavity to reach the base of anterior papillary muscle, and divided into four fascicles under the subendocardial layer. The Purkinje fibers in the ventricle formed a complex spatial network. The distributional and cellular component characteristics of the AVCS and Purkinje fibers ensured normal cardiac function. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerard, R.; Malekian, C.; Meessen, O.
The Leak Before Break (LBB) concept allows to eliminate from the design basis the double-ended guillotine break of the primary loop piping, provided it can be demonstrated by a fracture mechanics analysis that a through-wall flaw, of a size giving rise to a leakage still well detectable by the plant leak detection systems, remains stable even under accident conditions (including the Safe Shutdown Earthquake (SSE)). This concept was successfully applied to the primary loop piping of several Belgian Pressurized Water Reactor (PWR) units, operated by the Utility Electrabel. One of the main benefits is to permit justification of supports inmore » the primary loop and justification of the integrity of the reactor pressure vessel and internals in case of a Loss Of Coolant Accident (LOCA) in stretch-out conditions. For two of the Belgian PWR units, the LBB approach also made it possible to reduce the number of large hydraulic snubbers installed on the primary coolant pumps. Last but not least, the LBB concept also facilitates the steam generator replacement operations, by eliminating the need for some pipe whip restraints located close to the steam generator. In addition to the U.S. regulatory requirements, the Belgian safety authorities impose additional requirements which are described in details in a separate paper. An novel aspect of the studies performed in Belgium is the way in which residual loads in the primary loop are taken into account. Such loads may result from displacements imposed to close the primary loop in a steam generator replacement operation, especially when it is performed using the {open_quote}two cuts{close_quotes} technique. The influence of such residual loads on the LBB margins is discussed in details and typical results are presented.« less
The normal variants in the left bundle branch system.
Elizari, M V
This article reviewed the main anatomic and physiopathological aspects of the left bundle branch from its origin in the His bundle and its intraventricular distribution on the left endocardial surface. The results are based on the relevant literature and on personal observations executed on 206 hearts distributed as follows: 67 dogs, 60 humans, 45 sheep, 22 pigs, 10 cows, 2 monkeys, 1 guanaco, and 1 sea lion. The main anatomical features of the His-Purkinje conducting system may be summarized as follows: The bundle of His is composed by two segments: the penetrating and branching portions. LBB originates in the branching portion located underneath the membranous septum. There is no true bifurcation of the bundle of His in a human heart. Short after its origin the LBB gives rise to its two main fascicles, anterior and posterior, both heading the anterior and posterior papillary muscles, respectively. The anterior division is thinner and longer than the posterior one. The RBB and the most anterior fibers of the LBB arise at the end of the branching portion. In some cases a well-defined left septal fascicle can be identified, usually arising from the posterior division. Each division gives off small fibers and false tendons crossing the left ventricular cavity connecting the papillary between them or the papillary muscles with the septal surface. From each division of the LBB, their corresponding Purkinje networks emerge covering the subendocardium of the septum and the free wall of the left ventricles. There are critical relationships of the proximal segments of the His-Purkinje system with the surrounding cardiac structures whose pathologic processes may damage the conducting tissue. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anikovsky, V.V.; Karzov, G.P.; Timofeev, B.T.
The paper demonstrates an insufficiency of some requirements native Norms (when comparing them with the foreign requirements for the consideration of calculating situations): (1) leak before break (LBB); (2) short cracks; (3) preliminary loading (warm prestressing). In particular, the paper presents (1) Comparison of native and foreign normative requirements (PNAE G-7-002-86, Code ASME, BS 1515, KTA) on permissible stress levels and specifically on the estimation of crack initiation and propagation; (2) comparison of RF and USA Norms of pressure vessel material acceptance and also data of pressure vessel hydrotests; (3) comparison of Norms on the presence of defects (RF andmore » USA) in NPP vessels, developments of defect schematization rules; foundation of a calculated defect (semi-axis correlation a/b) for pressure vessel and piping components: (4) sequence of defect estimation (growth of initial defects and critical crack sizes) proceeding from the concept LBB; (5) analysis of crack initiation and propagation conditions according to the acting Norms (including crack jumps); (6) necessity to correct estimation methods of ultimate states of brittle an ductile fracture and elastic-plastic region as applied to calculating situation: (a) LBB and (b) short cracks; (7) necessity to correct estimation methods of ultimate states with the consideration of static and cyclic loading (warm prestressing effect) of pressure vessel; estimation of the effect stability; (8) proposals on PNAE G-7-002-86 Norm corrections.« less
Determination of leakage areas in nuclear piping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keim, E.
1997-04-01
For the design and operation of nuclear power plants the Leak-Before-Break (LBB) behavior of a piping component has to be shown. This means that the length of a crack resulting in a leak is smaller than the critical crack length and that the leak is safely detectable by a suitable monitoring system. The LBB-concept of Siemens/KWU is based on computer codes for the evaluation of critical crack lengths, crack openings, leakage areas and leakage rates, developed by Siemens/KWU. In the experience with the leak rate program is described while this paper deals with the computation of crack openings and leakagemore » areas of longitudinal and circumferential cracks by means of fracture mechanics. The leakage areas are determined by the integration of the crack openings along the crack front, considering plasticity and geometrical effects. They are evaluated with respect to minimum values for the design of leak detection systems, and maximum values for controlling jet and reaction forces. By means of fracture mechanics LBB for subcritical cracks has to be shown and the calculation of leakage areas is the basis for quantitatively determining the discharge rate of leaking subcritical through-wall cracks. The analytical approach and its validation will be presented for two examples of complex structures. The first one is a pipe branch containing a circumferential crack and the second one is a pipe bend with a longitudinal crack.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopper, A.; Wilowski, G.; Scott, P.
1997-03-01
The IPIRG-2 program was an international group program managed by the US NRC and funded by organizations from 15 nations. The emphasis of the IPIRG-2 program was the development of data to verify fracture analyses for cracked pipes and fittings subjected to dynamic/cyclic load histories typical of seismic events. The scope included: (1) the study of more complex dynamic/cyclic load histories, i.e., multi-frequency, variable amplitude, simulated seismic excitations, than those considered in the IPIRG-1 program, (2) crack sizes more typical of those considered in Leak-Before-Break (LBB) and in-service flaw evaluations, (3) through-wall-cracked pipe experiments which can be used to validatemore » LBB-type fracture analyses, (4) cracks in and around pipe fittings, such as elbows, and (5) laboratory specimen and separate effect pipe experiments to provide better insight into the effects of dynamic and cyclic load histories. Also undertaken were an uncertainty analysis to identify the issues most important for LBB or in-service flaw evaluations, updating computer codes and databases, the development and conduct of a series of round-robin analyses, and analyst`s group meetings to provide a forum for nuclear piping experts from around the world to exchange information on the subject of pipe fracture technology. 17 refs., 104 figs., 41 tabs.« less
Distribution of surfactant protein A in rat lung.
Doyle, I R; Barr, H A; Nicholas, T E
1994-10-01
Although surfactant protein A (SP-A) is an integral component of alveolar surfactant, its relative abundance in lamellar bodies, regarded as the intracellular storage organelles for surfactant, remains contentious. We have previously shown that lamellar bodies, isolated from rat lung by upward flotation on a sucrose gradient, can be subfractionated into classic-appearing lamellar bodies (Lb-A) and a vesicular fraction (Lb-B), which we have speculated may be a second release form of surfactant. In the present study, we have used two-dimensional protein electrophoresis and immunochemical analysis to clarify the origin and the composition of these two subcellular fractions. In addition, we have examined the hypothesis that the secretion of SP-A and surfactant phospholipids occurs by independent pathways by examining the distribution of SP-A, total protein, and disaturated phospholipids (DSP) in the tubular myelin-rich (Alv-1) and tubular myelin-poor (Alv-2) fractions separated from lavaged material and in Lb-A and Lb-B isolated from both lung homogenate and purified alveolar type II cells. Our findings indicate that Lb-B is derived from type II cells, although they do not indicate whether it is a secretory form of surfactant, a reuptake vesicle, or a mixture of both. We found that the lung has a large tissue pool of immunoreactive SP-A. The %SP-A/DSP of total lamellar bodies isolated from type II cells was 0.96 +/- 0.1 (mean +/- SE), intermediate between that in Lb-A (1.67 +/- 0.13) and in Lb-B (0.65 +/- 0.04). In contrast, the %SP-A/DSP was 11.16 +/- 0.84 in whole lung homogenate and 13.14 +/- 1.71 in whole type II cells. In the alveolar compartment, the %SP-A/DSP was 17.38 +/- 3.40 in Alv-1, 6.34 +/- 0.31 in Alv-2, and 10.49 +/- 1.43 in macrophages, values an order of magnitude greater than found with the lamellar bodies. Our results indicate that only a relatively small portion of alveolar SP-A is derived from lamellar bodies, and we suggest that secretion of SP-A and DSP occurs via independent pathways.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samohyl, P.
The application of the LBB requires also fatigue flaw growth assessment. This analysis was performed for PWR nuclear power plants types VVER 440/230, VVER 440/213c, VVER 1000/320. Respecting that these NPP`s were designed according to Russian codes that differ from US codes it was needed to compare these approaches. Comparison with our experimental data was accomplished, too. Margins of applicability of the US methods and their modifications for the materials used for construction of Czech and Slovak NPP`s are shown. Computer code accomplishing the analysis according to described method is presented. Some measurement and calculations show that thermal stratifications inmore » horizontal pipelines can lead to additive loads that are not negligible and can be dangerous. An attempt to include these loads induced by steady-state stratification was made.« less
Elastic plastic fracture mechanics methodology for surface cracks
NASA Technical Reports Server (NTRS)
Ernst, Hugo A.; Lambert, D. M.
1994-01-01
The Elastic Plastic Fracture Mechanics Methodology has evolved significantly in the last several years. Nevertheless, some of these concepts need to be extended further before the whole methodology can be safely applied to structural parts. Specifically, there is a need to include the effect of constraint in the characterization of material resistance to crack growth and also to extend these methods to the case of 3D defects. As a consequence, this project was started as a 36 month research program with the general objective of developing an elastic plastic fracture mechanics methodology to assess the structural reliability of pressure vessels and other parts of interest to NASA which may contain flaws. The project is divided into three tasks that deal with (1) constraint and thickness effects, (2) three-dimensional cracks, and (3) the Leak-Before-Burst (LBB) criterion. This report period (March 1994 to August 1994) is a continuation of attempts to characterize three dimensional aspects of fracture present in 'two dimensional' or planar configuration specimens (Chapter Two), especially, the determination of, and use of, crack face separation data. Also, included, are a variety of fracture resistance testing results (J(m)R-curve format) and a discussion regarding two materials of NASA interest (6061-T651 Aluminum alloy and 1N718-STA1 nickel-base super alloy) involving a bases for like constraint in terms of ligament dimensions, and their comparison to the resulting J(m)R-curves (Chapter Two).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chattopadhyay, J.; Dutta, B.K.; Kushwaha, H.S.
Leak-Before-Break (LBB) is being used to design the primary heat transport piping system of 500 MWe Indian Pressurized Heavy Water Reactors (IPHWR). The work is categorized in three directions to demonstrate three levels of safety against sudden catastrophic break. Level 1 is inherent in the design procedure of piping system as per ASME Sec.III with a well defined factor of safety. Level 2 consists of fatigue crack growth study of a postulated part-through flaw at the inside surface of pipes. Level 3 is stability analysis of a postulated leakage size flaw under the maximum credible loading condition. Developmental work relatedmore » to demonstration of level 2 and level 3 confidence is described in this paper. In a case study on fatigue crack growth on PHT straight pipes for level 2, negligible crack growth is predicted for the life of the reactor. For level 3 analysis, the R6 method has been adopted. A database to evaluate SIF of elbows with throughwall flaws under combined internal pressure and bending moment has been generated to provide one of the inputs for R6 method. The methodology of safety assessment of elbow using R6 method has been demonstrated for a typical pump discharge elbow. In this analysis, limit load of the cracked elbow has been determined by carrying out elasto-plastic finite element analysis. The limit load results compared well with those given by Miller. However, it requires further study to give a general form of limit load solution. On the experimental front, a set of small diameter pipe fracture experiments have been carried out at room temperature and 300{degrees}C. Two important observations of the experiments are - appreciable drop in maximum load at 300{degrees}C in case of SS pipes and out-of-plane crack growth in case of CS pipes. Experimental load deflection curves are finally compared with five J-estimation schemes predictions. A material database of PHT piping materials is also being generated for use in LBB analysis.« less
Crack instability analysis methods for leak-before-break program in piping systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattar Neto, M.; Maneschy, E.; Nobrega, P.G.B. da
1995-11-01
The instability evaluation of cracks in piping systems is a step that is considered when a high-energy line is investigated in a leak-before-break (LBB) program. Different approaches have been used to assess stability of cracks: (a) local flow stress (LFS); (b) limit load (LL); (c) elastic-plastic fracture mechanics (EPFM) as J-integral versus tearing modulus (J-T) analysis. The first two methods are used for high ductile materials, when it is assumed that remaining ligament of the cracked pipe section becomes fully plastic prior to crack extension. EPFM is considered for low ductile piping when the material reaches unstable ductile tearing priormore » to plastic collapse in the net section. In this paper the LFS, LL and EPFM J-T methodologies were applied to calculate failure loads in circumferential through-wall cracked pipes with different materials, geometries and loads. It presents a comparison among the results obtained from the above three formulations and also compares them with experimental data available in the literature.« less
Jeong, Hyung Uk; Mun, Hye Yeon; Oh, Hyung Keun; Kim, Seung Bum; Yang, Kwang Yeol; Kim, Iksoo; Lee, Hyang Burm
2010-08-01
To identify novel bioinsecticidal agents, a bacterial strain, Serratia sp. EML-SE1, was isolated from a dead larva of the lepidopteran diamondback moth (Plutella xylostella) collected from a cabbage field in Korea. In this study, the insecticidal activity of liquid cultures in Luria-Bertani broth (LBB) and nutrient broth (NB) of a bacterial strain, Serratia sp. EML-SE1 against thirty 3rd and 4th instar larvae of the diamondback moth was investigated on a Chinese cabbage leaf housed in a round plastic cage (Ø 10 x 6 cm). 72 h after spraying the cabbage leaf with LBB and NB cultures containing the bacterial strain, the mortalities of the larvae were determined to be 91.7% and 88.3%, respectively. In addition, the insecticidal activity on potted cabbage containing 14 leaves in a growth cage (165 x 83 x 124 cm) was found to be similar to that of the plastic cage experiment. The results of this study provided valuable information on the insecticidal activity of the liquid culture of a Serratia species against the diamondback moth.
Simulating squeeze flows in multiaxial laminates using an improved TIF model
NASA Astrophysics Data System (ADS)
Ibañez, R.; Abisset-Chavanne, Emmanuelle; Chinesta, Francisco
2017-10-01
Thermoplastic composites are widely considered in structural parts. In this paper attention is paid to squeeze flow of continuous fiber laminates. In the case of unidirectional prepregs, the ply constitutive equation is modeled as a transversally isotropic fluid, that must satisfy both the fiber inextensibility as well as the fluid incompressibility. When laminate is squeezed the flow kinematics exhibits a complex dependency along the laminate thickness requiring a detailed velocity description through the thickness. In a former work the solution making use of an in-plane-out-of-plane separated representation within the PGD - Poper Generalized Decomposition - framework was successfully accomplished when both kinematic constraints (inextensibility and in-compressibility) were introduced using a penalty formulation for circumventing the LBB constraints. However, such a formulation makes difficult the calculation on fiber tractions and compression forces, the last required in rheological characterizations. In this paper the former penalty formulation is substituted by a mixed formulation that makes use of two Lagrange multipliers, while addressing the LBB stability conditions within the separated representation framework, questions never until now addressed.
N-16 monitors: Almaraz NPP experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adrada, J.
1997-02-01
Almaraz Nuclear Power Plant has installed N-16 monitors - one per steam generator - to control the leakage rate through the steam generator tubes after the application of leak before break (LBB) criteria for the top tube sheet (TTS). After several years of operation with the N-16 monitors, Almaraz NPP experience may be summarized as follows: N-16 monitors are very useful to follow the steam generator leak rate trend and to detect an incipient tube rupture; but they do not provide an exact absolute leak rate value, mainly when there are small leaks. The evolution of the measured N-16 leakmore » rates varies along the fuel cycle, with the same trend for the 3 steam generators. This behaviour is associated with the primary water chemistry evolution along the cycle.« less
Yin, Xiaochen; Salemi, Michelle R.; Phinney, Brett S.; Gotcheva, Velitchka; Angelov, Angel
2017-01-01
ABSTRACT We identified the proteins synthesized by Lactobacillus delbrueckii subsp. bulgaricus strain LBB.B5 in laboratory culture medium (MRS) at 37°C and milk at 37 and 4°C. Cell-associated proteins were measured by gel-free, shotgun proteomics using high-performance liquid chromatography coupled with tandem mass spectrophotometry. A total of 635 proteins were recovered from all cultures, among which 72 proteins were milk associated (unique or significantly more abundant in milk). LBB.B5 responded to milk by increasing the production of proteins required for purine biosynthesis, carbohydrate metabolism (LacZ and ManM), energy metabolism (TpiA, PgK, Eno, SdhA, and GapN), amino acid synthesis (MetE, CysK, LBU0412, and AspC) and transport (GlnM and GlnP), and stress response (Trx, MsrA, MecA, and SmpB). The requirement for purines was confirmed by the significantly improved cell yields of L. delbrueckii subsp. bulgaricus when incubated in milk supplemented with adenine and guanine. The L. delbrueckii subsp. bulgaricus-expressed proteome in milk changed upon incubation at 4°C for 5 days and included increased levels of 17 proteins, several of which confer functions in stress tolerance (AddB, UvrC, RecA, and DnaJ). However, even with the activation of stress responses in either milk or MRS, L. delbrueckii subsp. bulgaricus did not survive passage through the murine digestive tract. These findings inform efforts to understand how L. delbrueckii subsp. bulgaricus is adapted to the dairy environment and its implications for its health-benefiting properties in the human digestive tract. IMPORTANCE Lactobacillus delbrueckii subsp. bulgaricus has a long history of use in yogurt production. Although commonly cocultured with Streptococcus salivarius subsp. thermophilus in milk, fundamental knowledge of the adaptive responses of L. delbrueckii subsp. bulgaricus to the dairy environment and the consequences of those responses on the use of L. delbrueckii subsp. bulgaricus as a probiotic remain to be elucidated. In this study, we identified proteins of L. delbrueckii subsp. bulgaricus LBB.B5 that are synthesized in higher quantities in milk at growth-conducive and non-growth-conductive (refrigeration) temperatures compared to laboratory culture medium and further examined whether those L. delbrueckii subsp. bulgaricus cultures were affected differently in their capacity to survive transit through the murine digestive tract. This work provides novel insight into how a major, food-adapted microbe responds to its primary habitat. Such knowledge can be applied to improve starter culture and yogurt production and to elucidate matrix effects on probiotic performance. PMID:28951887
Yin, Xiaochen; Salemi, Michelle R; Phinney, Brett S; Gotcheva, Velitchka; Angelov, Angel; Marco, Maria L
2017-01-01
We identified the proteins synthesized by Lactobacillus delbrueckii subsp. bulgaricus strain LBB.B5 in laboratory culture medium (MRS) at 37°C and milk at 37 and 4°C. Cell-associated proteins were measured by gel-free, shotgun proteomics using high-performance liquid chromatography coupled with tandem mass spectrophotometry. A total of 635 proteins were recovered from all cultures, among which 72 proteins were milk associated (unique or significantly more abundant in milk). LBB.B5 responded to milk by increasing the production of proteins required for purine biosynthesis, carbohydrate metabolism (LacZ and ManM), energy metabolism (TpiA, PgK, Eno, SdhA, and GapN), amino acid synthesis (MetE, CysK, LBU0412, and AspC) and transport (GlnM and GlnP), and stress response (Trx, MsrA, MecA, and SmpB). The requirement for purines was confirmed by the significantly improved cell yields of L. delbrueckii subsp. bulgaricus when incubated in milk supplemented with adenine and guanine. The L. delbrueckii subsp. bulgaricus -expressed proteome in milk changed upon incubation at 4°C for 5 days and included increased levels of 17 proteins, several of which confer functions in stress tolerance (AddB, UvrC, RecA, and DnaJ). However, even with the activation of stress responses in either milk or MRS, L. delbrueckii subsp. bulgaricus did not survive passage through the murine digestive tract. These findings inform efforts to understand how L. delbrueckii subsp. bulgaricus is adapted to the dairy environment and its implications for its health-benefiting properties in the human digestive tract. IMPORTANCE Lactobacillus delbrueckii subsp. bulgaricus has a long history of use in yogurt production. Although commonly cocultured with Streptococcus salivarius subsp. thermophilus in milk, fundamental knowledge of the adaptive responses of L. delbrueckii subsp. bulgaricus to the dairy environment and the consequences of those responses on the use of L. delbrueckii subsp. bulgaricus as a probiotic remain to be elucidated. In this study, we identified proteins of L. delbrueckii subsp. bulgaricus LBB.B5 that are synthesized in higher quantities in milk at growth-conducive and non-growth-conductive (refrigeration) temperatures compared to laboratory culture medium and further examined whether those L. delbrueckii subsp. bulgaricus cultures were affected differently in their capacity to survive transit through the murine digestive tract. This work provides novel insight into how a major, food-adapted microbe responds to its primary habitat. Such knowledge can be applied to improve starter culture and yogurt production and to elucidate matrix effects on probiotic performance.
20 CFR 10.7 - What forms are needed to process claims under the FECA?
Code of Federal Regulations, 2011 CFR
2011-04-01
.... Form No. Title (1) CA-1 Federal Employee's Notice of Traumatic Injury and Claim for Continuation of Pay... Traumatic Injury or Occupational Disease (8) CA-7a Time Analysis Form (9) CA-7b Leave Buy Back (LBB... claims under the FECA? (a) Notice of injury, claims and certain specified reports shall be made on forms...
20 CFR 10.7 - What forms are needed to process claims under the FECA?
Code of Federal Regulations, 2010 CFR
2010-04-01
.... Form No. Title (1) CA-1 Federal Employee's Notice of Traumatic Injury and Claim for Continuation of Pay... Traumatic Injury or Occupational Disease (8) CA-7a Time Analysis Form (9) CA-7b Leave Buy Back (LBB... claims under the FECA? (a) Notice of injury, claims and certain specified reports shall be made on forms...
Structural influence of mixed transition metal ions on lithium bismuth borate glasses
NASA Astrophysics Data System (ADS)
Yadav, Arti; Dahiya, Manjeet S.; Hooda, A.; Chand, Prem; Khasa, S.
2017-08-01
Lithium bismuth borate glasses containing mixed transition metals having composition 7CoO·23Li2O·20Bi2O3·50B2O3 (CLBB), 7V2O5·23Li2O·20Bi2O3·50B2O3 (VLBB) and x(2CoO·V2O5)·(30 - x)Li2O·20Bi2O3·50B2O3 (x = 0.0 (LBB) and x = 2.0, 5.0, 7.0, 10.0 mol% (CVLBB1-4)) are synthesized via melt quench route. The synthesized compositions are investigated for their physical properties using density (D) and molar volume (Vm), thermal properties by analyzing DSC/TG thermo-graphs, structural properties using IR absorption spectra in the mid-IR range and optical properties using UV-Vis-NIR spectroscopy. The Electron Paramagnetic Resonance (EPR) spectra of vanadyl and cobalt ion have been analyzed to study compositional effects on spin-Hamiltonian parameters. The non linear variations in physical properties depict a strong structural influence of Co/V- oxides on the glassy matrix. The compositional variations in characteristic temperatures (glass transition temperature Tg, glass crystallization temperature Tp and glass melting temperature Tm) reveals that Tg for glass samples CLBB is relatively less than that of pure lithium bismuth borate (LBB) glass sample wherein Tg for sample VLBB is higher than that of LBB. The increase in Tg (as compared with LBB) with an enhanced substitution of mixed transition metal oxides (2CoO·V2O5) shows a progressive structure modification of bismuth borate matrix. These predictions are very well corroborated by corresponding compositional trends of Tp and Tm. FTIR studies reveal that Co2+& VO2+ ions lead to structural rearrangements through the conversion of three-coordinated boron into four coordinated boron and thereby reducing number of non-bridging oxygen atoms. Bismuth is found to exist in [BiO6] octahedral units only, whereas boroxol rings are not present in the glass network. The theoretical values of optical basicity (Λth) and corresponding oxide ion polarizability (αo2-) have also been calculated to investigate oxygen covalency of glass matrix. Trends in both these parameters suggested an increase in ionic bonding on substitution of divalent transition metal cations causing a more bonding compaction in glass structure. The UV-Vis-NIR spectra suggest that cobalt ions exist as Co2+ states in octahedral coordination in glass network. Inter-electronic repulsion parameter and crystal field splitting energy were evaluated to understand the site symmetry around Co2+-ion in glass. X-band EPR spectra suggest that vanadium ions (V4+) exists as VO2+-ions in octahedral coordination with tetragonal compression. Spin Hamiltonian parameters g-values and A-values of VO2+ ions in glass were calculated. For sample CLBB two resonance lines in EPR spectrum attribute to octahedral symmetry around Co2+-ions were observed.
Research Design for the Chief Joseph Dam Cultural Resources Project.
1984-01-01
peoples of Northeastern Washington. Written as a masters thesis ( UW 1933), this study of the Nespelem and the Sanpoll tribes, who lived in and around...Methods of Soil Analysis. American Society of Agronomy, Madison . Bonnichsen, R. and R.T. Will 1980 Cultural modification of bone: The experimental...Name 32 - 35 Skeletal Element Key TH0C Thoracic Centrum (continued) THEC Thoracic Zygapophsis LbbLrLVLUMB Lumbar Vertebra Indet LUMM Lumbar Vertebra I
Walkling-Ribeiro, Markus; Anany, Hany; Griffiths, Mansel W
2015-01-01
Pulsed electric fields (PEF), heat-assisted PEF (H-PEF), and virulent bacteriophage (VP) are non-thermal techniques for pathogen inactivation in liquids that were investigated individually, and in combination (PEF/VP, H-PEF/VP) to control enterohemorrhagic Escherichia coli (EHEC) O157:H7 in Luria-Bertani broth (LBB) and Ringer's solution (RS). Treated cells were subsequently incubated at refrigeration (4°C) and temperature-abuse conditions (12°C) for 5 days. When EHEC cells grown in LBB were subjected to non-thermal processing and subsequently stored at 12°C for 5 days, reductions in count of between 0.1 and 0.6 log cycles were observed and following storage at 4°C the decrease in counts varied between 0.2 and 1.1 log10 . For bacteria cells suspended in RS values ranged from 0.1 to ≥3.9 log cycles at both storage temperatures. The most effective treatments were H-PEF and H-PEF/VP, both producing a >3.4 log cycle reduction of cells suspended in non-nutrient RS. Analysis of EHEC recovery on selective and non-selective media indicated no occurrence of sub-lethal damage for VP, PEF/VP, and H-PEF/VP-treated cells. The findings indicate that combining PEF and lytic phage may represent a suitable alternative to conventional fluid decontamination following further process optimization. © 2014 American Institute of Chemical Engineers.
Optical and Physical Investigations of Lanthanum Bismuth Borate glasses doped with Ho2O3
NASA Astrophysics Data System (ADS)
Ramesh, P.; Jagannath, G.; Eraiah, B.; Kokila, M. K.
2018-02-01
Holmium doped 10La2O3-15Bi2O3-(75-x) B2O3 (Ho3+: LBB) glasses have been prepared by melt quench technique and the impact of holmium ions concentration on optical and physical properties of present glasses have been examined. Ho3+ dependent density, molar volume, refractive index, rare earth ion concentration, polaron radius, inter ionic distance, field strength and energy band gap are calculated and tabulated. Amorphous nature of the all glasses has been confirmed by XRD patterns. The room temperature (RT) Uv-Vis absorption spectrum doped with 1 mol% of Ho2O3 exhibit eight prominent bands centred at 895, 641, 537, 486, 472, 467, 451 and 416 due to transition between ground state to various excited states. The results show that, the density is increases and molar volume of the glasses is decreases with an increase in Ho2O3 concentration and consequently generate more non-bridging oxygen (NBOs) in the glass matrix. The Urbach energy is increases with holmium concentration which exemplifies the degree of disorder present in the LBB glasses. The considerable increase in field strength observed in present glasses is attributed to occurrence of strong bridge between Ho3+ and B- ions and this strong bridge is possibly due to the displacement between Ho3+ and oxygen atoms which are generated from the conversion BO3-BO4 units.
Hassan, Ali H M; Ako, Junya; Waseda, Katsuhisa; Honda, Yasuhiro; Zeller, Thomas; Leon, Martin B; Fitzgerald, Peter J
2010-01-01
The purpose of this study was to evaluate the mechanism of luminal gain with a novel atheroablation system (Pathway PV) for the treatment of peripheral artery disease using intravascular ultrasound (IVUS). The atherectomy system is a rotational atherectomy device, which employs expandable rotating blades with ports that allow flushing and aspiration of the plaque material or thrombus. In this first-in-man clinical study, IVUS analysis was available in 6 patients with lower limb ischemia treated with this device. The treatment results were assessed using IVUS at pre and post atherectomy. Lumen beyond burr size (LBB) was defined as lumen gain divided by the estimated burr area determined by the burr-size. IVUS analysis was available in six patients (superficial femoral artery n=3, popliteal artery n=2, posterior tibial artery n=1). Atheroablation achieved a significant increase in lumen area (LA) (preintervention 3.9+/-0.4, postatheroablation 8.0+/-1.7 mm(2), P<.05), and significant reduction in plaque area (27.5+/-4.0, 23.7+/-3.1 mm(2), P=.001), while there was no change in the vessel area (31.3+/-4.2, 32.1+/-2.8 mm(2), P=.4). LBB was 57.4+/-51.3%. This novel rotational aspiration atherectomy device achieved significant luminal gain by debulking in the absence of vessel stretching. The LA was greater than burr-sized lumen expectancy at cross-sections along the treated segments, suggesting a complimentary role of aspiration in luminal gain in atherosclerotic peripheral artery lesions.
Effects of weld residual stresses on crack-opening area analysis of pipes for LBB applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, P.; Rahman, S.; Wilkowski, G.
1997-04-01
This paper summarizes four different studies undertaken to evaluate the effects of weld residual stresses on the crack-opening behavior of a circumferential through-wall crack in the center of a girth weld. The effect of weld residual stress on the crack-opening-area and leak-rate analyses of a pipe is not well understood. There are no simple analyses to account for these effects, and, therefore, they are frequently neglected. The four studies involved the following efforts: (1) Full-field thermoplastic finite element residual stress analyses of a crack in the center of a girth weld, (2) A comparison of the crack-opening displacements from amore » full-field thermoplastic residual stress analysis with a crack-face pressure elastic stress analysis to determine the residual stress effects on the crack-opening displacement, (3) The effects of hydrostatic testing on the residual stresses and the resulting crack-opening displacement, and (4) The effect of residual stresses on crack-opening displacement with different normal operating stresses.« less
Enzyme Mini-Test for Field Identification of Leishmania Isolates from U.S. Military Personnel.
1986-05-15
Panama Dominques L2 539 NWC LBP Man - Panama Carillo 530 NWC LBP Man - Panama Schoonmaker 063 NWC LBG Man - Peru Terborgh/Muco 294 NWC LBB Man - Brazil...NWC LMA-G Man - Panama Castro 140 NWC LMM-P Man - Peru UTA 381 NWC LMM-P Man - Panama Peters 453 NWC LMM-P Man - Dominican Republic - 457 NWC LMM-P...LV24 547 OWC LMJ Man R. Beach Kenya LRC-L137 551 OWC LMJ Rodent R. Beach Kenya NLB095 552 OWC LMJ Sandfly R. Beach Kenya NLBI44 558 OWC LMJ Man
Bräuer, S. L.; Adams, C.; Kranzler, K.; Murphy, D.; Xu, M.; Zuber, P.; Simon, H. M.; Baptista, A. M.; Tebo, B. M.
2017-01-01
Summary Measurements of dissolved, ascorbate-reducible and total Mn by ICP-OES revealed significantly higher concentrations during estuarine turbidity maxima (ETM) events, compared with non-events in the Columbia River. Most probable number (MPN) counts of Mn-oxidizing or Mn-reducing heterotrophs were not statistically different from that of other heterotrophs (103–104 cells ml−1) when grown in defined media, but counts of Mn oxidizers were significantly lower in nutrient-rich medium (13 cells ml−1). MPN counts of Mn oxidizers were also significantly lower on Mn(III)-pyrophosphate and glycerol (21 cells ml−1). Large numbers of Rhodobacter spp. were cultured from dilutions of 10−2 to 10−5, and many of these were capable of Mn(III) oxidation. Up to c. 30% of the colonies tested LBB positive, and all 77 of the successfully sequenced LBB positive colonies (of varying morphology) yielded sequences related to Rhodobacter spp. qPCR indicated that a cluster of Rhodobacter isolates and closely related strains (95–99% identity) represented approximately 1–3% of the total Bacteria, consistent with clone library results. Copy numbers of SSU rRNA genes for either Rhodobacter spp. or Bacteria were four to eightfold greater during ETM events compared with non-events. Strains of a Shewanella sp. were retrieved from the highest dilutions (10−5) of Mn reducers, and were also capable of Mn oxidation. The SSU rRNA gene sequences from these strains shared a high identity score (98%) with sequences obtained in clone libraries. Our results support previous findings that ETMs are zones with high microbial activity. Results indicated that Shewanella and Rhodobacter species were present in environmentally relevant concentrations, and further demonstrated that a large proportion of culturable bacteria, including Shewanella and Rhodobacter spp., were capable of Mn cycling in vitro. PMID:20977571
Mazurowski, Artur; Frieske, Anna; Kokoszynski, Dariusz; Mroczkowski, Sławomir; Bernacki, Zenon; Wilkanowska, Anna
2015-01-01
The main objective of the study was to assess the polymorphism in intron 2 of the GH gene and its association with some morphological traits (body weight--BW, length of trunk with neck--LTN, length of trunk--LT, chest girth--CG, length of breast bone--LBB, length of shank--LS). Polymorphism in intron 2 of the GH gene was evaluated for four duck populations (Pekin ducks AF51, Muscovy ducks from a CK and CRAMMLCFF mother and Mulard ducks). Genetic polymorphism was determined with the PCR-RFLP method using the BsmFI restriction enzyme. In the studied duck sample two alleles (GH(C) and GH(T)) and three genotypes (GH/TT, GH/CT, GH/CC) were found at locus GH/BsmFI. In both groups of Muscovies and in Mulards the dominant allele was GH(T). On the contrary in Pekin ducks AF51, the frequency of both alleles was found to be similar. The most frequent genotype in the examined ducks was GH/TT. In Pekin ducks AF51 three genotypes were observed, while in Mulard ducks and in male Muscovy ducks from a mother marked as CK, two genotypes (GH/TT and GH/CT) were identified. Muscovy duck females from a CK mother and all males and females of Muscovy duck from a CRAMMLCFF mother were monomorphic with only the GH/TTgenotype detected. The results showed that males of Pekin duck AF51 with the GH/TT genotype were characterized by higher (P < 0.01) BW value than those with the GH/CC and GH/CTgenotype. In females of Pekin ducks AF51, this same trend was observed; individuals with GH/TT genotype were superior (P < 0.05 and P < 0.01) to birds with two other detected genotypes in respect to BW, CG, LBB and LS. In the case of Mulards, ducks with the GH/TT genotype were distinguished by higher values of all evaluated traits compared to ducks with GH/CT and GH/CC genotypes, however most of the recorded differences were not significant. The only trait markedly impacted (P < 0.05) by the polymorphism of the GH gene intron 2 was the LS value in males.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
2017-09-01
THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT...Jul 2017 The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence CW2 Stockham, Braden E. National...forensic science resources, law enforcement methodologies and procedures, and basic investigative training. In order to determine if these changes would
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loon, W.M.G.M. van; Hermens, J.L.M.
1994-12-31
A large part of all aquatic pollutants can be classified as narcosis-type (baseline toxicity) chemicals. Many chemicals contribute to a joint baseline aquatic toxicity even at trace concentrations. A novel surrogate parameter, which simulated bioconcentration of hydrophobic substances from water and estimates internal molar concentrations, has been explored by Verhaar et al.. These estimated biological concentrations can be used to predict narcosis-type toxic effects, using the Lethal Body Burden (LBB) concept. The authors applied this toxicological-analytical concept to river water, and some recent technological developments and field results are pointed out. The simulation of bioconcentration is performed by extracting watermore » samples with empore{trademark} disks. The authors developed two extraction procedures; i.e., laboratory extraction and field extraction. Molar concentrations measurements are performed using vapor pressure osmometry, GC-FID and GC-MS. Results on the molar concentrations of hydrophobic compounds which can be bioaccumulated from several Dutch river systems will be presented.« less
A mimetic finite difference method for the Stokes problem with elected edge bubbles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipnikov, K; Berirao, L
2009-01-01
A new mimetic finite difference method for the Stokes problem is proposed and analyzed. The unstable P{sub 1}-P{sub 0} discretization is stabilized by adding a small number of bubble functions to selected mesh edges. A simple strategy for selecting such edges is proposed and verified with numerical experiments. The discretizations schemes for Stokes and Navier-Stokes equations must satisfy the celebrated inf-sup (or the LBB) stability condition. The stability condition implies a balance between discrete spaces for velocity and pressure. In finite elements, this balance is frequently achieved by adding bubble functions to the velocity space. The goal of this articlemore » is to show that the stabilizing edge bubble functions can be added only to a small set of mesh edges. This results in a smaller algebraic system and potentially in a faster calculations. We employ the mimetic finite difference (MFD) discretization technique that works for general polyhedral meshes and can accomodate non-uniform distribution of stabilizing bubbles.« less
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications
1991-12-01
prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
2016-09-15
7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology
78 FR 77399 - Basic Health Program: Proposed Federal Funding Methodology for Program Year 2015
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... American Indians and Alaska Natives F. Example Application of the BHP Funding Methodology III. Collection... effectively 138 percent due to the application of a required 5 percent income disregard in determining the... correct errors in applying the methodology (such as mathematical errors). Under section 1331(d)(3)(ii) of...
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
1978-09-01
This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a
End State: The Fallacy of Modern Military Planning
2017-04-06
operational planning for non -linear, complex scenarios requires application of non -linear, advanced planning techniques such as design methodology ...cannot be approached in a linear, mechanistic manner by a universal planning methodology . Theater/global campaign plans and theater strategies offer no...strategic environments, and instead prescribes a universal linear methodology that pays no mind to strategic complexity. This universal application
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Mattern, Duane
1994-01-01
An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynn, R.Y.S.; Bolmarcich, J.J.
The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less
Basic principles, methodology, and applications of remote sensing in agriculture
NASA Technical Reports Server (NTRS)
Moreira, M. A. (Principal Investigator); Deassuncao, G. V.
1984-01-01
The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.
Reliability Centered Maintenance - Methodologies
NASA Technical Reports Server (NTRS)
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
The Methodology for Developing Mobile Agent Application for Ubiquitous Environment
NASA Astrophysics Data System (ADS)
Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi
A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.
Application of ion chromatography in pharmaceutical and drug analysis.
Jenke, Dennis
2011-08-01
Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.
Methodology of management of dredging operations II. Applications.
Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D
2006-04-01
This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.
NASA Astrophysics Data System (ADS)
Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel
2013-09-01
Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
Using Modern Methodologies with Maintenance Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.
2014-01-01
Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.
Grounded theory as a method for research in speech and language therapy.
Skeat, J; Perry, A
2008-01-01
The use of qualitative methodologies in speech and language therapy has grown over the past two decades, and there is now a body of literature, both generally describing qualitative research, and detailing its applicability to health practice(s). However, there has been only limited profession-specific discussion of qualitative methodologies and their potential application to speech and language therapy. To describe the methodology of grounded theory, and to explain how it might usefully be applied to areas of speech and language research where theoretical frameworks or models are lacking. Grounded theory as a methodology for inductive theory-building from qualitative data is explained and discussed. Some differences between 'modes' of grounded theory are clarified and areas of controversy within the literature are highlighted. The past application of grounded theory to speech and language therapy, and its potential for informing research and clinical practice, are examined. This paper provides an in-depth critique of a qualitative research methodology, including an overview of the main difference between two major 'modes'. The article supports the application of a theory-building approach in the profession, which is sometimes complex to learn and apply, but worthwhile in its results. Grounded theory as a methodology has much to offer speech and language therapists and researchers. Although the majority of research and discussion around this methodology has rested within sociology and nursing, grounded theory can be applied by researchers in any field, including speech and language therapists. The benefit of the grounded theory method to researchers and practitioners lies in its application to social processes and human interactions. The resulting theory may support further research in the speech and language therapy profession.
Global-local methodologies and their application to nonlinear analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1989-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Application of Fuzzy Logic to Matrix FMECA
NASA Astrophysics Data System (ADS)
Shankar, N. Ravi; Prabhu, B. S.
2001-04-01
A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... VIRGIN ISLANDS General Financial Eligibility Requirements and Options § 436.601 Application of financial... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
NASA Technical Reports Server (NTRS)
Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.
2003-01-01
This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.
SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY
Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of
NASA Astrophysics Data System (ADS)
Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David
2014-01-01
Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.
Weltje, Lennart; Janz, Philipp; Sowig, Peter
2017-12-01
This paper presents a model to predict acute dermal toxicity of plant protection products (PPPs) to terrestrial amphibian life stages from (regulatory) fish data. By combining existing concepts, including interspecies correlation estimation (ICE), allometric relations, lethal body burden (LBB) and bioconcentration modelling, an equation was derived that predicts the amphibian median lethal dermal dose (LD 50 ) from standard acute toxicity values (96-h LC 50 ) for fish and bioconcentration factors (BCF) in fish. Where possible, fish BCF values were corrected to 5% lipid, and to parent compound. Then, BCF values were adjusted to an exposure duration of 96 h, in case steady state took longer to be achieved. The derived correlation equation is based on 32 LD 50 values from acute dermal toxicity experiments with 15 different species of anuran amphibians, comprising 15 different PPPs. The developed ICE model can be used in a screening approach to estimate the acute risk to amphibian terrestrial life stages from dermal exposures to PPPs with organic active substances. This has the potential to reduce unnecessary testing of vertebrates. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
Turbofan engine control system design using the LQG/LTR methodology
NASA Technical Reports Server (NTRS)
Garg, Sanjay
1989-01-01
Application of the linear-quadratic-Gaussian with loop-transfer-recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired target feedback loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.
Turbofan engine control system design using the LQG/LTR methodology
NASA Technical Reports Server (NTRS)
Garg, Sanjay
1989-01-01
Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.
This is one of a series of reports that present methodologies for assessing the potential risks to humans or other organisms from the disposal or reuse of municipal sludge. The sludge management practices addressed by this series include land application practices, distribution a...
Multilevel Modeling: A Review of Methodological Issues and Applications
ERIC Educational Resources Information Center
Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.
2009-01-01
This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…
An Application of the Methodology for Assessment of the Sustainability of Air Transport System
NASA Technical Reports Server (NTRS)
Janic, Milan
2003-01-01
An assessment and operationalization of the concept of sustainable air transport system is recognized as an important but complex research, operational and policy task. In the scope of the academic efforts to properly address the problem, this paper aims to assess the sustainability of air transport system. It particular, the paper describes the methodology for assessment of sustainability and its potential application. The methodology consists of the indicator systems, which relate to the air transport system operational, economic, social and environmental dimension of performance. The particular indicator systems are relevant for the particular actors such users (air travellers), air transport operators, aerospace manufacturers, local communities, governmental authorities at different levels (local, national, international), international air transport associations, pressure groups and public. In the scope of application of the methodology, the specific cases are selected to estimate the particular indicators, and thus to assess the system sustainability under given conditions.
An object-oriented approach for harmonization of multimedia markup languages
NASA Astrophysics Data System (ADS)
Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay
2003-12-01
An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Crossing trend analysis methodology and application for Turkish rainfall records
NASA Astrophysics Data System (ADS)
Şen, Zekâi
2018-01-01
Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.
Simoens, Steven
2013-01-01
Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474
Simoens, Steven
2013-01-01
This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.
Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study
DOT National Transportation Integrated Search
1980-02-01
This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...
Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo
2010-01-01
Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818
Security Quality Requirements Engineering (SQUARE) Methodology
2005-11-01
such as Joint Application Development and the Accelerated Requirements Method [Wood 89, Hubbard 99] • Soft Systems Methodology [Checkland 89...investigated were misuse cases [Jacobson 92], Soft Systems Methodology (SSM) [Checkland 89], Quality Function Deployment (QFD) [QFD 05], Con- trolled...html (2005). [Checkland 89] Checkland, Peter. Soft Systems Methodology . Rational Analysis for a Problematic World. New York, NY: John Wiley & Sons
Radioactive waste disposal fees-Methodology for calculation
NASA Astrophysics Data System (ADS)
Bemš, Július; Králík, Tomáš; Kubančák, Ján; Vašíček, Jiří; Starý, Oldřich
2014-11-01
This paper summarizes the methodological approach used for calculation of fee for low- and intermediate-level radioactive waste disposal and for spent fuel disposal. The methodology itself is based on simulation of cash flows related to the operation of system for waste disposal. The paper includes demonstration of methodology application on the conditions of the Czech Republic.
ERIC Educational Resources Information Center
Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus
2017-01-01
The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…
Gu, Huidong; Wang, Jian; Aubry, Anne-Françoise; Jiang, Hao; Zeng, Jianing; Easter, John; Wang, Jun-sheng; Dockens, Randy; Bifano, Marc; Burrell, Richard; Arnold, Mark E
2012-06-05
A methodology for the accurate calculation and mitigation of isotopic interferences in liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS) assays and its application in supporting microdose absolute bioavailability studies are reported for the first time. For simplicity, this calculation methodology and the strategy to minimize the isotopic interference are demonstrated using a simple molecule entity, then applied to actual development drugs. The exact isotopic interferences calculated with this methodology were often much less than the traditionally used, overestimated isotopic interferences simply based on the molecular isotope abundance. One application of the methodology is the selection of a stable isotopically labeled internal standard (SIL-IS) for an LC-MS/MS bioanalytical assay. The second application is the selection of an SIL analogue for use in intravenous (i.v.) microdosing for the determination of absolute bioavailability. In the case of microdosing, the traditional approach of calculating isotopic interferences can result in selecting a labeling scheme that overlabels the i.v.-dosed drug or leads to incorrect conclusions on the feasibility of using an SIL drug and analysis by LC-MS/MS. The methodology presented here can guide the synthesis by accurately calculating the isotopic interferences when labeling at different positions, using different selective reaction monitoring (SRM) transitions or adding more labeling positions. This methodology has been successfully applied to the selection of the labeled i.v.-dosed drugs for use in two microdose absolute bioavailability studies, before initiating the chemical synthesis. With this methodology, significant time and cost saving can be achieved in supporting microdose absolute bioavailability studies with stable labeled drugs.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta
2006-01-01
Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019
Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)
NASA Technical Reports Server (NTRS)
1984-01-01
The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.
Product environmental footprint in policy and market decisions: Applicability and impact assessment.
Lehmann, Annekatrin; Bach, Vanessa; Finkbeiner, Matthias
2015-07-01
In April 2013, the European Commission published the Product and Organisation Environmental Footprint (PEF/OEF) methodology--a life cycle-based multicriteria measure of the environmental performance of products, services, and organizations. With its approach of "comparability over flexibility," the PEF/OEF methodology aims at harmonizing existing methods, while decreasing the flexibility provided by the International Organization for Standardization (ISO) standards regarding methodological choices. Currently, a 3-y pilot phase is running, aiming at testing the methodology and developing product category and organization sector rules (PEFCR/OEFSR). Although a harmonized method is in theory a good idea, the PEF/OEF methodology presents challenges, including a risk of confusion and limitations in applicability to practice. The paper discusses the main differences between the PEF and ISO methodologies and highlights challenges regarding PEF applicability, with a focus on impact assessment. Some methodological aspects of the PEF and PEFCR Guides are found to contradict the ISO 14044 (2006) and ISO 14025 (2006). Others, such as prohibition of inventory cutoffs, are impractical. The evaluation of the impact assessment methods proposed in the PEF/OEF Guide showed that the predefined methods for water consumption, land use, and abiotic resources are not adequate because of modeling artefacts, missing inventory data, or incomplete characterization factors. However, the methods for global warming and ozone depletion perform very well. The results of this study are relevant for the PEF (and OEF) pilot phase, which aims at testing the PEF (OEF) methodology (and potentially adapting it) as well as addressing challenges and coping with them. © 2015 SETAC.
ERIC Educational Resources Information Center
Cheung, Alan C. K.; Slavin, Robert E.
2013-01-01
The present review examines research on the effects of educational technology applications on mathematics achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are…
ERIC Educational Resources Information Center
Vitale, Michael R.; Romance, Nancy
Adopting perspectives based on applications of artificial intelligence proven in industry, this paper discusses methodological strategies and issues that underlie the development of such software environments. The general concept of an expert system is discussed in the context of its relevance to the problem of increasing the accessibility of…
Wauters, Lauri D J; Miguel-Moragas, Joan San; Mommaerts, Maurice Y
2015-11-01
To gain insight into the methodology of different computer-aided design-computer-aided manufacturing (CAD-CAM) applications for the reconstruction of cranio-maxillo-facial (CMF) defects. We reviewed and analyzed the available literature pertaining to CAD-CAM for use in CMF reconstruction. We proposed a classification system of the techniques of implant and cutting, drilling, and/or guiding template design and manufacturing. The system consisted of 4 classes (I-IV). These classes combine techniques used for both the implant and template to most accurately describe the methodology used. Our classification system can be widely applied. It should facilitate communication and immediate understanding of the methodology of CAD-CAM applications for the reconstruction of CMF defects.
Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis
2016-12-24
The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol.
Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications
Fernández-Caramés, Tiago M.; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis
2016-01-01
The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol. PMID:28029119
Calibration Modeling Methodology to Optimize Performance for Low Range Applications
NASA Technical Reports Server (NTRS)
McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.
2010-01-01
Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
DOT National Transportation Integrated Search
1977-06-01
Author's abstract: In this report, a methodology for analyzing general categorical data with misclassification errors is developed and applied to the study of seat belt effectiveness. The methodology assumes the availability of an original large samp...
Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio
2009-01-01
Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.
Empirical Distributional Semantics: Methods and Biomedical Applications
Cohen, Trevor; Widdows, Dominic
2009-01-01
Over the past fifteen years, a range of methods have been developed that are able to learn human-like estimates of the semantic relatedness between terms from the way in which these terms are distributed in a corpus of unannotated natural language text. These methods have also been evaluated in a number of applications in the cognitive science, computational linguistics and the information retrieval literatures. In this paper, we review the available methodologies for derivation of semantic relatedness from free text, as well as their evaluation in a variety of biomedical and other applications. Recent methodological developments, and their applicability to several existing applications are also discussed. PMID:19232399
Fault Modeling of Extreme Scale Applications Using Machine Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.
Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less
Fault Modeling of Extreme Scale Applications Using Machine Learning
Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.; ...
2016-05-01
Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less
Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D
2015-09-01
Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Sparks, A N; Gadal, L; Ni, X
2015-08-01
The primary Lepidoptera pests of sweet corn (Zea mays L. convar. saccharata) in Georgia are the corn earworm, Helicoverpa zea (Boddie), and the fall armyworm, Spodoptera frugiperda (J. E. Smith). Management of these pests typically requires multiple insecticide applications from first silking until harvest, with commercial growers frequently spraying daily. This level of insecticide use presents problems for small growers, particularly for "pick-your-own" operations. Injection of oil into the corn ear silk channel 5-8 days after silking initiation has been used to suppress damage by these insects. Initial work with this technique in Georgia provided poor results. Subsequently, a series of experiments was conducted to evaluate the efficacy of silk channel injections as an application methodology for insecticides. A single application of synthetic insecticide, at greatly reduced per acre rates compared with common foliar applications, provided excellent control of Lepidoptera insects attacking the ear tip and suppressed damage by sap beetles (Nitidulidae). While this methodology is labor-intensive, it requires a single application of insecticide at reduced rates applied ∼2 wk prior to harvest, compared with potential daily applications at full rates up to the day of harvest with foliar insecticide applications. This methodology is not likely to eliminate the need for foliar applications because of other insect pests which do not enter through the silk channel or are not affected by the specific selective insecticide used in the silk channel injection, but would greatly reduce the number of applications required. This methodology may prove particularly useful for small acreage growers. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
2013-02-07
specific biosurveillance activities as well as clinical applications and alternative versions preformatted and categorized as ‘high-tech’ and ‘low-tech’ and...methodologies. Application for patent protection of this DoD intellectual property is underway. 1::1 . :::.u~~l-1 I ~111VI:::O Leishmaniasis...LHL assay and the need to develop novel and unique sample preparation methodologies. Application for patent protection of this DoD intellectual
The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research
ERIC Educational Resources Information Center
Mertens, Steven B.
2006-01-01
This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…
48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandor, Debra; Chung, Donald; Keyser, David
This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.
NASA Astrophysics Data System (ADS)
Sharma, Amita; Sarangdevot, S. S.
2010-11-01
Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.
Evaluation of stormwater harvesting sites using multi criteria decision methodology
NASA Astrophysics Data System (ADS)
Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.
2018-07-01
Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.
DB4US: A Decision Support System for Laboratory Information Management.
Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-11-14
Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
78 FR 68449 - Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With Multiple Chronic... applications for the ``AHRQ RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With...
Technology transfer methodology
NASA Technical Reports Server (NTRS)
Labotz, Rich
1991-01-01
Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.
An applicational process for dynamic balancing of turbomachinery shafting
NASA Technical Reports Server (NTRS)
Verhoff, Vincent G.
1990-01-01
The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.
T-4G Methodology: Undergraduate Pilot Training T-37 Phase.
ERIC Educational Resources Information Center
Woodruff, Robert R.; And Others
The report's brief introduction describes the application of T-4G methodology to the T-37 instrument phase of undergraduate pilot training. The methodology is characterized by instruction in trainers, proficiency advancement, a highly structured syllabus, the training manager concept, early exposure to instrument training, and hands-on training.…
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
Methodological Limitations of the Application of Expert Systems Methodology in Reading.
ERIC Educational Resources Information Center
Willson, Victor L.
Methodological deficiencies inherent in expert-novice reading research make it impossible to draw inferences about curriculum change. First, comparisons of intact groups are often used as a basis for making causal inferences about how observed characteristics affect behaviors. While comparing different groups is not by itself a useless activity,…
Applications of Mass Spectrometry for Cellular Lipid Analysis
Wang, Chunyan; Wang, Miao; Han, Xianlin
2015-01-01
Mass spectrometric analysis of cellular lipids is an enabling technology for lipidomics, which is a rapidly-developing research field. In this review, we briefly discuss the principles, advantages, and possible limitations of electrospray ionization (ESI) and matrix assisted laser desorption/ionization (MALDI) mass spectrometry-based methodologies for the analysis of lipid species. The applications of these methodologies to lipidomic research are also summarized. PMID:25598407
Recent developments and applications of immobilized laccase.
Fernández-Fernández, María; Sanromán, M Ángeles; Moldes, Diego
2013-12-01
Laccase is a promising biocatalyst with many possible applications, including bioremediation, chemical synthesis, biobleaching of paper pulp, biosensing, textile finishing and wine stabilization. The immobilization of enzymes offers several improvements for enzyme applications because the storage and operational stabilities are frequently enhanced. Moreover, the reusability of immobilized enzymes represents a great advantage compared with free enzymes. In this work, we discuss the different methodologies of enzyme immobilization that have been reported for laccases, such as adsorption, entrapment, encapsulation, covalent binding and self-immobilization. The applications of laccase immobilized by the aforementioned methodologies are presented, paying special attention to recent approaches regarding environmental applications and electrobiochemistry. Copyright © 2012 Elsevier Inc. All rights reserved.
Investigating transport pathways in the ocean
NASA Astrophysics Data System (ADS)
Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.
2013-01-01
The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.
Zhou, Jia; Hu, Fang; Wu, Jing; Zou, Zhi Yong; Wang, Yi Xin; Peng, Hua Can; Vermund, Sten H; Hu, Yi Fei; Ma, Ying Hua
2018-05-01
We sought to identify the differences between adolescents left behind in their home villages/towns (LBA) and non-left behind adolescents (NLB) on subjective well-being and family functioning due to parental migration in south China. We used a stratified cluster sampling method to recruit middle school students in a city experiencing population-emigration in Jiangxi Province in 2010. Participants included adolescents from families with: (1) one migrant parent, (2) both parents who migrated, or (3) non-left behind adolescents (i.e., no migrant parent). To determine predictors of subjective well-being, we used structural equation models. Adolescents left behind by both parents (LBB) were less likely to express life satisfaction (P = 0.038) in terms of their environments (P = 0.011) compared with NLB. A parent or parents who migrated predicts lower subjective well-being of adolescents (P = 0.051) and also lower academic performance. Being apart from their parents may affect family functioning negatively from an adolescent's viewpoint. Given the hundreds of millions of persons in China, many who are parents, migrating for work, there may be mental health challenges in some of the adolescents left behind. Copyright © 2018 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.
Evidence for the presence of restriction/modification systems in Lactobacillus delbrueckii.
Suárez, Viviana; Zago, Miriam; Giraffa, Giorgio; Reinheimer, Jorge; Quiberoni, Andrea
2009-11-01
The bacteriophages Cb1/204 and Cb1/342 were obtained by induction from the commercial strain Lactobacillus delbrueckii subsp. lactis Cb1, and propagated on Lactobacillus delbrueckii subsp. lactis 204 (Lb.l 204) and Lactobacillus delbrueckii subsp. bulgaricus 342 (Lb.b 342), respectively. By cross sensitivity, it was possible to detect a delay in the lysis of Lb.l 204 with Cb1/342 phage, while the adsorption rate was high (99.5%). Modified and unmodified phages were isolated using phage Cb1/342 and strain Lb.l 204. The EOP (Efficiency of Plaquing) values for the four phages (Cb1/204, Cb1/342, Cb1/342modified and Cb1/342unmodified) suggested that an R/M system modified the original temperate phage, and the BglII-DNA restriction patterns of these phages might point out the presence of a Type II R/M system. Also, the existence of a Type I R/M system was demonstrated by PCR and nucleotide sequence, being the percentages of alignment homology with Type I R/M systems reported previously higher than 95%. In this study it was possible to demonstrate that the native phage resistant mechanisms and the occurrence of prophages in commercial host strains, contribute strongly to diversify the phage population in a factory environment.
Semchonok, Dmitry A.; Chauvin, Jean-Paul; Frese, Raoul N.; Jungas, Colette; Boekema, Egbert J.
2012-01-01
Electron microscopy and single-particle averaging were performed on isolated reaction centre (RC)—antenna complexes (RC–LH1–PufX complexes) of Rhodobaca bogoriensis strain LBB1, with the aim of establishing the LH1 antenna conformation, and, in particular, the structural role of the PufX protein. Projection maps of dimeric complexes were obtained at 13 Å resolution and show the positions of the 2 × 14 LH1 α- and β-subunits. This new dimeric complex displays two open, C-shaped LH1 aggregates of 13 αβ polypeptides partially surrounding the RCs plus two LH1 units forming the dimer interface in the centre. Between the interface and the two half rings are two openings on each side. Next to the openings, there are four additional densities present per dimer, considered to be occupied by four copies of PufX. The position of the RC in our model was verified by comparison with RC–LH1–PufX complexes in membranes. Our model differs from previously proposed configurations for Rhodobacter species in which the LH1 ribbon is continuous in the shape of an S, and the stoichiometry is of one PufX per RC. PMID:23148268
The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.
Schmidt, Silke; Verweij, Marcel
2013-01-01
The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
GPS system simulation methodology
NASA Technical Reports Server (NTRS)
Ewing, Thomas F.
1993-01-01
The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.
Methodologies for Root Locus and Loop Shaping Control Design with Comparisons
NASA Technical Reports Server (NTRS)
Kopasakis, George
2017-01-01
This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2010-01-01
Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2011-01-01
Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
Waring, Mike; Bielfeldt, Stephan; Mätzold, Katja; Wilhelm, Klaus-Peter
2013-02-01
Chronic wounds require frequent dressing changes. Adhesive dressings used for this indication can be damaging to the stratum corneum, particularly in the elderly where the skin tends to be thinner. Understanding the level of damage caused by dressing removal can aid dressing selection. This study used a novel methodology that applied a stain to the skin and measured the intensity of that stain after repeated application and removal of a series of different adhesive types. Additionally, a traditional method of measuring skin barrier damage (transepidermal water loss) was also undertaken and compared with the staining methodology. The staining methodology and measurement of transepidermal water loss differentiated the adhesive dressings, showing that silicone adhesives caused least trauma to the skin. The staining methodology was shown to be as effective as transepidermal water loss in detecting damage to the stratum corneum and was shown to detect disruption of the barrier earlier than the traditional technique. © 2012 John Wiley & Sons A/S.
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
2008-07-23
This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.
Warehouses information system design and development
NASA Astrophysics Data System (ADS)
Darajatun, R. A.; Sukanta
2017-12-01
Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.
Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F
2011-04-01
This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.
A manual for conducting environmental impact studies.
DOT National Transportation Integrated Search
1971-01-01
This report suggests methodologies which should enable an interdisciplinary team to assess community values. The methodologies are applicable to the conceptual, location, and design phases of highway planning, respectively. The approach employs a wei...
Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving
Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice
2016-01-01
The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171
Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H
2018-07-01
Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics
ERIC Educational Resources Information Center
Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.
2016-01-01
Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…
Determining Faculty and Student Views: Applications of Q Methodology in Higher Education
ERIC Educational Resources Information Center
Ramlo, Susan
2012-01-01
William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…
A theoretical and experimental investigation of propeller performance methodologies
NASA Technical Reports Server (NTRS)
Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.
1980-01-01
This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.
Vertically aligned carbon nanotubes for microelectrode arrays applications.
Castro Smirnov, J R; Jover, Eric; Amade, Roger; Gabriel, Gemma; Villa, Rosa; Bertran, Enric
2012-09-01
In this work a methodology to fabricate carbon nanotube based electrodes using plasma enhanced chemical vapour deposition has been explored and defined. The final integrated microelectrode based devices should present specific properties that make them suitable for microelectrode arrays applications. The methodology studied has been focused on the preparation of highly regular and dense vertically aligned carbon nanotube (VACNT) mat compatible with the standard lithography used for microelectrode arrays technology.
Benefit-cost methodology study with example application of the use of wind generators
NASA Technical Reports Server (NTRS)
Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.
1975-01-01
An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.
DB4US: A Decision Support System for Laboratory Information Management
Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-01-01
Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745
78 FR 66681 - Census Advisory Committees
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
..., filing of petitions and applications and agency #0;statements of organization and functions are examples... policies, research and methodology, tests, operations, communications/messaging and other activities to..., socioeconomic, linguistic, technological, methodological, geographic, behavioral and operational variables...
Application of a statewide intermodal freight planning methodology.
DOT National Transportation Integrated Search
2001-08-01
Anticipating the need for Virginia to comply with the new freight planning requirements mandated by ISTEA and TEA-21, the Virginia Transportation Research Council in 1998 developed a Statewide Intermodal Freight Transportation Planning Methodology, w...
Seventh NASTRAN User's Colloquium
NASA Technical Reports Server (NTRS)
1978-01-01
The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems are described. Topics include: fluids and thermal applications, NASTRAN programming, substructuring methods, unique new applications, general auxiliary programs, specific applications, and new capabilities.
Evaluating Multi-Input/Multi-Output Digital Control Systems
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek
1994-01-01
Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.
Application of Design Methodologies for Feedback Compensation Associated with Linear Systems
NASA Technical Reports Server (NTRS)
Smith, Monty J.
1996-01-01
The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.
Coussot, G; Ladner, Y; Bayart, C; Faye, C; Vigier, V; Perrin, C
2015-01-09
This work aims at studying the potentialities of an on-line capillary electrophoresis (CE)-based digestion methodology for evaluating polymer-drug conjugates degradability in the presence of free trypsin (in-solution digestion). A sandwich plugs injection scheme with transverse diffusion of laminar profile (TDLFP) mode was used to achieve on-line digestions. Electrophoretic separation conditions were established using poly-l-Lysine (PLL) as reference substrate. Comparison with off-line digestion was carried out to demonstrate the feasibility of the proposed methodology. The applicability of the on-line CE-based digestion methodology was evaluated for two PLL-drug conjugates and for the four first generations of dendrigraft of lysine (DGL). Different electrophoretic profiles presenting the formation of di, tri, and tetralysine were observed for PLL-drug and DGL. These findings are in good agreement with the nature of the linker used to link the drug to PLL structure and the predicted degradability of DGL. The present on-line methodology applicability was also successfully proven for protein conjugates hydrolysis. In summary, the described methodology provides a powerful tool for the rapid study of biodegradable polymers. Copyright © 2014 Elsevier B.V. All rights reserved.
Methodology for estimating helicopter performance and weights using limited data
NASA Technical Reports Server (NTRS)
Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard
1990-01-01
Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.
Drenkard, K N
2001-01-01
The application of a strategic planning methodology for the discipline of nursing is described in use by a large, nonprofit integrated healthcare system. The methodology uses a transformational leadership assessment tool, quality planning methods, and large group intervention to engage nurses in the implementation of strategies. Based on systems theory, the methodology outlined by the author has application at any level in an organization, from an entire delivery network, to a patient care unit. The author discusses getting started on a strategic planning journey, tools that are useful in the process, integrating already existing business plans into the strategies for nursing, preliminary measures to monitor progress, and lessons learned along the journey.
Determining The Various Perspectives And Consensus Within A Classroom Using Q Methodology
NASA Astrophysics Data System (ADS)
Ramlo, Susan E.
2008-10-01
Q methodology was developed by PhD physicist and psychologist William Stevenson 73 years ago as a new way of investigating people's views of any topic. Yet its application has primarily been in the fields of marketing, psychology, and political science. Still, Q offers an opportunity for the physics education research community to determine the perspectives and consensus within a group, such as a classroom, related to topics of interest such as the nature of science and epistemology. This paper presents the basics of using Q methodology with a classroom application as an example and subsequent comparisons of this example's results to similar studies using qualitative and survey methods.
Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)
NASA Technical Reports Server (NTRS)
1983-01-01
The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Integration Methodology For Oil-Free Shaft Support Systems: Four Steps to Success
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; DellaCorte, Christopher; Bruckner, Robert J.
2010-01-01
Commercial applications for Oil-Free turbomachinery are slowly becoming a reality. Micro-turbine generators, highspeed electric motors, and electrically driven centrifugal blowers are a few examples of products available in today's commercial marketplace. Gas foil bearing technology makes most of these applications possible. A significant volume of component level research has led to recent acceptance of gas foil bearings in several specialized applications, including those mentioned above. Component tests identifying such characteristics as load carrying capacity, power loss, thermal behavior, rotordynamic coefficients, etc. all help the engineer design foil bearing machines, but the development process can be just as important. As the technology gains momentum and acceptance in a wider array of machinery, the complexity and variety of applications will grow beyond the current class of machines. Following a robust integration methodology will help improve the probability of successful development of future Oil-Free turbomachinery. This paper describes a previously successful four-step integration methodology used in the development of several Oil-Free turbomachines. Proper application of the methods put forward here enable successful design of Oil-Free turbomachinery. In addition when significant design changes or unique machinery are developed, this four-step process must be considered.
Recommendations for benefit-risk assessment methodologies and visual representations.
Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain
2016-03-01
The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.
1979-01-01
The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications.
Railroad classification yard technology : computer system methodology : case study : Potomac Yard
DOT National Transportation Integrated Search
1981-08-01
This report documents the application of the railroad classification yard computer system methodology to Potomac Yard of the Richmond, Fredericksburg, and Potomac Railroad Company (RF&P). This case study entailed evaluation of the yard traffic capaci...
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong
2018-03-01
Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disney, R.K.
1994-10-01
The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less
A Generalizable Methodology for Quantifying User Satisfaction
NASA Astrophysics Data System (ADS)
Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung
Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.
NASA Technical Reports Server (NTRS)
Young, G.
1982-01-01
A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brossmann, U.B.
1981-01-01
The application of the methodological design is demonstrated for the development of support concepts in the case of a Bitter-type magnet designed for a compact tokamak experimentat aiming at ignition of a DT plasma. With this methodology all boundary conditions and design criteria are more easily satisfied in a technical and economical way.
Nanosatellite and Plug-and-Play Architecture 2 (NAPA 2)
2017-02-28
potentially other militarily relevant roles. The "i- Missions" focus area studies the kinetics of rapid mission development. The methodology involves...the US and Sweden in the Nanosatellite and Plug-and-play Architecture or "NAPA" program) is to pioneer a methodology for creating mission capable 6U...spacecraft. The methodology involves interchangeable blackbox (self-describing) components, software (middleware and applications), advanced
2009-03-01
III. Methodology ...............................................................................................................26 Overview...applications relating to this research and the results they have obtained, as well as the background on LEEDR. Chapter 3 will detail the methodology ...different in that the snow dissipates faster and it is better to descend slower, at rates of 200 – 300 ft/min. 26 III. Methodology This chapter
NASA Technical Reports Server (NTRS)
Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai
2011-01-01
A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai
2012-01-01
A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.
IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less
NASA Astrophysics Data System (ADS)
Navarro, Manuel
2014-05-01
This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology (evolutionary maps or emaps), whose implementation on certain domains unfolds the web of itineraries that children may follow in the construction of concrete conceptual knowledge and pinpoints, for each conception, the architecture of the conceptual change that leads to the scientific concept. Remarkably, the generative character of its syntax yields conceptions that, if unknown, amount to predictions that can be tested experimentally. Its application to the diurnal cycle (including the sun's trajectory in the sky) indicates that the model is correct and the methodology works (in some domains). Specifically, said emap predicts a number of exotic trajectories of the sun in the sky that, in the experimental work, were drawn spontaneously both on paper and a dome. Additionally, the application of the emaps theoretical framework in clinical interviews has provided new insight into other cognitive processes. The field of validity of the methodology and its possible applications to science education are discussed.
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
Development of risk-based decision methodology for facility design.
DOT National Transportation Integrated Search
2014-06-01
This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...
A Nursing Process Methodology.
ERIC Educational Resources Information Center
Ryan-Wenger, Nancy M.
1990-01-01
A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
Analysis of pressure distortion testing
NASA Technical Reports Server (NTRS)
Koch, K. E.; Rees, R. L.
1976-01-01
The development of a distortion methodology, method D, was documented, and its application to steady state and unsteady data was demonstrated. Three methodologies based upon DIDENT, a NASA-LeRC distortion methodology based upon the parallel compressor model, were investigated by applying them to a set of steady state data. The best formulation was then applied to an independent data set. The good correlation achieved with this data set showed that method E, one of the above methodologies, is a viable concept. Unsteady data were analyzed by using the method E methodology. This analysis pointed out that the method E sensitivities are functions of pressure defect level as well as corrected speed and pattern.
Propellant Readiness Level: A Methodological Approach to Propellant Characterization
NASA Technical Reports Server (NTRS)
Bossard, John A.; Rhys, Noah O.
2010-01-01
A methodological approach to defining propellant characterization is presented. The method is based on the well-established Technology Readiness Level nomenclature. This approach establishes the Propellant Readiness Level as a metric for ascertaining the readiness of a propellant or a propellant combination by evaluating the following set of propellant characteristics: thermodynamic data, toxicity, applications, combustion data, heat transfer data, material compatibility, analytical prediction modeling, injector/chamber geometry, pressurization, ignition, combustion stability, system storability, qualification testing, and flight capability. The methodology is meant to be applicable to all propellants or propellant combinations; liquid, solid, and gaseous propellants as well as monopropellants and propellant combinations are equally served. The functionality of the proposed approach is tested through the evaluation and comparison of an example set of hydrocarbon fuels.
NASA Technical Reports Server (NTRS)
Page, J.
1981-01-01
The effects of an independent verification and integration (V and I) methodology on one class of application are described. Resource profiles are discussed. The development environment is reviewed. Seven measures are presented to test the hypothesis that V and I improve the development and product. The V and I methodology provided: (1) a decrease in requirements ambiguities and misinterpretation; (2) no decrease in design errors; (3) no decrease in the cost of correcting errors; (4) a decrease in the cost of system and acceptance testing; (5) an increase in early discovery of errors; (6) no improvement in the quality of software put into operation; and (7) a decrease in productivity and an increase in cost.
Ab initio quantum chemistry: methodology and applications.
Friesner, Richard A
2005-05-10
This Perspective provides an overview of state-of-the-art ab initio quantum chemical methodology and applications. The methods that are discussed include coupled cluster theory, localized second-order Moller-Plesset perturbation theory, multireference perturbation approaches, and density functional theory. The accuracy of each approach for key chemical properties is summarized, and the computational performance is analyzed, emphasizing significant advances in algorithms and implementation over the past decade. Incorporation of a condensed-phase environment by means of mixed quantum mechanical/molecular mechanics or self-consistent reaction field techniques, is presented. A wide range of illustrative applications, focusing on materials science and biology, are discussed briefly.
The TMIS life-cycle process document, revision A
NASA Technical Reports Server (NTRS)
1991-01-01
The Technical and Management Information System (TMIS) Life-Cycle Process Document describes the processes that shall be followed in the definition, design, development, test, deployment, and operation of all TMIS products and data base applications. This document is a roll out of TMIS Standards Document (SSP 30546). The purpose of this document is to define the life cycle methodology that the developers of all products and data base applications and any subsequent modifications shall follow. Included in this methodology are descriptions of the tasks, deliverables, reviews, and approvals that are required before a product or data base application is accepted in the TMIS environment.
The probability of transportation accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brobst, W.A.
1972-11-10
We examined the relative safety of different modes of transportation from a statistical basis, rather than an emotional one. As we were collecting data and evaluating its applicability, we found that our own emotions came into play in judging which data would be useful and which data we should discard. We developed a methodology of simple data analysis that would lend itself to similar evaluations to questions. The author described that methodology, and demonstrated its application to shipments of radioactive materials. 31 refs., 7 tabs/
U.S. Heat Demand by Sector for Potential Application of Direct Use Geothermal
Katherine Young
2016-06-23
This dataset includes heat demand for potential application of direct use geothermal broken down into 4 sectors: agricultural, commercial, manufacturing and residential. The data for each sector are organized by county, were disaggregated specifically to assess the market demand for geothermal direct use, and were derived using methodologies customized for each sector based on the availability of data and other sector-specific factors. This dataset also includes a paper containing a full explanation of the methodologies used.
NASA Astrophysics Data System (ADS)
Chen, Zhiming; Feng, Yuncheng
1988-08-01
This paper describes an algorithmic structure for combining simulation and optimization techniques both in theory and practice. Response surface methodology is used to optimize the decision variables in the simulation environment. A simulation-optimization software has been developed and successfully implemented, and its application to an aggregate production planning simulation-optimization model is reported. The model's objective is to minimize the production cost and to generate an optimal production plan and inventory control strategy for an aircraft factory.
Remote sensing applied to agriculture: Basic principles, methodology, and applications
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Mendonca, F. J.
1981-01-01
The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.
Sodium MRI: Methods and applications
Madelin, Guillaume; Lee, Jae-Seung; Regatte, Ravinder R.; Jerschow, Alexej
2014-01-01
Sodium NMR spectroscopy and MRI have become popular in recent years through the increased availability of high-field MRI scanners, advanced scanner hardware and improved methodology. Sodium MRI is being evaluated for stroke and tumor detection, for breast cancer studies, and for the assessment of osteoarthritis and muscle and kidney functions, to name just a few. In this article, we aim to present an up-to-date review of the theoretical background, the methodology, the challenges and limitations, and current and potential new applications of sodium MRI. PMID:24815363
Characterizing Postural Sway during Quiet Stance Based on the Intermittent Control Hypothesis
NASA Astrophysics Data System (ADS)
Nomura, Taishin; Nakamura, Toru; Fukada, Kei; Sakoda, Saburo
2007-07-01
This article illustrates a signal processing methodology for the time series of postural sway and accompanied electromyographs from the lower limb muscles during quiet stance. It was shown that the proposed methodology was capable of identifying the underlying postural control mechanisms. A preliminary application of the methodology provided evidence that supports the intermittent control hypothesis alternative to the conventional stiffness control hypothesis during human quiet upright stance.
Designing for fiber composite structural durability in hygrothermomechanical environment
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1985-01-01
A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.
UNICORN (Version III) Methodology.
1976-10-01
rAD-A124 766 UNICORN CYERSION’III) NETHODOLOGYMU SCIENCE / APPLICATIONS INC ENGLEWOOD CO L M BLACKWELL ET AL. OCT 76 SAI-76-648-DEN DCAII-75-C-802...1ii4 4% 83 02 010GZ SAI-76-048-DEN .. UNICORN (VERSION III) METHODOLOGY TECHNICAL MEMORANDUM by core L. M. Blackwell . IF’l 4 H. E. Hock T. A. Kriz D...6 DISCUSSION .. ... ..... ..... ..... ..... ...... 7 FINDINGS AND CONCLUSIONS--THE UNICORN METHODOLOGY .. .. ... ..... 9
Force on Force Modeling with Formal Task Structures and Dynamic Geometry
2017-03-24
task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test
The retrospective chart review: important methodological considerations.
Vassar, Matt; Holzmann, Matthew
2013-01-01
In this paper, we review and discuss ten common methodological mistakes found in retrospective chart reviews. The retrospective chart review is a widely applicable research methodology that can be used by healthcare disciplines as a means to direct subsequent prospective investigations. In many cases in this review, we have also provided suggestions or accessible resources that researchers can apply as a "best practices" guide when planning, conducting, or reviewing this investigative method.
Manfredi, Simone; Cristobal, Jorge
2016-09-01
Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.
A Systematic Determination of Skill and Simulator Requirements for Airplane Pilot Certification
DOT National Transportation Integrated Search
1985-03-01
This research report describes: (1) the FAA's ATP airman certification system; (2) needs of the system regarding simulator use; (3) a systematic methodology for meeting these needs; (4) application of the methodology; (5) results of the study; and (6...
Load and resistance factor rating (LRFR) in New York State : volume II.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...
Load and resistance factor rating (LRFR) in NYS : volume II final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
17 CFR 39.5 - Review of swaps for Commission determination on clearing requirement.
Code of Federal Regulations, 2012 CFR
2012-04-01
... publicly; (vi) Risk management procedures, including measurement and monitoring of credit exposures, initial and variation margin methodology, methodologies for stress testing and back testing, settlement procedures, and default management procedures; (vii) Applicable rules, manuals, policies, or procedures...
Load and resistance factor rating (LRFR) in NYS : volume I final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
Teaching Camera Calibration by a Constructivist Methodology
ERIC Educational Resources Information Center
Samper, D.; Santolaria, J.; Pastor, J. J.; Aguilar, J. J.
2010-01-01
This article describes the Metrovisionlab simulation software and practical sessions designed to teach the most important machine vision camera calibration aspects in courses for senior undergraduate students. By following a constructivist methodology, having received introductory theoretical classes, students use the Metrovisionlab application to…
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
Railroad classification yard design methodology study : East Deerfield Yard, a case study
DOT National Transportation Integrated Search
1980-02-01
This interim report documents the application of a railroad classification yard design methodology to Boston and Maine's East Deerfield Yard Rehabiliation. This case study effort represents Phase 2 of a larger effort to develop a yard design methodol...
Load and resistance factor rating (LRFR) in New York State : volume I.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
Methodology for extracting local constants from petroleum cracking flows
Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.
2000-01-01
A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio
2016-02-29
This paper presents a methodology and instrumentation system for the indirect measurement of the thermal diffusivity of a soil at a given depth from measuring its temperature at that depth. The development has been carried out considering its application to the design and sizing of very low enthalpy geothermal energy (VLEGE) systems, but it can has many other applications, for example in construction, agriculture or biology. The methodology is simple and inexpensive because it can take advantage of the prescriptive geotechnical drilling prior to the construction of a house or building, to take at the same time temperature measurements that will allow get the actual temperature and ground thermal diffusivity to the depth of interest. The methodology and developed system have been tested and used in the design of a VLEGE facility for a chalet with basement at the outskirts of Huelva (a city in the southwest of Spain). Experimental results validate the proposed approach.
Sadiq, Rehan; Rodriguez, Manuel J
2005-04-01
Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.
Corvalán, Roberto M; Osses, Mauricio; Urrutia, Cristian M
2002-02-01
Depending on the final application, several methodologies for traffic emission estimation have been developed. Emission estimation based on total miles traveled or other average factors is a sufficient approach only for extended areas such as national or worldwide areas. For road emission control and strategies design, microscale analysis based on real-world emission estimations is often required. This involves actual driving behavior and emission factors of the local vehicle fleet under study. This paper reports on a microscale model for hot road emissions and its application to the metropolitan region of the city of Santiago, Chile. The methodology considers the street-by-street hot emission estimation with its temporal and spatial distribution. The input data come from experimental emission factors based on local driving patterns and traffic surveys of traffic flows for different vehicle categories. The methodology developed is able to estimate hourly hot road CO, total unburned hydrocarbons (THCs), particulate matter (PM), and NO(x) emissions for predefined day types and vehicle categories.
Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio
2016-01-01
This paper presents a methodology and instrumentation system for the indirect measurement of the thermal diffusivity of a soil at a given depth from measuring its temperature at that depth. The development has been carried out considering its application to the design and sizing of very low enthalpy geothermal energy (VLEGE) systems, but it can has many other applications, for example in construction, agriculture or biology. The methodology is simple and inexpensive because it can take advantage of the prescriptive geotechnical drilling prior to the construction of a house or building, to take at the same time temperature measurements that will allow get the actual temperature and ground thermal diffusivity to the depth of interest. The methodology and developed system have been tested and used in the design of a VLEGE facility for a chalet with basement at the outskirts of Huelva (a city in the southwest of Spain). Experimental results validate the proposed approach. PMID:26938534
Abad, Sergi; Pérez, Xavier; Planas, Antoni; Turon, Xavier
2014-04-01
Recently, the need for crude glycerol valorisation from the biodiesel industry has generated many studies for practical and economic applications. Amongst them, fermentations based on glycerol media for the production of high value metabolites are prominent applications. This has generated a need to develop analytical techniques which allow fast and simple glycerol monitoring during fermentation. The methodology should be fast and inexpensive to be adopted in research, as well as in industrial applications. In this study three different methods were analysed and compared: two common methodologies based on liquid chromatography and enzymatic kits, and the new method based on a DotBlot assay coupled with image analysis. The new methodology is faster and cheaper than the other conventional methods, with comparable performance. Good linearity, precision and accuracy were achieved in the lower range (10 or 15 g/L to depletion), the most common range of glycerol concentrations to monitor fermentations in terms of growth kinetics. Copyright © 2014 Elsevier B.V. All rights reserved.
Applications of mixed-methods methodology in clinical pharmacy research.
Hadi, Muhammad Abdul; Closs, S José
2016-06-01
Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.
Proteomic Profiling of Rat Thyroarytenoid Muscle
ERIC Educational Resources Information Center
Welham, Nathan V.; Marriott, Gerard; Bless, Diane M.
2006-01-01
Purpose: Proteomic methodologies offer promise in elucidating the systemwide cellular and molecular processes that characterize normal and diseased thyroarytenoid (TA) muscle. This study examined methodological issues central to the application of 2-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis (2D SDS-PAGE) to the study of…
Development of Methodologies Evaluating Emissions from Metal-Containing Explosives and Propellants
Experiments were performed to develop methodologies that will allow determination of pollutant emission factors for gases and particles produced by...micrometer, 16 by weight). Although not included here, the analysis methods described will be directly applicable to the study of pyrotechnics.
Social Network Analysis: A New Methodology for Counseling Research.
ERIC Educational Resources Information Center
Koehly, Laura M.; Shivy, Victoria A.
1998-01-01
Social network analysis (SNA) uses indices of relatedness among individuals to produce representations of social structures and positions inherent in dyads or groups. SNA methods provide quantitative representations of ongoing transactional patterns in a given social environment. Methodological issues, applications and resources are discussed…
On the Evolving Nature of Exposure Therapy
ERIC Educational Resources Information Center
Schare, Mitchell L.; Wyatt, Kristin P.
2013-01-01
Four articles examining methodological applications of exposure therapy and its limited dissemination were briefly reviewed. Methodological articles included those by Abramowitz et al., Gryczkowski et al., and Weiner and McKay, which addressed couple treatment of obsessive-compulsive disorder (OCD), modification of evidence-based anxiety…
Methodology of Education and R&D in Mechatronics.
ERIC Educational Resources Information Center
Yamazaki, K.; And Others
1985-01-01
Describes the concept and methodology of "mechatronics" (application of microelectronics to mechanism control) and research and development (R&D) projects through the activities initiated at the Precision Machining Laboratory of the Department of Production Systems Engineering of the new Toyohashi University of Technology. (JN)
Single Subject Research: Applications to Special Education
ERIC Educational Resources Information Center
Cakiroglu, Orhan
2012-01-01
Single subject research is a scientific research methodology that is increasingly used in the field of special education. Therefore, understanding the unique characteristics of single subject research methodology is critical both for educators and practitioners. Certain characteristics make single subject research one of the most preferred…
DEVELOPMENT OF RISK ASSESSMENT METHODOLOGY FOR MUNICIPAL SLUDGE INCINERATION
This is one of a series of reports that present methodologies for assessing the potential risks to humans or other organisms from the disposal or reuse of municipal sludge. he sludge management practices addressed by this series include land application practices, distribution an...
DEVELOPMENT OF RISK ASSESSMENT METHODOLOGY FOR MUNICIPAL SLUDGE LANDFILLING
This is one of a series of reports that present methodologies for assessing the potential risks to humans or other organisms from the disposal or reuse of municipal sludge. he sludge management practices addressed by this series include land application practices, distribution an...
Weighted Ensemble Simulation: Review of Methodology, Applications, and Software
Zuckerman, Daniel M.; Chong, Lillian T.
2018-01-01
The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling
Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.
Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.
Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.
Zuckerman, Daniel M; Chong, Lillian T
2017-05-22
The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.
NASA Astrophysics Data System (ADS)
Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay
2018-01-01
Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.
NASA Technical Reports Server (NTRS)
Jones, Thomas C.; Dorsey, John T.; Doggett, William R.
2015-01-01
The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.
Least-squares finite element solution of 3D incompressible Navier-Stokes problems
NASA Technical Reports Server (NTRS)
Jiang, Bo-Nan; Lin, Tsung-Liang; Povinelli, Louis A.
1992-01-01
Although significant progress has been made in the finite element solution of incompressible viscous flow problems. Development of more efficient methods is still needed before large-scale computation of 3D problems becomes feasible. This paper presents such a development. The most popular finite element method for the solution of incompressible Navier-Stokes equations is the classic Galerkin mixed method based on the velocity-pressure formulation. The mixed method requires the use of different elements to interpolate the velocity and the pressure in order to satisfy the Ladyzhenskaya-Babuska-Brezzi (LBB) condition for the existence of the solution. On the other hand, due to the lack of symmetry and positive definiteness of the linear equations arising from the mixed method, iterative methods for the solution of linear systems have been hard to come by. Therefore, direct Gaussian elimination has been considered the only viable method for solving the systems. But, for three-dimensional problems, the computer resources required by a direct method become prohibitively large. In order to overcome these difficulties, a least-squares finite element method (LSFEM) has been developed. This method is based on the first-order velocity-pressure-vorticity formulation. In this paper the LSFEM is extended for the solution of three-dimensional incompressible Navier-Stokes equations written in the following first-order quasi-linear velocity-pressure-vorticity formulation.
A methodological review of qualitative case study methodology in midwifery research.
Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn
2016-10-01
To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Xiao-Ying; Yao, Juan; He, Hua
2012-01-01
Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.
Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...
2014-09-28
Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.
High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
Carvalho, Carlos M.; Chang, Jeffrey; Lucas, Joseph E.; Nevins, Joseph R.; Wang, Quanli; West, Mike
2010-01-01
We describe studies in molecular profiling and biological pathway analysis that use sparse latent factor and regression models for microarray gene expression data. We discuss breast cancer applications and key aspects of the modeling and computational methodology. Our case studies aim to investigate and characterize heterogeneity of structure related to specific oncogenic pathways, as well as links between aggregate patterns in gene expression profiles and clinical biomarkers. Based on the metaphor of statistically derived “factors” as representing biological “subpathway” structure, we explore the decomposition of fitted sparse factor models into pathway subcomponents and investigate how these components overlay multiple aspects of known biological activity. Our methodology is based on sparsity modeling of multivariate regression, ANOVA, and latent factor models, as well as a class of models that combines all components. Hierarchical sparsity priors address questions of dimension reduction and multiple comparisons, as well as scalability of the methodology. The models include practically relevant non-Gaussian/nonparametric components for latent structure, underlying often quite complex non-Gaussianity in multivariate expression patterns. Model search and fitting are addressed through stochastic simulation and evolutionary stochastic search methods that are exemplified in the oncogenic pathway studies. Supplementary supporting material provides more details of the applications, as well as examples of the use of freely available software tools for implementing the methodology. PMID:21218139
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M
2018-05-10
This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
Novel thermal management system design methodology for power lithium-ion battery
NASA Astrophysics Data System (ADS)
Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro
2014-12-01
Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.
Methodology of modeling and measuring computer architectures for plasma simulations
NASA Technical Reports Server (NTRS)
Wang, L. P. T.
1977-01-01
A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun
1994-01-01
A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.
NASA Technical Reports Server (NTRS)
Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.
1989-01-01
A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.
[Clinical practice guidelines in Peru: evaluation of its quality using the AGREE II instrument].
Canelo-Aybar, Carlos; Balbin, Graciela; Perez-Gomez, Ángela; Florez, Iván D
2016-01-01
To evaluate the methodological quality of clinical practice guidelines (CPGs) put into practice by the Peruvian Ministry of Health (MINSA), 17 CPGs from the ministry, published between 2009 and 2014, were independently evaluated by three methodologic experts using the AGREE II instrument. The score of AGREE II domains was low and very low in all CPGs: scope and purpose (medium, 44%), clarity of presentation (medium, 47%), participation of decision-makers (medium, 8%), methodological rigor (medium, 5%), applicability (medium, 5%), and editorial independence (medium, 8%). In conclusion, the methodological quality of CPGs implemented by the MINSA is low. Consequently, its use could not be recommended. The implementation of the methodology for the development of CPGs described in the recentlypublished CPG methodological preparation manual in Peru is a pressing need.
The Statistical point of view of Quality: the Lean Six Sigma methodology
Viti, Andrea; Terzi, Alberto
2015-01-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253
The Statistical point of view of Quality: the Lean Six Sigma methodology.
Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto
2015-04-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.
Application-specific coarse-grained reconfigurable array: architecture and design methodology
NASA Astrophysics Data System (ADS)
Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu
2015-06-01
Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.
DOT National Transportation Integrated Search
1995-01-01
Prepared ca. 1995. This paper illustrates the use of the simulation-optimization technique of response surface methodology (RSM) in traffic signal optimization of urban networks. It also quantifies the gains of using the common random number (CRN) va...
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…
Nonlinear and adaptive control
NASA Technical Reports Server (NTRS)
Athans, Michael
1989-01-01
The primary thrust of the research was to conduct fundamental research in the theories and methodologies for designing complex high-performance multivariable feedback control systems; and to conduct feasibiltiy studies in application areas of interest to NASA sponsors that point out advantages and shortcomings of available control system design methodologies.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.
2016-12-01
Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, B.P.; Legg, J.; Travis, C.C.
1995-06-01
This document describes a worker health risk evaluation methodology for assessing risks associated with Environmental Restoration (ER) and Waste Management (WM). The methodology is appropriate for estimating worker risks across the Department of Energy (DOE) Complex at both programmatic and site-specific levels. This document supports the worker health risk methodology used to perform the human health risk assessment portion of the DOE Programmatic Environmental Impact Statement (PEIS) although it has applications beyond the PEIS, such as installation-wide worker risk assessments, screening-level assessments, and site-specific assessments.
Design consideration of resonance inverters with electro-technological application
NASA Astrophysics Data System (ADS)
Hinov, Nikolay
2017-12-01
This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.
Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?
NASA Technical Reports Server (NTRS)
Gregory, Irene M.
1999-01-01
High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.
Semantic Network Adaptation Based on QoS Pattern Recognition for Multimedia Streams
NASA Astrophysics Data System (ADS)
Exposito, Ernesto; Gineste, Mathieu; Lamolle, Myriam; Gomez, Jorge
This article proposes an ontology based pattern recognition methodology to compute and represent common QoS properties of the Application Data Units (ADU) of multimedia streams. The use of this ontology by mechanisms located at different layers of the communication architecture will allow implementing fine per-packet self-optimization of communication services regarding the actual application requirements. A case study showing how this methodology is used by error control mechanisms in the context of wireless networks is presented in order to demonstrate the feasibility and advantages of this approach.
Fuzzy logic modeling of high performance rechargeable batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, P.; Fennie, C. Jr.; Reisner, D.E.
1998-07-01
Accurate battery state-of-charge (SOC) measurements are critical in many portable electronic device applications. Yet conventional techniques for battery SOC estimation are limited in their accuracy, reliability, and flexibility. In this paper the authors present a powerful new approach to estimate battery SOC using a fuzzy logic-based methodology. This approach provides a universally applicable, accurate method for battery SOC estimation either integrated within, or as an external monitor to, an electronic device. The methodology is demonstrated in modeling impedance measurements on Ni-MH cells and discharge voltage curves of Li-ion cells.
Jones, Cheryl Bland
2005-01-01
This is the second article in a 2-part series focusing on nurse turnover and its costs. Part 1 (December 2004) described nurse turnover costs within the context of human capital theory, and using human resource accounting methods, presented the updated Nursing Turnover Cost Calculation Methodology. Part 2 presents an application of this method in an acute care setting and the estimated costs of nurse turnover that were derived. Administrators and researchers can use these methods and cost information to build a business case for nurse retention.
A new vision of carbonate slopes: the Little Bahama Bank
NASA Astrophysics Data System (ADS)
Mulder, Thierry; Gillet, Hervé; Hanquiez, Vincent; Reijmer, John J.; Tournadour, Elsa; Chabaud, Ludivine; Principaud, Mélanie; Schnyder, Jara; Borgomano, Jean
2015-04-01
Recent data collected in November 2014 (RV Walton Smith) on the upper slope of the Little Bahama Bank (LBB) between 30 and 400 m water depth allowed to characterize the uppermost slope (Rankey et al., 2012) over a surface of 170 km2. The new data set includes multibeam bathymetry and acoustic imagery, 3.5 kHz very-high resolution (VHR) seismic reflection lines, 21 gravity cores and 11 Van Veen grabs. The upper slope of the LBB does not show a steep submarine cliff as known from western Great Bahama Bank. The carbonate bank progressively deepens towards the basin through a slighty inclined plateau. The slope value is < 6° down to a water depth of about 70 m. The plateau is incised by decameter-wide gullies that covered with indurated sediment. Some of the gullies like Roberts Cuts show a larger size and may play an important role in sediment transfer from the shallow-water carbonate bank down to the canyon heads at 400-500 m water depth (Mulder et al., 2012). In the gully area, the actual reef rests on paleo-reefs that outcrop at a water depth of about 40 m. These paleo-reef structures could represent reefs that established themselves during past periods of sea-level stagnation. Below this water depth, the slope steepens up to 30° to form the marginal escarpment (Rankey et al., 2012), which is succeeded by the open margin realm (Rankey et al., 2012). The slope inclination value decreases at about 180-200 m water depth. Between 20 and 200 m of water depth, the VHR seismic shows no seafloor sub-bottom reflector. Between 180 and 320 m water depth, the seafloor smoothens. The VHR seismic shows an onlapping sediment wedge, which starts in this water depth and shows a blind or very crudely stratified echo facies. The sediment thickness of this Holocene unit may exceed 20 m. It fills small depressions in the substratum and thickens in front of gullies that cut the carbonate platform edge. Sediment samples show the abundancy of carbonate mud on the present Bahamian seafloor. In gullies, coarser sediment can be found. In some case, soft sediments are absent suggesting by-passing. At water depth between 40 and 100 m, the present-day seafloor is covered with bioclastic sediments. The main carbonate producer seems to be the alga genus Halimeda. Sediments collected in the deeper part of the basin (water depth = 1080 m) on the distal lobe consist of massive fine to medium well-sorted aragonitic sand. This suggests that carbonate slope systems are able to sort sediment despite the relative short slope distance. Sorting could either be due to flow spilling above the terraces identified in the canyon heads (Mulder et al., 2012) or could result from bottom currents. In this area, flow velocity profiles in the water column show the presence of two superposed water masses with a pycnocline at about 600-700 m water depth. Mulder, T., Ducassou, E., Gillet, H., Hanquiez, V., Tournadour, E., Combes, J., Eberli, G.P, Kindler, P., Gonthier, E., Conesa, G., Robin, C., Sianipar, R., Reijmer, J.J.G., and François A. Canyon morphology on a modern carbonate slope of the Bahamas: Evidence of regional tectonic tilting. Geology, 40(9), 771-774. Rankey, E.C, and Doolittle, D.F. (2012). Geomorphology of carbonate platform-marginal uppermost slopes: Insights from a Holocene analogue, Little Bahama Bank, Bahamas. Sedimentology, 59, 2146-2171.
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
Aircraft optimization by a system approach: Achievements and trends
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1992-01-01
Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.
2016-08-01
REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie améliorées des exercices...STO TECHNICAL REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie...transition, application and field-testing, experimentation and a range of related scientific activities that include systems engineering, operational
Comparing Characteristics of Highly Circulated Titles for Demand-Driven Collection Development.
ERIC Educational Resources Information Center
Britten, William A; Webster, Judith D.
1992-01-01
Describes methodology for analyzing MARC (machine-readable cataloging) records of highly circulating titles to document common characteristics for collection development purposes. Application of the methodology in a university library is discussed, and data are presented on commonality of subject heading, author, language, and imprint date for…
Workshop on LCA: Methodology, Current Development, and Application in Standards - LCA Methodology
As ASTM standards are being developed including Life Cycle Assessment within the Standards it is imperative that practitioners in the field learn more about what LCA is, and how to conduct it. This presentation will include an overview of the LCA process and will concentrate on ...
Designing Trend-Monitoring Sounds for Helicopters: Methodological Issues and an Application
ERIC Educational Resources Information Center
Edworthy, Judy; Hellier, Elizabeth; Aldrich, Kirsteen; Loxley, Sarah
2004-01-01
This article explores methodological issues in sonification and sound design arising from the design of helicopter monitoring sounds. Six monitoring sounds (each with 5 levels) were tested for similarity and meaning with 3 different techniques: hierarchical cluster analysis, linkage analysis, and multidimensional scaling. In Experiment 1,…
Characterizing wood-plastic composites via data-driven methodologies
John G. Michopoulos; John C. Hermanson; Robert Badaliance
2007-01-01
The recent increase of wood-plastic composite materials in various application areas has underlined the need for an efficient and robust methodology to characterize their nonlinear anisotropic constitutive behavior. In addition, the multiplicity of various loading conditions in structures utilizing these materials further increases the need for a characterization...
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…
Development of a Teaching Methodology for Undergraduate Human Development in Psychology
ERIC Educational Resources Information Center
Rodriguez, Maria A.; Espinoza, José M.
2015-01-01
The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…
Modern Psychometric Methodology: Applications of Item Response Theory
ERIC Educational Resources Information Center
Reid, Christine A.; Kolakowsky-Hayner, Stephanie A.; Lewis, Allen N.; Armstrong, Amy J.
2007-01-01
Item response theory (IRT) methodology is introduced as a tool for improving assessment instruments used with people who have disabilities. Need for this approach in rehabilitation is emphasized; differences between IRT and classical test theory are clarified. Concepts essential to understanding IRT are defined, necessary data assumptions are…
Marzocchini, Manrico; Tatàno, Fabio; Moretti, Michela Simona; Antinori, Caterina; Orilisi, Stefano
2018-06-05
A possible approach for determining soil and groundwater quality criteria for contaminated sites is the comparative risk assessment. Originating from but not limited to Italian interest in a decentralised (regional) implementation of comparative risk assessment, this paper first addresses the proposal of an original methodology called CORIAN REG-M , which was created with initial attention to the context of potentially contaminated sites in the Marche Region (Central Italy). To deepen the technical-scientific knowledge and applicability of the comparative risk assessment, the following characteristics of the CORIAN REG-M methodology appear to be relevant: the simplified but logical assumption of three categories of factors (source and transfer/transport of potential contamination, and impacted receptors) within each exposure pathway; the adaptation to quality and quantity of data that are available or derivable at the given scale of concern; the attention to a reliable but unsophisticated modelling; the achievement of a conceptual linkage to the absolute risk assessment approach; and the potential for easy updating and/or refining of the methodology. Further, the application of the CORIAN REG-M methodology to some case-study sites located in the Marche Region indicated the following: a positive correlation can be expected between air and direct contact pathway scores, as well as between individual pathway scores and the overall site scores based on a root-mean-square algorithm; the exposure pathway, which presents the highest variability of scores, tends to be dominant at sites with the highest computed overall site scores; and the adoption of a root-mean-square algorithm can be expected to emphasise the overall site scoring.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
Utilizing a structural meta-ontology for family-based quality assurance of the BioPortal ontologies.
Ochs, Christopher; He, Zhe; Zheng, Ling; Geller, James; Perl, Yehoshua; Hripcsak, George; Musen, Mark A
2016-06-01
An Abstraction Network is a compact summary of an ontology's structure and content. In previous research, we showed that Abstraction Networks support quality assurance (QA) of biomedical ontologies. The development of an Abstraction Network and its associated QA methodologies, however, is a labor-intensive process that previously was applicable only to one ontology at a time. To improve the efficiency of the Abstraction-Network-based QA methodology, we introduced a QA framework that uses uniform Abstraction Network derivation techniques and QA methodologies that are applicable to whole families of structurally similar ontologies. For the family-based framework to be successful, it is necessary to develop a method for classifying ontologies into structurally similar families. We now describe a structural meta-ontology that classifies ontologies according to certain structural features that are commonly used in the modeling of ontologies (e.g., object properties) and that are important for Abstraction Network derivation. Each class of the structural meta-ontology represents a family of ontologies with identical structural features, indicating which types of Abstraction Networks and QA methodologies are potentially applicable to all of the ontologies in the family. We derive a collection of 81 families, corresponding to classes of the structural meta-ontology, that enable a flexible, streamlined family-based QA methodology, offering multiple choices for classifying an ontology. The structure of 373 ontologies from the NCBO BioPortal is analyzed and each ontology is classified into multiple families modeled by the structural meta-ontology. Copyright © 2016 Elsevier Inc. All rights reserved.
Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M
2013-01-01
In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Darwinism and positivism as methodological influences on the development of psychology.
Mackenzie, B
1976-10-01
The methodological significance of evolutionary theory for psychology may be distinguished from its substantive or theoretical significance. The methodological significance was that evolutionay theory broadened the current conceptors of scientific method and rendered them relatively independent of physics. It thereby made the application of the "scientific method" to psychology much more feasible than it had been previously, and thus established the possibility of a wide-ranging scientific psychology for the first time. The methodological eclecticism that made scientific psychology possible did not, however, remain a feature of psychology for very long. Psychology's methodology rapidly became restricted and codified through the influence of, and in imitation of, the rigorously positivistic orientation of physics around the turn of the twentieth century.
The Role of Ambulatory Assessment in Psychological Science.
Trull, Timothy J; Ebner-Priemer, Ulrich
2014-12-01
We describe the current use and future promise of an innovative methodology, ambulatory assessment (AA), that can be used to investigate psychological, emotional, behavioral, and biological processes of individuals in their daily life. The term AA encompasses a wide range of methods used to study people in their natural environment, including momentary self-report, observational, and physiological. We emphasize applications of AA that integrate two or more of these methods, discuss the smart phone as a hub or access point for AA, and discuss future applications of AA methodology to the science of psychology. We pay particular attention to the development and application of Wireless Body Area Networks (WBANs) that can be implemented with smart phones and wireless physiological monitoring devices, and we close by discussing future applications of this approach to matters relevant to psychological science.
Applications of aerospace technology
NASA Technical Reports Server (NTRS)
Rouse, Doris J.
1984-01-01
The objective of the Research Triangle Institute Technology Transfer Team is to assist NASA in achieving widespread utilization of aerospace technology in terrestrial applications. Widespread utilization implies that the application of NASA technology is to benefit a significant sector of the economy and population of the Nation. This objective is best attained by stimulating the introduction of new or improved commercially available devices incorporating aerospace technology. A methodology is presented for the team's activities as an active transfer agent linking NASA Field Centers, industry associations, user groups, and the medical community. This methodology is designed to: (1) identify priority technology requirements in industry and medicine, (2) identify applicable NASA technology that represents an opportunity for a successful solution and commercial product, (3) obtain the early participation of industry in the transfer process, and (4) successfully develop a new product based on NASA technology.
Development of a North American paleoclimate pollen-based reconstruction database application
NASA Astrophysics Data System (ADS)
Ladd, Matthew; Mosher, Steven; Viau, Andre
2013-04-01
Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N
2015-03-01
A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
NASA Technical Reports Server (NTRS)
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
2006-01-01
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
NASA Technical Reports Server (NTRS)
Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek
2002-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.
Gillis, Richard B; Rowe, Arthur J; Adams, Gary G; Harding, Stephen E
2014-10-01
This short review considers the range of modern techniques for the hydrodynamic characterisation of macromolecules - particularly large glycosylated systems used in the food, biopharma and healthcare industries. The range or polydispersity of molecular weights and conformations presents special challenges compared to proteins. The review is aimed, without going into any great theoretical or methodological depth, to help the Industrial Biotechnologist choose the appropriate methodology or combination of methodologies for providing the detail he/she needs for particular applications.
Data Mining for Financial Applications
NASA Astrophysics Data System (ADS)
Kovalerchuk, Boris; Vityaev, Evgenii
This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.
NASA Astrophysics Data System (ADS)
D'silva, Oneil; Kerrison, Roger
2013-09-01
A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti
NASA Technical Reports Server (NTRS)
Johnson, Jerry
1992-01-01
The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Financial options methodology for analyzing investments in new technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenning, B.D.
1994-12-31
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisionsmore » are being contemplated.« less
[SciELO: method for electronic publishing].
Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C
2001-01-01
It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.
Financial options methodology for analyzing investments in new technology
NASA Technical Reports Server (NTRS)
Wenning, B. D.
1995-01-01
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.
Segmentation of medical images using explicit anatomical knowledge
NASA Astrophysics Data System (ADS)
Wilson, Laurie S.; Brown, Stephen; Brown, Matthew S.; Young, Jeanne; Li, Rongxin; Luo, Suhuai; Brandt, Lee
1999-07-01
Knowledge-based image segmentation is defined in terms of the separation of image analysis procedures and representation of knowledge. Such architecture is particularly suitable for medical image segmentation, because of the large amount of structured domain knowledge. A general methodology for the application of knowledge-based methods to medical image segmentation is described. This includes frames for knowledge representation, fuzzy logic for anatomical variations, and a strategy for determining the order of segmentation from the modal specification. This method has been applied to three separate problems, 3D thoracic CT, chest X-rays and CT angiography. The application of the same methodology to such a range of applications suggests a major role in medical imaging for segmentation methods incorporating representation of anatomical knowledge.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
Biffis, Andrea; Dvorakova, Gita; Falcimaigne-Cordin, Aude
2012-01-01
The current state of the art in the development of methodologies for the preparation of MIPs in predetermined physical forms is critically reviewed, with particular attention being paid to the forms most widely employed in practical applications, such as spherical beads in the micro- to nanometer range, microgels, monoliths, membranes. Although applications of the various MIP physical forms are mentioned, the focus of the paper is mainly on the description of the various preparative methods. The aim is to provide the reader with an overview of the latest achievements in the field, as well as with a mean for critically evaluating the various proposed methodologies towards an envisaged application. The review covers the literature up to early 2010, with special emphasis on the developments of the last 10 years.
Q and you: The application of Q methodology in recreation research
Whitney Ward
2010-01-01
Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...
76 FR 71920 - Payment for Home Health Services and Hospice Care by Non-VA Providers
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
... concerning the billing methodology for non-VA providers of home health services and hospice care. The proposed rulemaking would include home health services and hospice care under the VA regulation governing payment for other non-VA health care providers. Because the newly applicable methodology cannot supersede...
Classification of Word Levels with Usage Frequency, Expert Opinions and Machine Learning
ERIC Educational Resources Information Center
Sohsah, Gihad N.; Ünal, Muhammed Esad; Güzey, Onur
2015-01-01
Educational applications for language teaching can utilize the language levels of words to target proficiency levels of students. This paper and the accompanying data provide a methodology for making educational standard-aligned language-level predictions for all English words. The methodology involves expert opinions on language levels and…
Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process
ERIC Educational Resources Information Center
Arnab, Sylvester; Clarke, Samantha
2017-01-01
The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…
The Professionalization of Dental Students: The Application of Socio-Anthropological Methodology.
ERIC Educational Resources Information Center
Platt, Larry A.; Bailey, Wilfrid C.
This paper discusses the advantages of using both qualitative and quantitative methodological procedures in investigating attitudinal and perception changes in the population studied. This project is part of a 4-year longitudinal study involving 24 dental students and 29 faculty members of a new southern dental school. The paper reviews some of…
Action Learning, Team Learning and Co-Operation in the Czech Republic
ERIC Educational Resources Information Center
Kubatova, Slava
2012-01-01
This account of practice presents two cases of the application of Action Learning (AL) communication methodology as described by Marquardt [2004. "Optimising the power of action learning". Mountain View, CA: Davies-Black Publishing]. The teams were Czech and international top management teams. The AL methodology was used to improve…
School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology
ERIC Educational Resources Information Center
Newman, Daniel S.; Clare, Mary M.
2016-01-01
The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…
Using Self-Experimentation and Single-Subject Methodology to Promote Critical Thinking
ERIC Educational Resources Information Center
Cowley, Brian J.; Lindgren, Ann; Langdon, David
2006-01-01
Critical thinking is often absent from classroom endeavor because it is hard to define (Gelder, 2005) or is difficult to assess (Bissell & Lemons, 2006). Critical thinking is defined as application, analysis, synthesis, and evaluation (Browne & Minnick, 2005). This paper shows how self-experimentation and single-subject methodology can be used to…
Science, Technology, and Society: A Perspective on the Enhancement of Scientific Education
ERIC Educational Resources Information Center
Courville, Keith
2009-01-01
(Purpose) This literature review discusses the history and application of science, technology, and society (STS) teaching methodologies. (Findings) Topics addressed in this paper include: (1) developmental history of STS; (2) fundamental beliefs of STS practitioners; (3) STS methodology in the classroom; (4) Difficulty in implementing STS; (5) STS…
ERIC Educational Resources Information Center
Cosier, Meghan
2012-01-01
Historically, researchers focused on individuals with severe disabilities have utilized single-subject research methodologies to study the application of the behavioral theory to learning. In contrast, disability studies scholars have primarily used qualitative research methodologies to study quality of life or policy issues related to individuals…
18 CFR 342.4 - Other rate changing methodologies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... regard to the applicable ceiling level under § 342.3. (b) Market-based rates. A carrier may attempt to...
ERIC Educational Resources Information Center
Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.
2003-01-01
This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…
Educational Strategies for Teaching Basic Family Dynamics to Non-Family Therapists.
ERIC Educational Resources Information Center
Merkel, William T.; Rudisill, John R.
1985-01-01
Presents six-part methodology for teaching basic concepts of family systems to non-family therapists and describes application of methodology to teach primary care physicians. Explains use of simulated encounters in which a physically symptomatic adolescent is interviewed alone, then with his mother, then with his whole family. (Author/NRB)
NASA Astrophysics Data System (ADS)
Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.
2018-04-01
Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.
NASA Technical Reports Server (NTRS)
Izygon, Michel
1993-01-01
The work accomplished during the past nine months in order to help three different organizations involved in Flight Planning and in Mission Operations systems, to transition to Object-Oriented Technology, by adopting one of the currently most widely used Object-Oriented analysis and Design Methodology is summarized.
Education and Training in Ethical Decision Making: Comparing Context and Orientation
ERIC Educational Resources Information Center
Perri, David F.; Callanan, Gerard A.; Rotenberry, Paul F.; Oehlers, Peter F.
2009-01-01
Purpose: The purpose of this paper is to present a teaching methodology for improving the understanding of ethical decision making. This pedagogical approach is applicable in college courses and in corporate training programs. Design/methodology/approach: Participants are asked to analyze a set of eight ethical dilemmas with differing situational…
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
ERIC Educational Resources Information Center
Woods, Charlotte
2012-01-01
This article presents an original application of Q methodology in investigating the challenging arena of emotion in the Higher Education (HE) workplace. Q's strength lies in capturing holistic, subjective accounts of complex and contested phenomena but is unusual in employing a statistical procedure within an interpretivist framework. Here Q is…
ERIC Educational Resources Information Center
Williams, Lawrence H., Jr.
2013-01-01
This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…
Cho, Kyung Hwa; Lee, Seungwon; Ham, Young Sik; Hwang, Jin Hwan; Cha, Sung Min; Park, Yongeun; Kim, Joon Ha
2009-01-01
The present study proposes a methodology for determining the effective dispersion coefficient based on the field measurements performed in Gwangju (GJ) Creek in South Korea which is environmentally degraded by the artificial interferences such as weirs and culverts. Many previous works determining the dispersion coefficient were limited in application due to the complexity and artificial interferences in natural stream. Therefore, the sequential combination of N-Tank-In-Series (NTIS) model and Advection-Dispersion-Reaction (ADR) model was proposed for evaluating dispersion process in complex stream channel in this study. The series of water quality data were intensively monitored in the field to determine the effective dispersion coefficient of E. coli in rainy day. As a result, the suggested methodology reasonably estimates the dispersion coefficient for GJ Creek with 1.25 m(2)/s. Also, the sequential combined method provided Number of tank-Velocity-Dispersion coefficient (NVD) curves for convenient evaluation of dispersion coefficient of other rivers or streams. Comparing the previous studies, the present methodology is quite general and simple for determining the effective dispersion coefficients which are applicable for other rivers and streams.
Classification of samples into two or more ordered populations with application to a cancer trial.
Conde, D; Fernández, M A; Rueda, C; Salvador, B
2012-12-10
In many applications, especially in cancer treatment and diagnosis, investigators are interested in classifying patients into various diagnosis groups on the basis of molecular data such as gene expression or proteomic data. Often, some of the diagnosis groups are known to be related to higher or lower values of some of the predictors. The standard methods of classifying patients into various groups do not take into account the underlying order. This could potentially result in high misclassification rates, especially when the number of groups is larger than two. In this article, we develop classification procedures that exploit the underlying order among the mean values of the predictor variables and the diagnostic groups by using ideas from order-restricted inference. We generalize the existing methodology on discrimination under restrictions and provide empirical evidence to demonstrate that the proposed methodology improves over the existing unrestricted methodology. The proposed methodology is applied to a bladder cancer data set where the researchers are interested in classifying patients into various groups. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Leifeste, Mark R.
2007-01-01
Composite Overwrapped Pressure Vessels (COPVs) are commonly used in spacecraft for containment of pressurized gases and fluids, incorporating strength and weight savings. The energy stored is capable of extensive spacecraft damage and personal injury in the event of sudden failure. These apparently simple structures, composed of a metallic media impermeable liner and fiber/resin composite overwrap are really complex structures with numerous material and structural phenomena interacting during pressurized use which requires multiple, interrelated monitoring methodologies to monitor and understand subtle changes critical to safe use. Testing of COPVs at NASA Johnson Space Center White Sands T est Facility (WSTF) has employed multiple in-situ, real-time nondestructive evaluation (NDE) methodologies as well as pre- and post-test comparative techniques to monitor changes in material and structural parameters during advanced pressurized testing. The use of NDE methodologies and their relationship to monitoring changes is discussed based on testing of real-world spacecraft COPVs. Lessons learned are used to present recommendations for use in testing, as well as a discussion of potential applications to vessel health monitoring in future applications.
A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.
Floating-to-Fixed-Point Conversion for Digital Signal Processors
NASA Astrophysics Data System (ADS)
Menard, Daniel; Chillet, Daniel; Sentieys, Olivier
2006-12-01
Digital signal processing applications are specified with floating-point data types but they are usually implemented in embedded systems with fixed-point arithmetic to minimise cost and power consumption. Thus, methodologies which establish automatically the fixed-point specification are required to reduce the application time-to-market. In this paper, a new methodology for the floating-to-fixed point conversion is proposed for software implementations. The aim of our approach is to determine the fixed-point specification which minimises the code execution time for a given accuracy constraint. Compared to previous methodologies, our approach takes into account the DSP architecture to optimise the fixed-point formats and the floating-to-fixed-point conversion process is coupled with the code generation process. The fixed-point data types and the position of the scaling operations are optimised to reduce the code execution time. To evaluate the fixed-point computation accuracy, an analytical approach is used to reduce the optimisation time compared to the existing methods based on simulation. The methodology stages are described and several experiment results are presented to underline the efficiency of this approach.
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
Formulating accident occurrence as a survival process.
Chang, H L; Jovanis, P P
1990-10-01
A conceptual framework for accident occurrence is developed based on the principle of the driver as an information processor. The framework underlies the development of a modeling approach that is consistent with the definition of exposure to risk as a repeated trial. Survival theory is proposed as a statistical technique that is consistent with the conceptual structure and allows the exploration of a wide range of factors that contribute to highway operating risk. This survival model of accident occurrence is developed at a disaggregate level, allowing safety researchers to broaden the scope of studies which may be limited by the use of traditional aggregate approaches. An application of the approach to motor carrier safety is discussed as are potential applications to a variety of transportation industries. Lastly, a typology of highway safety research methodologies is developed to compare the properties of four safety methodologies: laboratory experiments, on-the-road studies, multidisciplinary accident investigations, and correlational studies. The survival theory formulation has a mathematical structure that is compatible with each safety methodology, so it may facilitate the integration of findings across methodologies.
Dekant, Wolfgang; Bridges, James
2016-11-01
Quantitative weight of evidence (QWoE) methodology utilizes detailed scoring sheets to assess the quality/reliability of each publication on toxicity of a chemical and gives numerical scores for quality and observed toxicity. This QWoE-methodology was applied to the reproductive toxicity data on diisononylphthalate (DINP), di-n-hexylphthalate (DnHP), and dicyclohexylphthalate (DCHP) to determine if the scientific evidence for adverse effects meets the requirements for classification as reproductive toxicants. The scores for DINP were compared to those when applying the methodology DCHP and DnHP that have harmonized classifications. Based on the quality/reliability scores, application of the QWoE shows that the three databases are of similar quality; but effect scores differ widely. Application of QWoE to DINP studies resulted in an overall score well below the benchmark required to trigger classification. For DCHP, the QWoE also results in low scores. The high scores from the application of the QWoE methodology to the toxicological data for DnHP represent clear evidence for adverse effects and justify a classification of DnHP as category 1B for both development and fertility. The conclusions on classification based on the QWoE are well supported using a narrative assessment of consistency and biological plausibility. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
A multicriteria-based methodology for site prioritisation in sediment management.
Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos
2009-08-01
Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.
Evaluative methodology for prioritizing transportation energy conservation strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pang, L.M.G.
An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less
Production methodologies of polymeric and hydrogel particles for drug delivery applications.
Lima, Ana Catarina; Sher, Praveen; Mano, João F
2012-02-01
Polymeric particles are ideal vehicles for controlled delivery applications due to their ability to encapsulate a variety of substances, namely low- and high-molecular mass therapeutics, antigens or DNA. Micro and nano scale spherical materials have been developed as carriers for therapies, using appropriated methodologies, in order to achieve a prolonged and controlled drug administration. This paper reviews the methodologies used for the production of polymeric micro/nanoparticles. Emulsions, phase separation, spray drying, ionic gelation, polyelectrolyte complexation and supercritical fluids precipitation are all widely used processes for polymeric micro/nanoencapsulation. This paper also discusses the recent developments and patents reported in this field. Other less conventional methodologies are also described, such as the use of superhydrophobic substrates to produce hydrogel and polymeric particulate biomaterials. Polymeric drug delivery systems have gained increased importance due to the need for improving the efficiency and versatility of existing therapies. This allows the development of innovative concepts that could create more efficient systems, which in turn may address many healthcare needs worldwide. The existing methods to produce polymeric release systems have some critical drawbacks, which compromise the efficiency of these techniques. Improvements and development of new methodologies could be achieved by using multidisciplinary approaches and tools taken from other subjects, including nanotechnologies, biomimetics, tissue engineering, polymer science or microfluidics.
Lykins, Amy D; Meana, Marta; Kambe, Gretchen
2006-10-01
As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.
Nguyen, Ha T.; Pearce, Joshua M.; Harrap, Rob; Barber, Gerald
2012-01-01
A methodology is provided for the application of Light Detection and Ranging (LiDAR) to automated solar photovoltaic (PV) deployment analysis on the regional scale. Challenges in urban information extraction and management for solar PV deployment assessment are determined and quantitative solutions are offered. This paper provides the following contributions: (i) a methodology that is consistent with recommendations from existing literature advocating the integration of cross-disciplinary competences in remote sensing (RS), GIS, computer vision and urban environmental studies; (ii) a robust methodology that can work with low-resolution, incomprehensive data and reconstruct vegetation and building separately, but concurrently; (iii) recommendations for future generation of software. A case study is presented as an example of the methodology. Experience from the case study such as the trade-off between time consumption and data quality are discussed to highlight a need for connectivity between demographic information, electrical engineering schemes and GIS and a typical factor of solar useful roofs extracted per method. Finally, conclusions are developed to provide a final methodology to extract the most useful information from the lowest resolution and least comprehensive data to provide solar electric assessments over large areas, which can be adapted anywhere in the world. PMID:22666044
Applying a contemporary grounded theory methodology.
Licqurish, Sharon; Seibold, Carmel
2011-01-01
The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.
Review of Land Use Models: Theory and Application
DOT National Transportation Integrated Search
1997-01-01
This paper discusses methodology in reviewing land use models and identifying desired attributes for recommending a model for application by the Delaware Valley Planning Commission (DVRPC). The need for land-use transportation interaction is explored...
Assessment of Automated Measurement and Verification (M&V) Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Custodio, Claudine
This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.
Ninth NASTRAN (R) Users' Colloquium
NASA Technical Reports Server (NTRS)
1980-01-01
The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems is addressed. Comparison with other approaches and new methods of analysis with nastran are included.
Statistical Inferences from Formaldehyde Dna-Protein Cross-Link Data
Physiologically-based pharmacokinetic (PBPK) modeling has reached considerable sophistication in its application in the pharmacological and environmental health areas. Yet, mature methodologies for making statistical inferences have not been routinely incorporated in these applic...
Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner
2013-09-01
This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quality of Service in Networks Supporting Cultural Multimedia Applications
ERIC Educational Resources Information Center
Kanellopoulos, Dimitris N.
2011-01-01
Purpose: This paper aims to provide an overview of representative multimedia applications in the cultural heritage sector, as well as research results on quality of service (QoS) mechanisms in internet protocol (IP) networks that support such applications. Design/methodology/approach: The paper's approach is a literature review. Findings: Cultural…
USDA-ARS?s Scientific Manuscript database
As remote sensing and variable rate technology are becoming more available for aerial applicators, practical methodologies on effective integration of these technologies are needed for site-specific aerial applications of crop production and protection materials. The objectives of this study were to...
Methodology for 2012 Application Trends Survey
ERIC Educational Resources Information Center
Graduate Management Admission Council, 2012
2012-01-01
From early June to late July 2012, the Graduate Management Admission Council[R] (GMAC[R]) conducted the "Application Trends Survey", its annual survey of business school admission professionals worldwide to assess how application volume at MBA and other graduate management programs compared with that from the same period in 2011. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
A decision model for planetary missions
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.; Brigadier, W. L.
1976-01-01
Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.
Investigation of Weibull statistics in fracture analysis of cast aluminum
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.
Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X
2015-01-01
Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.
Flamm, Christoph; Graef, Andreas; Pirker, Susanne; Baumgartner, Christoph; Deistler, Manfred
2013-01-01
Granger causality is a useful concept for studying causal relations in networks. However, numerical problems occur when applying the corresponding methodology to high-dimensional time series showing co-movement, e.g. EEG recordings or economic data. In order to deal with these shortcomings, we propose a novel method for the causal analysis of such multivariate time series based on Granger causality and factor models. We present the theoretical background, successfully assess our methodology with the help of simulated data and show a potential application in EEG analysis of epileptic seizures. PMID:23354014
NASA Technical Reports Server (NTRS)
Perry, B., III
1982-01-01
The relationships between elevon deflection and static margin using elements from static and dynamic stability and control and from classical control theory are emphasized. Expressions are derived and presented for calculating elevon deflections required to trim the vehicle in lg straight-and-level flight and to perform specified longitudinal and lateral maneuvers. Applications of this methodology are made at several flight conditions for the ARW-2 wing. On the basis of these applications, it appears possible to trim and maneuver the vehicle with the existing elevons at -15% static margin.
Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles
NASA Technical Reports Server (NTRS)
Wieting, A. R.
1979-01-01
The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe
2018-01-17
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
Improving ED specimen TAT using Lean Six Sigma.
Sanders, Janet H; Karr, Tedd
2015-01-01
Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.
VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.
Little, Todd D; Wang, Eugene W; Gorrall, Britt K
2017-06-01
This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.
Observations of fallibility in applications of modern programming methodologies
NASA Technical Reports Server (NTRS)
Gerhart, S. L.; Yelowitz, L.
1976-01-01
Errors, inconsistencies, or confusing points are noted in a variety of published algorithms, many of which are being used as examples in formulating or teaching principles of such modern programming methodologies as formal specification, systematic construction, and correctness proving. Common properties of these points of contention are abstracted. These properties are then used to pinpoint possible causes of the errors and to formulate general guidelines which might help to avoid further errors. The common characteristic of mathematical rigor and reasoning in these examples is noted, leading to some discussion about fallibility in mathematics, and its relationship to fallibility in these programming methodologies. The overriding goal is to cast a more realistic perspective on the methodologies, particularly with respect to older methodologies, such as testing, and to provide constructive recommendations for their improvement.
Novel Biomaterials Methodology, Development and Application
USDA-ARS?s Scientific Manuscript database
Traditionally the use of carbohydrate-based wound dressings including cotton, xerogels, charcoal cloth, alginates, chitosan and hydrogels, have afforded properties such as absorbency, ease of application and removal, bacterial protection, fluid balance, occlusion, and elasticity. Recent efforts in ...
von Stosch, Moritz; Davy, Steven; Francois, Kjell; Galvanauskas, Vytautas; Hamelink, Jan-Martijn; Luebbert, Andreas; Mayer, Martin; Oliveira, Rui; O'Kennedy, Ronan; Rice, Paul; Glassey, Jarka
2014-06-01
This report highlights the drivers, challenges, and enablers of the hybrid modeling applications in biopharmaceutical industry. It is a summary of an expert panel discussion of European academics and industrialists with relevant scientific and engineering backgrounds. Hybrid modeling is viewed in its broader sense, namely as the integration of different knowledge sources in form of parametric and nonparametric models into a hybrid semi-parametric model, for instance the integration of fundamental and data-driven models. A brief description of the current state-of-the-art and industrial uptake of the methodology is provided. The report concludes with a number of recommendations to facilitate further developments and a wider industrial application of this modeling approach. These recommendations are limited to further exploiting the benefits of this methodology within process analytical technology (PAT) applications in biopharmaceutical industry. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.
Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo
2017-11-05
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
Contributions of mobile technologies to addiction research.
Swendsen, Joel
2016-06-01
Mobile technologies are revolutionizing the field of mental health, and particular progress has been made in their application to addiction research and treatment. The use of smartphones and other mobile devices has been shown to be feasible with individuals addicted to any of a wide range of substances, with few biases being observed concerning the repeated monitoring of daily life experiences, craving, or substance use. From a methodological point of view, the use of mobile technologies overcomes longstanding limitations of traditional clinical research protocols, including the more accurate assessment of temporal relationships among variables, as well as the reduction in both contextual constraints and discipline-specific methodological isolation. The present article presents a conceptual review of these advances while using illustrations of research applications that are capable of overcoming specific methodological barriers. Finally, a brief review of both the benefits and risks of mobile technology use for the treatment of patients will be addressed.
Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.
Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan
2014-09-01
The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).
Bigger is Better, but at What Cost? Estimating the Economic Value of Incremental Data Assets.
Dalessandro, Brian; Perlich, Claudia; Raeder, Troy
2014-06-01
Many firms depend on third-party vendors to supply data for commercial predictive modeling applications. An issue that has received very little attention in the prior research literature is the estimation of a fair price for purchased data. In this work we present a methodology for estimating the economic value of adding incremental data to predictive modeling applications and present two cases studies. The methodology starts with estimating the effect that incremental data has on model performance in terms of common classification evaluation metrics. This effect is then translated into economic units, which gives an expected economic value that the firm might realize with the acquisition of a particular data asset. With this estimate a firm can then set a data acquisition price that targets a particular return on investment. This article presents the methodology in full detail and illustrates it in the context of two marketing case studies.
An Interoperability Framework and Capability Profiling for Manufacturing Software
NASA Astrophysics Data System (ADS)
Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.
ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.
Parks, Nathan A.
2013-01-01
The simultaneous application of transcranial magnetic stimulation (TMS) with non-invasive neuroimaging provides a powerful method for investigating functional connectivity in the human brain and the causal relationships between areas in distributed brain networks. TMS has been combined with numerous neuroimaging techniques including, electroencephalography (EEG), functional magnetic resonance imaging (fMRI), and positron emission tomography (PET). Recent work has also demonstrated the feasibility and utility of combining TMS with non-invasive near-infrared optical imaging techniques, functional near-infrared spectroscopy (fNIRS) and the event-related optical signal (EROS). Simultaneous TMS and optical imaging affords a number of advantages over other neuroimaging methods but also involves a unique set of methodological challenges and considerations. This paper describes the methodology of concurrently performing optical imaging during the administration of TMS, focusing on experimental design, potential artifacts, and approaches to controlling for these artifacts. PMID:24065911
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lala, J.H.; Nagle, G.A.; Harper, R.E.
1993-05-01
The Maglev control computer system should be designed to verifiably possess high reliability and safety as well as high availability to make Maglev a dependable and attractive transportation alternative to the public. A Maglev control computer system has been designed using a design-for-validation methodology developed earlier under NASA and SDIO sponsorship for real-time aerospace applications. The present study starts by defining the maglev mission scenario and ends with the definition of a maglev control computer architecture. Key intermediate steps included definitions of functional and dependability requirements, synthesis of two candidate architectures, development of qualitative and quantitative evaluation criteria, and analyticalmore » modeling of the dependability characteristics of the two architectures. Finally, the applicability of the design-for-validation methodology was also illustrated by applying it to the German Transrapid TR07 maglev control system.« less
Forecasting the Economic Impact of Future Space Station Operations
NASA Technical Reports Server (NTRS)
Summer, R. A.; Smolensky, S. M.; Muir, A. H.
1967-01-01
Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.
Contributions of mobile technologies to addiction research
Swendsen, Joel
2016-01-01
Mobile technologies are revolutionizing the field of mental health, and particular progress has been made in their application to addiction research and treatment. The use of smartphones and other mobile devices has been shown to be feasible with individuals addicted to any of a wide range of substances, with few biases being observed concerning the repeated monitoring of daily life experiences, craving, or substance use. From a methodological point of view, the use of mobile technologies overcomes longstanding limitations of traditional clinical research protocols, including the more accurate assessment of temporal relationships among variables, as well as the reduction in both contextual constraints and discipline-specific methodological isolation. The present article presents a conceptual review of these advances while using illustrations of research applications that are capable of overcoming specific methodological barriers. Finally, a brief review of both the benefits and risks of mobile technology use for the treatment of patients will be addressed. PMID:27489461
A methodology for selecting optimum organizations for space communities
NASA Technical Reports Server (NTRS)
Ragusa, J. M.
1978-01-01
This paper suggests that a methodology exists for selecting optimum organizations for future space communities of various sizes and purposes. Results of an exploratory study to identify an optimum hypothetical organizational structure for a large earth-orbiting multidisciplinary research and applications (R&A) Space Base manned by a mixed crew of technologists are presented. Since such a facility does not presently exist, in situ empirical testing was not possible. Study activity was, therefore, concerned with the identification of a desired organizational structural model rather than the empirical testing of it. The principal finding of this research was that a four-level project type 'total matrix' model will optimize the effectiveness of Space Base technologists. An overall conclusion which can be reached from the research is that application of this methodology, or portions of it, may provide planning insights for the formal organizations which will be needed during the Space Industrialization Age.
NASA Astrophysics Data System (ADS)
Polatidis, Heracles; Morales, Jan Borràs
2016-11-01
In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.
Experiences of Structured Elicitation for Model-Based Cost-Effectiveness Analyses.
Soares, Marta O; Sharples, Linda; Morton, Alec; Claxton, Karl; Bojke, Laura
2018-06-01
Empirical evidence supporting the cost-effectiveness estimates of particular health care technologies may be limited, or it may even be missing entirely. In these situations, additional information, often in the form of expert judgments, is needed to reach a decision. There are formal methods to quantify experts' beliefs, termed as structured expert elicitation (SEE), but only limited research is available in support of methodological choices. Perhaps as a consequence, the use of SEE in the context of cost-effectiveness modelling is limited. This article reviews applications of SEE in cost-effectiveness modelling with the aim of summarizing the basis for methodological choices made in each application and recording the difficulties and challenges reported by the authors in the design, conduct, and analyses. The methods used in each application were extracted along with the criteria used to support methodological and practical choices and any issues or challenges discussed in the text. Issues and challenges were extracted using an open field, and then categorised and grouped for reporting. The review demonstrates considerable heterogeneity in methods used, and authors acknowledge great methodological uncertainty in justifying their choices. Specificities of the context area emerging as potentially important in determining further methodological research in elicitation are between- expert variation and its interpretation, the fact that substantive experts in the area may not be trained in quantitative subjects, that judgments are often needed on various parameter types, the need for some form of assessment of validity, and the need for more integration with behavioural research to devise relevant debiasing strategies. This review of experiences of SEE highlights a number of specificities/constraints that can shape the development of guidance and target future research efforts in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Applying axiomatic design to a medication distribution system
NASA Astrophysics Data System (ADS)
Raguini, Pepito B.
As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seimenis, Ioannis; Tsekos, Nikolaos V.; Keroglou, Christoforos
2012-04-15
Purpose: The aim of this work was to develop and test a general methodology for the planning and performance of robot-assisted, MR-guided interventions. This methodology also includes the employment of software tools with appropriately tailored routines to effectively exploit the capabilities of MRI and address the relevant spatial limitations. Methods: The described methodology consists of: (1) patient-customized feasibility study that focuses on the geometric limitations imposed by the gantry, the robotic hardware, and interventional tools, as well as the patient; (2) stereotactic preoperative planning for initial positioning of the manipulator and alignment of its end-effector with a selected target; andmore » (3) real-time, intraoperative tool tracking and monitoring of the actual intervention execution. Testing was performed inside a standard 1.5T MRI scanner in which the MR-compatible manipulator is deployed to provide the required access. Results: A volunteer imaging study demonstrates the application of the feasibility stage. A phantom study on needle targeting is also presented, demonstrating the applicability and effectiveness of the proposed preoperative and intraoperative stages of the methodology. For this purpose, a manually actuated, MR-compatible robotic manipulation system was used to accurately acquire a prescribed target through alternative approaching paths. Conclusions: The methodology presented and experimentally examined allows the effective performance of MR-guided interventions. It is suitable for, but not restricted to, needle-targeting applications assisted by a robotic manipulation system, which can be deployed inside a cylindrical scanner to provide the required access to the patient facilitating real-time guidance and monitoring.« less
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
ERIC Educational Resources Information Center
Galeeva, Railya B.
2016-01-01
Purpose: The purpose of this study is to demonstrate an adaptation of the SERVQUAL survey method for measuring the quality of higher educational services in a Russian university context. We use a new analysis and a graphical technique for presentation of results. Design/methodology/approach: The methodology of this research follows the classic…
ERIC Educational Resources Information Center
Sun, Shuyan; Pan, Wei
2014-01-01
As applications of multilevel modelling in educational research increase, researchers realize that multilevel data collected in many educational settings are often not purely nested. The most common multilevel non-nested data structure is one that involves student mobility in longitudinal studies. This article provides a methodological review of…
Visual sensitivity of river recreation to power plants
David H. Blau; Michael C. Bowie
1979-01-01
The consultants were asked by the Power Plant Siting Staff of the Minnesota Environmental Quality Council to develop a methodology for evaluating the sensitivity of river-related recreational activities to visual intrusion by large coal-fired power plants. The methodology, which is applicable to any major stream in the state, was developed and tested on a case study...
Working in the Methodological "Outfield": The Case of Bourdieu and Occupational Therapy
ERIC Educational Resources Information Center
Watson, Jo; Grenfell, Michael
2016-01-01
The article reports on a study of methodological innovation involving occupational therapy (OT) students in higher education (HE). It is based on an original project which examined the experiences and outcomes of non-traditional entrants to pre-registration OT education. A feature of the original project was the application of the epistemological…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
... Methodology for Boiling Water Reactors, June 2011. To support use of Topical Report ANP-10307PA, Revision 0... the NRC's E-Filing system does not support unlisted software, and the NRC Meta System Help Desk will... Water Reactors with AREVA Topical Report ANP- 10307PA, Revision 0, ``AREVA MCPR Safety Limit Methodology...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-23
... the large break loss-of-coolant accident (LOCA) analysis methodology with a reference to WCAP-16009-P... required by 10 CFR 50.91(a), the licensee has provided its analysis of the issue of no significant hazards... Section 5.6.5 to incorporate a new large break LOCA analysis methodology. Specifically, the proposed...
Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.
ERIC Educational Resources Information Center
Lazinger, Susan S.; Shoval, Peretz
This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…
ERIC Educational Resources Information Center
Nickerson, Carol A.; McClelland, Gary H.
1988-01-01
A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)
USDA-ARS?s Scientific Manuscript database
We developed a cost-based methodology to assess the value of forested watersheds to improve water quality in public water supplies. The developed methodology is applicable to other source watersheds to determine ecosystem services for water quality. We assess the value of forest land for source wate...
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
A conference, held in Washington, D. C., in 1967 by the Association for Educational Data Systems and the U.S. Office of Education, attempted to lay the groundwork for an efficient automatic data processing training program for the Federal Government utilizing new instructional methodologies. The rapid growth of computer applications and computer…
Long-term land use and land cover change, and the associated impacts, pose critical challenges to sustaining healthy communities and ecosystems. In this study, a methodology was developed to use parcel data to evaluate land use trends in southeast Arizona’s San Pedro River Water...
Development of regional stump-to-mill logging cost estimators
Chris B. LeDoux; John E. Baumgras
1989-01-01
Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...
Application of Real Options Theory to DoD Software Acquisitions
2009-02-20
Future Combat Systems Program. Washington, DC. U.S. Government Printing Office. Damodaran , A. (2007). Investment Valuation : The Options To Expand... valuation methodology, when enhanced and properly formulated around a proposed or existing software investment employing the spiral development approach...THIS PAGE INTENTIONALLY LEFT BLANK iii ABSTRACT The traditional real options valuation methodology, when enhanced and properly formulated
ERIC Educational Resources Information Center
Kalathaki, Maria
2015-01-01
Greek school community emphasizes on the discovery direction of teaching methodology in the school Environmental Education (EE) in order to promote Education for the Sustainable Development (ESD). In ESD school projects the used methodology is experiential teamwork for inquiry based learning. The proposed tool checks whether and how a school…
ERIC Educational Resources Information Center
Vineberg, Robert; Joyner, John N.
Instructional System Development (ISD) methodologies and practices were examined in the Army, Navy, Marine Corps, and Air Force, each of which prescribes the ISD system involving rigorous derivation of training requirements from job requirements, selection of instructional strategies to maximize training efficiency, and revision of instruction…
Treatment of Farm Families under Need Analysis for Student Aid. Final Report.
ERIC Educational Resources Information Center
National Computer Systems, Inc., Arlington, VA.
In response to Congressional request, this report compares the treatment of student financial aid applicants from farm families and non-farm families under two need-analysis formulae. Both the need-analysis methodology for Pell Grants and the Congressional Methodology (CM) for other federal aid calculate ability to pay as a function of income and…
The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.
ERIC Educational Resources Information Center
Greer, John T.
Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…
ERIC Educational Resources Information Center
Bachore, Zelalem
2012-01-01
Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…
ERIC Educational Resources Information Center
Strømskag, Heidi
2017-01-01
This theoretical paper presents a methodology for instructional design in mathematics. It is a theoretical analysis of a proposed model for instructional design, where tasks are embedded in situations that preserve meaning with respect to particular pieces of mathematical knowledge. The model is applicable when there is an intention of teaching…
One common way - The strategic and methodological influence on environmental planning across Europe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiricka, Alexandra, E-mail: alexandra.jiricka@boku.ac.a; Proebstl, Ulrike, E-mail: ulrike.proebstl@boku.ac.a
In the last decades the European Union exerted influence on precautionary environmental planning by the establishment of several Directives. The most relevant were the Habitat-Directive, the EIA-Directive, the SEA-Directive and the Water Framework Directive. Comparing these EU policies in the area of environmental precaution it becomes obvious that there is a lot of common ground. Thus, the conclusion seems likely that the European Union, in doing so, has intended to establish general planning concepts through introducing several methodological steps indicated by the regulations. The goal of this article is firstly to point out, which are the common planning principles, convertedmore » by methodological elements and secondly examine the consideration of these planning concepts by the implementation and application in the member states. In this context it is analysed whether the connections and divergences between the directives lead to significant differences in the implementation process. To this aim the directives are shortly introduced and significant steps of the processes regulated by them are outlined. In the second steps the national legal implementation in the Alpine states and its consequences for the practical application are discussed. The results show a heterogeneous application of the EU principles. Within the comparative view on the four directives influence and causalities between the national implementation and the practical application were identified, which can be simplified as four types. Since a coherent strategic and methodological concept for improving environmental precaution planning from part of the EU is noticeable, more unity and comparability within the implementation is desirable, particularly in areas with comparable habitats such as the alpine space. Beyond this the trade-off between the directives poses an important task for the future.« less
Discrete and continuous dynamics modeling of a mass moving on a flexible structure
NASA Technical Reports Server (NTRS)
Herman, Deborah Ann
1992-01-01
A general discrete methodology for modeling the dynamics of a mass that moves on the surface of a flexible structure is developed. This problem was motivated by the Space Station/Mobile Transporter system. A model reduction approach is developed to make the methodology applicable to large structural systems. To validate the discrete methodology, continuous formulations are also developed. Three different systems are examined: (1) simply-supported beam, (2) free-free beam, and (3) free-free beam with two points of contact between the mass and the flexible beam. In addition to validating the methodology, parametric studies were performed to examine how the system's physical properties affect its dynamics.
Software reuse in spacecraft planning and scheduling systems
NASA Technical Reports Server (NTRS)
Mclean, David; Tuchman, Alan; Broseghini, Todd; Yen, Wen; Page, Brenda; Johnson, Jay; Bogovich, Lynn; Burkhardt, Chris; Mcintyre, James; Klein, Scott
1993-01-01
The use of a software toolkit and development methodology that supports software reuse is described. The toolkit includes source-code-level library modules and stand-alone tools which support such tasks as data reformatting and report generation, simple relational database applications, user interfaces, tactical planning, strategic planning and documentation. The current toolkit is written in C and supports applications that run on IBM-PC's under DOS and UNlX-based workstations under OpenLook and Motif. The toolkit is fully integrated for building scheduling systems that reuse AI knowledge base technology. A typical scheduling scenario and three examples of applications that utilize the reuse toolkit will be briefly described. In addition to the tools themselves, a description of the software evolution and reuse methodology that was used is presented.
Pediatric echocardiography: new developments and applications.
Ge, Shuping
2013-04-01
In this Special Issue of the Journal, 6 review articles that represent the new developments and applications of echocardiography for diagnosis and assessment of congenital heart disease from fetus to adult are included. The goal is to provide an updated review of the evidence for the current and potential use of some of the new methodologies, i.e. fetal echocardiography, tissue Doppler imaging, strain imaging by speckle tracking imaging, ventricular synchrony, quantification using real time three-dimensional (3D) echocardiography, and 3D echocardiography for adults with congenital heart disease. We hope this effort will provide an impetus for more investigation and ultimately clinical application of these new methodologies to improve the care of those with congenital and acquired heart diseases in the pediatric population and beyond. © 2013, Wiley Periodicals, Inc.
Application of Adjoint Methodology in Various Aspects of Sonic Boom Design
NASA Technical Reports Server (NTRS)
Rallabhandi, Sriram K.
2014-01-01
One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.
Close Combat Missile Methodology Study
2010-10-14
Modeling: Industrial Applications of DEX.” Informatica 23 (1999): 487-491. Bohanec, Marko, Blaz Zupan, and Vladislav Rajkovic. “Applications of...Lisec. “Multi-attribute Decision Analysis in GIS: Weighted Linear Combination and Ordered Weighted Averaging.” Informatica 33, (1999): 459- 474
Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations
Palmiotti, Giuseppe; Salvatores, Massimo
2012-01-01
The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.
Self-Organisation and Capacity Building: Sustaining the Change
ERIC Educational Resources Information Center
Bain, Alan; Walker, Allan; Chan, Anissa
2011-01-01
Purpose: The paper aims to describe the application of theoretical principles derived from a study of self-organisation and complex systems theory and their application to school-based capacity building to support planned change. Design/methodology/approach: The paper employs a case example in a Hong Kong School to illustrate the application of…
Chemical Contaminant and Decontaminant Test Methodology Source Document. Second Edition
2012-07-01
performance as described in “A Statistical Overview on Univariate Calibration, Inverse Regression, and Detection Limits: Application to Gas Chromatography...Overview on Univariate Calibration, Inverse Regression, and Detection Limits: Application to Gas Chromatography/Mass Spectrometry Technique. Mass... APPLICATIONS INTERNATIONAL CORPORATION Gunpowder, MD 21010-0068 July 2012 Approved for public release; distribution is unlimited
On Raviart-Thomas and VMS formulations for flow in heterogeneous materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Daniel Zack
It is well known that the continuous Galerkin method (in its standard form) is not locally conservative, yet many stabilized methods are constructed by augmenting the standard Galerkin weak form. In particular, the Variational Multiscale (VMS) method has achieved popularity for combating numerical instabilities that arise for mixed formulations that do not otherwise satisfy the LBB condition. Among alternative methods that satisfy local and global conservation, many employ Raviart-Thomas function spaces. The lowest order Raviart-Thomas finite element formulation (RT0) consists of evaluating fluxes over the midpoint of element edges and constant pressures within the element. Although the RT0 element posesmore » many advantages, it has only been shown viable for triangular or tetrahedral elements (quadrilateral variants of this method do not pass the patch test). In the context of heterogenous materials, both of these methods have been used to model the mixed form of the Darcy equation. This work aims, in a comparative fashion, to evaluate the strengths and weaknesses of either approach for modeling Darcy flow for problems with highly varying material permeabilities and predominantly open flow boundary conditions. Such problems include carbon sequestration and enhanced oil recovery simulations for which the far-field boundary is typically described with some type of pressure boundary condition. We intend to show the degree to which the VMS formulation violates local mass conservation for these types of problems and compare the performance of the VMS and RT0 methods at boundaries between disparate permeabilities.« less
Effect of Combined Loading Due to Bending and Internal Pressure on Pipe Flaw Evaluation Criteria
NASA Astrophysics Data System (ADS)
Miura, Naoki; Sakai, Shinsuke
Considering a rule for the rationalization of maintenance of Light Water Reactor piping, reliable flaw evaluation criteria are essential for determining how a detected flaw will be detrimental to continuous plant operation. Ductile fracture is one of the dominant failure modes that must be considered for carbon steel piping and can be analyzed by elastic-plastic fracture mechanics. Some analytical efforts have provided various flaw evaluation criteria using load correction factors, such as the Z-factors in the JSME codes on fitness-for-service for nuclear power plants and the section XI of the ASME boiler and pressure vessel code. The present Z-factors were conventionally determined, taking conservativity and simplicity into account; however, the effect of internal pressure, which is an important factor under actual plant conditions, was not adequately considered. Recently, a J-estimation scheme, LBB.ENGC for the ductile fracture analysis of circumferentially through-wall-cracked pipes subjected to combined loading was developed for more accurate prediction under more realistic conditions. This method explicitly incorporates the contributions of both bending and tension due to internal pressure by means of a scheme that is compatible with an arbitrary combined-loading history. In this study, the effect of internal pressure on the flaw evaluation criteria was investigated using the new J-estimation scheme. The Z-factor obtained in this study was compared with the presently used Z-factors, and the predictability of the current flaw evaluation criteria was quantitatively evaluated in consideration of the internal pressure.
PIA and REWIND: Two New Methodologies for Cross Section Adjustment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmiotti, G.; Salvatores, M.
2017-02-01
This paper presents two new cross section adjustment methodologies intended for coping with the problem of compensations. The first one PIA, Progressive Incremental Adjustment, gives priority to the utilization of experiments of elemental type (those sensitive to a specific cross section), following a definite hierarchy on which type of experiment to use. Once the adjustment is performed, both the new adjusted data and the new covariance matrix are kept. The second methodology is called REWIND (Ranking Experiments by Weighting for Improved Nuclear Data). This new proposed approach tries to establish a methodology for ranking experiments by looking at the potentialmore » gain they can produce in an adjustment. Practical applications for different adjustments illustrate the results of the two methodologies against the current one and show the potential improvement for reducing uncertainties in target reactors.« less
Fourteenth NASTRAN (R) Users' Colloquium
NASA Technical Reports Server (NTRS)
1986-01-01
The proceedings of a colloquium are presented along with technical papers contributed during the conference. Reviewed are general applications of finite element methodology and the specific application of the NASA Structural Analysis System, NASTRAN, to a variety of static and dynamic sturctural problems.
A self-describing data transfer methodology for ITS applications : executive summary
DOT National Transportation Integrated Search
2000-12-01
A wide variety of remote sensors used in Intelligent Transportation Systems (ITS) applications (loops, probe vehicles, radar, cameras) has created a need for general methods by which data can be shared among agencies and users who disparate computer ...
A self-describing data transfer methodology for ITS applications
DOT National Transportation Integrated Search
1999-01-01
The wide variety of remote sensors used in Intelligent Transportation Systems (ITS) : applications (loops, probe vehicles, radar, cameras, etc.) has created a need for general : methods by which data can be shared among agencies and users who own dis...
MICROBIOLOGICAL RISK ASSESSMENT FOR LAND APPLICATION OF MUNICIPAL SLUDGE
Each major option for the disposal/reuse of municipal sludges poses potential risks to human health or the environment because of the microbial contaminants in sludge. Therefore, risk assessment methodology appropriate for pathogen risk evaluation for land application and distrib...
Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth
2017-11-28
The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further research is required, however, so that a rational choice can be made about which supplementary search strategies should be used, and when.
Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A
2010-01-01
Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.
Design and Customization of Telemedicine Systems
Martínez-Alcalá, Claudia I.; Muñoz, Mirna; Monguet-Fierro, Josep
2013-01-01
In recent years, the advances in information and communication technology (ICT) have resulted in the development of systems and applications aimed at supporting rehabilitation therapy that contributes to enrich patients' life quality. This work is focused on the improvement of the telemedicine systems with the purpose of customizing therapies according to the profile and disability of patients. For doing this, as salient contribution, this work proposes the adoption of user-centered design (UCD) methodology for the design and development of telemedicine systems in order to support the rehabilitation of patients with neurological disorders. Finally, some applications of the UCD methodology in the telemedicine field are presented as a proof of concept. PMID:23762191
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas M., Jr.; Roelofs, Larry H.; Dorfman, Erik
1991-01-01
A methodology for optimizing organization of data obtained by NASA earth and space missions is discussed. The methodology uses a concept based on semantic data modeling techniques implemented in a hierarchical storage model. The modeling is used to organize objects in mass storage devices, relational database systems, and object-oriented databases. The semantic data modeling at the metadata record level is examined, including the simulation of a knowledge base and semantic metadata storage issues. The semantic data model hierarchy and its application for efficient data storage is addressed, as is the mapping of the application structure to the mass storage.
Introduction to Computational Methods for Stability and Control (COMSAC)
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Fremaux, C. Michael; Chambers, Joseph R.
2004-01-01
This Symposium is intended to bring together the often distinct cultures of the Stability and Control (S&C) community and the Computational Fluid Dynamics (CFD) community. The COMSAC program is itself a new effort by NASA Langley to accelerate the application of high end CFD methodologies to the demanding job of predicting stability and control characteristics of aircraft. This talk is intended to set the stage for needing a program like COMSAC. It is not intended to give details of the program itself. The topics include: 1) S&C Challenges; 2) Aero prediction methodology; 3) CFD applications; 4) NASA COMSAC planning; 5) Objectives of symposium; and 6) Closing remarks.
The role of empirical Bayes methodology as a leading principle in modern medical statistics.
van Houwelingen, Hans C
2014-11-01
This paper reviews and discusses the role of Empirical Bayes methodology in medical statistics in the last 50 years. It gives some background on the origin of the empirical Bayes approach and its link with the famous Stein estimator. The paper describes the application in four important areas in medical statistics: disease mapping, health care monitoring, meta-analysis, and multiple testing. It ends with a warning that the application of the outcome of an empirical Bayes analysis to the individual "subjects" is a delicate matter that should be handled with prudence and care. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An Application of Six Sigma to Reduce Supplier Quality Cost
NASA Astrophysics Data System (ADS)
Gaikwad, Lokpriya Mohanrao; Teli, Shivagond Nagappa; Majali, Vijay Shashikant; Bhushi, Umesh Mahadevappa
2016-01-01
This article presents an application of Six Sigma to reduce supplier quality cost in manufacturing industry. Although there is a wider acceptance of Six Sigma in many organizations today, there is still a lack of in-depth case study of Six Sigma. For the present research the case study methodology was used. The company decided to reduce quality cost and improve selected processes using Six Sigma methodologies. Regarding the fact that there is a lack of case studies dealing with Six Sigma especially in individual manufacturing organization this article could be of great importance also for the practitioners. This paper discusses the quality and productivity improvement in a supplier enterprise through a case study. The paper deals with an application of Six Sigma define-measure-analyze-improve-control methodology in an industry which provides a framework to identify, quantify and eliminate sources of variation in an operational process in question, to optimize the operation variables, improve and sustain performance viz. process yield with well-executed control plans. Six Sigma improves the process performance (process yield) of the critical operational process, leading to better utilization of resources, decreases variations and maintains consistent quality of the process output.
Bao, Yihai; Main, Joseph A; Noh, Sam-Young
2017-08-01
A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.
Toward Green Acylation of (Hetero)arenes: Palladium-Catalyzed Carbonylation of Olefins to Ketones
2017-01-01
Green Friedel–Crafts acylation reactions belong to the most desired transformations in organic chemistry. The resulting ketones constitute important intermediates, building blocks, and functional molecules in organic synthesis as well as for the chemical industry. Over the past 60 years, advances in this topic have focused on how to make this reaction more economically and environmentally friendly by using green acylating conditions, such as stoichiometric acylations and catalytic homogeneous and heterogeneous acylations. However, currently well-established methodologies for their synthesis either produce significant amounts of waste or proceed under harsh conditions, limiting applications. Here, we present a new protocol for the straightforward and selective introduction of acyl groups into (hetero)arenes without directing groups by using available olefins with inexpensive CO. In the presence of commercial palladium catalysts, inter- and intramolecular carbonylative C–H functionalizations take place with good regio- and chemoselectivity. Compared to classical Friedel–Crafts chemistry, this novel methodology proceeds under mild reaction conditions. The general applicability of this methodology is demonstrated by the direct carbonylation of industrial feedstocks (ethylene and diisobutene) as well as of natural products (eugenol and safrole). Furthermore, synthetic applications to drug molecules are showcased. PMID:29392174
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Application of Six Sigma towards improving surgical outcomes.
Shukla, P J; Barreto, S G; Nadkarni, M S
2008-01-01
Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.
Automated control of hierarchical systems using value-driven methods
NASA Technical Reports Server (NTRS)
Pugh, George E.; Burke, Thomas E.
1990-01-01
An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.
Brattoli, Magda; Cisternino, Ezia; Dambruoso, Paolo Rosario; de Gennaro, Gianluigi; Giungato, Pasquale; Mazzone, Antonio; Palmisani, Jolanda; Tutino, Maria
2013-01-01
The gas chromatography-olfactometry (GC-O) technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications. PMID:24316571
Yusuf, Afiqah; Elsabbagh, Mayada
2015-12-15
Identifying biomarkers for autism can improve outcomes for those affected by autism. Engaging the diverse stakeholders in the research process using community-based participatory research (CBPR) can accelerate biomarker discovery into clinical applications. However, there are limited examples of stakeholder involvement in autism research, possibly due to conceptual and practical concerns. We evaluate the applicability of CBPR principles to biomarker discovery in autism and critically review empirical studies adopting these principles. Using a scoping review methodology, we identified and evaluated seven studies using CBPR principles in biomarker discovery. The limited number of studies in biomarker discovery adopting CBPR principles coupled with their methodological limitations suggests that such applications are feasible but challenging. These studies illustrate three CBPR themes: community assessment, setting global priorities, and collaboration in research design. We propose that further research using participatory principles would be useful in accelerating the pace of discovery and the development of clinically meaningful biomarkers. For this goal to be successful we advocate for increased attention to previously identified conceptual and methodological challenges to participatory approaches in health research, including improving scientific rigor and developing long-term partnerships among stakeholders.
A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.
Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego
2017-09-22
Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.
A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators
Sánchez-Picot, Álvaro
2017-01-01
Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610
ERIC Educational Resources Information Center
Gaven, Patricia; Williams, R. David
An experiment is proposed which will study the advantages of satellite technology as a means for the standardization of teaching methodology in an attempt to socially integrate the rural Alaskan native. With "Man: A Course of Study" as the curricular base of the experiment, there will be a Library Experiment Program for Adults using…
Christopher A. Lupoli; Wayde C. Morse; Conner Bailey; John Schelhas
2015-01-01
Two prominent critiques of volunteer tourism are that it is a top-down imposed form of development treating host communities as passive recipients of international aid, and that the impacts of volunteer tourism in host communities are not systematically evaluated. To address this we identified a pre-existing participatory methodology for assessing community...
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the Proceedings contains the following 20 papers: "Information Sufficiency and Risk Communication" (Robert J. Griffin, Kurt Neuwirth, and Sharon Dunwoody); "The Therapeutic Application of Television: An Experimental Study" (Charles Kingsley); "A Path Model Examining the…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... not listed on the Web site, but should note that the NRC's E-Filing system does not support unlisted... (COLR), to update the methodology reference list to support the core design with the new AREVA fuel... methodologies listed in Technical Specification 5.7.1.5 has no impact on any plant configuration or system...
Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments
2015-09-30
statistical inference methodologies for ocean- acoustic problems by investigating and applying statistical methods to data collected from scale-model...to begin planning experiments for statistical inference applications. APPROACH In the ocean acoustics community over the past two decades...solutions for waveguide parameters. With the introduction of statistical inference to the field of ocean acoustics came the desire to interpret marginal
ERIC Educational Resources Information Center
Renard, Colette; And Others
Principles of the "St. Cloud" audiovisual language instruction methodology based on "Le Francais fondamental" are presented in this guide for teachers. The material concentrates on course content, methodology, and application--including criteria for selection and gradation of course content, a description of the audiovisual and written language…
2009-06-01
19 Immanuel Kant, Perpetual Peace, trans. Mary Campbell Smith (New York: Cosimo, Inc., 2005), 10. Democratic Peace theory derives from Kant’s ...1 2 THEORY AND METHODOLOGY ........................................................................ 3 3 RELATIONS IN THE...for the JADSC, the thesis takes a methodological approach through theory , history, application, and analysis. First, prevailing theories of
Epel, Boris; Sundramoorthy, Subramanian V.; Barth, Eugene D.; Mailer, Colin; Halpern, Howard J.
2011-01-01
Purpose: The authors compare two electron paramagnetic resonance imaging modalities at 250 MHz to determine advantages and disadvantages of those modalities for in vivo oxygen imaging. Methods: Electron spin echo (ESE) and continuous wave (CW) methodologies were used to obtain three-dimensional images of a narrow linewidth, water soluble, nontoxic oxygen-sensitive trityl molecule OX063 in vitro and in vivo. The authors also examined sequential images obtained from the same animal injected intravenously with trityl spin probe to determine temporal stability of methodologies. Results: A study of phantoms with different oxygen concentrations revealed a threefold advantage of the ESE methodology in terms of reduced imaging time and more precise oxygen resolution for samples with less than 70 torr oxygen partial pressure. Above∼100 torr, CW performed better. The images produced by both methodologies showed pO2 distributions with similar mean values. However, ESE images demonstrated superior performance in low pO2 regions while missing voxels in high pO2 regions. Conclusions: ESE and CW have different areas of applicability. ESE is superior for hypoxia studies in tumors. PMID:21626937
Sanchez-Vazquez, Manuel J; Nielen, Mirjam; Edwards, Sandra A; Gunn, George J; Lewis, Fraser I
2012-08-31
Abattoir detected pathologies are of crucial importance to both pig production and food safety. Usually, more than one pathology coexist in a pig herd although it often remains unknown how these different pathologies interrelate to each other. Identification of the associations between different pathologies may facilitate an improved understanding of their underlying biological linkage, and support the veterinarians in encouraging control strategies aimed at reducing the prevalence of not just one, but two or more conditions simultaneously. Multi-dimensional machine learning methodology was used to identify associations between ten typical pathologies in 6485 batches of slaughtered finishing pigs, assisting the comprehension of their biological association. Pathologies potentially associated with septicaemia (e.g. pericarditis, peritonitis) appear interrelated, suggesting on-going bacterial challenges by pathogens such as Haemophilus parasuis and Streptococcus suis. Furthermore, hepatic scarring appears interrelated with both milk spot livers (Ascaris suum) and bacteria-related pathologies, suggesting a potential multi-pathogen nature for this pathology. The application of novel multi-dimensional machine learning methodology provided new insights into how typical pig pathologies are potentially interrelated at batch level. The methodology presented is a powerful exploratory tool to generate hypotheses, applicable to a wide range of studies in veterinary research.
Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen
2015-01-01
The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870
A primer on systematic reviews in toxicology.
Hoffmann, Sebastian; de Vries, Rob B M; Stephens, Martin L; Beck, Nancy B; Dirven, Hubert A A M; Fowle, John R; Goodman, Julie E; Hartung, Thomas; Kimber, Ian; Lalu, Manoj M; Thayer, Kristina; Whaley, Paul; Wikoff, Daniele; Tsaioun, Katya
2017-07-01
Systematic reviews, pioneered in the clinical field, provide a transparent, methodologically rigorous and reproducible means of summarizing the available evidence on a precisely framed research question. Having matured to a well-established approach in many research fields, systematic reviews are receiving increasing attention as a potential tool for answering toxicological questions. In the larger framework of evidence-based toxicology, the advantages and obstacles of, as well as the approaches for, adapting and adopting systematic reviews to toxicology are still being explored. To provide the toxicology community with a starting point for conducting or understanding systematic reviews, we herein summarized available guidance documents from various fields of application. We have elaborated on the systematic review process by breaking it down into ten steps, starting with planning the project, framing the question, and writing and publishing the protocol, and concluding with interpretation and reporting. In addition, we have identified the specific methodological challenges of toxicological questions and have summarized how these can be addressed. Ultimately, this primer is intended to stimulate scientific discussions of the identified issues to fuel the development of toxicology-specific methodology and to encourage the application of systematic review methodology to toxicological issues.
Meta-Study as Diagnostic: Toward Content Over Form in Qualitative Synthesis.
Frost, Julia; Garside, Ruth; Cooper, Chris; Britten, Nicky
2016-02-01
Having previously conducted qualitative syntheses of the diabetes literature, we wanted to explore the changes in theoretical approaches, methodological practices, and the construction of substantive knowledge which have recently been presented in the qualitative diabetes literature. The aim of this research was to explore the feasibility of synthesizing existing qualitative syntheses of patient perspectives of diabetes using meta-study methodology. A systematic review of qualitative literature, published between 2000 and 2013, was conducted. Six articles were identified as qualitative syntheses. The meta-study methodology was used to compare the theoretical, methodological, analytic, and synthetic processes across the six studies, exploring the potential for an overarching synthesis. We identified that while research questions have increasingly concentrated on specific aspects of diabetes, the focus on systematic review processes has led to the neglect of qualitative theory and methods. This can inhibit the production of compelling results with meaningful clinical applications. Although unable to produce a synthesis of syntheses, we recommend that researchers who conduct qualitative syntheses pay equal attention to qualitative traditions and systematic review processes, to produce research products that are both credible and applicable. © The Author(s) 2015.
Vanegas, Fernando; Weiss, John; Gonzalez, Felipe
2018-01-01
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101
On the importance of methods in hydrological modelling. Perspectives from a case study
NASA Astrophysics Data System (ADS)
Fenicia, Fabrizio; Kavetski, Dmitri
2017-04-01
The hydrological community generally appreciates that developing any non-trivial hydrological model requires a multitude of modelling choices. These choices may range from a (seemingly) straightforward application of mass conservation, to the (often) guesswork-like selection of constitutive functions, parameter values, etc. The application of a model itself requires a myriad of methodological choices - the selection of numerical solvers, objective functions for model calibration, validation approaches, performance metrics, etc. Not unreasonably, hydrologists embarking on ever ambitious projects prioritize hydrological insight over the morass of methodological choices. Perhaps to emphasize "ideas" over "methods", some journals have even reduced the fontsize of the methodology sections of its articles. However, the very nature of modelling is that seemingly routine methodological choices can significantly affect the conclusions of case studies and investigations - making it dangerous to skimp over methodological details in an enthusiastic rush towards the next great hydrological idea. This talk shares modelling insights from a hydrological study of a 300 km2 catchment in Luxembourg, where the diversity of hydrograph dynamics observed at 10 locations begs the question of whether external forcings or internal catchment properties act as dominant controls on streamflow generation. The hydrological insights are fascinating (at least to us), but in this talk we emphasize the impact of modelling methodology on case study conclusions and recommendations. How did we construct our prior set of hydrological model hypotheses? What numerical solver was implemented and why was an objective function based on Bayesian theory deployed? And what would have happened had we omitted model cross-validation, or not used a systematic hypothesis testing approach?
Evaluating a collaborative IT based research and development project.
Khan, Zaheer; Ludlow, David; Caceres, Santiago
2013-10-01
In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.
Application of control theory to dynamic systems simulation
NASA Technical Reports Server (NTRS)
Auslander, D. M.; Spear, R. C.; Young, G. E.
1982-01-01
The application of control theory is applied to dynamic systems simulation. Theory and methodology applicable to controlled ecological life support systems are considered. Spatial effects on system stability, design of control systems with uncertain parameters, and an interactive computing language (PARASOL-II) designed for dynamic system simulation, report quality graphics, data acquisition, and simple real time control are discussed.
3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications
ERIC Educational Resources Information Center
Pesaresi, Cristano; Van Der Schee, Joop; Pavia, Davide
2017-01-01
The project "3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications" has been devised with the intention to deal with the demand for research, innovation and applicative methodology on the part of the international programme, requiring concrete results to increase the capacity to know, anticipate…
From LCAs to simplified models: a generic methodology applied to wind power electricity.
Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle
2013-02-05
This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Q methodology in health economics.
Baker, Rachel; Thompson, Carl; Mannion, Russell
2006-01-01
The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.
A hierarchical clustering methodology for the estimation of toxicity.
Martin, Todd M; Harten, Paul; Venkatapathy, Raghuraman; Das, Shashikala; Young, Douglas M
2008-01-01
ABSTRACT A quantitative structure-activity relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural similarity is defined in terms of 2-D physicochemical descriptors (such as connectivity and E-state indices). A genetic algorithm-based technique is used to generate statistically valid QSAR models for each cluster (using the pool of descriptors described above). The toxicity for a given query compound is estimated using the weighted average of the predictions from the closest cluster from each step in the hierarchical clustering assuming that the compound is within the domain of applicability of the cluster. The hierarchical clustering methodology was tested using a Tetrahymena pyriformis acute toxicity data set containing 644 chemicals in the training set and with two prediction sets containing 339 and 110 chemicals. The results from the hierarchical clustering methodology were compared to the results from several different QSAR methodologies.
The Technology Acceptance of Mobile Applications in Education
ERIC Educational Resources Information Center
Camilleri, Mark Anthony; Camilleri, Adriana Caterina
2017-01-01
This research explores the educators' attitudes and behavioural intention toward mobile applications. The methodology integrates measures from "the pace of technological innovativeness" and the "technology acceptance model" to understand the rationale for further investment in mobile learning (m-learning). A quantitative study…
COMPREHENSIVE SUMMARY OF SLUDGE DISPOSAL RECYCLING HISTORY
Since 1971 the only mode of sludge disposal used by Denver District No. 1 has been land application. A number of different application procedures have been tried over the intervening years. The development of methodology and problems associated with each procedure are discussed i...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Applicability. 301.1 Section 301.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST METHODOLOGY FOR...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Applicability. 301.1 Section 301.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST METHODOLOGY FOR...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Applicability. 301.1 Section 301.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST METHODOLOGY FOR...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Applicability. 301.1 Section 301.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST METHODOLOGY FOR...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Applicability. 301.1 Section 301.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS FOR FEDERAL POWER MARKETING ADMINISTRATIONS AVERAGE SYSTEM COST METHODOLOGY FOR...
Corbett, Andrea M; Francis, Karen; Chapman, Ysanne
2007-04-01
Identifying a methodology to guide a study that aims to enhance service delivery can be challenging. Participatory action research offers a solution to this challenge as it both informs and is informed by critical social theory. In addition, using a feminist lens helps acquiesce this approach as a suitable methodology for changing practice. This methodology embraces empowerment self-determination and the facilitation of agreed change as central tenets that guide the research process. Encouraged by the work of Foucault, Friere, Habermas, and Maguire, this paper explicates the philosophical assumptions underpinning critical social theory and outlines how feminist influences are complimentary in exploring the processes and applications of nursing research that seeks to embrace change.
Methodological Variability Using Electronic Nose Technology For Headspace Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knobloch, Henri; Turner, Claire; Spooner, Andrew
Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspacemore » analysis. However, the potential of e-nose technology is also discussed.« less
BATSE gamma-ray burst line search. 2: Bayesian consistency methodology
NASA Technical Reports Server (NTRS)
Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.
1994-01-01
We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
pysimm: A Python Package for Simulation of Molecular Systems
NASA Astrophysics Data System (ADS)
Fortunato, Michael; Colina, Coray
pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.
Varas, Lautaro R; Pontes, F C; Santos, A C F; Coutinho, L H; de Souza, G G B
2015-09-15
The ion-ion-coincidence mass spectroscopy technique brings useful information about the fragmentation dynamics of doubly and multiply charged ionic species. We advocate the use of a matrix-parameter methodology in order to represent and interpret the entire ion-ion spectra associated with the ionic dissociation of doubly charged molecules. This method makes it possible, among other things, to infer fragmentation processes and to extract information about overlapped ion-ion coincidences. This important piece of information is difficult to obtain from other previously described methodologies. A Wiley-McLaren time-of-flight mass spectrometer was used to discriminate the positively charged fragment ions resulting from the sample ionization by a pulsed 800 eV electron beam. We exemplify the application of this methodology by analyzing the fragmentation and ionic dissociation of the dimethyl disulfide (DMDS) molecule as induced by fast electrons. The doubly charged dissociation was analyzed using the Multivariate Normal Distribution. The ion-ion spectrum of the DMDS molecule was obtained at an incident electron energy of 800 eV and was matrix represented using the Multivariate Distribution theory. The proposed methodology allows us to distinguish information among [CH n SH n ] + /[CH 3 ] + (n = 1-3) fragment ions in the ion-ion coincidence spectra using ion-ion coincidence data. Using the momenta balance methodology for the inferred parameters, a secondary decay mechanism is proposed for the [CHS] + ion formation. As an additional check on the methodology, previously published data on the SiF 4 molecule was re-analyzed with the present methodology and the results were shown to be statistically equivalent. The use of a Multivariate Normal Distribution allows for the representation of the whole ion-ion mass spectrum of doubly or multiply ionized molecules as a combination of parameters and the extraction of information among overlapped data. We have successfully applied this methodology to the analysis of the fragmentation of the DMDS molecule. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Application of low-cost methodologies for mobile phone app development.
Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon
2014-12-09
The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines.
Application of Low-Cost Methodologies for Mobile Phone App Development
Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon
2014-01-01
Background The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. Objective The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users’ self-rated perception of the apps. Methods In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the “Mastering Psychiatry” app for undergraduates and “Déjà vu” app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. Results For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. Conclusions This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines. PMID:25491323
2013-01-01
Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905
Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P
2013-08-22
Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Novel optoelectronic methodology for testing of MOEMS
NASA Astrophysics Data System (ADS)
Pryputniewicz, Ryszard J.; Furlong, Cosme
2003-01-01
Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.
Integration of infrared thermography into various maintenance methodologies
NASA Astrophysics Data System (ADS)
Morgan, William T.
1993-04-01
Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Disease Risk Score (DRS) as a Confounder Summary Method: Systematic Review and Recommendations
Tadrous, Mina; Gagne, Joshua J.; Stürmer, Til; Cadarette, Suzanne M.
2013-01-01
Purpose To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. Methods We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Results Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Conclusion Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. PMID:23172692
Disease risk score as a confounder summary method: systematic review and recommendations.
Tadrous, Mina; Gagne, Joshua J; Stürmer, Til; Cadarette, Suzanne M
2013-02-01
To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. Copyright © 2012 John Wiley & Sons, Ltd.
Evaluative methodology for comprehensive water quality management planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, H. L.
Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.
Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering
NASA Astrophysics Data System (ADS)
Hynes-Griffin, M. E.; Buege, L. L.
1983-09-01
Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.
Patients' perspective of the design of provider-patients electronic communication services.
Silhavy, Petr; Silhavy, Radek; Prokopova, Zdenka
2014-06-12
Information Delivery is one the most important tasks in healthcare practice. This article discusses patient's tasks and perspectives, which are then used to design a new Effective Electronic Methodology. The system design methods applicable to electronic communication in the healthcare sector are also described. The architecture and the methodology for the healthcare service portal are set out in the proposed system design.