Kumar, Sri; Enz, Bruce; Ponder, Perry L; Anderson, Bob
2009-01-01
Traffic safety has been significantly improved over the past several decades reducing injury and fatality rates. However, there is a paucity of research effort to address the safety issues in underride accidents, specifically the side underride crashes. It is well known that the compromise of occupant space in the vehicle leads to a higher probability of serious or fatal injuries. A better understanding of occupant protection and mechanism of injuries involved in side underride accidents assists in the advancement of safety measures. The present work evaluates the injury potential to occupants during side underride crashes using the car-to-trailer crash methodology. Four crash tests were conducted into the side of a stationary trailer fitted with the side underride guard system (SURG). The SURG used in these tests is 25% lighter than the previous design. A 5th percentile hybrid III female dummy was placed in the driver seat and restrained with the three-point lap and shoulder harness. The anthropometric dummy was instrumented with a head triaxial accelerometer, a chest triaxal accelerometer, a load cell to measure neck force and moment, and a load cell to measure the femur force. The vehicle acceleration was measured using a traxial accelerometer in the rear center tunnel. High speed, standard video and still photos were taken. In all tests, the intrusion was limited to the front structure of the vehicle without any significant compromise to the occupant space. Results indicate that the resultant head and chest accelerations, head injury criterion (HIC), neck force and moment, and femur force were well below the injury tolerance. The present findings support the hypothesis that the SURG not only limits or eliminates the intrusion into the occupant space but also results in biomechanical injury values well below the tolerance limit in motor vehicle crashes.
Evaluation of US rear underride guard regulation for large trucks using real-world crashes.
Brumbelow, Matthew L; Blanar, Laura
2010-11-01
Current requirements for rear underride guards on large trucks are set by the National Highway Traffic Safety Administration in Federal Motor Vehicle Safety Standards (FMVSS) 223 and 224. The standards have been in place since 1998, but their adequacy has not been evaluated apart from two series of controlled crash tests. The current study used detailed reviews of real-world crashes from the Large Truck Crash Causation Study to assess the ability of guards that comply with certain aspects of the regulation to mitigate passenger vehicle underride. It also evaluated the dangers posed by underride of large trucks that are exempt from guard requirements. For the 115 cases meeting the inclusion criteria, coded data, case narratives, photographs, and measurements were used to examine the interaction between study vehicles. The presence and type of underride guard was determined, and its performance in mitigating underride was categorized. Overall, almost one-half of the passenger vehicles had underride damage classified as severe or catastrophic. These vehicles accounted for 23 of the 28 in which occupants were killed. For the cases involving trailers with underride guards compliant with one or both FMVSS, guard deformation or complete failure was frequent and most commonly due to weak attachments, buckling of the trailer chassis, or bending of the lateral end of the guard under narrow overlap loading. Most of the truck units studied qualified for at least one of the FMVSS exemptions. The two largest groups were trailers with small wheel setbacks and single-unit straight trucks. Dump trucks represented a particularly hazardous category of straight truck. The current study suggests several weaknesses in the rear underride guard regulation. The standard allows too much ground clearance, the quasi-static test conditions allow guard designs that fail in narrow overlap crashes, and certifying guards independent of trailers leads to systems with inadequate attachment and chassis strength. Additionally, the regulation should be expanded to cover a higher percentage of the large truck fleet.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
... Report, on the Effectiveness of Underride Guards for Heavy Trailers AGENCY: National Highway Traffic...: The Effectiveness of Underride Guards for Heavy Trailers. DATES: Comments must be received no later...
Vachal, Kimberly; Tumuhairwe, Esther K; Berwick, Mark
2009-04-01
The North Dakota Legislature recently passed a law exempting the state's agricultural truck fleet from a federal safety program requirement for rear-guard equipment on large trucks. This equipment has been shown to reduce crash severity when a passenger vehicle collides with the rear of the truck. This study uses truck fleet, truck crash, and injury severity data to estimate the public safety benefit derived from passenger-vehicle underride protection during rear-end crashes involving large agricultural trucks in North Dakota. A benefit-cost analysis of crash injury avoidance is developed based on the frequency and severity of rear-end truck collisions in North Dakota between 2001 and 2007. The injury avoidance benefits and commercial vehicle safety grant benefits are estimated to be $11.4 to $20.2 million during the seven-year depreciable truck life. The public safety benefits for rear-impact guards are higher than the estimated lifetime cost for the equipment and maintenance of $8.1 million.
Effects of changes in effective rail height on barrier performance. Volume 1, Research report
DOT National Transportation Integrated Search
1987-04-01
The objective of this project was to determine the critical rail mounting heights to prevent underride and override for traffic barriers. W-beam guardrails, which are the most commonly specified barrier in the U. S., were used to develop criteria for...
Safety performance evaluation of cable median barriers on freeways in Florida.
Alluri, Priyanka; Haleem, Kirolos; Gan, Albert; Mauthner, John
2016-07-03
This article aims to evaluate the safety performance of cable median barriers on freeways in Florida. The safety performance evaluation was based on the percentages of barrier and median crossovers by vehicle type, crash severity, and cable median barrier type (Trinity Cable Safety System [CASS] and Gibraltar system). Twenty-three locations with cable median barriers totaling about 101 miles were identified. Police reports of 6,524 crashes from years 2005-2010 at these locations were reviewed to verify and obtain detailed crash information. A total of 549 crashes were determined to be barrier related (i.e., crashes involving vehicles hitting the cable median barrier) and were reviewed in further detail to identify crossover crashes and the manner in which the vehicles crossed the barriers; that is, by either overriding, underriding, or penetrating the barriers. Overall, 2.6% of vehicles that hit the cable median barrier crossed the median and traversed into the opposite travel lane. Overall, 98.1% of cars and 95.5% of light trucks that hit the barrier were prevented from crossing the median. In other words, 1.9% of cars and 4.5% of light trucks that hit the barrier had crossed the median and encroached on the opposite travel lanes. There is no significant difference in the performance of cable median barrier for cars versus light trucks in terms of crossover crashes. In terms of severity, overrides were more severe compared to underrides and penetrations. The statistics showed that the CASS and Gibraltar systems performed similarly in terms of crossover crashes. However, the Gibraltar system experienced a higher proportion of penetrations compared to the CASS system. The CASS system resulted in a slightly higher percentage of moderate and minor injury crashes compared to the Gibraltar system. Cable median barriers are successful in preventing median crossover crashes; 97.4% of the cable median barrier crashes were prevented from crossing over the median. Of all of the vehicles that hit the barrier, 83.6% were either redirected or contained by the cable barrier system. Barrier crossover crashes were found to be more severe compared to barrier noncrossover crashes. In addition, overrides were found to be more severe compared to underrides and penetrations.
Rail height effects on safety performance of Midwest Guardrail System.
Asadollahi Pajouh, Mojdeh; Julin, Ramen D; Stolle, Cody S; Reid, John D; Faller, Ronald K
2018-02-17
Guardrail heights play a crucial role in the way that errant vehicles interact with roadside barriers. Low rail heights increase the propensity of vehicle rollover and override, whereas excessively tall rails promote underride. Further, rail mounting heights and post embedment depths may be altered by variations in roadside terrain. An increased guardrail height may be desirable to accommodate construction tolerances, soil erosion, frost heave, and future roadway overlays. This study aimed to investigate and identify a maximum safe installation height for the Midwest Guardrail System that would be robust and remain crashworthy before and after pavement overlays. A research investigation was performed to evaluate the safety performance of increased mounting heights for the standard 787-mm (31-in.)-tall Midwest Guardrail System (MGS) through crash testing and computer simulation. Two full-scale crash tests with small passenger cars were performed on the MGS with top-rail mounting heights of 864 and 914 mm (34 and 36 in.). Test results were then used to calibrate computer simulation models. In the first test, a small car impacted the MGS with 864-mm (34-in.) rail height at 102 km/h (63.6 mph) and 25.0° and was successfully redirected. In the second test, another small car impacted the MGS with a 914-mm (36-in.) rail height at 103 km/h (64.1 mph) and 25.6° and was successful. Both system heights satisfied the Manual for Assessing Safety Hardware (MASH) Test Level 3 (TL-3) evaluation criteria. Test results were then used to calibrate computer simulation models. A mounting height of 36 in. was determined to be the maximum guardrail height that would safely contain and redirect small car vehicles. Simulations confirmed that taller guardrail heights (i.e., 37 in.) would likely result in small car underride. In addition, simulation results indicated that passenger vehicle models were successfully contained by the 34- and 36-in.-tall MGS installed on approach slopes as steep as 6:1. A mounting height of 914 mm (36 in.) was determined to be the maximum guardrail height that would safely contain and redirect 1100C vehicles and not allow underride or excessive vehicle snag on support posts. Recommendations were also provided regarding the safety performance of the MGS with increased height.
NASA Astrophysics Data System (ADS)
Kaye, Andrew J.; Cho, Jaehyun; Basu, Nandita B.; Chen, Xiaosong; Annable, Michael D.; Jawitz, James W.
2008-11-01
This study investigated the benefits of partial removal of dense nonaqueous phase liquid (DNAPL) source zones using enhanced dissolution in eight laboratory scale experiments. The benefits were assessed by characterizing the relationship between reductions in DNAPL mass and the corresponding reduction in contaminant mass flux. Four flushing agents were evaluated in eight controlled laboratory experiments to examine the effects of displacement fluid property contrasts and associated override and underride on contaminant flux reduction ( Rj) vs. mass reduction ( Rm) relationships ( Rj( Rm)): 1) 50% ethanol/50% water (less dense than water), 2) 40% ethyl-lactate/60% water (more dense than water), 3) 18% ethanol/26% ethyl-lactate/56% water (neutrally buoyant), and 4) 2% Tween-80 surfactant (also neutrally buoyant). For each DNAPL architecture evaluated, replicate experiments were conducted where source zone dissolution was conducted with a single flushing event to remove most of the DNAPL from the system, and with multiple shorter-duration floods to determine the path of the Rj( Rm) relationship. All of the single-flushing experiments exhibited similar Rj( Rm) relationships indicating that override and underride effects associated with cosolvents did not significantly affect the remediation performance of the agents. The Rj( Rm) relationship of the multiple injection experiments for the cosolvents with a density contrast with water tended to be less desirable in the sense that there was less Rj for a given Rm. UTCHEM simulations supported the observations from the laboratory experiments and demonstrated the capability of this model to predict Rj( Rm) relationships for non-uniformly distributed NAPL sources.
Kaye, Andrew J; Cho, Jaehyun; Basu, Nandita B; Chen, Xiaosong; Annable, Michael D; Jawitz, James W
2008-11-14
This study investigated the benefits of partial removal of dense nonaqueous phase liquid (DNAPL) source zones using enhanced dissolution in eight laboratory scale experiments. The benefits were assessed by characterizing the relationship between reductions in DNAPL mass and the corresponding reduction in contaminant mass flux. Four flushing agents were evaluated in eight controlled laboratory experiments to examine the effects of displacement fluid property contrasts and associated override and underride on contaminant flux reduction (R(j)) vs. mass reduction (R(m)) relationships (R(j)(R(m))): 1) 50% ethanol/50% water (less dense than water), 2) 40% ethyl-lactate/60% water (more dense than water), 3) 18% ethanol/26% ethyl-lactate/56% water (neutrally buoyant), and 4) 2% Tween-80 surfactant (also neutrally buoyant). For each DNAPL architecture evaluated, replicate experiments were conducted where source zone dissolution was conducted with a single flushing event to remove most of the DNAPL from the system, and with multiple shorter-duration floods to determine the path of the R(j)(R(m)) relationship. All of the single-flushing experiments exhibited similar R(j)(R(m)) relationships indicating that override and underride effects associated with cosolvents did not significantly affect the remediation performance of the agents. The R(j)(R(m)) relationship of the multiple injection experiments for the cosolvents with a density contrast with water tended to be less desirable in the sense that there was less R(j) for a given R(m). UTCHEM simulations supported the observations from the laboratory experiments and demonstrated the capability of this model to predict R(j)(R(m)) relationships for non-uniformly distributed NAPL sources.
Potential benefits of underride guards in large truck side crashes.
Brumbelow, Matthew L
2012-01-01
To evaluate the maximum potential for side underride guards (SUGs) to reduce passenger vehicle occupant fatalities and injuries in crashes with large trucks in the United States. Examination of the Large Truck Crash Causation Study (LTCCS) identified 206 crash events involving a passenger vehicle impact with the side of a large truck. Each case was evaluated to determine whether the most severe injury sustained by a passenger vehicle occupant was a result of the impact with the side of the truck and whether an SUG could have reduced the injury severity. Data from the 2006-2008 Fatality Analysis Reporting System (FARS) and Trucks Involved in Fatal Accidents (TIFA) survey were used to compare the types of trucks involved in all fatal side impacts with passenger vehicles with the truck types in the LTCCS cases that were studied. FARS and TIFA data also were used to estimate the total annual number of passenger vehicle occupants killed in truck side impacts. In 143 of the 206 cases, the truck side impact produced the most severe injury sustained by a passenger vehicle occupant. In the other cases, no passenger vehicle occupant was injured or the most severe injury was due to an event preceding or following the truck side impact. Forty-nine of these occupants sustained injuries coded as level 3 or higher on the abbreviated injury scale (AIS) or were killed. SUGs could have reduced injury severity in 76 of the 143 cases, including 38 of the 49 cases with an AIS ≥ 3 coded injury or fatality. Semi-trailers were the most common type of impacted truck unit, both overall and when considering only cases where an SUG could have mitigated injury severity. Crashes where the front of the passenger vehicle struck the side of the semi-trailer perpendicularly or obliquely from the oncoming direction were less common overall than side-to-side and oblique/same direction crashes but more often produced an AIS ≥ 3 injury or fatality. The distribution of truck types in the LTCCS sample was similar to that in the FARS and TIFA data. Overall, around 1600 passenger vehicle occupants were killed in 2-vehicle truck side impact crashes during 2006-2008, or 22 percent of all passenger vehicle occupants who died in 2-vehicle crashes with large trucks. Structural incompatibility was a common factor in LTCCS crashes between passenger vehicles and the sides of large trucks. SUGs could have reduced injury risk in around three fourths of the crashes that produced an AIS ≥ 3 injury or fatality. Most of these crashes involved semi-trailers. However, the necessary strength and location of these SUGs present technical challenges that need to be addressed.
Investigation of Vehicle Rear Under Run Protection Device (RUPD) Using Aluminium Foam
NASA Astrophysics Data System (ADS)
Nagaraj Goud, B.; pachori, Avinash
2017-08-01
Whenever the passenger cars meet with accidents with the heavy duty truck from rear, it will tend to penetrate under the truck bed called truck trailer under-ride crash. This is responsible for the thousands of accidents, causing severe injuries and spot death. This is mostly due to the lack of effective guarding system. The Present paper gives an importance on energy absorption mechanism of a Rear under Run Protection Device (RUPD) under crash effect of the truck. The aim of the study is to replace Steel RUPD with aluminum foam, which promises an improvement of vehicle crashworthiness as well as to reduce weight of the vehicle. The aluminum foam is selected due to the high specific strength and specific stiffness. This inborn character makes it a promising candidate in the modern lightweight structures in the automotive engineering which can contribute to the improvement of mileage in addition to safety of the occupants.
NASA Astrophysics Data System (ADS)
Toganel, George-Radu; Ovidiu Soica, Adrian
2017-10-01
The presented study was directed at two types of airbag miss-deployments: late deployment and non-deployment. Late deployment can be a product of override or underride road traffic accidents. Non-deployment can be a product of technical failure or trigger algorithm’s inability to correctly assume the state of the accident to happen. In order to analyse the phenomena through physical tests, a specialized test device was used for a series of 8 non-deployment tests and a series of 4 airbag firing tests, totalling 12 tests. Acceleration based data was recorded and analysed for the movement of the device part simulating the driver head. High speed video recording was used to analyse the mechanics of airbag deployment and correlate with the acceleration based data. It has been determined, in the limitations of the laboratory testing environment, a significant variation of the time frame for the airbag deployments, despite using similar testing conditions and identical tested products. Also, the initial time frame for airbag deployment delay was overshadowed by other factors such as time to impact.
NASA Astrophysics Data System (ADS)
Juhlin, Christopher; Anderson, Mark; Dopson, Mark; Lorenz, Henning; Pascal, Christophe; Piazolo, Sandra; Roberts, Nick; Rosberg, Jan-Erik; Tsang, Chin-Fu
2016-04-01
The Collisional Orogeny in the Scandinavian Caledonides (COSC) scientific drilling project employs two fully cored boreholes for investigating mountain building processes at mid-crustal levels in a deeply eroded Paleozoic collisional orogen of Alpine-Himalayan size. The two COSC boreholes will provide a unique c. 5 km deep composite section from a hot allochthon through the underlying 'colder' nappes, the main décollement and into the basement of the collisional underriding plate. COSC's unprecedented wealth of geophysical field and borehole data combined with the petrology, geochronology and rock physics information obtained from the drill cores will develop into an integrated model for a major collisional mountain belt. This can be utilized as an analogue to better understand similar modern tectonic settings (Himalaya, Izu-Bonin-Mariana, amongst others) and, thus, advance our understanding of such complex systems and how they affect the (human) environment. COSC investigations and drilling activities are focused in the Åre-Mörsil area (Sweden) of central Scandinavia. The first drill hole, COSC-1, was completed in late August 2014 with near 100% core recovery down to 2.5 km. It targeted the high-grade metamorphic Seve Nappe Complex (SNC) and its contact with the underlying allochthon, investigating how this metasedimentary unit, that was initially deeply subducted during orogeny, was exhumed and then, still hot, emplaced as an allochthon onto the foreland of the underriding plate. COSC-2 will investigate the main Caledonian décollement, which is the major detachment that separates the Caledonian allochthons from the autochthonous basement of the Fennoscandian Shield, and the character of the deformation in the basement. Combined seismic, magnetotelluric (MT) and magnetic data provide control on the basement structure and the depth to the main décollement, believed to be hosted in the carbon-rich highly conductive Alum Shale. Key targets are to understand the geometry, stress distribution and rheology of the main décollement and associated fault systems in the foreland of one of the Earth's largest orogens, and to determine the relationship between the basement deformation and the thrust tectonics in the nappes above. COSC-2 will provide insights into the evolution of Baltica near the Ordovician-Silurian boundary by providing a new, distal section from the Early Paleozoic sedimentary basin. High-quality, high-resolution temperature profiles will allow the reconstruction of the ground surface temperature history and its variations for up to 100000 years and gather new knowledge about the Weichselian glaciation and climate evolution in northern Europe during the Holocene, including industrial age trends. Furthermore, research will address the hydrogeological and geothermic characteristics of the mountain belt and investigate the geological energy sources utilized by the deep biosphere. The drilling program and on-site science will build on the experience from drilling COSC-1. Applications for drilling related costs have been made to ICDP and the Swedish Research Council and if funded, drilling can be performed in 2017 at the earliest. Researchers interested in any aspect of the COSC project are invited to join and provide parallel funding for drilling, on-site science, and studies on core and downhole geophysics.
NASA Astrophysics Data System (ADS)
Kusky, Timothy M.; Bradley, Dwight C.
1999-12-01
Permian to Cretaceous mélange of the McHugh Complex on the Kenai Peninsula, south-central Alaska includes blocks and belts of graywacke, argillite, limestone, chert, basalt, gabbro, and ultramafic rocks, intruded by a variety of igneous rocks. An oceanic plate stratigraphy is repeated hundreds of times across the map area, but most structures at the outcrop scale extend lithological layering. Strong rheological units occur as blocks within a matrix that flowed around the competent blocks during deformation, forming broken formation and mélange. Deformation was noncoaxial, and disruption of primary layering was a consequence of general strain driven by plate convergence in a relatively narrow zone between the overriding accretionary wedge and the downgoing, generally thinly sedimented oceanic plate. Soft-sediment deformation processes do not appear to have played a major role in the formation of the mélange. A model for deformation at the toe of the wedge is proposed in which layers oriented at low angles to σ1 are contracted in both the brittle and ductile regimes, layers at 30-45° to σ1 are extended in the brittle regime and contracted in the ductile regime, and layers at angles greater than 45° to σ1 are extended in both the brittle and ductile regimes. Imbrication in thrust duplexes occurs at deeper levels within the wedge. Many structures within mélange of the McHugh Complex are asymmetric and record kinematic information consistent with the inferred structural setting in an accretionary wedge. A displacement field for the McHugh Complex on the lower Kenai Peninsula includes three belts: an inboard belt of Late Triassic rocks records west-to-east-directed slip of hanging walls, a central belt of predominantly Early Jurassic rocks records north-south directed displacements, and Early Cretaceous rocks in an outboard belt preserve southwest-northeast directed slip vectors. Although precise ages of accretion are unknown, slip directions are compatible with inferred plate motions during the general time frame of accretion of the McHugh Complex. The slip vectors are interpreted to preserve the convergence directions between the overriding and underriding plates, which became more oblique with time. They are not considered indicative of strain partitioning into belts of orogen-parallel and orogen-perpendicular displacements, because the kinematic data are derived from the earliest preserved structures, whereas fabrics related to strain partitioning would be expected to be superimposed on earlier accretion-related fabrics.
77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-10
... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
ERIC Educational Resources Information Center
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-01-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…
Fractography of ceramic and metal failures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1984-01-01
STP 827 is organized into the two broad areas of ceramics and metals. The ceramics section covers fracture analysis techniques, surface analysis techniques, and applied fractography. The metals section covers failure analysis techniques, and latest approaches to fractography, and applied fractography.
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke
2013-07-01
This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.
Mulware, Stephen Juma
2015-01-01
The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
Determining the Number of Factors in P-Technique Factor Analysis
ERIC Educational Resources Information Center
Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael
2017-01-01
Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…
Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour
2018-01-12
Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.
Sidaway, Ben; Euloth, Tracey; Caron, Heather; Piskura, Matthew; Clancy, Jessica; Aide, Alyson
2012-07-01
The purpose of this study was to compare the reliability of three previously used techniques for the measurement of ankle dorsiflexion ROM, open-chained goniometry, closed-chained goniometry, and inclinometry, to a novel trigonometric technique. Twenty-one physiotherapy students used four techniques (open-chained goniometry, closed-chained goniometry, inclinometry, and trigonometry) to assess dorsiflexion range of motion in 24 healthy volunteers. All student raters underwent training to establish competence in the four techniques. Raters then measured dorsiflexion with a randomly assigned measuring technique four times over two sessions, one week apart. Data were analyzed using a technique by session analysis of variance, technique measurement variability being the primary index of reliability. Comparisons were also made between the measurements derived from the four techniques and those obtained from a computerized video analysis system. Analysis of the rater measurement variability around the technique means revealed significant differences between techniques with the least variation being found in the trigonometric technique. Significant differences were also found between the technique means but no differences between sessions were evident. The trigonometric technique produced mean ROMs closest in value to those derived from computer analysis. Application of the trigonometric technique resulted in the least variability in measurement across raters and consequently should be considered for use when changes in dorsiflexion ROM need to be reliably assessed. Copyright © 2012 Elsevier B.V. All rights reserved.
Advances in the analysis and design of constant-torque springs
NASA Technical Reports Server (NTRS)
McGuire, John R.; Yura, Joseph A.
1996-01-01
In order to improve the design procedure of constant-torque springs used in aerospace applications, several new analysis techniques have been developed. These techniques make it possible to accurately construct a torque-rotation curve for any general constant-torque spring configuration. These new techniques allow for friction in the system to be included in the analysis, an area of analysis that has heretofore been unexplored. The new analysis techniques also include solutions for the deflected shape of the spring as well as solutions for drum and roller support reaction forces. A design procedure incorporating these new capabilities is presented.
Artificial intelligence techniques used in respiratory sound analysis--a systematic review.
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-02-01
Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
NASA Astrophysics Data System (ADS)
Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.
2002-03-01
Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.
Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M
2007-02-19
A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.
Digital techniques for ULF wave polarization analysis
NASA Technical Reports Server (NTRS)
Arthur, C. W.
1979-01-01
Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1985-01-01
Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Factor Analysis and Counseling Research
ERIC Educational Resources Information Center
Weiss, David J.
1970-01-01
Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…
Elimination of Perchlorate Oxidizers from Pyrotechnic Flare Compositions
2007-03-09
in candelas ( cd ), where the candela is defined as, 1 cd = 1 lumen /steradian-1. DSC A thermal analysis technique known as Differential...Shorter Wavelength Infrared band routinely monitored in decoy flare performance tests. TGA A thermal analysis technique known as Thermogravimetric ...Scanning Calorimetry DTA A thermal analysis technique known as Differential Thermal Analysis GAP Glycidyl Azide Polymer used as a curable binder in some
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2014 CFR
2014-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2013 CFR
2013-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...
Advanced Undergraduate Experiments in Thermoanalytical Chemistry.
ERIC Educational Resources Information Center
Hill, J. O.; Magee, R. J.
1988-01-01
Describes several experiments using the techniques of thermal analysis and thermometric titrimetry. Defines thermal analysis and several recent branches of the technique. Notes most of the experiments use simple equipment and standard laboratory techniques. (MVL)
Development of analysis techniques for remote sensing of vegetation resources
NASA Technical Reports Server (NTRS)
Draeger, W. C.
1972-01-01
Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.
Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.
ERIC Educational Resources Information Center
Lynch, James; And Others
1996-01-01
Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Towards generating ECSS-compliant fault tree analysis results via ConcertoFLA
NASA Astrophysics Data System (ADS)
Gallina, B.; Haider, Z.; Carlsson, A.
2018-05-01
Attitude Control Systems (ACSs) maintain the orientation of the satellite in three-dimensional space. ACSs need to be engineered in compliance with ECSS standards and need to ensure a certain degree of dependability. Thus, dependability analysis is conducted at various levels and by using ECSS-compliant techniques. Fault Tree Analysis (FTA) is one of these techniques. FTA is being automated within various Model Driven Engineering (MDE)-based methodologies. The tool-supported CHESS-methodology is one of them. This methodology incorporates ConcertoFLA, a dependability analysis technique enabling failure behavior analysis and thus FTA-results generation. ConcertoFLA, however, similarly to other techniques, still belongs to the academic research niche. To promote this technique within the space industry, we apply it on an ACS and discuss about its multi-faceted potentialities in the context of ECSS-compliant engineering.
Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.
Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao
2018-01-01
In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.
NASA Technical Reports Server (NTRS)
Lindstrom, David J.; Lindstrom, Richard M.
1989-01-01
Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory
NASA Astrophysics Data System (ADS)
Steels, R. S., Jr.; Babelay, E. F., Jr.
1980-07-01
Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.
Approaches to answering critical CER questions.
Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y
2015-01-01
While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.
Rasch Analysis for Instrument Development: Why, When, and How?
ERIC Educational Resources Information Center
Boone, William J.
2016-01-01
This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to…
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Towards Effective Clustering Techniques for the Analysis of Electric Power Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh
2013-11-30
Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less
Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.
Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E
2017-01-01
Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.
2015-01-01
for IC fault detection . This section provides background information on inversion methods. Conventional inversion techniques and their shortcomings are...physical techniques, electron beam imaging/analysis, ion beam techniques, scanning probe techniques. Electrical tests are used to detect faults in 13 an...hand, there is also the second harmonic technique through which duty cycle degradation faults are detected by collecting the magnitude and the phase of
Child versus adult psychoanalysis: two processes or one?
Sugarman, Alan
2009-12-01
Child analysis continues to be seen as a different technique from adult analysis because children are still involved in a developmental process and because the primary objects continue to play active roles in their lives. This paper argues that this is a false dichotomy. An extended vignette of the analysis of a latency-aged girl is used to demonstrate that the psychoanalytic process that develops in child analysis is structurally the same as that in adult analysis. Both revolve around the analysis of resistance and transference and use both to promote knowledge of the patient's mind at work. And both techniques formulate interventions based on the analyst's appraisal of the patient's mental organization. It is hoped that stressing the essential commonality of both techniques will promote the development of an overarching theory of psychoanalytic technique.
Biostatistics Series Module 10: Brief Overview of Multivariate Methods.
Hazra, Avijit; Gogtay, Nithya
2017-01-01
Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.
Program risk analysis handbook
NASA Technical Reports Server (NTRS)
Batson, R. G.
1987-01-01
NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Quarternary Pollen Analysis in Secondary School Ecology
ERIC Educational Resources Information Center
Slater, F. M.
1972-01-01
Describes techniques for studying historic changes in climate by analysis of pollen preserved in peat bogs. Illustrates the methodology and data analysis techniques by reference to results from English research. (AL)
Residual stresses of thin, short rectangular plates
NASA Technical Reports Server (NTRS)
Andonian, A. T.; Danyluk, S.
1985-01-01
The analysis of the residual stresses in thin, short rectangular plates is presented. The analysis is used in conjunction with a shadow moire interferometry technique by which residual stresses are obtained over a large spatial area from a strain measurement. The technique and analysis are applied to a residual stress measurement of polycrystalline silicon sheet grown by the edge-defined film growth technique.
Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media
Chen, Jing; Fang, Yanjun
2007-01-01
A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.
Computer-assisted techniques to evaluate fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-01-01
Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J
2001-08-01
Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2010 CFR
2010-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2011 CFR
2011-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...
48 CFR 15.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2012 CFR
2012-10-01
... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...
48 CFR 215.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
NASA Technical Reports Server (NTRS)
Landmann, A. E.; Tillema, H. F.; Marshall, S. E.
1989-01-01
The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
Flow analysis techniques for phosphorus: an overview.
Estela, José Manuel; Cerdà, Víctor
2005-04-15
A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
Hybrid soft computing systems for electromyographic signals analysis: a review.
Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates
2014-02-03
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.
Hybrid soft computing systems for electromyographic signals analysis: a review
2014-01-01
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Peters, Jeanne M.
1987-01-01
An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Jong, Jen-Yi
1986-01-01
An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.
Performance analysis of clustering techniques over microarray data: A case study
NASA Astrophysics Data System (ADS)
Dash, Rasmita; Misra, Bijan Bihari
2018-03-01
Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.
Analysis technique for controlling system wavefront error with active/adaptive optics
NASA Astrophysics Data System (ADS)
Genberg, Victor L.; Michels, Gregory J.
2017-08-01
The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
The combined use of order tracking techniques for enhanced Fourier analysis of order components
NASA Astrophysics Data System (ADS)
Wang, K. S.; Heyns, P. S.
2011-04-01
Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.
Cognitive task analysis: Techniques applied to airborne weapons training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terranova, M.; Seamster, T.L.; Snyder, C.E.
1989-01-01
This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C
2004-09-30
Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.
Technologies for Clinical Diagnosis Using Expired Human Breath Analysis
Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji
2015-01-01
This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142
Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques
Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.
2013-01-01
Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.
1976-10-01
A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of
Eigensystem analysis of classical relaxation techniques with applications to multigrid analysis
NASA Technical Reports Server (NTRS)
Lomax, Harvard; Maksymiuk, Catherine
1987-01-01
Classical relaxation techniques are related to numerical methods for solution of ordinary differential equations. Eigensystems for Point-Jacobi, Gauss-Seidel, and SOR methods are presented. Solution techniques such as eigenvector annihilation, eigensystem mixing, and multigrid methods are examined with regard to the eigenstructure.
Characterization of emission microscopy and liquid crystal thermography in IC fault localization
NASA Astrophysics Data System (ADS)
Lau, C. K.; Sim, K. S.
2013-05-01
This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.
Analysis of "The Wonderful Desert." Technical Report No. 170.
ERIC Educational Resources Information Center
Green, G. M.; And Others
This report presents a text analysis of "The Wonderful Desert," a brief selection from the "Reader's Digest Skill Builder" series. (The techniques used in arriving at the analysis are presented in a Reading Center Technical Report, Number 168, "Problems and Techniques of Text Analysis.") Tables are given for a…
Ali, S. J.; Kraus, R. G.; Fratanduono, D. E.; ...
2017-05-18
Here, we developed an iterative forward analysis (IFA) technique with the ability to use hydrocode simulations as a fitting function for analysis of dynamic compression experiments. The IFA method optimizes over parameterized quantities in the hydrocode simulations, breaking the degeneracy of contributions to the measured material response. Velocity profiles from synthetic data generated using a hydrocode simulation are analyzed as a first-order validation of the technique. We also analyze multiple magnetically driven ramp compression experiments on copper and compare with more conventional techniques. Excellent agreement is obtained in both cases.
NASA Astrophysics Data System (ADS)
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-09-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
NASA Technical Reports Server (NTRS)
Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.
2013-01-01
Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.
Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.
1991-01-01
Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
A methodological comparison of customer service analysis techniques
James Absher; Alan Graefe; Robert Burns
2003-01-01
Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...
Phased-mission system analysis using Boolean algebraic methods
NASA Technical Reports Server (NTRS)
Somani, Arun K.; Trivedi, Kishor S.
1993-01-01
Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.
Bilek, Maciej; Namieśnik, Jacek
2016-01-01
For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Współczesna", "Wiadomości Farmaceutyczne", "Acta Poloniae Pharmaceutica", "Farmacja Polska", "Dissertationes Pharmaceuticae", "Annales UMCS sectio DDD Phamacia". The number of published works using various chromatography techniques was assessed based on the content description of individual issues of the journal "Acta Poloniae Pharmaceutica".
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
Application of a sensitivity analysis technique to high-order digital flight control systems
NASA Technical Reports Server (NTRS)
Paduano, James D.; Downing, David R.
1987-01-01
A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.
NASA Astrophysics Data System (ADS)
Shao, Xupeng
2017-04-01
Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy
Finite element modeling of truss structures with frequency-dependent material damping
NASA Technical Reports Server (NTRS)
Lesieutre, George A.
1991-01-01
A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.
Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.
2004-01-01
A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.
Analytical techniques: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.
Correlating Detergent Fiber Analysis and Dietary Fiber Analysis Data for Corn Stover
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfrum, E. J.; Lorenz, A. J.; deLeon, N.
There exist large amounts of detergent fiber analysis data [neutral detergent fiber (NDF), acid detergent fiber (ADF), acid detergent lignin (ADL)] for many different potential cellulosic ethanol feedstocks, since these techniques are widely used for the analysis of forages. Researchers working in the area of cellulosic ethanol are interested in the structural carbohydrates in a feedstock (principally glucan and xylan), which are typically determined by acid hydrolysis of the structural fraction after multiple extractions of the biomass. These so-called dietary fiber analysis methods are significantly more involved than detergent fiber analysis methods. The purpose of this study was to determinemore » whether it is feasible to correlate detergent fiber analysis values to glucan and xylan content determined by dietary fiber analysis methods for corn stover. In the detergent fiber analysis literature cellulose is often estimated as the difference between ADF and ADL, while hemicellulose is often estimated as the difference between NDF and ADF. Examination of a corn stover dataset containing both detergent fiber analysis data and dietary fiber analysis data predicted using near infrared spectroscopy shows that correlations between structural glucan measured using dietary fiber techniques and cellulose estimated using detergent techniques, and between structural xylan measured using dietary fiber techniques and hemicellulose estimated using detergent techniques are high, but are driven largely by the underlying correlation between total extractives measured by fiber analysis and NDF/ADF. That is, detergent analysis data is correlated to dietary fiber analysis data for structural carbohydrates, but only indirectly; the main correlation is between detergent analysis data and solvent extraction data produced during the dietary fiber analysis procedure.« less
Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques
2018-04-30
Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice
Real time automatic detection of bearing fault in induction machine using kurtogram analysis.
Tafinine, Farid; Mokrani, Karim
2012-11-01
A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.
Holographic Interferometry and Image Analysis for Aerodynamic Testing
1980-09-01
tunnels, (2) development of automated image analysis techniques for reducing quantitative flow-field data from holographic interferograms, and (3...investigation and development of software for the application of digital image analysis to other photographic techniques used in wind tunnel testing.
Iterative categorization (IC): a systematic technique for analysing qualitative data
2016-01-01
Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155
Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle
NASA Technical Reports Server (NTRS)
Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero
2000-01-01
This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
ERIC Educational Resources Information Center
Foley, John P., Jr.
A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
Gorzsás, András; Sundberg, Björn
2014-01-01
Fourier transform infrared (FT-IR) spectroscopy is a fast, sensitive, inexpensive, and nondestructive technique for chemical profiling of plant materials. In this chapter we discuss the instrumental setup, the basic principles of analysis, and the possibilities for and limitations of obtaining qualitative and semiquantitative information by FT-IR spectroscopy. We provide detailed protocols for four fully customizable techniques: (1) Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS): a sensitive and high-throughput technique for powders; (2) attenuated total reflectance (ATR) spectroscopy: a technique that requires no sample preparation and can be used for solid samples as well as for cell cultures; (3) microspectroscopy using a single element (SE) detector: a technique used for analyzing sections at low spatial resolution; and (4) microspectroscopy using a focal plane array (FPA) detector: a technique for rapid chemical profiling of plant sections at cellular resolution. Sample preparation, measurement, and data analysis steps are listed for each of the techniques to help the user collect the best quality spectra and prepare them for subsequent multivariate analysis.
A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.
Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian
2018-01-19
This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Srivastava, Anjali
The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.
Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan
2018-04-23
Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.
Techniques for the analysis of data from coded-mask X-ray telescopes
NASA Technical Reports Server (NTRS)
Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.
1987-01-01
Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.
A histogram-based technique for rapid vector extraction from PIV photographs
NASA Technical Reports Server (NTRS)
Humphreys, William M., Jr.
1991-01-01
A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.
Three Techniques for Task Analysis: Examples from the Nuclear Utilities.
ERIC Educational Resources Information Center
Carlisle, Kenneth E.
1984-01-01
Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)
UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.
Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois
2018-03-01
Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.
Predicting Effective Course Conduction Strategy Using Datamining Techniques
ERIC Educational Resources Information Center
Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.
2017-01-01
Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique
Tipper, J.C.
1979-01-01
Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Sample preparation techniques for the determination of trace residues and contaminants in foods.
Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M
2007-06-15
The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.
Use of different spectroscopic techniques in the analysis of Roman age wall paintings.
Agnoli, Francesca; Calliari, Irene; Mazzocchin, Gian-Antonio
2007-01-01
In this paper the analysis of samples of Roman age wall paintings coming from: Pordenone, Vicenza and Verona is carried out by using three different techniques: energy dispersive x-rays spectroscopy (EDS), x-rays fluorescence (XRF) and proton induced x-rays emission (PIXE). The features of the three spectroscopic techniques in the analysis of samples of archaeological interest are discussed. The studied pigments were: cinnabar, yellow ochre, green earth, Egyptian blue and carbon black.
Current status of the real-time processing of complex radar signatures
NASA Astrophysics Data System (ADS)
Clay, E.
The real-time processing technique developed by ONERA to characterize radar signatures at the Brahms station is described. This technique is used for the real-time analysis of the RCS of airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys. Using this technique, it is also possible to optimize the experimental parameters, i.e., the analysis band, the microwave-network gain, and the electromagnetic window of the analysis.
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
Evidential Reasoning in Expert Systems for Image Analysis.
1985-02-01
techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Earth rotation, station coordinates and orbit determination from satellite laser ranging
NASA Astrophysics Data System (ADS)
Murata, Masaaki
The Project MERIT, a special program of international colaboration to Monitor Earth Rotation and Intercompare the Techniques of observation and analysis, has come to an end with great success. Its major objective was to evaluate the ultimate potential of space techniques such as VLBI and satellite laser ranging, in contrast with the other conventional techniques, in the determination of rotational dynamics of the earth. The National Aerospace Laboratory (NAL) has officially participated in the project as an associate analysis center for satellite laser technique for the period of the MERIT Main Campaign (September 1983-October 1984). In this paper, the NAL analysis center results are presented.
NASA Astrophysics Data System (ADS)
Li, Xinyi; Bao, Jingfu; Huang, Yulin; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya
2018-07-01
In this paper, we propose the use of the hierarchical cascading technique (HCT) for the finite element method (FEM) analysis of bulk acoustic wave (BAW) devices. First, the implementation of this technique is presented for the FEM analysis of BAW devices. It is shown that the traveling-wave excitation sources proposed by the authors are fully compatible with the HCT. Furthermore, a HCT-based absorbing mechanism is also proposed to replace the perfectly matched layer (PML). Finally, it is demonstrated how the technique is much more efficient in terms of memory consumption and execution time than the full FEM analysis.
Techniques for Analysis of DSN 64-meter Antenna Azimuth Bearing Film Height Records
NASA Technical Reports Server (NTRS)
Stevens, R.; Quach, C. T.
1983-01-01
The DSN 64-m antennas use oil pad azimuth thrust bearings. Instrumentation on the bearing pads measures the height of the oil film between the pad and the bearing runner. Techniques to analyze the film height record are developed and discussed. The analysis techniques present the unwieldy data in a compact form for assessment of bearing condition. The techniques are illustrated by analysis of a small sample of film height records from each of the three 64-m antennas. The results show the general condition of the bearings of DSS 43 and DSS 63 as good to excellent, and a DSS 14 as marginal.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
Some failure modes and analysis techniques for terrestrial solar cell modules
NASA Technical Reports Server (NTRS)
Shumka, A.; Stern, K. H.
1978-01-01
Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.
Sisco, Edward; Dake, Jeffrey; Bridge, Candice
2013-10-10
Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A review of intelligent systems for heart sound signal analysis.
Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S
2017-10-01
Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.
A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
2016-07-18
This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.
K-Fold Crossvalidation in Canonical Analysis.
ERIC Educational Resources Information Center
Liang, Kun-Hsia; And Others
1995-01-01
A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)
Automated analysis and classification of melanocytic tumor on skin whole slide images.
Xu, Hongming; Lu, Cheng; Berendt, Richard; Jha, Naresh; Mandal, Mrinal
2018-06-01
This paper presents a computer-aided technique for automated analysis and classification of melanocytic tumor on skin whole slide biopsy images. The proposed technique consists of four main modules. First, skin epidermis and dermis regions are segmented by a multi-resolution framework. Next, epidermis analysis is performed, where a set of epidermis features reflecting nuclear morphologies and spatial distributions is computed. In parallel with epidermis analysis, dermis analysis is also performed, where dermal cell nuclei are segmented and a set of textural and cytological features are computed. Finally, the skin melanocytic image is classified into different categories such as melanoma, nevus or normal tissue by using a multi-class support vector machine (mSVM) with extracted epidermis and dermis features. Experimental results on 66 skin whole slide images indicate that the proposed technique achieves more than 95% classification accuracy, which suggests that the technique has the potential to be used for assisting pathologists on skin biopsy image analysis and classification. Copyright © 2018 Elsevier Ltd. All rights reserved.
Bonetti, Jennifer; Quarino, Lawrence
2014-05-01
This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.
Development of a sensitivity analysis technique for multiloop flight control systems
NASA Technical Reports Server (NTRS)
Vaillard, A. H.; Paduano, J.; Downing, D. R.
1985-01-01
This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis
NASA Astrophysics Data System (ADS)
Chou, Hui-Yu; Yang, Jyh-Bin
2017-10-01
The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.
NASA Astrophysics Data System (ADS)
Tolstikov, Vladimir V.
Analysis of the metabolome with coverage of all of the possibly detectable components in the sample, rather than analysis of each individual metabolite at a given time, can be accomplished by metabolic analysis. Targeted and/or nontargeted approaches are applied as needed for particular experiments. Monitoring hundreds or more metabolites at a given time requires high-throughput and high-end techniques that enable screening for relative changes in, rather than absolute concentrations of, compounds within a wide dynamic range. Most of the analytical techniques useful for these purposes use GC or HPLC/UPLC separation modules coupled to a fast and accurate mass spectrometer. GC separations require chemical modification (derivatization) before analysis, and work efficiently for the small molecules. HPLC separations are better suited for the analysis of labile and nonvolatile polar and nonpolar compounds in their native form. Direct infusion and NMR-based techniques are mostly used for fingerprinting and snap phenotyping, where applicable. Discovery and validation of metabolic biomarkers are exciting and promising opportunities offered by metabolic analysis applied to biological and biomedical experiments. We have demonstrated that GC-TOF-MS, HPLC/UPLC-RP-MS and HILIC-LC-MS techniques used for metabolic analysis offer sufficient metabolome mapping providing researchers with confident data for subsequent multivariate analysis and data mining.
Integrative sparse principal component analysis of gene expression data.
Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge
2017-12-01
In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.
Error analysis of multi-needle Langmuir probe measurement technique.
Barjatya, Aroh; Merritt, William
2018-04-01
Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.
Error analysis of multi-needle Langmuir probe measurement technique
NASA Astrophysics Data System (ADS)
Barjatya, Aroh; Merritt, William
2018-04-01
Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.
Recent Electrochemical and Optical Sensors in Flow-Based Analysis
Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn
2006-01-01
Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.
[Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].
Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia
2008-07-01
Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.
Safety analysis in test facility design
NASA Astrophysics Data System (ADS)
Valk, A.; Jonker, R. J.
1990-09-01
The application of safety analysis techniques as developed in, for example nuclear and petrochemical industry, can be very beneficial in coping with the increasing complexity of modern test facility installations and their operations. To illustrate the various techniques available and their phasing in a project, an overview of the most commonly used techniques is presented. Two case studies are described: the hazard and operability study techniques and safety zoning in relation to the possible presence of asphyxiating atmospheres.
NASA Technical Reports Server (NTRS)
Garvin, J. B.; Mouginis-Mark, P. J.; Head, J. W.
1981-01-01
A data collection and analysis scheme developed for the interpretation of rock morphology from lander images is reviewed with emphasis on rock population characterization techniques. Data analysis techniques are also discussed in the context of identifying key characteristics of a rock that place it in a single category with similar rocks. Actual rock characteristics observed from Viking and Venera lander imagery are summarized. Finally, some speculations regarding the block fields on Mars and Venus are presented.
Electrolytic preconcentration in instrumental analysis.
Sioda, R E; Batley, G E; Lund, W; Wang, J; Leach, S C
1986-05-01
The use of electrolytic deposition as a separation and preconcentration step in trace metal analysis is reviewed. Both the principles and applications of the technique are dealt with in some detail. Electrolytic preconcentration can be combined with a variety of instrumental techniques. Special attention is given to stripping voltammetry, potentiometric stripping analysis, different combinations with atomic-absorption spectrometry, and the use of flow-through porous electrodes. It is pointed out that the electrolytic preconcentration technique deserves more extensive use as well as fundamental investigation.
A microhistological technique for analysis of food habits of mycophagous rodents.
Patrick W. McIntire; Andrew B. Carey
1989-01-01
We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...
NASA Technical Reports Server (NTRS)
Zamora, M. A.
1977-01-01
Consumables analysis/crew training simulator interface requirements were defined. Two aspects were investigated: consumables analysis support techniques to crew training simulator for advanced spacecraft programs, and the applicability of the above techniques to the crew training simulator for the space shuttle program in particular.
USDA-ARS?s Scientific Manuscript database
Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1975-01-01
Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.
Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G
2015-01-01
Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Lo, Shih-Jie; Yao, Da-Jeng
2015-07-23
This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.
Lo, Shih-Jie; Yao, Da-Jeng
2015-01-01
This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918
Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.
Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko
2017-11-01
A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.
Transgender Phonosurgery: A Systematic Review and Meta-analysis.
Song, Tara Elena; Jiang, Nancy
2017-05-01
Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.
Analysis of thin plates with holes by using exact geometrical representation within XFEM.
Perumal, Logah; Tso, C P; Leng, Lim Thong
2016-05-01
This paper presents analysis of thin plates with holes within the context of XFEM. New integration techniques are developed for exact geometrical representation of the holes. Numerical and exact integration techniques are presented, with some limitations for the exact integration technique. Simulation results show that the proposed techniques help to reduce the solution error, due to the exact geometrical representation of the holes and utilization of appropriate quadrature rules. Discussion on minimum order of integration order needed to achieve good accuracy and convergence for the techniques presented in this work is also included.
A combination of selected mapping and clipping to increase energy efficiency of OFDM systems
Lee, Byung Moo; Rim, You Seung
2017-01-01
We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591
GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration
Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng
2015-01-01
The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315
Non-destructive evaluation techniques, high temperature ceramic component parts for gas turbines
NASA Technical Reports Server (NTRS)
Reiter, H.; Hirsekorn, S.; Lottermoser, J.; Goebbels, K.
1984-01-01
This report concerns studies conducted on various tests undertaken on material without destroying the material. Tests included: microradiographic techniques, vibration analysis, high-frequency ultrasonic tests with the addition of evaluation of defects and structure through analysis of ultrasonic scattering data, microwave tests and analysis of sound emission.
Image Analysis, Microscopic, and Spectrochemical Study of the PVC Dry Blending Process,
The dry blending process used in the production of electrical grade pvc formulations has been studies using a combination of image analysis , microscopic...by image analysis techniques. Optical and scanning electron microscopy were used to assess morphological differences. Spectrochemical techniques were used to indicate chemical changes.
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laskin, Julia; Lanekoff, Ingela
2015-11-13
Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.
A strategy for selecting data mining techniques in metabolomics.
Banimustafa, Ahmed Hmaidan; Hardy, Nigel W
2012-01-01
There is a general agreement that the development of metabolomics depends not only on advances in chemical analysis techniques but also on advances in computing and data analysis methods. Metabolomics data usually requires intensive pre-processing, analysis, and mining procedures. Selecting and applying such procedures requires attention to issues including justification, traceability, and reproducibility. We describe a strategy for selecting data mining techniques which takes into consideration the goals of data mining techniques on the one hand, and the goals of metabolomics investigations and the nature of the data on the other. The strategy aims to ensure the validity and soundness of results and promote the achievement of the investigation goals.
Spectral Analysis and Experimental Modeling of Ice Accretion Roughness
NASA Technical Reports Server (NTRS)
Orr, D. J.; Breuer, K. S.; Torres, B. E.; Hansman, R. J., Jr.
1996-01-01
A self-consistent scheme for relating wind tunnel ice accretion roughness to the resulting enhancement of heat transfer is described. First, a spectral technique of quantitative analysis of early ice roughness images is reviewed. The image processing scheme uses a spectral estimation technique (SET) which extracts physically descriptive parameters by comparing scan lines from the experimentally-obtained accretion images to a prescribed test function. Analysis using this technique for both streamwise and spanwise directions of data from the NASA Lewis Icing Research Tunnel (IRT) are presented. An experimental technique is then presented for constructing physical roughness models suitable for wind tunnel testing that match the SET parameters extracted from the IRT images. The icing castings and modeled roughness are tested for enhancement of boundary layer heat transfer using infrared techniques in a "dry" wind tunnel.
Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan
2018-02-05
Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.
Directed Incremental Symbolic Execution
NASA Technical Reports Server (NTRS)
Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz
2011-01-01
The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.
NASA Astrophysics Data System (ADS)
Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław
Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.
Application of phyto-indication and radiocesium indicative methods for microrelief mapping
NASA Astrophysics Data System (ADS)
Panidi, E.; Trofimetz, L.; Sokolova, J.
2016-04-01
Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
ERIC Educational Resources Information Center
Moriarty, Dick; Zarebski, John
This paper delineates the exact methodology developed by the Sports Institute for Research/Change Agent Research (SIR/CAR) for applying a systems analysis technique to a voluntary mutual benefit organization, such as a school or amateur athletic group. The functions of the technique are to compare avowed and actual behavior, to utilize group…
Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane
2008-02-01
MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.
Use of communication techniques by Maryland dentists.
Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V
2013-12-01
Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.
Iterative categorization (IC): a systematic technique for analysing qualitative data.
Neale, Joanne
2016-06-01
The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
Large space antennas: A systems analysis case history
NASA Technical Reports Server (NTRS)
Keafer, Lloyd S. (Compiler); Lovelace, U. M. (Compiler)
1987-01-01
The value of systems analysis and engineering is aptly demonstrated by the work on Large Space Antennas (LSA) by the NASA Langley Spacecraft Analysis Branch. This work was accomplished over the last half-decade by augmenting traditional system engineering, analysis, and design techniques with computer-aided engineering (CAE) techniques using the Langley-developed Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. This report chronicles the research highlights and special systems analyses that focused the LSA work on deployable truss antennas. It notes developmental trends toward greater use of CAE techniques in their design and analysis. A look to the future envisions the application of improved systems analysis capabilities to advanced space systems such as an advanced space station or to lunar and Martian missions and human habitats.
Method for improving accuracy in full evaporation headspace analysis.
Xie, Wei-Qi; Chai, Xin-Sheng
2017-05-01
We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
EPA Method 245.2: Mercury (Automated Cold Vapor Technique)
Method 245.2 describes procedures for preparation and analysis of drinking water samples for analysis of mercury using acid digestion and cold vapor atomic absorption. Samples are prepared using an acid digestion technique.
Light stable isotope analysis of meteorites by ion microprobe
NASA Technical Reports Server (NTRS)
Mcsween, Harry Y., Jr.
1994-01-01
The main goal was to develop the necessary secondary ion mass spectrometer (SIMS) techniques to use a Cameca ims-4f ion microprobe to measure light stable isotope ratios (H, C, O and S) in situ and in non-conducting mineral phases. The intended application of these techniques was the analysis of meteorite samples, although the techniques that have been developed are equally applicable to the investigation of terrestrial samples. The first year established techniques for the analysis of O isotope ratios (delta O-18 and delta O-17) in conducting mineral phases and the measurement of S isotope ratios (delta S-34) in a variety of sulphide phases. In addition, a technique was developed to measure delta S-34 values in sulphates, which are insulators. Other research undertaken in the first year resulted in SIMS techniques for the measurement of wide variety of trace elements in carbonate minerals, with the aim of understanding the nature of alteration fluids in carbonaceous chondrites. In the second year we developed techniques for analyzing O isotope ratios in nonconducting mineral phases. These methods are potentially applicable to the measurement of other light stable isotopes such as H, C and S in insulators. Also, we have further explored the analytical techniques used for the analysis of S isotopes in sulphides by analyzing troilite in a number of L and H ordinary chondrites. This was done to see if there was any systematic differences with petrological type.
NASA Astrophysics Data System (ADS)
Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho
2013-10-01
Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.
NASA Astrophysics Data System (ADS)
Randle, K.; Al-Jundi, J.; Mamas, C. J. V.; Sokhi, R. S.; Earwaker, L. G.
1993-06-01
Our work on heavy metals in the estuarine environment has involved the use of two multielement techniques: neutron activation analysis (NAA) and proton-induced X-ray emission (PIXE) analysis. As PIXE is essentially a surface analytical technique problems may arise due to sample inhomogeneity and surface roughness. In order to assess the contribution of these effects we have compared the results from PIXE analysis with those from a technique which analyzes a larger bulk sample rather than just the surface. An obvious method was NAA. A series of sediment samples containing particles of variable diameter were compared. Pellets containing a few mg of sediment were prepared from each sample and analyzed by the PIXE technique using both an absolute and a comparitive method. For INAA the rest of the sample was then irradiated with thermal neutrons and element concentrations determined from analyses of the subsequent gamma-ray spectrum. Results from the two methods are discussed.
Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.
Kemp, Pavlina S; VanderVeen, Deborah K
2016-01-01
The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.
Automated thermal mapping techniques using chromatic image analysis
NASA Technical Reports Server (NTRS)
Buck, Gregory M.
1989-01-01
Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.
Using Machine Learning Techniques in the Analysis of Oceanographic Data
NASA Astrophysics Data System (ADS)
Falcinelli, K. E.; Abuomar, S.
2017-12-01
Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
Developing techniques for cause-responsibility analysis of occupational accidents.
Jabbari, Mousa; Ghorbani, Roghayeh
2016-11-01
The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.
A simple 2D composite image analysis technique for the crystal growth study of L-ascorbic acid.
Kumar, Krishan; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir
2017-06-01
This work was destined for 2D crystal growth studies of L-ascorbic acid using the composite image analysis technique. Growth experiments on the L-ascorbic acid crystals were carried out by standard (optical) microscopy, laser diffraction analysis, and composite image analysis. For image analysis, the growth of L-ascorbic acid crystals was captured as digital 2D RGB images, which were then processed to composite images. After processing, the crystal boundaries emerged as white lines against the black (cancelled) background. The crystal boundaries were well differentiated by peaks in the intensity graphs generated for the composite images. The lengths of crystal boundaries measured from the intensity graphs of composite images were in good agreement (correlation coefficient "r" = 0.99) with the lengths measured by standard microscopy. On the contrary, the lengths measured by laser diffraction were poorly correlated with both techniques. Therefore, the composite image analysis can replace the standard microscopy technique for the crystal growth studies of L-ascorbic acid. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.
2014-01-01
Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811
ADP of multispectral scanner data for land use mapping
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1971-01-01
The advantages and disadvantages of various remote sensing instrumentation and analysis techniques are reviewed. The use of multispectral scanner data and the automatic data processing techniques are considered. A computer-aided analysis system for remote sensor data is described with emphasis on the image display, statistics processor, wavelength band selection, classification processor, and results display. Advanced techniques in using spectral and temporal data are also considered.
Solid State Audio/Speech Processor Analysis.
1980-03-01
techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD
Estimating Mass of Inflatable Aerodynamic Decelerators Using Dimensionless Parameters
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2011-01-01
This paper describes a technique for estimating mass for inflatable aerodynamic decelerators. The technique uses dimensional analysis to identify a set of dimensionless parameters for inflation pressure, mass of inflation gas, and mass of flexible material. The dimensionless parameters enable scaling of an inflatable concept with geometry parameters (e.g., diameter), environmental conditions (e.g., dynamic pressure), inflation gas properties (e.g., molecular mass), and mass growth allowance. This technique is applicable for attached (e.g., tension cone, hypercone, and stacked toroid) and trailing inflatable aerodynamic decelerators. The technique uses simple engineering approximations that were developed by NASA in the 1960s and 1970s, as well as some recent important developments. The NASA Mars Entry and Descent Landing System Analysis (EDL-SA) project used this technique to estimate the masses of the inflatable concepts that were used in the analysis. The EDL-SA results compared well with two independent sets of high-fidelity finite element analyses.
S-192 analysis: Conventional and special data processing techniques. [Michigan
NASA Technical Reports Server (NTRS)
Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.
1975-01-01
The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.
Advanced techniques for determining long term compatibility of materials with propellants
NASA Technical Reports Server (NTRS)
Green, R. L.
1972-01-01
The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.
Rasch Analysis for Instrument Development: Why, When, and How?
Boone, William J.
2016-01-01
This essay describes Rasch analysis psychometric techniques and how such techniques can be used by life sciences education researchers to guide the development and use of surveys and tests. Specifically, Rasch techniques can be used to document and evaluate the measurement functioning of such instruments. Rasch techniques also allow researchers to construct “Wright maps” to explain the meaning of a test score or survey score and develop alternative forms of tests and surveys. Rasch techniques provide a mechanism by which the quality of life sciences–related tests and surveys can be optimized and the techniques can be used to provide a context (e.g., what topics a student has mastered) when explaining test and survey results. PMID:27856555
Analysis of Synthetic Polymers.
ERIC Educational Resources Information Center
Smith, Charles G.; And Others
1989-01-01
Reviews techniques for the characterization and analysis of synthetic polymers, copolymers, and blends. Includes techniques for structure determination, separation, and quantitation of additives and residual monomers; determination of molecular weight; and the study of thermal properties including degradation mechanisms. (MVL)
Fabrication Techniques and Principles for Flat Plate Antennas
DOT National Transportation Integrated Search
1973-09-01
The report documents the fabrication techniques and principles selected to produce one and ten million flat plate antennas per year. An engineering analysis of the reliability, electrical integrity, and repeatability is made, and a cost analysis summ...
NASA Astrophysics Data System (ADS)
Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.
2013-07-01
The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.
Moini, Mehdi
2018-05-01
In the past few years, there has been a significant effort by the forensic science community to develop new scientific techniques for the analysis of forensic evidence. Forensic chemists have been spearheaded to develop information-rich confirmatory technologies and techniques and apply them to a broad array of forensic challenges. The purpose of these confirmatory techniques is to provide alternatives to presumptive techniques that rely on data such as color changes, pattern matching, or retention time alone, which are prone to more false positives. To this end, the application of separation techniques in conjunction with mass spectrometry has played an important role in the analysis of forensic evidence. Moreover, in the past few years the role of liquid separation techniques, such as liquid chromatography and capillary electrophoresis in conjunction with mass spectrometry, has gained significant tractions and have been applied to a wide range of chemicals, from small molecules such as drugs and explosives, to large molecules such as proteins. For example, proteomics and peptidomics have been used for identification of humans, organs, and bodily fluids. A wide range of HPLC techniques including reversed phase, hydrophilic interaction, mixed-mode, supercritical fluid, multidimensional chromatography, and nanoLC, as well as several modes of capillary electrophoresis mass spectrometry, including capillary zone electrophoresis, partial filling, full filling, and micellar electrokenetic chromatography have been applied to the analysis drugs, explosives, and questioned documents. In this article, we review recent (2015-2017) applications of liquid separation in conjunction with mass spectrometry to the analysis of forensic evidence. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John
2013-05-01
Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Matthew W.
2013-01-01
This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include themore » inherently weak Raman cross section and susceptibility to fluorescence interference.« less
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Determination of dynamic fracture toughness using a new experimental technique
NASA Astrophysics Data System (ADS)
Cady, Carl M.; Liu, Cheng; Lovato, Manuel L.
2015-09-01
In other studies dynamic fracture toughness has been measured using Charpy impact and modified Hopkinson Bar techniques. In this paper results will be shown for the measurement of fracture toughness using a new test geometry. The crack propagation velocities range from ˜0.15 mm/s to 2.5 m/s. Digital image correlation (DIC) will be the technique used to measure both the strain and the crack growth rates. The boundary of the crack is determined using the correlation coefficient generated during image analysis and with interframe timing the crack growth rate and crack opening can be determined. A comparison of static and dynamic loading experiments will be made for brittle polymeric materials. The analysis technique presented by Sammis et al. [1] is a semi-empirical solution, however, additional Linear Elastic Fracture Mechanics analysis of the strain fields generated as part of the DIC analysis allow for the more commonly used method resembling the crack tip opening displacement (CTOD) experiment. It should be noted that this technique was developed because limited amounts of material were available and crack growth rates were to fast for a standard CTOD method.
Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko
2018-06-01
Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Matrix Perturbation Techniques in Structural Dynamics
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1973-01-01
Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.
Data Unfolding with Wiener-SVD Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, W.; Li, X.; Qian, X.
Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.
Data Unfolding with Wiener-SVD Method
Tang, W.; Li, X.; Qian, X.; ...
2017-10-04
Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
Laboratory Spectrometer for Wear Metal Analysis of Engine Lubricants.
1986-04-01
analysis, the acid digestion technique for sample pretreatment is the best approach available to date because of its relatively large sample size (1000...microliters or more). However, this technique has two major shortcomings limiting its application: (1) it requires the use of hydrofluoric acid (a...accuracy. Sample preparation including filtration or acid digestion may increase analysis times by 20 minutes or more. b. Repeatability In the analysis
NASA Technical Reports Server (NTRS)
Viezee, W.; Russell, P. B.; Hake, R. D., Jr.
1974-01-01
The matching method of lidar data analysis is explained, and the results from two flights studying the stratospheric aerosol using lidar techniques are summarized and interpreted. Support is lent to the matching method of lidar data analysis by the results, but it is not yet apparent that the analysis technique leads to acceptable results on all nights in all seasons.
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
Digital Dental X-ray Database for Caries Screening
NASA Astrophysics Data System (ADS)
Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila
2016-06-01
Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.
Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse
NASA Astrophysics Data System (ADS)
Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa
1994-01-01
Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.
LIBS: a potential tool for industrial/agricultural waste water analysis
NASA Astrophysics Data System (ADS)
Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.
2016-04-01
Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
NASA Technical Reports Server (NTRS)
Merino, F.; Wakabayashi, I.; Pleasant, R. L.; Hill, M.
1982-01-01
Preferred techniques for providing abort pressurization and engine feed system net positive suction pressure (NPSP) for low thrust chemical propulsion systems (LTPS) were determined. A representative LTPS vehicle configuration is presented. Analysis tasks include: propellant heating analysis; pressurant requirements for abort propellant dump; and comparative analysis of pressurization techniques and thermal subcoolers.
A technique for conducting point pattern analysis of cluster plot stem-maps
C.W. Woodall; J.M. Graham
2004-01-01
Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...
NASA Technical Reports Server (NTRS)
1974-01-01
The use of optical data processing (ODP) techniques for motion analysis in two-dimensional imagery was studied. The basic feasibility of this approach was demonstrated, but inconsistent performance of the photoplastic used for recording spatial filters prevented totally automatic operation. Promising solutions to the problems encountered are discussed, and it is concluded that ODP techniques could be quite useful for motion analysis.
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
Nondestructive evaluation of turbine blades vibrating in resonant modes
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Ahmadshahi, Mansour A.
1991-12-01
The paper presents the analysis of the strain distribution of turbine blades. The holographic moire technique is used in conjunction with computer analysis of the fringes. The application of computer fringe analysis technique reduces the number of holograms to be recorded to two. Stroboscopic illumination is used to record the patterns. Strains and stresses are computed.
ERIC Educational Resources Information Center
Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.
2014-01-01
Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…
Multiscale Analysis of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C. A.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-06-04
Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.
Wear Debris Analysis of Grease Lubricated Ball Bearings.
1982-04-12
Ferrography method was performed by the Naval Air Engineering Center (NAVAIRENGCEN), Lakehurst, New Jersey. A total of three sets of two 6309 deep-groove ball... Ferrography technique. The analysis of the grease-retained wear debris necessitated the development of a technique to reduce the grease samples to a...condition where they were compatible with the Ferrography technique. A major achievement was the successful application of dissolving the grease
Determination of authenticity of brand perfume using electronic nose prototypes
NASA Astrophysics Data System (ADS)
Gebicki, Jacek; Szulczynski, Bartosz; Kaminski, Marian
2015-12-01
The paper presents the practical application of an electronic nose technique for fast and efficient discrimination between authentic and fake perfume samples. Two self-built electronic nose prototypes equipped with a set of semiconductor sensors were employed for that purpose. Additionally 10 volunteers took part in the sensory analysis. The following perfumes and their fake counterparts were analysed: Dior—Fahrenheit, Eisenberg—J’ose, YSL—La nuit de L’homme, 7 Loewe and Spice Bomb. The investigations were carried out using the headspace of the aqueous solutions. Data analysis utilized multidimensional techniques: principle component analysis (PCA), linear discrimination analysis (LDA), k-nearest neighbour (k-NN). The results obtained confirmed the legitimacy of the electronic nose technique as an alternative to the sensory analysis as far as the determination of authenticity of perfume is concerned.
The Utility of Template Analysis in Qualitative Psychology Research.
Brooks, Joanna; McCluskey, Serena; Turley, Emma; King, Nigel
2015-04-03
Thematic analysis is widely used in qualitative psychology research, and in this article, we present a particular style of thematic analysis known as Template Analysis. We outline the technique and consider its epistemological position, then describe three case studies of research projects which employed Template Analysis to illustrate the diverse ways it can be used. Our first case study illustrates how the technique was employed in data analysis undertaken by a team of researchers in a large-scale qualitative research project. Our second example demonstrates how a qualitative study that set out to build on mainstream theory made use of the a priori themes (themes determined in advance of coding) permitted in Template Analysis. Our final case study shows how Template Analysis can be used from an interpretative phenomenological stance. We highlight the distinctive features of this style of thematic analysis, discuss the kind of research where it may be particularly appropriate, and consider possible limitations of the technique. We conclude that Template Analysis is a flexible form of thematic analysis with real utility in qualitative psychology research.
Wang, Chuji; Sahay, Peeyush
2009-01-01
Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503
DOT National Transportation Integrated Search
1980-01-01
This project was undertaken for the Virginia Department of Transportation Safety to assess the feasibility of implementing the Data Analysis and Reporting Techniques (DART) computer software system in Virginia. Following a review of available literat...
Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto
2013-01-01
Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793
Rapid Method for Sodium Hydroxide/Sodium Peroxide Fusion ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Plutonium-238 and plutonium-239 in water and air filters Method Selected for: SAM lists this method as a pre-treatment technique supporting analysis of refractory radioisotopic forms of plutonium in drinking water and air filters using the following qualitative techniques: • Rapid methods for acid or fusion digestion • Rapid Radiochemical Method for Plutonium-238 and Plutonium 239/240 in Building Materials for Environmental Remediation Following Radiological Incidents. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana M F; Silva, Rosângela; Souza, Sheila Mendonça de; Araujo, Adauto
2013-04-01
Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
Uranium Detection - Technique Validation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Garduno, Katherine; Lujan, Elmer J.
As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples usingmore » the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H 2SO 4, at λ max = 419.5 nm.« less
Recent advances in capillary electrophoretic migration techniques for pharmaceutical analysis.
Deeb, Sami El; Wätzig, Hermann; El-Hady, Deia Abd; Albishri, Hassan M; de Griend, Cari Sänger-van; Scriba, Gerhard K E
2014-01-01
Since the introduction about 30 years ago, CE techniques have gained a significant impact in pharmaceutical analysis. The present review covers recent advances and applications of CE for the analysis of pharmaceuticals. Both small molecules and biomolecules such as proteins are considered. The applications range from the determination of drug-related substances to the analysis of counterions and the determination of physicochemical parameters. Furthermore, general considerations of CE methods in pharmaceutical analysis are described. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Economou, Anastasios
2018-01-01
This work reviews the field of screen-printed electrodes (SPEs) modified with “green” metals for electrochemical stripping analysis of toxic elements. Electrochemical stripping analysis has been established as a useful trace analysis technique offering many advantages compared to competing optical techniques. Although mercury has been the preferred electrode material for stripping analysis, the toxicity of mercury and the associated legal requirements in its use and disposal have prompted research towards the development of “green” metals as alternative electrode materials. When combined with the screen-printing technology, such environment-friendly metals can lead to disposable sensors for trace metal analysis with excellent operational characteristics. This review focuses on SPEs modified with Au, Bi, Sb, and Sn for stripping analysis of toxic elements. Different modification approaches (electroplating, bulk modification, use of metal precursors, microengineering techniques) are considered and representative applications are described. A developing related field, namely biosensing based on stripping analysis of metallic nanoprobe labels, is also briefly mentioned. PMID:29596391
Economou, Anastasios
2018-03-29
This work reviews the field of screen-printed electrodes (SPEs) modified with "green" metals for electrochemical stripping analysis of toxic elements. Electrochemical stripping analysis has been established as a useful trace analysis technique offering many advantages compared to competing optical techniques. Although mercury has been the preferred electrode material for stripping analysis, the toxicity of mercury and the associated legal requirements in its use and disposal have prompted research towards the development of "green" metals as alternative electrode materials. When combined with the screen-printing technology, such environment-friendly metals can lead to disposable sensors for trace metal analysis with excellent operational characteristics. This review focuses on SPEs modified with Au, Bi, Sb, and Sn for stripping analysis of toxic elements. Different modification approaches (electroplating, bulk modification, use of metal precursors, microengineering techniques) are considered and representative applications are described. A developing related field, namely biosensing based on stripping analysis of metallic nanoprobe labels, is also briefly mentioned.
On the Power of Abstract Interpretation
NASA Technical Reports Server (NTRS)
Reddy, Uday S.; Kamin, Samuel N.
1991-01-01
Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.
A novel pulse height analysis technique for nuclear spectroscopic and imaging systems
NASA Astrophysics Data System (ADS)
Tseng, H. H.; Wang, C. Y.; Chou, H. P.
2005-08-01
The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.
A simple white noise analysis of neuronal light responses.
Chichilnisky, E J
2001-05-01
A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.
NASA Technical Reports Server (NTRS)
Behbehani, K.
1980-01-01
A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.
A study of data analysis techniques for the multi-needle Langmuir probe
NASA Astrophysics Data System (ADS)
Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.
2018-06-01
In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Advanced analysis technique for the evaluation of linear alternators and linear motors
NASA Technical Reports Server (NTRS)
Holliday, Jeffrey C.
1995-01-01
A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.
NASA Astrophysics Data System (ADS)
Kajiya, E. A. M.; Campos, P. H. O. V.; Rizzutto, M. A.; Appoloni, C. R.; Lopes, F.
2014-02-01
This paper presents systematic studies and analysis that contributed to the identification of the forgery of a work by the artist Emiliano Augusto Cavalcanti de Albuquerque e Melo, known as Di Cavalcanti. The use of several areas of expertise such as brush stroke analysis ("pinacologia"), applied physics, and art history resulted in an accurate diagnosis for ascertaining the authenticity of the work entitled "Violeiro" (1950). For this work we used non-destructive methods such as techniques of infrared, ultraviolet, visible and tangential light imaging combined with chemical analysis of the pigments by portable X-Ray Fluorescence (XRF) and graphic gesture analysis. Each applied method of analysis produced specific information that made possible the identification of materials and techniques employed and we concluded that this work is not consistent with patterns characteristic of the artist Di Cavalcanti.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaBelle, S.J.; Smith, A.E.; Seymour, D.A.
1977-02-01
The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less
Regional environmental analysis and management: New techniques for current problems
NASA Technical Reports Server (NTRS)
Honea, R. B.; Paludan, C. T. N.
1974-01-01
Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.
Another Look at the Power of Meta-Analysis in the Solomon Four-Group Design.
ERIC Educational Resources Information Center
Sawilowsky, Shlomo S.; Markman, Barry S.
This paper demonstrates that a meta-analysis technique applied to the Solomon Four-Group Design (SFGD) can fail to find significance even though an earlier "weaker" test may have found significance. The meta-analysis technique was promoted by Braver and Braver as the most powerful single test for analyzing data from an SFGD. They…
ERIC Educational Resources Information Center
Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi
2006-01-01
This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…
Multidimensional chromatography in food analysis.
Herrero, Miguel; Ibáñez, Elena; Cifuentes, Alejandro; Bernal, Jose
2009-10-23
In this work, the main developments and applications of multidimensional chromatographic techniques in food analysis are reviewed. Different aspects related to the existing couplings involving chromatographic techniques are examined. These couplings include multidimensional GC, multidimensional LC, multidimensional SFC as well as all their possible combinations. Main advantages and drawbacks of each coupling are critically discussed and their key applications in food analysis described.
ERIC Educational Resources Information Center
Leech, Nancy L.; Onwuegbuzie, Anthony J.
2008-01-01
Qualitative researchers in school psychology have a multitude of analyses available for data. The purpose of this article is to present several of the most common methods for analyzing qualitative data. Specifically, the authors describe the following 18 qualitative analysis techniques: method of constant comparison analysis, keywords-in-context,…
ERIC Educational Resources Information Center
Mosier, Nancy R.
Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…
ERIC Educational Resources Information Center
Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.
Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…
The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright; Milos Manic
2010-05-01
This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach
ERIC Educational Resources Information Center
Lending, Diane; May, Jeffrey
2013-01-01
Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…
Maione, Camila; Barbosa, Rommel Melgaço
2018-01-24
Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1995-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.
Automatic differentiation evaluated as a tool for rotorcraft design and optimization
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.; Young, Katherine C.
1995-01-01
This paper investigates the use of automatic differentiation (AD) as a means for generating sensitivity analyses in rotorcraft design and optimization. This technique transforms an existing computer program into a new program that performs sensitivity analysis in addition to the original analysis. The original FORTRAN program calculates a set of dependent (output) variables from a set of independent (input) variables, the new FORTRAN program calculates the partial derivatives of the dependent variables with respect to the independent variables. The AD technique is a systematic implementation of the chain rule of differentiation, this method produces derivatives to machine accuracy at a cost that is comparable with that of finite-differencing methods. For this study, an analysis code that consists of the Langley-developed hover analysis HOVT, the comprehensive rotor analysis CAMRAD/JA, and associated preprocessors is processed through the AD preprocessor ADIFOR 2.0. The resulting derivatives are compared with derivatives obtained from finite-differencing techniques. The derivatives obtained with ADIFOR 2.0 are exact within machine accuracy and do not depend on the selection of step-size, as are the derivatives obtained with finite-differencing techniques.
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... overfilling and spill, or overpressurization and venting through relief valves or rupture disks; and (v... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as...
New Results in Software Model Checking and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2010-01-01
This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.
Analysis of Gold Ores by Fire Assay
ERIC Educational Resources Information Center
Blyth, Kristy M.; Phillips, David N.; van Bronswijk, Wilhelm
2004-01-01
Students of an Applied Chemistry degree course carried out a fire-assay exercise. The analysis showed that the technique was a worthwhile quantitative analytical technique and covered interesting theory including acid-base and redox chemistry and other concepts such as inquarting and cupelling.
Separation and Analysis of Citral Isomers.
ERIC Educational Resources Information Center
Sacks, Jeff; And Others
1983-01-01
Provides background information, procedures, and results of an experiments designed to introduce undergraduates to the technique of steam distillation as a means of isolating thermally sensitive compounds. Chromatographic techniques (HPLC) and mass spectrometric analysis are used in the experiment which requires three laboratory periods. (JN)
A CHARTING TECHNIQUE FOR THE ANALYSIS OF BUSINESS SYSTEMS,
This paper describes a charting technique useful in the analysis of business systems and in studies of the information economics of the firm. The...planning advanced systems. It is not restricted to any particular kind of business or information system. (Author)
Fault detection in digital and analog circuits using an i(DD) temporal analysis technique
NASA Technical Reports Server (NTRS)
Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark
1993-01-01
An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.
1976-09-01
The purpose of this research effort was to determine the financial management educational needs of USAF graduate logistics positions. Goal analysis...was used to identify financial management techniques and task analysis was used to develop a method to identify the use of financial management techniques...positions. The survey identified financial management techniques in five areas: cost accounting, capital budgeting, working capital, financial forecasting, and programming. (Author)
A Sensitivity Analysis of Circular Error Probable Approximation Techniques
1992-03-01
SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
Application of pattern recognition techniques to crime analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.
1976-08-15
The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)
Presentation-Oriented Visualization Techniques.
Kosara, Robert
2016-01-01
Data visualization research focuses on data exploration and analysis, yet the vast majority of visualizations people see were created for a different purpose: presentation. Whether we are talking about charts showing data to help make a presenter's point, data visuals created to accompany a news story, or the ubiquitous infographics, many more people consume charts than make them. Traditional visualization techniques treat presentation as an afterthought, but are there techniques uniquely suited to data presentation but not necessarily ideal for exploration and analysis? This article focuses on presentation-oriented techniques, considering their usefulness for presentation first and any other purposes as secondary.
Marcos-Garcés, V; Harvat, M; Molina Aguilar, P; Ferrández Izquierdo, A; Ruiz-Saurí, A
2017-08-01
Measurement of collagen bundle orientation in histopathological samples is a widely used and useful technique in many research and clinical scenarios. Fourier analysis is the preferred method for performing this measurement, but the most appropriate staining and microscopy technique remains unclear. Some authors advocate the use of Haematoxylin-Eosin (H&E) and confocal microscopy, but there are no studies comparing this technique with other classical collagen stainings. In our study, 46 human skin samples were collected, processed for histological analysis and stained with Masson's trichrome, Picrosirius red and H&E. Five microphotographs of the reticular dermis were taken with a 200× magnification with light microscopy, polarized microscopy and confocal microscopy, respectively. Two independent observers measured collagen bundle orientation with semiautomated Fourier analysis with the Image-Pro Plus 7.0 software and three independent observers performed a semiquantitative evaluation of the same parameter. The average orientation for each case was calculated with the values of the five pictures. We analyzed the interrater reliability, the consistency between Fourier analysis and average semiquantitative evaluation and the consistency between measurements in Masson's trichrome, Picrosirius red and H&E-confocal. Statistical analysis for reliability and agreement was performed with the SPSS 22.0 software and consisted of intraclass correlation coefficient (ICC), Bland-Altman plots and limits of agreement and coefficient of variation. Interrater reliability was almost perfect (ICC > 0.8) with all three histological and microscopy techniques and always superior in Fourier analysis than in average semiquantitative evaluation. Measurements were consistent between Fourier analysis by one observer and average semiquantitative evaluation by three observers, with an almost perfect agreement with Masson's trichrome and Picrosirius red techniques (ICC > 0.8) and a strong agreement with H&E-confocal (0.7 < ICC < 0.8). Comparison of measurements between the three techniques for the same observer showed an almost perfect agreement (ICC > 0.8), better with Fourier analysis than with semiquantitative evaluation (single and average). These results in nonpathological skin samples were also confirmed in a preliminary analysis in eight scleroderma skin samples. Our results show that Masson's trichrome and Picrosirius red are consistent with H&E-confocal for measuring collagen bundle orientation in histological samples and could thus be used indistinctly for this purpose. Fourier analysis is superior to average semiquantitative evaluation and should keep being used as the preferred method. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Determining Kinetic Parameters for Isothermal Crystallization of Glasses
NASA Technical Reports Server (NTRS)
Ray, C. S.; Zhang, T.; Reis, S. T.; Brow, R. K.
2006-01-01
Non-isothermal crystallization techniques are frequently used to determine the kinetic parameters for crystallization in glasses. These techniques are experimentally simple and quick compared to the isothermal techniques. However, the analytical models used for non-isothermal data analysis, originally developed for describing isothermal transformation kinetics, are fundamentally flawed. The present paper describes a technique for determining the kinetic parameters for isothermal crystallization in glasses, which eliminates most of the common problems that generally make the studies of isothermal crystallization laborious and time consuming. In this technique, the volume fraction of glass that is crystallized as a function of time during an isothermal hold was determined using differential thermal analysis (DTA). The crystallization parameters for the lithium-disilicate (Li2O.2SiO2) model glass were first determined and compared to the same parameters determined by other techniques to establish the accuracy and usefulness of the present technique. This technique was then used to describe the crystallization kinetics of a complex Ca-Sr-Zn-silicate glass developed for sealing solid oxide fuel cells.
Evaluation of methods for rapid determination of freezing point of aviation fuels
NASA Technical Reports Server (NTRS)
Mathiprakasam, B.
1982-01-01
Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.
Enamel paint techniques in archaeology and their identification using XRF and micro-XRF
NASA Astrophysics Data System (ADS)
Hložek, M.; Trojek, T.; Komoróczy, B.; Prokeš, R.
2017-08-01
This investigation focuses in detail on the analysis of discoveries in South Moravia - important sites from the Roman period in Pasohlávky and Mušov. Using X-ray fluorescence analysis and micro-analysis we help identify the techniques of enamel paint and give a thorough chemical analysis in details which would not be possible to determine by means of macroscopic examination. We thus address the influence of elemental composition on the final colour of the enamel paint and describe the less known technique of combining enamel with millefiori. The material analyses of the metal artefacts decorated with enamel paint significantly contribute to our knowledge of the technology being used during the Roman period.
Medvedev, Nickolay S; Shaverina, Anastasiya V; Tsygankova, Alphiya R; Saprykin, Anatoly I
2016-08-01
The paper presents a combined technique of germanium dioxide analysis by inductively coupled plasma atomic emission spectrometry (ICP-AES) with preconcentration of trace elements by distilling off matrix and electrothermal (ETV) introduction of the trace elements concentrate into the ICP. Evaluation of metrological characteristics of the developed technique of high-purity germanium dioxide analysis was performed. The limits of detection (LODs) for 25 trace elements ranged from 0.05 to 20ng/g. The accuracy of proposed technique is confirmed by "added-found" («or spiking») experiment and comparing the results of ETV-ICP-AES and ICP-AES analysis of high purity germanium dioxide samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Factor weighting in DRASTIC modeling.
Pacheco, F A L; Pires, L M G R; Santos, R M B; Sanches Fernandes, L F
2015-02-01
Evaluation of aquifer vulnerability comprehends the integration of very diverse data, including soil characteristics (texture), hydrologic settings (recharge), aquifer properties (hydraulic conductivity), environmental parameters (relief), and ground water quality (nitrate contamination). It is therefore a multi-geosphere problem to be handled by a multidisciplinary team. The DRASTIC model remains the most popular technique in use for aquifer vulnerability assessments. The algorithm calculates an intrinsic vulnerability index based on a weighted addition of seven factors. In many studies, the method is subject to adjustments, especially in the factor weights, to meet the particularities of the studied regions. However, adjustments made by different techniques may lead to markedly different vulnerabilities and hence to insecurity in the selection of an appropriate technique. This paper reports the comparison of 5 weighting techniques, an enterprise not attempted before. The studied area comprises 26 aquifer systems located in Portugal. The tested approaches include: the Delphi consensus (original DRASTIC, used as reference), Sensitivity Analysis, Spearman correlations, Logistic Regression and Correspondence Analysis (used as adjustment techniques). In all cases but Sensitivity Analysis, adjustment techniques have privileged the factors representing soil characteristics, hydrologic settings, aquifer properties and environmental parameters, by leveling their weights to ≈4.4, and have subordinated the factors describing the aquifer media by downgrading their weights to ≈1.5. Logistic Regression predicts the highest and Sensitivity Analysis the lowest vulnerabilities. Overall, the vulnerability indices may be separated by a maximum value of 51 points. This represents an uncertainty of 2.5 vulnerability classes, because they are 20 points wide. Given this ambiguity, the selection of a weighting technique to integrate a vulnerability index may require additional expertise to be set up satisfactorily. Following a general criterion that weights must be proportional to the range of the ratings, Correspondence Analysis may be recommended as the best adjustment technique. Copyright © 2014 Elsevier B.V. All rights reserved.
Commentary: Status of road safety in Asia.
Wismans, Jac; Skogsmo, Ingrid; Nilsson-Ehle, Anna; Lie, Anders; Thynell, Marie; Lindberg, Gunnar
2016-01-01
The objective of this article is to assess the status of road safety in Asia and present accident and injury prevention strategies based on global road safety improvement experiences and discuss the way forward by indicating opportunities and countermeasures that could be implemented to achieve a new level of safety in Asia. This study provides a review and analyses of data in the literature, including from the World Health Organization (WHO) and World Bank, and a review of lessons learned from best practices in high-income countries. In addition, an estimation of costs due to road transport injuries in Asia and review of future trends in road transport is provided. Data on the global and Asian road safety problem and status of prevention strategies in Asia as well as recommendations for future actions are discussed. The total number of deaths due to road accidents in the 24 Asian countries, encompassing 56% of the total world population, is 750,000 per year (statistics 2010). The total number of injuries is more than 50 million, of which 12% are hospital admissions. The loss to the economy in the 24 Asian countries is estimated to around US$800 billion or 3.6% of the gross domestic product (GDP). This article clearly shows that road safety is causing large problems and high costs in Asia, with an enormous impact on the well-being of people, economy, and productivity. In many Asian low- and middle-income countries, the yearly number of fatalities and injuries is increasing. Vulnerable road users (pedestrians, cyclists, and motorcyclists combined) are particularly at risk. Road safety in Asia should be given rightful attention, including taking powerful, effective actions. This review stresses the need for reliable accident data, because there is considerable underreporting in the official statistics. Reliable accident data are imperative to determine evidence-based intervention strategies and monitor the success of these interventions and analyses. On the other hand, lack of good high-quality accident data should not be an excuse to postpone interventions. There are many opportunities for evidence-based transport safety improvements, including measures concerning the 5 key risk factors: speed, drunk driving, not wearing motorcycle helmets, not wearing seat belts, and not using child restraints in cars, as specified in the Decade of Action for Road Safety 2011-2020. In this commentary, a number of additional measures are proposed that are not covered in the Decade of Action Plan. These new measures include separate roads or lanes for pedestrians and cyclists; helmet wearing for e-bike riders; special attention to elderly persons in public transportation; introduction of emerging collision avoidance technologies, in particular automatic emergency braking (AEB) and alcohol locks; improved truck safety focusing on the other road user (including blind spot detection technology; underride protection at the front, rear, and side; and energy-absorbing fronts); and improvements in motorcycle safety concerning protective clothing, requirements for advanced braking systems, improved visibility of motorcycles by using daytime running lights, and better guardrails.
NASA Technical Reports Server (NTRS)
Hall, David G.; Heidelberg, Laurence; Konno, Kevin
1993-01-01
The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.
NASA Technical Reports Server (NTRS)
Hall, David G.; Heidelberg, Laurence; Konno, Kevin
1993-01-01
The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.
Application of a novel new multispectral nanoparticle tracking technique
NASA Astrophysics Data System (ADS)
McElfresh, Cameron; Harrington, Tyler; Vecchio, Kenneth S.
2018-06-01
Fast, reliable, and accurate particle size analysis techniques must meet the demands of evolving industrial and academic research in areas of functionalized nanoparticle synthesis, advanced materials development, and other nanoscale enabled technologies. In this study a new multispectral particle tracking analysis (m-PTA) technique enabled by the ViewSizer™ 3000 (MANTA Instruments, USA) was evaluated using solutions of monomodal and multimodal gold and polystyrene latex nanoparticles, as well as a spark eroded polydisperse 316L stainless steel nanopowder, and large (non-Brownian) borosilicate particles. It was found that m-PTA performed comparably to the DLS in evaluation of monomodal particle size distributions. When measuring bimodal, trimodal and polydisperse solutions, the m-PTA technique overwhelmingly outperformed traditional dynamic light scattering (DLS) in both peak detection and relative particle concentration analysis. It was also observed that the m-PTA technique is less susceptible to large particle overexpression errors. The ViewSizer™ 3000 was also found to be successful in accurately evaluating sizes and concentrations of monomodal and bimodal sinking borosilicate particles.
NASA Astrophysics Data System (ADS)
Sinha, Mangalika; Modi, Mohammed H.
2017-10-01
In-depth compositional analysis of 240 Å thick aluminium oxide thin film has been carried out using soft x-ray reflectivity (SXR) and x-ray photoelectron spectroscopy technique (XPS). The compositional details of the film is estimated by modelling the optical index profile obtained from the SXR measurements over 60-200 Å wavelength region. The SXR measurements are carried out at Indus-1 reflectivity beamline. The method suggests that the principal film region is comprised of Al2O3 and AlOx (x = 1.6) phases whereas the interface region comprised of SiO2 and AlOx (x = 1.6) mixture. The soft x-ray reflectivity technique combined with XPS measurements explains the compositional details of principal layer. Since the interface region cannot be analyzed with the XPS technique in a non-destructive manner in such a case the SXR technique is a powerful tool for nondestructive compositional analysis of interface region.
NASA Astrophysics Data System (ADS)
Avitabile, Peter; O'Callahan, John
2009-01-01
Generally, response analysis of systems containing discrete nonlinear connection elements such as typical mounting connections require the physical finite element system matrices to be used in a direct integration algorithm to compute the nonlinear response analysis solution. Due to the large size of these physical matrices, forced nonlinear response analysis requires significant computational resources. Usually, the individual components of the system are analyzed and tested as separate components and their individual behavior may essentially be linear when compared to the total assembled system. However, the joining of these linear subsystems using highly nonlinear connection elements causes the entire system to become nonlinear. It would be advantageous if these linear modal subsystems could be utilized in the forced nonlinear response analysis since much effort has usually been expended in fine tuning and adjusting the analytical models to reflect the tested subsystem configuration. Several more efficient techniques have been developed to address this class of problem. Three of these techniques given as: equivalent reduced model technique (ERMT);modal modification response technique (MMRT); andcomponent element method (CEM); are presented in this paper and are compared to traditional methods.
Analysis of intracranial pressure: past, present, and future.
Di Ieva, Antonio; Schmitz, Erika M; Cusimano, Michael D
2013-12-01
The monitoring of intracranial pressure (ICP) is an important tool in medicine for its ability to portray the brain's compliance status. The bedside monitor displays the ICP waveform and intermittent mean values to guide physicians in the management of patients, particularly those having sustained a traumatic brain injury. Researchers in the fields of engineering and physics have investigated various mathematical analysis techniques applicable to the waveform in order to extract additional diagnostic and prognostic information, although they largely remain limited to research applications. The purpose of this review is to present the current techniques used to monitor and interpret ICP and explore the potential of using advanced mathematical techniques to provide information about system perturbations from states of homeostasis. We discuss the limits of each proposed technique and we propose that nonlinear analysis could be a reliable approach to describe ICP signals over time, with the fractal dimension as a potential predictive clinically meaningful biomarker. Our goal is to stimulate translational research that can move modern analysis of ICP using these techniques into widespread practical use, and to investigate to the clinical utility of a tool capable of simplifying multiple variables obtained from various sensors.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
Protocol Analysis: A Methodology for Exploring the Information Processing of Gifted Students.
ERIC Educational Resources Information Center
Anderson, Margaret A.
1986-01-01
Protocol analysis techniques, in which subjects are taught to think aloud, can provide information on the mental operations used by gifted learners. Concerns over the use of such data are described and new directions for the technique are proposed. (CL)
Behavior Analysis: Methodological Foundations.
ERIC Educational Resources Information Center
Owen, James L.
Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline…
Classroom Dialogue and Science Achievement.
ERIC Educational Resources Information Center
Clarke, John A.
This study reports the application to classroom dialogue of the Thematic and Structural Analysis (TSA) Technique which has been used previously in the analysis of text materials. The TSA Technique identifies themes (word clusters) and their structural relationship throughout sequentially organized material. Dialogues from four Year 8 science…
Spectroscopic analysis technique for arc-welding process control
NASA Astrophysics Data System (ADS)
Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel
2005-09-01
The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.
A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin McCarthy; Milos Manic
Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less
Schnabel, Thomas; Musso, Maurizio; Tondi, Gianluca
2014-01-01
Vibrational spectroscopy is one of the most powerful tools in polymer science. Three main techniques--Fourier transform infrared spectroscopy (FT-IR), FT-Raman spectroscopy, and FT near-infrared (NIR) spectroscopy--can also be applied to wood science. Here, these three techniques were used to investigate the chemical modification occurring in wood after impregnation with tannin-hexamine preservatives. These spectroscopic techniques have the capacity to detect the externally added tannin. FT-IR has very strong sensitivity to the aromatic peak at around 1610 cm(-1) in the tannin-treated samples, whereas FT-Raman reflects the peak at around 1600 cm(-1) for the externally added tannin. This high efficacy in distinguishing chemical features was demonstrated in univariate analysis and confirmed via cluster analysis. Conversely, the results of the NIR measurements show noticeable sensitivity for small differences. For this technique, multivariate analysis is required and with this chemometric tool, it is also possible to predict the concentration of tannin on the surface.
NASA Astrophysics Data System (ADS)
Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.
2016-01-01
In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.
[The progress in speciation analysis of trace elements by atomic spectrometry].
Wang, Zeng-Huan; Wang, Xu-Nuo; Ke, Chang-Liang; Lin, Qin
2013-12-01
The main purpose of the present work is to review the different non-chromatographic methods for the speciation analysis of trace elements in geological, environmental, biological and medical areas. In this paper, the sample processing methods in speciation analysis were summarized, and the main strategies for non-chromatographic technique were evaluated. The basic principles of the liquid extractions proposed in the published literatures recently and their advantages and disadvantages were discussed, such as conventional solvent extraction, cloud point extraction, single droplet microextraction, and dispersive liquid-liquid microextraction. Solid phase extraction, as a non-chromatographic technique for speciation analysis, can be used in batch or in flow detection, and especially suitable for the online connection to atomic spectrometric detector. The developments and applications of sorbent materials filled in the columns of solid phase extraction were reviewed. The sorbents include chelating resins, nanometer materials, molecular and ion imprinted materials, and bio-sorbents. Other techniques, e. g. hydride generation technique and coprecipitation, were also reviewed together with their main applications.
Extension of vibrational power flow techniques to two-dimensional structures
NASA Technical Reports Server (NTRS)
Cuschieri, Joseph M.
1988-01-01
In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or finite element analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid frequencies between the optimum frequency regimes for SEA and FEA. Power flow analysis has in general been used on 1-D beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to 2-D plate-like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA results at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.
Extension of vibrational power flow techniques to two-dimensional structures
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or Finite Element Analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid- frequencies between the optimum frequency regimes for FEA and SEA. Power flow analysis has in general been used on one-dimensional beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to two-dimensional plate like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.
An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1972-01-01
Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.
Reduction and analysis of data collected during the electromagnetic tornado experiment
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1976-01-01
Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.
An R package for the integrated analysis of metabolomics and spectral data.
Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel
2016-06-01
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine
Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less
The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions
Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine; ...
2017-07-18
Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-01-01
Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
Umeta, Ricardo S G; Avanzi, Osmar
2011-07-01
Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Carden, J. L.; Browner, R.
1982-01-01
The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.
Price Analysis on Commercial Item Purchases within the Department of the Navy
2015-02-05
to paying more than a fair and reasonable price for products and services can be attributed to the shortage of qualified contract personnel, the...was fair and reasonable. Sometimes, due to exigent situations, supplies or services are purchased even though an adequate price or cost 4 6 7 0...Supplies- Price Analysis Techniques used ................................................ 25 Figure 10: Services - Price Analysis Techniques used
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
Numerical analysis of thermal drilling technique on titanium sheet metal
NASA Astrophysics Data System (ADS)
Kumar, R.; Hynes, N. Rajesh Jesudoss
2018-05-01
Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.
Practical issues of hyperspectral imaging analysis of solid dosage forms.
Amigo, José Manuel
2010-09-01
Hyperspectral imaging techniques have widely demonstrated their usefulness in different areas of interest in pharmaceutical research during the last decade. In particular, middle infrared, near infrared, and Raman methods have gained special relevance. This rapid increase has been promoted by the capability of hyperspectral techniques to provide robust and reliable chemical and spatial information on the distribution of components in pharmaceutical solid dosage forms. Furthermore, the valuable combination of hyperspectral imaging devices with adequate data processing techniques offers the perfect landscape for developing new methods for scanning and analyzing surfaces. Nevertheless, the instrumentation and subsequent data analysis are not exempt from issues that must be thoughtfully considered. This paper describes and discusses the main advantages and drawbacks of the measurements and data analysis of hyperspectral imaging techniques in the development of solid dosage forms.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis
NASA Technical Reports Server (NTRS)
Lindstrom, D. G.; Normand, E.; Wilcox, A. D.
1972-01-01
In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.
Determination of fiber volume in graphite/epoxy materials using computer image analysis
NASA Technical Reports Server (NTRS)
Viens, Michael J.
1990-01-01
The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.
Structural study of Cu(II) complexes with benzo[b]furancarboxylic acids
NASA Astrophysics Data System (ADS)
Kalinowska, Diana; Klepka, Marcin T.; Wolska, Anna; Drzewiecka-Antonik, Aleksandra; Ostrowska, Kinga; Struga, Marta
2017-11-01
Four Cu(II) complexes with 2- and 3-benzo[b]furancarboxylic acids have been synthesized and characterized using combination of two spectroscopic techniques. These techniques were: (i) FTIR and (ii) XAFS. FTIR analysis confirmed that complexes were formed and gave insight into identification of possible coordinating groups to the metallic center. XANES analysis indicated that the oxidation state of Cu is +2. EXAFS analysis allowed to identify that the first coordination sphere is formed by 4-5 oxygen atoms with the Cu-O distances around 2 Å. Combining these techniques it was possible to structurally describe novel Cu(II) complexes with benzo[b]furancarboxylic acids.
Cullen, Jared; Lobo, Charlene J; Ford, Michael J; Toth, Milos
2015-09-30
Electron-beam-induced deposition (EBID) is a direct-write chemical vapor deposition technique in which an electron beam is used for precursor dissociation. Here we show that Arrhenius analysis of the deposition rates of nanostructures grown by EBID can be used to deduce the diffusion energies and corresponding preexponential factors of EBID precursor molecules. We explain the limitations of this approach, define growth conditions needed to minimize errors, and explain why the errors increase systematically as EBID parameters diverge from ideal growth conditions. Under suitable deposition conditions, EBID can be used as a localized technique for analysis of adsorption barriers and prefactors.
NASA Astrophysics Data System (ADS)
Grassi, N.
2005-06-01
In the framework of the extensive study on the wood painting "Madonna dei fusi" attributed to Leonardo da Vinci, Ion Beam Analysis (IBA) techniques were used at the Florence accelerator laboratory to get information about the elemental composition of the paint layers. After a brief description of the basic principle and the general features of IBA techniques, we will illustrate in detail how the analysis allowed us to characterise the pigments of original and restored areas and the substrate composition, and to obtain information about the stratigraphy of the painting, also providing an estimate of the paint layer thickness.
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.
Summers, A E
2000-01-01
ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.
NASA Astrophysics Data System (ADS)
Kozikowski, Raymond T.; Smith, Sarah E.; Lee, Jennifer A.; Castleman, William L.; Sorg, Brian S.; Hahn, David W.
2012-06-01
Fluorescence spectroscopy has been widely investigated as a technique for identifying pathological tissue; however, unrelated subject-to-subject variations in spectra complicate data analysis and interpretation. We describe and evaluate a new biosensing technique, differential laser-induced perturbation spectroscopy (DLIPS), based on deep ultraviolet (UV) photochemical perturbation in combination with difference spectroscopy. This technique combines sequential fluorescence probing (pre- and post-perturbation) with sub-ablative UV perturbation and difference spectroscopy to provide a new spectral dimension, facilitating two improvements over fluorescence spectroscopy. First, the differential technique eliminates significant variations in absolute fluorescence response within subject populations. Second, UV perturbations alter the extracellular matrix (ECM), directly coupling the DLIPS response to the biological structure. Improved biosensing with DLIPS is demonstrated in vivo in a murine model of chemically induced skin lesion development. Component loading analysis of the data indicates that the DLIPS technique couples to structural proteins in the ECM. Analysis of variance shows that DLIPS has a significant response to emerging pathology as opposed to other population differences. An optimal likelihood ratio classifier for the DLIPS dataset shows that this technique holds promise for improved diagnosis of epithelial pathology. Results further indicate that DLIPS may improve diagnosis of tissue by augmenting fluorescence spectra (i.e. orthogonal sensing).
Edge compression techniques for visualization of dense directed graphs.
Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher
2013-12-01
We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.
DART-MS: A New Analytical Technique for Forensic Paint Analysis.
Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice
2018-06-05
Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.
A guide to understanding meta-analysis.
Israel, Heidi; Richter, Randy R
2011-07-01
With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.
Comparison of point intercept and image analysis for monitoring rangeland transects
USDA-ARS?s Scientific Manuscript database
Amidst increasing workloads and static or declining budgets, both public and private land management agencies face the need to adapt resource-monitoring techniques or risk falling behind on resource monitoring volume and quality with old techniques. Image analysis of nadir plot images, acquired with...
DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES
A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...
Secondary Analysis of Qualitative Data.
ERIC Educational Resources Information Center
Turner, Paul D.
The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…
Code of Federal Regulations, 2012 CFR
2012-10-01
... licensed medical professional, for a billed item or service identified by data analysis techniques or probe... rate based on the results of a probe review prior to the initiation of complex medical review. Medical... licensed medical professional, for a billed item or service identified by data analysis techniques or probe...
Code of Federal Regulations, 2010 CFR
2010-10-01
... licensed medical professional, for a billed item or service identified by data analysis techniques or probe... rate based on the results of a probe review prior to the initiation of complex medical review. Medical... licensed medical professional, for a billed item or service identified by data analysis techniques or probe...
Code of Federal Regulations, 2011 CFR
2011-10-01
... licensed medical professional, for a billed item or service identified by data analysis techniques or probe... rate based on the results of a probe review prior to the initiation of complex medical review. Medical... licensed medical professional, for a billed item or service identified by data analysis techniques or probe...
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
Developing Scenarios: Linking Environmental Scanning and Strategic Planning.
ERIC Educational Resources Information Center
Whiteley, Meredith A.; And Others
1990-01-01
The multiple scenario analysis technique for organizational planning used by multinational corporations is adaptable for colleges and universities. Arizona State University launched a futures-based planning project using the Delphi technique and cross-impact analysis to produce three alternative scenarios (stable, turbulent, and chaotic) to expand…
Fourier Spectroscopy: A Simple Analysis Technique
ERIC Educational Resources Information Center
Oelfke, William C.
1975-01-01
Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)
NASA Astrophysics Data System (ADS)
Sait, Abdulrahman S.
This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.
Kernel analysis in TeV gamma-ray selection
NASA Astrophysics Data System (ADS)
Moriarty, P.; Samuelson, F. W.
2000-06-01
We discuss the use of kernel analysis as a technique for selecting gamma-ray candidates in Atmospheric Cherenkov astronomy. The method is applied to observations of the Crab Nebula and Markarian 501 recorded with the Whipple 10 m Atmospheric Cherenkov imaging system, and the results are compared with the standard Supercuts analysis. Since kernel analysis is computationally intensive, we examine approaches to reducing the computational load. Extension of the technique to estimate the energy of the gamma-ray primary is considered. .
Qualitative computer aided evaluation of dental impressions in vivo.
Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H
2006-01-01
Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Group decision-making techniques for natural resource management applications
Coughlan, Beth A.K.; Armour, Carl L.
1992-01-01
This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.
The edge of chaos: A nonlinear view of psychoanalytic technique.
Galatzer-Levy, Robert M
2016-04-01
The field of nonlinear dynamics (or chaos theory) provides ways to expand concepts of psychoanalytic process that have implications for the technique of psychoanalysis. This paper describes how concepts of "the edge of chaos," emergence, attractors, and coupled oscillators can help shape analytic technique resulting in an approach to doing analysis which is at the same time freer and more firmly based in an enlarged understanding of the ways in which psychoanalysis works than some current recommendation about technique. Illustrations from a lengthy analysis of an analysand with obsessive-compulsive disorder show this approach in action. Copyright © 2016 Institute of Psychoanalysis.
Automated quantification of the synchrogram by recurrence plot analysis.
Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart
2012-04-01
Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2016-02-01
Multiresolution analysis techniques including continuous wavelet transform, empirical mode decomposition, and variational mode decomposition are tested in the context of interest rate next-day variation prediction. In particular, multiresolution analysis techniques are used to decompose interest rate actual variation and feedforward neural network for training and prediction. Particle swarm optimization technique is adopted to optimize its initial weights. For comparison purpose, autoregressive moving average model, random walk process and the naive model are used as main reference models. In order to show the feasibility of the presented hybrid models that combine multiresolution analysis techniques and feedforward neural network optimized by particle swarm optimization, we used a set of six illustrative interest rates; including Moody's seasoned Aaa corporate bond yield, Moody's seasoned Baa corporate bond yield, 3-Month, 6-Month and 1-Year treasury bills, and effective federal fund rate. The forecasting results show that all multiresolution-based prediction systems outperform the conventional reference models on the criteria of mean absolute error, mean absolute deviation, and root mean-squared error. Therefore, it is advantageous to adopt hybrid multiresolution techniques and soft computing models to forecast interest rate daily variations as they provide good forecasting performance.
Novel near-infrared sampling apparatus for single kernel analysis of oil content in maize.
Janni, James; Weinstock, B André; Hagen, Lisa; Wright, Steve
2008-04-01
A method of rapid, nondestructive chemical and physical analysis of individual maize (Zea mays L.) kernels is needed for the development of high value food, feed, and fuel traits. Near-infrared (NIR) spectroscopy offers a robust nondestructive method of trait determination. However, traditional NIR bulk sampling techniques cannot be applied successfully to individual kernels. Obtaining optimized single kernel NIR spectra for applied chemometric predictive analysis requires a novel sampling technique that can account for the heterogeneous forms, morphologies, and opacities exhibited in individual maize kernels. In this study such a novel technique is described and compared to less effective means of single kernel NIR analysis. Results of the application of a partial least squares (PLS) derived model for predictive determination of percent oil content per individual kernel are shown.
Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K
2008-06-01
The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.
Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa
2014-11-06
Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat
2009-01-01
Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.
Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...
2015-09-10
We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less
Fundamentals of Biomolecule Analysis by Electrospray Ionization Mass Spectrometry
ERIC Educational Resources Information Center
Weinecke, Andrea; Ryzhov, Victor
2005-01-01
Electrospray ionization (ESI) is a soft ionization technique that allows transfer of fragile biomolecules directly from solution into the gas phase. An instrumental analysis laboratory experiment is designed that would introduce the students to the ESI technique, major parameters of the ion trap mass spectrometers and some caveats in…
Science in Drama: Using Television Programmes to Teach Concepts and Techniques
ERIC Educational Resources Information Center
Rutter, Gordon
2011-01-01
By using a specific episode of the popular television cartoon series "The Simpsons," a range of techniques can be communicated, including microscope setup and use, simple chemical analysis, observation, and interpretation. Knowledge of blood groups and typing, morphological comparison of hair samples, fingerprint analysis, and DNA fingerprinting…
Pressure-Assisted Chelating Extraction as a Teaching Tool in Instrumental Analysis
ERIC Educational Resources Information Center
Sadik, Omowunmi A.; Wanekaya, Adam K.; Yevgeny, Gelfand
2004-01-01
A novel instrumental-digestion technique using pressure-assisted chelating extraction (PACE), for undergraduate laboratory is reported. This procedure is used for exposing students to safe sample-preparation techniques, for correlating wet-chemical methods with modern instrumental analysis and comparing the performance of PACE with conventional…
Decision Analysis Techniques for Adult Learners: Application to Leadership
ERIC Educational Resources Information Center
Toosi, Farah
2017-01-01
Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…
Developing a Systematic Patent Search Training Program
ERIC Educational Resources Information Center
Zhang, Li
2009-01-01
This study aims to develop a systematic patent training program using patent analysis and citation analysis techniques applied to patents held by the University of Saskatchewan. The results indicate that the target audience will be researchers in life sciences, and aggregated patent database searching and advanced search techniques should be…
Fourier Analysis and the Rhythm of Conversation.
ERIC Educational Resources Information Center
Dabbs, James M., Jr.
Fourier analysis, a common technique in engineering, breaks down a complex wave form into its simple sine wave components. Communication researchers have recently suggested that this technique may provide an index of the rhythm of conversation, since vocalizing and pausing produce a complex wave form pattern of alternation between two speakers. To…
USDA-ARS?s Scientific Manuscript database
Ambient desorption ionization techniques, such as laser desorption with electrospray ionization assistance (ELDI), direct analysis in real time (DART) and desorption electrospray ionization (DESI) have been developed as alternatives to traditional mass spectrometric-based methods. Such techniques al...
A Study of Item Bias for Attitudinal Measurement Using Maximum Likelihood Factor Analysis.
ERIC Educational Resources Information Center
Mayberry, Paul W.
A technique for detecting item bias that is responsive to attitudinal measurement considerations is a maximum likelihood factor analysis procedure comparing multivariate factor structures across various subpopulations, often referred to as SIFASP. The SIFASP technique allows for factorial model comparisons in the testing of various hypotheses…
The creation of chiral chromatography techniques significantly advanced the development of methods for the analysis of individual enantiomers of chiral compounds. These techniques are being employed at the US EPA for human exposure and ecological research studies with indoor samp...
Regression Commonality Analysis: A Technique for Quantitative Theory Building
ERIC Educational Resources Information Center
Nimon, Kim; Reio, Thomas G., Jr.
2011-01-01
When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…
Graphical Representation of University Image: A Correspondence Analysis.
ERIC Educational Resources Information Center
Yavas, Ugar; Shemwell, Donald J.
1996-01-01
Correspondence analysis, an easy-to-interpret interdependence technique, portrays data graphically to show associations of factors more clearly. A study used the technique with 58 students in one university to determine factors in college choice. Results identified the institution's closest competitors and its positioning in terms of college…
Analysis techniques for multivariate root loci. [a tool in linear control systems
NASA Technical Reports Server (NTRS)
Thompson, P. M.; Stein, G.; Laub, A. J.
1980-01-01
Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.
48 CFR 215.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2010 CFR
2010-10-01
... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...
Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques
NASA Technical Reports Server (NTRS)
McDonald, G.; Storrie-Lombardi, M.; Nealson, K.
1999-01-01
The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.
ERIC Educational Resources Information Center
Misra, Anjali; Schloss, Patrick J.
1989-01-01
The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…
USDA-ARS?s Scientific Manuscript database
Nondestructive methods based on fluorescence hyperspectral imaging (HSI) techniques were developed in order to detect worms on fresh-cut lettuce. The optimal wavebands for detecting worms on fresh-cut lettuce were investigated using the one-way ANOVA analysis and correlation analysis. The worm detec...
Application of a substructuring technique to the problem of crack extension and closure
NASA Technical Reports Server (NTRS)
Armen, H., Jr.
1974-01-01
A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.
Killgore, George; Thompson, Angela; Johnson, Stuart; Brazier, Jon; Kuijper, Ed; Pepin, Jacques; Frost, Eric H; Savelkoul, Paul; Nicholson, Brad; van den Berg, Renate J; Kato, Haru; Sambol, Susan P; Zukowski, Walter; Woods, Christopher; Limbago, Brandi; Gerding, Dale N; McDonald, L Clifford
2008-02-01
Using 42 isolates contributed by laboratories in Canada, The Netherlands, the United Kingdom, and the United States, we compared the results of analyses done with seven Clostridium difficile typing techniques: multilocus variable-number tandem-repeat analysis (MLVA), amplified fragment length polymorphism (AFLP), surface layer protein A gene sequence typing (slpAST), PCR-ribotyping, restriction endonuclease analysis (REA), multilocus sequence typing (MLST), and pulsed-field gel electrophoresis (PFGE). We assessed the discriminating ability and typeability of each technique as well as the agreement among techniques in grouping isolates by allele profile A (AP-A) through AP-F, which are defined by toxinotype, the presence of the binary toxin gene, and deletion in the tcdC gene. We found that all isolates were typeable by all techniques and that discrimination index scores for the techniques tested ranged from 0.964 to 0.631 in the following order: MLVA, REA, PFGE, slpAST, PCR-ribotyping, MLST, and AFLP. All the techniques were able to distinguish the current epidemic strain of C. difficile (BI/027/NAP1) from other strains. All of the techniques showed multiple types for AP-A (toxinotype 0, binary toxin negative, and no tcdC gene deletion). REA, slpAST, MLST, and PCR-ribotyping all included AP-B (toxinotype III, binary toxin positive, and an 18-bp deletion in tcdC) in a single group that excluded other APs. PFGE, AFLP, and MLVA grouped two, one, and two different non-AP-B isolates, respectively, with their AP-B isolates. All techniques appear to be capable of detecting outbreak strains, but only REA and MLVA showed sufficient discrimination to distinguish strains from different outbreaks.
Song, Dawei; Meng, Bin; Gan, Minfeng; Niu, Junjie; Li, Shiyan; Chen, Hao; Yuan, Chenxi; Yang, Huilin
2015-08-01
Percutaneous vertebroplasty (PVP) and balloon kyphoplasty (BKP) are minimally invasive and effective vertebral augmentation techniques for managing osteoporotic vertebral compression fractures (OVCFs). Recent meta-analyses have compared the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques or conservative treatment; however, the inclusions were not thorough and rigorous enough, and the effects of each technique on the incidence of secondary vertebral fractures remain unclear. To perform an updated systematic review and meta-analysis of the studies with more rigorous inclusion criteria on the effects of vertebral augmentation techniques and conservative treatment for OVCF on the incidence of secondary vertebral fractures. PubMed, MEDLINE, EMBASE, SpringerLink, Web of Science, and the Cochrane Library database were searched for relevant original articles comparing the incidence of secondary vertebral fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. Randomized controlled trials (RCTs) and prospective non-randomized controlled trials (NRCTs) were identified. The methodological qualities of the studies were evaluated, relevant data were extracted and recorded, and an appropriate meta-analysis was conducted. A total of 13 articles were included. The pooled results from included studies showed no statistically significant differences in the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques and conservative treatment. Subgroup analysis comparing different study designs, durations of symptoms, follow-up times, races of patients, and techniques were conducted, and no significant differences in the incidence of secondary fractures were identified (P > 0.05). No obvious publication bias was detected by either Begg's test (P = 0.360 > 0.05) or Egger's test (P = 0.373 > 0.05). Despite current thinking in the field that vertebral augmentation procedures may increase the incidence of secondary fractures, we found no differences in the incidence of secondary fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. © The Foundation Acta Radiologica 2014.
The report describes a new technique for sulfur forms analysis based on low-temperature oxygen plasma ashing. The technique involves analyzing the low-temperature plasma ash by modified ASTM techniques after selectively removing the organic material. The procedure has been tested...
Use of satellite images in the evaluation of farmlands. [in Mexico
NASA Technical Reports Server (NTRS)
Lozano H., A. E.
1978-01-01
Remote sensing techniques in the evaluation of farmland in Mexico are discussed. Electronic analysis techniques and photointerpretation techniques are analyzed. Characteristics of the basic crops in Mexico as related to remote sensing are described.
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.
,
1990-01-01
Various techniques were used to decipher the sedimentation history of Site 765, including Markov chain analysis of facies transitions, XRD analysis of clay and other minerals, and multivariate analysis of smear-slide data, in addition to the standard descriptive procedures employed by the shipboard sedimentologist. This chapter presents brief summaries of methodology and major findings of these three techniques, a summary of the sedimentation history, and a discussion of trends in sedimentation through time.
Automated Sneak Circuit Analysis Technique
1990-06-01
the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air
Stochastic Game Analysis and Latency Awareness for Self-Adaptation
2014-01-01
this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
Niu, Guanghui; Shi, Qi; Xu, Mingjun; Lai, Hongjun; Lin, Qingyu; Liu, Kunping; Duan, Yixiang
2015-10-01
In this article, a novel and alternative method of laser-induced breakdown spectroscopy (LIBS) analysis for liquid sample is proposed, which involves the removal of metal ions from a liquid to a solid substrate using a cost-efficient adsorbent, dehydrated carbon, obtained using a dehydration reaction. Using this new technique, researchers can detect trace metal ions in solutions qualitatively and quantitatively, and the drawbacks of performing liquid analysis using LIBS can be avoided because the analysis is performed on a solid surface. To achieve better performance using this technique, we considered parameters potentially influencing both adsorption performance and LIBS analysis. The calibration curves were evaluated, and the limits of detection obtained for Cu(2+), Pb(2+), and Cr(3+) were 0.77, 0.065, and 0.46 mg/L, respectively, which are better than those in the previous studies. In addition, compared to other absorbents, the adsorbent used in this technique is much cheaper in cost, easier to obtain, and has fewer or no other elements other than C, H, and O that could result in spectral interference during analysis. We also used the recommended method to analyze spiked samples, obtaining satisfactory results. Thus, this new technique is helpful and promising for use in wastewater analysis and management.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
Why bundled payments could drive innovation: an example from interventional oncology.
Steele, Joseph R; Jones, A Kyle; Ninan, Elizabeth P; Clarke, Ryan K; Odisio, Bruno C; Avritscher, Rony; Murthy, Ravi; Mahvash, Armeen
2015-03-01
Some have suggested that the current fee-for-service health care payment system in the United States stifles innovation. However, there are few published examples supporting this concept. We implemented an innovative temporary balloon occlusion technique for yttrium 90 radioembolization of nonresectable liver cancer. Although our balloon occlusion technique was associated with similar patient outcomes, lower cost, and faster procedure times compared with the standard-of-care coil embolization technique, our technique failed to gain widespread acceptance. Financial analysis revealed that because the balloon occlusion technique avoided a procedural step associated with a lucrative Current Procedural Terminology billing code, this new technique resulted in a significant decrease in hospital and physician revenue in the current fee-for-service payment system, even though the new technique would provide a revenue enhancement through cost savings in a bundled payment system. Our analysis illustrates how in a fee-for-service payment system, financial disincentives can stifle innovation and advancement of health care delivery. Copyright © 2015 by American Society of Clinical Oncology.
An Introduction to Markov Modeling: Concepts and Uses
NASA Technical Reports Server (NTRS)
Boyd, Mark A.; Lau, Sonie (Technical Monitor)
1998-01-01
Kharkov modeling is a modeling technique that is widely useful for dependability analysis of complex fault tolerant systems. It is very flexible in the type of systems and system behavior it can model. It is not, however, the most appropriate modeling technique for every modeling situation. The first task in obtaining a reliability or availability estimate for a system is selecting which modeling technique is most appropriate to the situation at hand. A person performing a dependability analysis must confront the question: is Kharkov modeling most appropriate to the system under consideration, or should another technique be used instead? The need to answer this gives rise to other more basic questions regarding Kharkov modeling: what are the capabilities and limitations of Kharkov modeling as a modeling technique? How does it relate to other modeling techniques? What kind of system behavior can it model? What kinds of software tools are available for performing dependability analyses with Kharkov modeling techniques? These questions and others will be addressed in this tutorial.
Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders
NASA Technical Reports Server (NTRS)
Lovejoy, Andrew E.; Schultz, Marc R.
2012-01-01
Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.
NASA Technical Reports Server (NTRS)
1996-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is a common task in the aerospace industry. Often, these problems are solved by static analysis with inertia relief. This technique allows for a free-free static analysis by balancing the applied loads with the inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus the displacement-dependent loads. A launch vehicle being acted upon by an aerodynamic loading can have such applied loads. The final displacements of such systems are commonly determined with iterative solution techniques. Unfortunately, these techniques can be time consuming and labor intensive. Because the coupled system equations for free-free systems with displacement-dependent loads can be written in closed form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. An MSC/NASTRAN (MacNeal-Schwendler Corporation/NASA Structural Analysis) DMAP (Direct Matrix Abstraction Program) Alter was used to include displacement-dependent loads in static analysis with inertia relief. It efficiently solved a common aerospace problem that typically has been solved with an iterative technique.
Steam generator tubing NDE performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, G.; Welty, C.S. Jr.
1997-02-01
Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis.
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-03-09
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B 1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B 1 affected peanuts at EU regulatory limits of 1250 μg kg -1 and 8 μg kg -1 , respectively.
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-01-01
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg−1 and 8 μg kg−1, respectively. PMID:28276454
Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang
2016-01-01
This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis
NASA Astrophysics Data System (ADS)
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-03-01
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg-1 and 8 μg kg-1, respectively.
Yang, Jun-Ho; Yoh, Jack J
2018-01-01
A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clegg, Samuel M; Barefield, James E; Wiens, Roger C
2008-01-01
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less
Using sentiment analysis to review patient satisfaction data located on the internet.
Hopper, Anthony M; Uriyo, Maria
2015-01-01
The purpose of this paper is to test the usefulness of sentiment analysis and time-to-next-complaint methods in quantifying text-based information located on the internet. As important, the authors demonstrate how managers can use time-to-next-complaint techniques to organize sentiment analysis derived data into useful information, which can be shared with doctors and other staff. The authors used sentiment analysis to review patient feedback for a select group of gynecologists in Virginia. The authors utilized time-to-next-complaint methods along with other techniques to organize this data into meaningful information. The authors demonstrated that sentiment analysis and time-to-next-complaint techniques might be useful tools for healthcare managers who are interested in transforming web-based text into meaningful, quantifiable information. This study has several limitations. For one thing, neither the data set nor the techniques the authors used to analyze it will account for biases that resulted from selection issues related to gender, income, and culture, as well as from other socio-demographic concerns. Additionally, the authors lacked key data concerning patient volumes for the targeted physicians. Finally, it may be difficult to convince doctors to consider web-based comments as truthful, thereby preventing healthcare managers from using data located on the internet. The report illustrates some of the ways in which healthcare administrators can utilize sentiment analysis, along with time-to-next-complaint techniques, to mine web-based, patient comments for meaningful information. The paper is one of the first to illustrate ways in which administrators at clinics and physicians' offices can utilize sentiment analysis and time-to-next-complaint methods to analyze web-based patient comments.
Lobb, Eric C
2016-07-08
Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.
Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu
2011-01-01
The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.
2011-01-01
Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
Peek, Mirjam Cl; Charalampoudis, Petros; Anninga, Bauke; Baker, Rose; Douek, Michael
2017-02-01
The combined technique (radioisotope and blue dye) is the gold standard for sentinel lymph node biopsy (SLNB) and there is wide variation in techniques and blue dyes used. We performed a systematic review and meta-analysis to assess the need for radioisotope and the optimal blue dye for SLNB. A total of 21 studies were included. The SLNB identification rates are high with all the commonly used blue dyes. Furthermore, methylene blue is superior to iso-sulfan blue and Patent Blue V with respect to false-negative rates. The combined technique remains the most accurate and effective technique for SLNB. In order to standardize the SLNB technique, comparative trials to determine the most effective blue dye and national guidelines are required.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation
NASA Technical Reports Server (NTRS)
Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.
1997-01-01
This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.
Application of optical correlation techniques to particle imaging velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1988-01-01
Pulsed laser sheet velocimetry yields nonintrusive measurements of velocity vectors across an extended 2-dimensional region of the flow field. The application of optical correlation techniques to the analysis of multiple exposure laser light sheet photographs can reduce and/or simplify the data reduction time and hardware. Here, Matched Spatial Filters (MSF) are used in a pattern recognition system. Usually MSFs are used to identify the assembly line parts. In this application, the MSFs are used to identify the iso-velocity vector contours in the flow. The patterns to be recognized are the recorded particle images in a pulsed laser light sheet photograph. Measurement of the direction of the partical image displacements between exposures yields the velocity vector. The particle image exposure sequence is designed such that the velocity vector direction is determined unambiguously. A global analysis technique is used in comparison to the more common particle tracking algorithms and Young's fringe analysis technique.
Jo, Javier A.; Fang, Qiyin; Marcu, Laura
2007-01-01
We report a new deconvolution method for fluorescence lifetime imaging microscopy (FLIM) based on the Laguerre expansion technique. The performance of this method was tested on synthetic and real FLIM images. The following interesting properties of this technique were demonstrated. 1) The fluorescence intensity decay can be estimated simultaneously for all pixels, without a priori assumption of the decay functional form. 2) The computation speed is extremely fast, performing at least two orders of magnitude faster than current algorithms. 3) The estimated maps of Laguerre expansion coefficients provide a new domain for representing FLIM information. 4) The number of images required for the analysis is relatively small, allowing reduction of the acquisition time. These findings indicate that the developed Laguerre expansion technique for FLIM analysis represents a robust and extremely fast deconvolution method that enables practical applications of FLIM in medicine, biology, biochemistry, and chemistry. PMID:19444338
A constant current charge technique for low Earth orbit life testing
NASA Technical Reports Server (NTRS)
Glueck, Peter
1991-01-01
A constant current charge technique for low earth orbit testing of nickel cadmium cells is presented. The method mimics the familiar taper charge of the constant potential technique while maintaining cell independence for statistical analysis. A detailed example application is provided and the advantages and disadvantages of this technique are discussed.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
Mishra, Gautam; Easton, Christopher D.; McArthur, Sally L.
2009-01-01
Physical and photolithographic techniques are commonly used to create chemical patterns for a range of technologies including cell culture studies, bioarrays and other biomedical applications. In this paper, we describe the fabrication of chemical micropatterns from commonly used plasma polymers. Atomic force microcopy (AFM) imaging, Time-of-Flight Static Secondary Ion Mass Spectrometry (ToF-SSIMS) imaging and multivariate analysis have been employed to visualize the chemical boundaries created by these patterning techniques and assess the spatial and chemical resolution of the patterns. ToF-SSIMS analysis demonstrated that well defined chemical and spatial boundaries were obtained from photolithographic patterning, while the resolution of physical patterning via a transmission electron microscopy (TEM) grid varied depending on the properties of the plasma system including the substrate material. In general, physical masking allowed diffusion of the plasma species below the mask and bleeding of the surface chemistries. Multivariate analysis techniques including Principal Component Analysis (PCA) and Region of Interest (ROI) assessment were used to investigate the ToF-SSIMS images of a range of different plasma polymer patterns. In the most challenging case, where two strongly reacting polymers, allylamine and acrylic acid were deposited, PCA confirmed the fabrication of micropatterns with defined spatial resolution. ROI analysis allowed for the identification of an interface between the two plasma polymers for patterns fabricated using the photolithographic technique which has been previously overlooked. This study clearly demonstrated the versatility of photolithographic patterning for the production of multichemistry plasma polymer arrays and highlighted the need for complimentary characterization and analytical techniques during the fabrication plasma polymer micropatterns. PMID:19950941
Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin
2018-01-01
Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.
Time-Frequency Analyses of Tide-Gauge Sensor Data
Erol, Serdar
2011-01-01
The real world phenomena being observed by sensors are generally non-stationary in nature. The classical linear techniques for analysis and modeling natural time-series observations are inefficient and should be replaced by non-linear techniques of whose theoretical aspects and performances are varied. In this manner adopting the most appropriate technique and strategy is essential in evaluating sensors’ data. In this study, two different time-series analysis approaches, namely least squares spectral analysis (LSSA) and wavelet analysis (continuous wavelet transform, cross wavelet transform and wavelet coherence algorithms as extensions of wavelet analysis), are applied to sea-level observations recorded by tide-gauge sensors, and the advantages and drawbacks of these methods are reviewed. The analyses were carried out using sea-level observations recorded at the Antalya-II and Erdek tide-gauge stations of the Turkish National Sea-Level Monitoring System. In the analyses, the useful information hidden in the noisy signals was detected, and the common features between the two sea-level time series were clarified. The tide-gauge records have data gaps in time because of issues such as instrumental shortcomings and power outages. Concerning the difficulties of the time-frequency analysis of data with voids, the sea-level observations were preprocessed, and the missing parts were predicted using the neural network method prior to the analysis. In conclusion the merits and limitations of the techniques in evaluating non-stationary observations by means of tide-gauge sensors records were documented and an analysis strategy for the sequential sensors observations was presented. PMID:22163829
Time-frequency analyses of tide-gauge sensor data.
Erol, Serdar
2011-01-01
The real world phenomena being observed by sensors are generally non-stationary in nature. The classical linear techniques for analysis and modeling natural time-series observations are inefficient and should be replaced by non-linear techniques of whose theoretical aspects and performances are varied. In this manner adopting the most appropriate technique and strategy is essential in evaluating sensors' data. In this study, two different time-series analysis approaches, namely least squares spectral analysis (LSSA) and wavelet analysis (continuous wavelet transform, cross wavelet transform and wavelet coherence algorithms as extensions of wavelet analysis), are applied to sea-level observations recorded by tide-gauge sensors, and the advantages and drawbacks of these methods are reviewed. The analyses were carried out using sea-level observations recorded at the Antalya-II and Erdek tide-gauge stations of the Turkish National Sea-Level Monitoring System. In the analyses, the useful information hidden in the noisy signals was detected, and the common features between the two sea-level time series were clarified. The tide-gauge records have data gaps in time because of issues such as instrumental shortcomings and power outages. Concerning the difficulties of the time-frequency analysis of data with voids, the sea-level observations were preprocessed, and the missing parts were predicted using the neural network method prior to the analysis. In conclusion the merits and limitations of the techniques in evaluating non-stationary observations by means of tide-gauge sensors records were documented and an analysis strategy for the sequential sensors observations was presented.
Chao, T.T.; Sanzolone, R.F.
1992-01-01
Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.
Lagrangian analysis of multiscale particulate flows with the particle finite element method
NASA Astrophysics Data System (ADS)
Oñate, Eugenio; Celigueta, Miguel Angel; Latorre, Salvador; Casas, Guillermo; Rossi, Riccardo; Rojek, Jerzy
2014-05-01
We present a Lagrangian numerical technique for the analysis of flows incorporating physical particles of different sizes. The numerical approach is based on the particle finite element method (PFEM) which blends concepts from particle-based techniques and the FEM. The basis of the Lagrangian formulation for particulate flows and the procedure for modelling the motion of small and large particles that are submerged in the fluid are described in detail. The numerical technique for analysis of this type of multiscale particulate flows using a stabilized mixed velocity-pressure formulation and the PFEM is also presented. Examples of application of the PFEM to several particulate flows problems are given.
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.
1980-11-01
The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.
Sample preparation for the analysis of isoflavones from soybeans and soy foods.
Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A
2009-01-02
This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.
An overview of data acquisition, signal coding and data analysis techniques for MST radars
NASA Technical Reports Server (NTRS)
Rastogi, P. K.
1986-01-01
An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.
The use of interpractive graphic displays for interpretation of surface design parameters
NASA Technical Reports Server (NTRS)
Talcott, N. A., Jr.
1981-01-01
An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.
Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine
2017-01-01
One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized. PMID:28447998
Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine
2017-04-01
One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized.
Multiscale Image Processing of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.
NASA Technical Reports Server (NTRS)
Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.
2015-01-01
The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.
NASA Astrophysics Data System (ADS)
Bordovsky, Michal; Catrysse, Peter; Dods, Steven; Freitas, Marcio; Klein, Jackson; Kotacka, Libor; Tzolov, Velko; Uzunov, Ivan M.; Zhang, Jiazong
2004-05-01
We present the state of the art for commercial design and simulation software in the 'front end' of photonic circuit design. One recent advance is to extend the flexibility of the software by using more than one numerical technique on the same optical circuit. There are a number of popular and proven techniques for analysis of photonic devices. Examples of these techniques include the Beam Propagation Method (BPM), the Coupled Mode Theory (CMT), and the Finite Difference Time Domain (FDTD) method. For larger photonic circuits, it may not be practical to analyze the whole circuit by any one of these methods alone, but often some smaller part of the circuit lends itself to at least one of these standard techniques. Later the whole problem can be analyzed on a unified platform. This kind of approach can enable analysis for cases that would otherwise be cumbersome, or even impossible. We demonstrate solutions for more complex structures ranging from the sub-component layout, through the entire device characterization, to the mask layout and its editing. We also present recent advances in the above well established techniques. This includes the analysis of nano-particles, metals, and non-linear materials by FDTD, photonic crystal design and analysis, and improved models for high concentration Er/Yb co-doped glass waveguide amplifiers.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.
2015-01-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
NASA Astrophysics Data System (ADS)
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-01
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-05
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-06-01
The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less
Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda
2014-01-01
In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Flow-based analysis using microfluidics-chemiluminescence systems.
Al Lawati, Haider A J
2013-01-01
This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.
A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.
Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi
2016-10-01
We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.
NASA Astrophysics Data System (ADS)
Dostal, P.; Krasula, L.; Klima, M.
2012-06-01
Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.
Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K
2018-06-01
Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p < 0.001]. No difference was seen in time to cannulate [SMD (95% CI) -0.31 (-0.65, 0.04); p = 0.30] and mean number of attempt [MD (95% CI) -0.65 (-1.32, 0.02); p = 0.06] between USG guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018 Elsevier Inc. All rights reserved.
Ambient ionisation mass spectrometry for in situ analysis of intact proteins
Kocurek, Klaudia I.; Griffiths, Rian L.
2018-01-01
Abstract Ambient surface mass spectrometry is an emerging field which shows great promise for the analysis of biomolecules directly from their biological substrate. In this article, we describe ambient ionisation mass spectrometry techniques for the in situ analysis of intact proteins. As a broad approach, the analysis of intact proteins offers unique advantages for the determination of primary sequence variations and posttranslational modifications, as well as interrogation of tertiary and quaternary structure and protein‐protein/ligand interactions. In situ analysis of intact proteins offers the potential to couple these advantages with information relating to their biological environment, for example, their spatial distributions within healthy and diseased tissues. Here, we describe the techniques most commonly applied to in situ protein analysis (liquid extraction surface analysis, continuous flow liquid microjunction surface sampling, nano desorption electrospray ionisation, and desorption electrospray ionisation), their advantages, and limitations and describe their applications to date. We also discuss the incorporation of ion mobility spectrometry techniques (high field asymmetric waveform ion mobility spectrometry and travelling wave ion mobility spectrometry) into ambient workflows. Finally, future directions for the field are discussed. PMID:29607564
A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
Deng, Yuqiang; Yang, Weijian; Zhou, Chun; Wang, Xi; Tao, Jun; Kong, Weipeng; Zhang, Zhigang
2008-12-01
We propose and demonstrate an analysis method to directly extract the group delay rather than the phase from the white-light spectral interferogram. By the joint time-frequency analysis technique, group delay is directly read from the ridge of wavelet transform, and group-delay dispersion is easily obtained by additional differentiation. The technique shows reasonable potential for the characterization of ultra-broadband chirped mirrors.
A Modeling and Data Analysis of Laser Beam Propagation in the Maritime Domain
2015-05-18
approach to computing pdfs is the Kernel Density Method (Reference [9] has an intro - duction to the method), which we will apply to compute the pdf of our...The project has two parts to it: 1) we present a computational analysis of different probability density function approximation techniques; and 2) we... computational analysis of different probability density function approximation techniques; and 2) we introduce preliminary steps towards developing a
Comparative Analysis of RF Emission Based Fingerprinting Techniques for ZigBee Device Classification
quantify the differences invarious RF fingerprinting techniques via comparative analysis of MDA/ML classification results. The findings herein demonstrate...correct classification rates followed by COR-DNA and then RF-DNA in most test cases and especially in low Eb/N0 ranges, where ZigBee is designed to operate.
A Flipped Classroom Approach to Teaching Systems Analysis, Design and Implementation
ERIC Educational Resources Information Center
Tanner, Maureen; Scott, Elsje
2015-01-01
This paper describes a flipped classroom approach followed to teach systems analysis, design and implementation at university level. The techniques employed are described. These techniques were underpinned by a theory of coherent practice: a pedagogy that provides a framework for the design of highly structured interventions to guide students in…
Rasch Analysis: A Primer for School Psychology Researchers and Practitioners
ERIC Educational Resources Information Center
Boone, William J.; Noltemeyer, Amity
2017-01-01
In order to progress as a field, school psychology research must be informed by effective measurement techniques. One approach to address the need for careful measurement is Rasch analysis. This technique can (a) facilitate the development of instruments that provide useful data, (b) provide data that can be used confidently for both descriptive…
Sensitivity analysis of hybrid thermoelastic techniques
W.A. Samad; J.M. Considine
2017-01-01
Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...
A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.
ERIC Educational Resources Information Center
Feeney, J. D.
Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…
Techniques for the Analysis of Human Movement.
ERIC Educational Resources Information Center
Grieve, D. W.; And Others
This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…
ERIC Educational Resources Information Center
Griffith, James
2002-01-01
Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…
Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...
A Comparison of Mean Phase Difference and Generalized Least Squares for Analyzing Single-Case Data
ERIC Educational Resources Information Center
Manolov, Rumen; Solanas, Antonio
2013-01-01
The present study focuses on single-case data analysis specifically on two procedures for quantifying differences between baseline and treatment measurements. The first technique tested is based on generalized least square regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The…
Figure Analysis: An Implementation Dialogue
ERIC Educational Resources Information Center
Wiles, Amy M.
2016-01-01
Figure analysis is a novel active learning teaching technique that reinforces visual literacy. Small groups of students discuss diagrams in class in order to learn content. The instructor then gives a brief introduction and later summarizes the content of the figure. This teaching technique can be used in place of lecture as a mechanism to deliver…
A Market-oriented Approach To Maximizing Product Benefits: Cases in U.S. Forest Products Industries
Vijay S. Reddy; Robert J. Bush; Ronen Roudik
1996-01-01
Conjoint analysis, a decompositional customer preference modelling technique, has seen little application to forest products. However, the technique provides useful information for marketing decisions by quantifying consumer preference functions for multiattribute product alternatives. The results of a conjoint analysis include the contribution of each attribute and...
Multilevel Latent Class Analysis: Parametric and Nonparametric Models
ERIC Educational Resources Information Center
Finch, W. Holmes; French, Brian F.
2014-01-01
Latent class analysis is an analytic technique often used in educational and psychological research to identify meaningful groups of individuals within a larger heterogeneous population based on a set of variables. This technique is flexible, encompassing not only a static set of variables but also longitudinal data in the form of growth mixture…
ERIC Educational Resources Information Center
Duxbury, Mark
2004-01-01
An enzymatic laboratory experiment based on the analysis of serum is described that is suitable for students of clinical chemistry. The experiment incorporates an introduction to mathematical method-comparison techniques in which three different clinical glucose analysis methods are compared using linear regression and Bland-Altman difference…
USDA-ARS?s Scientific Manuscript database
A technique is described where an atmospheric pressure-thermal desorption (AP-TD) device and electrospray ionization (ESI)-mass spectrometry are coupled and used for the rapid analysis of Bacillus spores in complex matrices. The resulting AP-TD/ESI-MS technique combines the generation of volatile co...
Effect of various putty-wash impression techniques on marginal fit of cast crowns.
Nissan, Joseph; Rosner, Ofir; Bukhari, Mohammed Amin; Ghelfan, Oded; Pilo, Raphael
2013-01-01
Marginal fit is an important clinical factor that affects restoration longevity. The accuracy of three polyvinyl siloxane putty-wash impression techniques was compared by marginal fit assessment using the nondestructive method. A stainless steel master cast containing three abutments with three metal crowns matching the three preparations was used to make 45 impressions: group A = single-step technique (putty and wash impression materials used simultaneously), group B = two-step technique with a 2-mm relief (putty as a preliminary impression to create a 2-mm wash space followed by the wash stage), and group C = two-step technique with a polyethylene spacer (plastic spacer used with the putty impression followed by the wash stage). Accuracy was assessed using a toolmaker microscope to measure and compare the marginal gaps between each crown and finish line on the duplicated stone casts. Each abutment was further measured at the mesial, buccal, and distal aspects. One-way analysis of variance was used for statistical analysis. P values and Scheffe post hoc contrasts were calculated. Significance was determined at .05. One-way analysis of variance showed significant differences among the three impression techniques in all three abutments and at all three locations (P < .001). Group B yielded dies with minimal gaps compared to groups A and C. The two-step impression technique with 2-mm relief was the most accurate regarding the crucial clinical factor of marginal fit.
Supercritical fluid chromatography for lipid analysis in foodstuffs.
Donato, Paola; Inferrera, Veronica; Sciarrone, Danilo; Mondello, Luigi
2017-01-01
The task of lipid analysis has always challenged separation scientists, and new techniques in chromatography were often developed for the separation of lipids; however, no single technique or methodology is yet capable of affording a comprehensive screening of all lipid species and classes. This review acquaints the role of supercritical fluid chromatography within the field of lipid analysis, from the early developed capillary separations based on pure CO 2 , to the most recent techniques employing packed columns under subcritical conditions, including the niche multidimensional techniques using supercritical fluids in at least one of the separation dimensions. A short history of supercritical fluid chromatography will be introduced first, from its early popularity in the late 1980s, to the sudden fall and oblivion until the last decade, experiencing a regain of interest within the chromatographic community. Afterwards, the subject of lipid nomenclature and classification will be briefly dealt with, before discussing the main applications of supercritical fluid chromatography for food analysis, according to the specific class of lipids. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Volodin, Boris; Dolgy, Sergei; Ban, Vladimir S.; Gracin, Davor; Juraić, Krunoslav; Gracin, Leo
2014-03-01
Shifted Excitation Raman Difference Spectroscopy (SERDS) has proven an effective method for performing Raman analysis of fluorescent samples. This technique allows achieving excellent signal to noise performance with shorter excitation wavelengths, thus taking full advantage of the superior signal strength afforded by shorter excitation wavelengths and the superior performance, also combined with lower cost, delivered by silicon CCDs. The technique is enabled by use of two closely space fixed-wavelength laser diode sources stabilized with the Volume Bragg gratings (VBGs). A side by side comparison reveals that SERDS technique delivers superior signal to noise ratio and better detection limits in most situations, even when a longer excitation wavelength is employed for the purpose of elimination of the fluorescence. We have applied the SERDS technique to the quantitative analysis of the presence of trace amounts of methanol in red wines, which is an important task in quality control operations within wine industry and is currently difficult to perform in the field. So far conventional Raman spectroscopy analysis of red wines has been impractical due to the high degree of fluorescence.
Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs
NASA Technical Reports Server (NTRS)
Somani, Arun K.
1996-01-01
Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.
NASA Astrophysics Data System (ADS)
Fosnight, Alyssa M.; Moran, Benjamin L.; Branco, Daniela R.; Thomas, Jessica R.; Medvedev, Ivan R.
2013-06-01
As many as 3000 chemicals are reported to be found in exhaled human breath. Many of these chemicals are linked to certain health conditions and environmental exposures. Present state of the art techniques used for analysis of exhaled human breath include mass spectrometry based methods, infrared spectroscopic sensors, electro chemical sensors and semiconductor oxide based testers. Some of these techniques are commercially available but are somewhat limited in their specificity and exhibit fairly high probability of false alarm. Here, we present the results of our most recent study which demonstrated a novel application of a terahertz high resolutions spectroscopic technique to the analysis of exhaled human breath, focused on detection of ethanol in the exhaled breath of a person which consumed an alcoholic drink. This technique possesses nearly ``absolute'' specificity and we demonstrated its ability to uniquely identify ethanol, methanol, and acetone in human breath. This project is now complete and we are looking to extend this method of chemical analysis of exhaled human breath to a broader range of chemicals in an attempt to demonstrate its potential for biomedical diagnostic purposes.
Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis
NASA Technical Reports Server (NTRS)
Eberhart, C. J.; Casiano, M. J.
2015-01-01
Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.
Dinç, Erdal; Ozdemir, Abdil
2005-01-01
Multivariate chromatographic calibration technique was developed for the quantitative analysis of binary mixtures enalapril maleate (EA) and hydrochlorothiazide (HCT) in tablets in the presence of losartan potassium (LST). The mathematical algorithm of multivariate chromatographic calibration technique is based on the use of the linear regression equations constructed using relationship between concentration and peak area at the five-wavelength set. The algorithm of this mathematical calibration model having a simple mathematical content was briefly described. This approach is a powerful mathematical tool for an optimum chromatographic multivariate calibration and elimination of fluctuations coming from instrumental and experimental conditions. This multivariate chromatographic calibration contains reduction of multivariate linear regression functions to univariate data set. The validation of model was carried out by analyzing various synthetic binary mixtures and using the standard addition technique. Developed calibration technique was applied to the analysis of the real pharmaceutical tablets containing EA and HCT. The obtained results were compared with those obtained by classical HPLC method. It was observed that the proposed multivariate chromatographic calibration gives better results than classical HPLC.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less
MeV ion-beam analysis of optical data storage films
NASA Technical Reports Server (NTRS)
Leavitt, J. A.; Mcintyre, L. C., Jr.; Lin, Z.
1993-01-01
Our objectives are threefold: (1) to accurately characterize optical data storage films by MeV ion-beam analysis (IBA) for ODSC collaborators; (2) to develop new and/or improved analysis techniques; and (3) to expand the capabilities of the IBA facility itself. Using H-1(+), He-4(+), and N-15(++) ion beams in the 1.5 MeV to 10 MeV energy range from a 5.5 MV Van de Graaff accelerator, film thickness (in atoms/sq cm), stoichiometry, impurity concentration profiles, and crystalline structure were determined by Rutherford backscattering (RBS), high-energy backscattering, channeling, nuclear reaction analysis (NRA) and proton induced X-ray emission (PIXE). Most of these techniques are discussed in detail in the ODSC Annual Report (February 17, 1987), p. 74. The PIXE technique is briefly discussed in the ODSC Annual Report (March 15, 1991), p. 23.
Biomedical application of MALDI mass spectrometry for small-molecule analysis.
van Kampen, Jeroen J A; Burgers, Peter C; de Groot, Ronald; Gruters, Rob A; Luider, Theo M
2011-01-01
Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) is an emerging analytical tool for the analysis of molecules with molar masses below 1,000 Da; that is, small molecules. This technique offers rapid analysis, high sensitivity, low sample consumption, a relative high tolerance towards salts and buffers, and the possibility to store sample on the target plate. The successful application of the technique is, however, hampered by low molecular weight (LMW) matrix-derived interference signals and by poor reproducibility of signal intensities during quantitative analyses. In this review, we focus on the biomedical application of MALDI-MS for the analysis of small molecules and discuss its favorable properties and its challenges as well as strategies to improve the performance of the technique. Furthermore, practical aspects and applications are presented. © 2010 Wiley Periodicals, Inc.
Watkinson, D; Rimmer, M; Kasztovszky, Z; Kis, Z; Maróti, B; Szentmiklósi, L
2014-01-01
Chloride (Cl) ions diffuse into iron objects during burial and drive corrosion after excavation. Located under corrosion layers, Cl is inaccessible to many analytical techniques. Neutron analysis offers non-destructive avenues for determining Cl content and distribution in objects. A pilot study used prompt gamma activation analysis (PGAA) and prompt gamma activation imaging (PGAI) to analyse the bulk concentration and longitudinal distribution of Cl in archaeological iron objects. This correlated with the object corrosion rate measured by oxygen consumption, and compared well with Cl measurement using a specific ion meter. High-Cl areas were linked with visible damage to the corrosion layers and attack of the iron core. Neutron techniques have significant advantages in the analysis of archaeological metals, including penetration depth and low detection limits. PMID:26028670
NASA Astrophysics Data System (ADS)
Yedra, Lluís; Eswara, Santhana; Dowsett, David; Wirtz, Tom
2016-06-01
Isotopic analysis is of paramount importance across the entire gamut of scientific research. To advance the frontiers of knowledge, a technique for nanoscale isotopic analysis is indispensable. Secondary Ion Mass Spectrometry (SIMS) is a well-established technique for analyzing isotopes, but its spatial-resolution is fundamentally limited. Transmission Electron Microscopy (TEM) is a well-known method for high-resolution imaging down to the atomic scale. However, isotopic analysis in TEM is not possible. Here, we introduce a powerful new paradigm for in-situ correlative microscopy called the Parallel Ion Electron Spectrometry by synergizing SIMS with TEM. We demonstrate this technique by distinguishing lithium carbonate nanoparticles according to the isotopic label of lithium, viz. 6Li and 7Li and imaging them at high-resolution by TEM, adding a new dimension to correlative microscopy.
Model for spectral and chromatographic data
Jarman, Kristin [Richland, WA; Willse, Alan [Richland, WA; Wahl, Karen [Richland, WA; Wahl, Jon [Richland, WA
2002-11-26
A method and apparatus using a spectral analysis technique are disclosed. In one form of the invention, probabilities are selected to characterize the presence (and in another form, also a quantification of a characteristic) of peaks in an indexed data set for samples that match a reference species, and other probabilities are selected for samples that do not match the reference species. An indexed data set is acquired for a sample, and a determination is made according to techniques exemplified herein as to whether the sample matches or does not match the reference species. When quantification of peak characteristics is undertaken, the model is appropriately expanded, and the analysis accounts for the characteristic model and data. Further techniques are provided to apply the methods and apparatuses to process control, cluster analysis, hypothesis testing, analysis of variance, and other procedures involving multiple comparisons of indexed data.
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.; Tanner, J. A.
1984-01-01
An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.
Static analysis of class invariants in Java programs
NASA Astrophysics Data System (ADS)
Bonilla-Quintero, Lidia Dionisia
2011-12-01
This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.
Artificial Intelligence Techniques: Applications for Courseware Development.
ERIC Educational Resources Information Center
Dear, Brian L.
1986-01-01
Introduces some general concepts and techniques of artificial intelligence (natural language interfaces, expert systems, knowledge bases and knowledge representation, heuristics, user-interface metaphors, and object-based environments) and investigates ways these techniques might be applied to analysis, design, development, implementation, and…
Findlay, I; Wong, F; Smith, C; Back, D; Davies, A; Ajuied, A
2016-03-01
Recent meta-analyses support not resurfacing the patella at the time of TKA. Several different modes of intervention are reported for non-resurfacing management of the patella at TKA. We have conducted a systematic review and meta-analysis of non-resurfacing interventions in TKA. The PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) study methodology and reporting system was adopted, utilising the PRISMA checklist and statement. Classes of patella interventions were defined as: 0. No intervention. 1. Osteophyte excision only. 2. Osteophyte excision, denervation, with soft tissue debridement. 3. Osteophyte excision, denervation, soft tissue debridement, and drilling or micro-fracture of eburnated bone. 4. Patellar resurfacing. A meta-analysis was conducted upon the pre- and post-operative KSS for each technique. Four hundred and twenty-three studies were identified, 12 studies met the inclusion criteria for the systematic review and eight for the meta-analysis. Two studies compared different non-resurfacing patellar techniques, the other studies used the non-resurfacing cohort as controls for their prospective RCTs comparing patellar resurfacing with non-resurfacing. The meta-analysis revealed no significant difference between the techniques. We conclude that there is no significant difference in KSS for differing non-resurfacing patellar techniques, but further trials using patellofemoral specific scores may better demonstrate superior efficacy of specific classes of patella intervention, by virtue of greater sensitivity for patellofemoral pain and dysfunction. I. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C
2000-09-01
The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. Copyright 2000 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Cubillos, G. I.; Bethencourt, M.; Olaya, J. J.
2015-02-01
ZrOxNy/ZrO2 thin films were deposited on stainless steel using two different methods: ultrasonic spray pyrolysis-nitriding (SPY-N) and the DC unbalanced magnetron sputtering technique (UBMS). Using the first method, ZrO2 was initially deposited and subsequently nitrided in an anhydrous ammonia atmosphere at 1023 K at atmospheric pressure. For UBMS, the film was deposited in an atmosphere of air/argon with a Φair/ΦAr flow ratio of 3.0. Structural analysis was carried out through X-ray diffraction (XRD), and morphological analysis was done through scanning electron microscopy (SEM) and atomic force microscopy (AFM). Chemical analysis was carried out using X-ray photoelectron spectroscopy (XPS). ZrOxNy rhombohedral polycrystalline film was produced with spray pyrolysis-nitriding, whereas using the UBMS technique, the oxynitride films grew with cubic Zr2ON2 crystalline structures preferentially oriented along the (2 2 2) plane. Upon chemical analysis of the surface, the coatings exhibited spectral lines of Zr3d, O1s, and N1s, characteristic of zirconium oxynitride/zirconia. SEM analysis showed the homogeneity of the films, and AFM showed morphological differences according to the deposition technique of the coatings. Zirconium oxynitride films enhanced the stainless steel's resistance to corrosion using both techniques. The protective efficacy was evaluated using electrochemical techniques based on linear polarization (LP). The results indicated that the layers provide good resistance to corrosion when exposed to chloride-containing media.
Cost-revenue analysis in the surgical treatment of the obstructed defecation syndrome.
Schiano di Visconte, Michele; Piccin, Alessandra; Di Bella, Raimondo; Giomo, Priscilla; Pederiva, Vania; Cina, Livio Dal; Munegato, Gabriele
2006-01-01
The obstructed defecation syndrome is a frequent condition in the female population. Rectocele and rectal intussusception may cause symptoms of obstructed defecation. The aim of this study is to carry out an economic cost-revenue analysis comparing the rectocele and the rectal intussusception surgical techniques using a double-transanal, circular stapler (Stapled Trans-Anal Rectal Resection - STARR) with other techniques used to repair the same defects. The analysis involved the systematic calculation of the costs incurred during hospitalisation. The revenue estimate was obtained according to the rate quantification of the Diagnosis Related Group (DRG) associated with each hospitalisation. Our analysis confirmed that the global expenditure for the STARR technique amounts to 3,579.09 Euro as against 5,401.15 Euro for rectocele abdominal repair and 3,469.32 Euro for perineal repair. The intussusception repair cost according to Delorme's procedure amounts to 5,877.41Euro as against 3,579.09 Euro for the STARR technique. The revenue analysis revealed a substantial gain for the Health Authority as regards the treatment of rectocele and rectal intussusception for obstructed defecation syndrome. The highest revenue, 6,168. 52 Euro, was obtained with intussusception repair with STARR as compared to Delorme's procedure which presented revenue amounting to 2,359.04. Lower revenues are recorded if the STARR technique is intended for rectocele repair; in this case the revenue amounts to 1,778.12 Euro as against 869.67 Euro and 1,887.89 Euro for abdominal and perineal repair, respectively.
An intercomparison of five ammonia measurement techniques
NASA Technical Reports Server (NTRS)
Williams, E. J.; Sandholm, S. T.; Bradshaw, J. D.; Schendel, J. S.; Langford, A. O.; Quinn, P. K.; Lebel, P. J.; Vay, S. A.; Roberts, P. D.; Norton, R. B.
1992-01-01
Results obtained from five techniques for measuring gas-phase ammonia at low concentration in the atmosphere are compared. These methods are: (1) a photofragmentation/laser-induced fluorescence (PF/LIF) instrument; (2) a molybdenum oxide annular denuder sampling/chemiluminescence detection technique; (3) a tungsten oxide denuder sampling/chemiluminescence detection system; (4) a citric-acid-coated denuder sampling/ion chromatographic analysis (CAD/IC) method; and (5) an oxalic-acid-coated filter pack sampling/colorimetric analysis method. It was found that two of the techniques, the PF/LIF and the CAD/IC methods, measured approximately 90 percent of the calculated ammonia added in the spiking tests and agreed very well with each other in the ambient measurements.
Analytical techniques of pilot scanning behavior and their application
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.
1986-01-01
The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.
Mass spectrometry. [in organic chemistry
NASA Technical Reports Server (NTRS)
Burlingame, A. L.; Shackleton, C. H. L.; Howe, I.; Chizhov, O. S.
1978-01-01
A review of mass spectrometry in organic chemistry is given, dealing with advances in instrumentation and computer techniques, selected topics in gas-phase ion chemistry, and applications in such fields as biomedicine, natural-product studies, and environmental pollution analysis. Innovative techniques and instrumentation are discussed, along with chromatographic-mass spectrometric on-line computer techniques, mass spectral interpretation and management techniques, and such topics in gas-phase ion chemistry as electron-impact ionization and decomposition, photoionization, field ionization and desorption, high-pressure mass spectrometry, ion cyclotron resonance, and isomerization reactions of organic ions. Applications of mass spectrometry are examined with respect to bio-oligomers and their constituents, biomedically important substances, microbiology, environmental organic analysis, and organic geochemistry.
A pilot modeling technique for handling-qualities research
NASA Technical Reports Server (NTRS)
Hess, R. A.
1980-01-01
A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.
The Shock and Vibration Bulletin. Part 3. Dynamic Analysis, Design Techniques
1980-09-01
response at certain discrete frequen- nique for dynamic analysis was pioneered by cies, not over a random-frequence spectrum. Myklestad[l]. Later Pestel and...34Fundamentals of Vibra- v’ angle of rotation due to tion Analysis ," McGraw-Hill, New York, 1956. bending 2. E.C. Pestel and F.A. Leckie, "Matrix o’ angle of...Bulletin 50IC FILE COPY (Part 03ofP,) to THE SHOCK AND VIBRATION BULLETIN Part 3 Dynamic Analysis , Design Techniques IELECTE SEPTEMBER 1980 S NOV 1
An Introduction to Data Analysis in Asteroseismology
NASA Astrophysics Data System (ADS)
Campante, Tiago L.
A practical guide is presented to some of the main data analysis concepts and techniques employed contemporarily in the asteroseismic study of stars exhibiting solar-like oscillations. The subjects of digital signal processing and spectral analysis are introduced first. These concern the acquisition of continuous physical signals to be subsequently digitally analyzed. A number of specific concepts and techniques relevant to asteroseismology are then presented as we follow the typical workflow of the data analysis process, namely, the extraction of global asteroseismic parameters and individual mode parameters (also known as peak-bagging) from the oscillation spectrum.
Bayesian linkage and segregation analysis: factoring the problem.
Matthysse, S
2000-01-01
Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.